 Geoscientific measurements often provide time series with irregular time sampling requiring either data reconstruction, interpolation, or sophisticated methods to handle irregular sampling. We compared the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of a regularly sampled time series as Lohm-Scarville Fourier transformation and kernel-based methods. In a thorough benchmark test, we investigated the performance of these techniques. All methods had comparable root mean-square errors, RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias, and RMSE increased strongly. We found a 40% lower RMSE for the Lag 1 autocorrelation function, ACF, for the Gaussian kernel method versus the linear interpolation scheme in the analysis of highly irregular time series. For the cross-correlation function, CCF, the RMSE was then lowered by 60%. The application of the Lohm-Scarville technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case, especially the high-frequency components of the signal where classical methods showed. This article was authored by K. Refeld, N. Marwan, J. Heitzig, and others. We are article.tv, links in the description below.