A Computational approach to detect inhomogeneities in time series data

Yazıcı, Ceyda
Detection of possible inhomogeneity within a series is an important problem in time series data. There are many sources from which inhomogeneity can be originated such as mean shift, variance and trend change, gradual change, or sudden decrease or increase in time series. Since time series has many application areas, the detection of changepoints should be investigated before conducting any analysis. Available methods have certain drawbacks that may lead to unreliable inferences. These include the need of independent and identically distributed variables or normality assumption of observations whose validation may not be possible for dependent data or need of highly correlated reference series. In this thesis, a computational approach is proposed to obtain an absolute test to detect whether data is homogeneous or not. For this purpose, likelihood ratio test for mean shift for AR(1) models is considered and then, moving block bootstrap is used to detect the breakpoints especially close to the beginning or end of the series. In order to derive the related test statistic, exact likelihood is used and the critical values of the test statistic is obtained by a simulation study for different sample sizes and an appropriate length of blocks is suggested to detect changepoints. Then, the performance of the proposed method is compared with the best performing tests in the literature. The comparison study and real life applications reveal that the proposed method performs better than the methods in the literature.