This paper studies issues related to the estimation of a structural change in the persistence of a univariate time series. The break is such that the process has a unit root [i.e., is I(1)] in the pre-break regime but reverts to a stationary [i.e., I(0)] process in the post-break regime or vice versa. Chong (2001) develops the limit theory for the estimation of such autoregressive processes and shows that the rate of convergence of the breakpoint estimator in the I(1)–I(0) case is faster than that in the I(0)–I(1) case, which enables the break date to be estimated much more precisely in the former case. In this paper, we show that the faster rate is an artifact of the assumed data generating process that is characterized by a spurious jump at the true breakpoint. Based on a reformulation that avoids this jump, the same rate of convergence prevails in both cases. An important implication of this result is that existing confidence intervals in the I(1)–I(0) case have asymptotically zero coverage rates when the break magnitude is fixed. A small simulation study confirms the relevance of the asymptotic results in finite samples.