In nonparametric regression with errors in the explanatory variable, the regression function is typically assumed to be smooth, and in particular not to have a rapidly changing derivative. However, not all real-data applications have this property. When the property fails, conventional techniques, usually based on kernel methods, have unsatisfactory performance. We suggest an adaptive, wavelet-based approach, founded on the concept of explained sum of squares, and using matrix regularisation to reduce noise. This non-standard technique is used because conventional wavelet methods fail to estimate wavelet coefficients consistently in the presence of measurement error. We demonstrate that our approach enjoys very good performance, especially when the regression function is erratic. Pronounced maxima and minima are recovered more accurately than when using conventional methods, which tend to flatten peaks and troughs. Our techniques have advantages when estimating conventional, smooth functions since they require less sophisticated smoothing parameter choice. That problem is particularly challenging in the setting of measurement error. A real-data example is discussed, along with an extensive simulation study. This is a joint work with Peter Hall and Jason Tran.