The results seem quite good. We shall see soon if this holds for other images as well.
For simple POT smoothing for each tCurrent (mCurrent) I have to retain tPrev (mPrev) in order to compute tNew (mNew). So, for each line I have to store the previous line => O(z) memory.
In the averaged smoothing approach, for each tCurrent (mCurrent) I have to retain the average of previous t values (means values), which is updated at each line in constant time, in order to compute tNew (mNew). That said, for each line I will also retain a vector of averages of the previous values. => O(z) memory.
It would thus be ideal if we could combine the two. This is what I've been working on this week. Unfortunately it's not an easy task. After trying many variations, the problem seems to be when switching from simple smoothing to average smoothing discontinuities appear. I think this happens because average smoothing uses the averages of the tCurrent values when smoothing and this doesn't seem to 'fit' after smoothing with simple smoothing. Here is how the t parameter looks:
In this example I used a block size of 400 lines in which I used average smoothing for the first 256 and simple smoothing for the rest. After 400 lines this procedure repeats. The discontinuities appear at around the multiples of 400, indicating that the problem appears when starting a new block.
I will try to improve this, using something other than tCurrent in the average computation.
In parallel I will also be working on the lossless case which has been ignored so far. Everything we've talked about so far has been related to the lossy implementation. So far, I've represented the image after a simple application of lossless POT, which can be found here . But more on this topic will be covered in the next blog post.