There is a method called Total Variation that seeks to minimize the total of the magnitude of the gradient at each point in the image, while staying within a fixed Euclidean distance of the noisy image. This is effectively an assumption of Gaussian noise coupled with an assumption of exponential distribution of gradient magnitudes in the signal.

It's a nice method. It's better than saying the gradient magnitudes are Gaussian, that tends to nuke edges. Exponential is *ambivalent* about edges. A smooth gradient, a sharp edge, it doesn't care either way. Tends to get staircase artifacts though, puts in edges where there shouldn't be any.

****** ********* *********** vs ********** **************** ********** ****************** ***************** "Eh. Whatever." says the Total Variation method.

Now if one modelled the magnitude of gradients using a Levy alpha-stable distribution, it would smooth out small edges, like the Gaussian, and instead tend to concentrate any edges into one big leap. If you're using an alpha-stable distribution and you need to make a leap above a certain size, it doesn't cost much more to make a really big leap. And alpha-stable distributions are nice mathematically, they obey the generalized central limit theorem, which is like the central limit theorem used to justify the Gaussian distribution, only done right this time.

I'm guessing this could even be used to fill in missing areas of an image, *including reasonable reconstructions of edges*.

That seems rather nice behaviour indeed. I shall have to try it.