• May 08, 2021, 07:28:11 PM

### Author Topic: (Question) sampling  (Read 669 times)

0 Members and 1 Guest are viewing this topic.

#### quaz0r

• Fractal Feline
• Posts: 153
##### (Question)sampling
« on: April 06, 2020, 08:32:53 PM »
so while the world turns into the zombie apocalypse, im trying to distract myself by working on my mandelbrot program some more
i was reading a bit about the nyquist sampling theorem and the sinc function for interpolating samples and such things.  usually with something like a mandelbrot renderer people program it such that one sample = one pixel.  if one wants to do better than that, often they are forced to resort to rendering a large image that way and then downsampling in another program to effectively get more samples per pixel.  i always thought that was a major shortcoming, so one of the first features i implemented in my own program was to have controls for settings related to supersampling and then internally render the larger image, downsample, and output the desired final image.  i use a typical image processing library for this, which itself i guess implements the usual convolution kernels for such things.

i was thinking though that it would tickle my fancy even more to not rely on image processing libraries for this, and instead implement something more directly.  this is where i started reading stuff about the nyquist sampling theorem, but immediately its all super technical and over my head, and often spoken about in terms of audio stuff it seems.  one thing i was able to somewhat relate to was talking about the sinc function, which i kind of recognize from stuff about image convolution kernels.

so if i understand this right, if you take lets say the location of the center of a pixel, around which lets say you have a set of samples at varying distances from this location, is the idea that you could plug the (normalized i guess) distance of a sample into the sinc function, and then use the result to scale the "value" of that sample?  then what?  take the average of all the scaled values?  anything else?  i cant seem to find anything clearly showing an example like this...

another thing im not sure about is how exactly one would scale the sinc function.  for instance, the central "lobe" should equate to what exactly?  the size of a pixel?  something more specific than that?

i gather that actually doing all that is considered computationally impractical, though im still interested in the theory at least.  with regard to the theory, i think it said that an infinite sinc function would render a "perfect" result.  so, if one was to render the "most perfect" result theoretically possible given a set of samples computed for a given image, that means that for each pixel in the rendered image you could extend the sinc function out from the center of that pixel to include all the samples you computed for the whole image, so that they all (sort of) contribute to each pixel?

#### hobold

• Fractal Frogurt
• Posts: 430
##### Re: sampling
« Reply #1 on: April 06, 2020, 10:46:54 PM »
Fractals have (potentially) infinite detail. That is equivalent to there being unboundedly high frequencies in the signal's spectrum. In other words, a fractal signal can have unlimited bandwidth. So there will always be some loss, and perfection is probably not a reachable goal.

The good news is that thanks to the high resolution of modern display screens, and thanks to the sheer brute computing force available, one can get very nice results with rather sloppy sampling and filtering strategies. For example I never had to improve on a box filter, once I corrected for display gamma (without gamma correction you get jagged edges almost regardless of filter kernel).

In a very real sense, the box filter is the worst possible, when judging with sine waves in mind. But on the other hand, the box filter is a close relative to the Haar wavelet - and suddenly its acceptable performance isn't such a surprise anymore. At least for video signals - audio is another matter.

The point being, I guess, is: this topic is worthy and interesting. But don't expect dramatic differences between a "good enough" and some other "near perfect" filter.

• Fractal Frogurt
• Posts: 472
##### Re: sampling
« Reply #2 on: April 07, 2020, 07:47:49 AM »

#### v

• Fractal Phenom
• Posts: 56
##### Re: sampling
« Reply #3 on: April 08, 2020, 12:24:11 PM »
downsampling and interpolating is essentially compression, low frequency components, ie slow changing colours that change slower than half sampling freq, will get preserved while the high frequency detail will be lost, as above members explained.

Here is an image of the mandelbrot segmented into boxes and the 2D cosine transform of each plotted in colour. You can see how the edges, the detailed part have a lot of high frequency components resulting in less monocolored boxes. In the next image we chop off the high frequencies, analogous to LPF or downsampling, and reconstruct the image, losing a ton of detail. This is a few steps short of JPEG compression and works nicely for natural images and photographs, but will destroy fractals which are anything but bandlimited.  You can do this without segmenting the image but it is really computation intensive, this is analogous but not a precise demonstration, I just happened to have this code on hand, I can post more frequency analysis pictures of fractals if there is interest.

#### mrrudewords

• Fractal Frogurt
• Posts: 473
• Dat Mandel!
##### Re: sampling
« Reply #4 on: April 11, 2020, 12:23:10 PM »
I thought that the typical way to do this was to calculate eg 25 samples for a pixel, adding up the rgb values across them and dividing by 25 resulting in your pixel being the average colour from the 5x5 pixel sample.
Z = Z2 + C (obvs)

#### 3DickUlus

• Posts: 2213
##### Re: sampling
« Reply #5 on: April 11, 2020, 09:29:55 PM »
personally not a big fan of over-sampling more than 5 spots, center and 4 corners, I think this wonderfully hand crafted example can help illustrate my thought...

speculation: in this pixel only one sample is on (or extremely close to) the border, taking 25 samples and averaging, I feel, doesn't accurately represent the pixel, because the center is on the border this pixel should be very dark?

the inner group immediately next to the central sample should only influence it by some small % like 1, because there are 8 of them and likewise for the outer group but even less like 0.25% because there are 16 of them, rather than adding them all together and dividing by 25... just my un-edumakated 2 cents

• Posts: 2213

#### C0ryMcG

• Posts: 263
##### Re: sampling
« Reply #7 on: April 11, 2020, 10:10:52 PM »

Serious Statistics: The Aliasing Adventure - Super-Sampling
Interesting! I just read through this one, since this was the first one I saw, and I think the system that you describe not liking is what they describe as Box Sampling... and I've definitely been a bit disappointed with my implimentation of box sampling for supersampling AA in the past.. Their examples are very convincing! I was already leaning towards Gaussian after reading your pitch, since that's a formula that often gets along well with graphics, but even Triangle does a pretty good job.

#### 3DickUlus

• Posts: 2213
##### Re: sampling
« Reply #8 on: April 11, 2020, 11:00:05 PM »
I think the best one is a trapezoid  or twisted offset box sampling 4 locations but for simplicity I mentioned center + corners

#### xenodreambuie

• Fractal Friar
• Posts: 119
##### Re: sampling
« Reply #9 on: April 12, 2020, 12:07:04 AM »
In sampling theory, the idea is that if you take a bunch of samples of an input signal, what is the best way to recombine them to recreate the input signal as accurately as possible? For this purpose, results are objective in terms of measurable artifacts. Box filtering has a lot of artifacts. Gaussian filtering is ideal. The choice of spatial distribution of samples depends on the nature of the input, and whether your preference is to minimize moire effects or minimize noise or somewhere in between.

Fractals are more challenging than most fields because you can have areas of smooth coloring, areas of relatively uncorrelated noise (high density fractal boundary), and areas of correlated patterns (eg high frequency repetitions of a gradient, or fractal correlations) that do cause moire patterns. Box filtering gives the most blurring of dense regions and is simple, but it does lose some detail and is not ideal. Some people like to see more detail and are happy with 5 samples, but that leaves a lot of undesirable noise in places that doesn't satisfy other people.

Gaussian and other filters are slower to compute as each sample has to be multiplied by a weight, but this is negligible compared to the iteration time per sample.

I use a kind of compromise combination of approximate Gaussian filter and dithered sampling. It has equal weights, but biases the distribution towards the center. Not strictly accurate but very convenient to use and I'm happy with it.

#### quaz0r

• Fractal Feline
• Posts: 153
##### Re: sampling
« Reply #10 on: April 14, 2020, 04:19:07 PM »
thanks for the replies.  so a gaussian curve makes the most sense here?  makes sense, though like i say i thought some stuff says a sinc function is ideal...does that not really apply here?

#### hobold

• Fractal Frogurt
• Posts: 430
##### Re: sampling
« Reply #11 on: April 14, 2020, 05:24:00 PM »
Gaussian is best at suppressing artifacts, while sinc is best at preserving signal features.

Gaussian tends to introduce blurring - or I should rather say, results of Gaussian filtering will tend to lead human test subjects to judge images as slightly blurry (i.e. the blur effect is something related to human visual perception, not necessarily an objective loss of details).

Sinc tends to introduce edge ringing, or double contours. This is something also often disliked by human observers.

The rabbit hole goes even deeper than that, though, because the display device, when presenting pixels to human eyes, also inevitably (and implicitly) applies a reconstruction filter. The analog cathode ray tubes of old would project round pixels with blurry borders, so a Gaussian filter kernel was a good approximation. The liquid crystal displays of today project sharply delineated, rectangular subpixels - not at all like a Gaussian filter kernel.

Oh, and I recall that somebody developed an adaptive filter based on measured fractal dimension. That one would preserve thin lines better, but not overemphasize other edges. And it could implicitly differentiate between textures of small points on the one hand, and textures of bigger blotches on the other hand. I tried to find the reference again for citation here, but my google-fu has left me.

#### claude

• 3f
• Posts: 1850
##### Re: sampling
« Reply #12 on: April 14, 2020, 05:33:45 PM »

#### hobold

• Fractal Frogurt
• Posts: 430
##### Re: sampling
« Reply #13 on: April 14, 2020, 08:31:24 PM »
The website looks nothing like I remember, but the idea presented there seems to be the one.

• Fractal Frogurt
• Posts: 472
##### Re: sampling
« Reply #14 on: April 14, 2020, 10:10:07 PM »

### Similar Topics

###### "Time Span"

Started by cricke49 on Fractal Image Gallery

0 Replies
926 Views
August 02, 2018, 07:05:21 AM
by cricke49
###### A new style of fractal imagery using fractal neural style transfer

Started by iRyanBell on Fractal Image Gallery

3 Replies
877 Views
October 03, 2020, 10:50:39 PM
by Jimw338
###### Sampling

Started by quaz0r on Fractal Mathematics And New Theories

6 Replies
375 Views
September 05, 2019, 02:03:32 PM
by hobold
###### Birdie Style

Started by gannjondal on Fractal Image Gallery

1 Replies
884 Views
May 08, 2018, 02:39:37 PM
by who8mypnuts
###### Neural Style Transfer with Fractal Art

Started by reallybigname on Other Artforms

1 Replies
746 Views
July 20, 2019, 04:25:41 PM
by reallybigname