• June 19, 2021, 11:49:04 PM

Login with username, password and session length

Author Topic: (Question) Sampling  (Read 392 times)

0 Members and 1 Guest are viewing this topic.

Offline quaz0r

  • Fractal Feline
  • **
  • Posts: 157
(Question) Sampling
« on: September 01, 2019, 04:26:02 AM »
i have just added some stuff to my mandelbrot program to "jitter" the "sampling grid" (is this the right terminology?) using simplex noise.  one thing im unsure of is how people tend to implement this for such a purpose.  the simplex thing returns a single value, so right now im using two separate instances of the simplex noise to generate a set of values to apply horizontally, and another set to apply vertically.  generating two different sets of values instead of reusing the same set for both horizontal and vertical offsets seemed like it might potentially be the most proper, but maybe thats dumb and is of no value?

i found some stuff talking about poisson disc sampling and extolling it as some kind of best approach perhaps?  but when i read more it seemed to indicate that you simply pick jitter values at random, but confine them to a disc region?  isnt something like simplex supposed to be better than pure random?  i thought that was the whole idea in the first place?  is the poisson thing simply about confining the region of possible new values to a disc instead of a square?  i guess i could use my two sets of values as an angle and a radius instead of an x,y offset if that would actually be "better" ?

this stuff also got me thinking about some of the fancier math discussed on here sometimes, like the ball arithmetic that was discussed for, if i can even remember back that far, generating an error bound on the series approximation for use as a bail condition as i recall.  anyway, i was curious, is this sort of thing only capable of producing an error bound or whatever?  for things like mandelbrot, could one iterate a ball or something along those lines, and get the usual "answers" out the other end?  things like escape time and distance and whatever.  if possible at all, how expensive might it be?  at what point i wonder would it become worthwhile to do that versus jittering points and iterating a quadzillion of them to average together, and all of this endless screwing around?

Linkback: https://fractalforums.org/index.php?topic=3032.0

Offline pauldelbrot

  • 3f
  • ******
  • Posts: 2899
Re: Sampling
« Reply #1 on: September 01, 2019, 05:23:47 AM »
i guess i could use my two sets of values as an angle and a radius instead of an x,y offset if that would actually be "better" ?

That will result in a nonuniform point distribution. Consider e.g. that the central part of the disk out to half the radius represents a quarter of the disk's total area but would receive half the points. The radial coordinate would have to be squared and treated as "distance in from the rim" after normalization. Or, you pick x,y in a square with the disk inscribed in it and discard-and-reroll points outside the disk (x2 + y2 > r2). The latter requires doing roughly 4/3 as much work per point, plus the multiplies in the disk membership test, versus just the one multiply to square the distance-in-from-rim coordinate. On the other hand I am more certain that the discard-and-reroll method will give a truly uniform distribution on the disk.

Offline marcm200

  • 3d
  • ****
  • Posts: 977
Re: Sampling
« Reply #2 on: September 01, 2019, 08:02:49 AM »
for things like mandelbrot, could one iterate a ball or something along those lines, and get the usual "answers" out the other end?  things like escape time and distance and whatever.  if possible at all, how expensive might it be? 
Not sure about noise, but that sounds like interval arithmetics in the paper by Figueiredo about trustworthy Julia sets. Howver they iterate small squares once and then break the iterated square (ball) into standard sized squares (tiling the complex plane) down to iterate those again. Otherwise, outside the 1-disc they would get blown up pretty fast and non-informative I guess.

As for the cost: to calculate a valid bounding box for a ball/square using interval arithmetics it's quite a task, a lot of multiplications and max/min determinations. But those get compiler-parallelized quite well. However with that breaking down you kind of get an escaping tree and have to follow the fate of ALL paths to be certain whether the inital point ecapes or not which means a lot of tree traversals. So using intervals here might only be applicable for certain seed values/formulas where the ball/square iteration decreases the size of the iterated ball below the resolution-based minimal standard size.

Offline claude

  • 3f
  • ******
  • Posts: 1915
    • mathr.co.uk
Re: Sampling
« Reply #3 on: September 01, 2019, 02:52:21 PM »
Here's how KF does it:  https://code.mathr.co.uk/kalles-fraktaler-2/blob/d685883599f7ae795bd7667834b0d2a47c00c7ef:/fraktal_sft/fraktal_sft.cpp#l3314 lines 3314 through 3379
In short, it generates a hash of (i,j,seed) for each of dx,dy offset (seed is different for x and y) and uses that to jitter.

When I first added jitter to KF it was Gaussian distributed, but gerrit found that it still let Moiré artifacts through in certain locations (eg deep minis near -2+0i).  Uniform jitter seems much better.  See some threads in this forum.  Don't know if disc would be better than rectangular.

Simplex noise is designed to be smooth, so I don't know if it's a good thing to use?  I suspect "blue noise" might be the best...

Offline quaz0r

  • Fractal Feline
  • **
  • Posts: 157
Re: Sampling
« Reply #4 on: September 05, 2019, 07:04:09 AM »
after doing more research i think i understand that poisson disk sampling isnt about sampling within a disk instead of a square (lol), it is in fact about creating blue noise, as claude mentioned, with samples being evenly distributed and avoiding clusters and voids.  it does also still sound like this blue noise is considered "the best" for sampling?

as far as simply jittering a grid goes, it seems to me that utilizing something like simplex noise to apply as offsets to your grid points makes a lot of sense, versus something more random.  again, random will result in clusters and voids (though i suppose less so for a randomly-jittered grid versus pure random), whereas the "smoothness" of something like simplex noise in this context simply means that there will be more uniformity to the spacing of adjacent points (no clusters and voids) but still "random" relative to any other part of the image, which sounds to me to be similar to or the same as how we are defining blue noise here?

i came across this guy's work which all looks very interesting:


he makes available here what looks to be a proper and high-quality (i.e. modern C++ template library explicitly designed to be fast and efficient) implementation of various things including what appears to be a novel way to generate a poisson-disk sample set by taking a larger sample set as its input (he simply populates the input with purely random values in his examples) and then his algorithm works by eliminating samples, reducing the sample set down and outputting a smaller subset with poisson-disk properties.

one thought i had was to start by creating a larger sample set from a simplex-jittered grid and feed that as a sort of conditioned input to this sample-elimination algorithm to end up with a potentially higher-quality sample set of a given size?  assuming of course that doing so is worthwhile, or that doing it in this manner is even the best way to do it.

an interesting feature of this guys implementation of his sample-elimination algorithm is that you can set a flag to have it sort the output, as he says, "such that when the samples are introduced one by one in this order, each subset in the sequence becomes a Poisson disk sample set," facilitating for instance a progressive sampling scheme.

so right now im kind of envisioning an adaptive-sampling scheme utilizing this sample-elimination thing to generate a progressive (sorted) poisson-disk sample set.  this sample set would be large enough such that it would at least roughly equate to some maximum number of samples per pixel area.  you could start off by solving some minimum number of these points, and then use the information learned from solving these points to inform an adaptive-sampling scheme which adds more samples (by simply stepping through your pre-computed progressive sample set) to regions which exceed some threshold measure of density or complexity or whatever, hopefully resulting in fairly uniform image quality.

anyway, i know adaptive-sampling schemes are not a new idea.  some of the particulars of what i have in mind might be a little different, though it doesnt even have to be new to be interesting to me.  everyone wants to make 3D stuff as great as humanly possible, but the enthusiasm for making 2D stuff as great as humanly possible seems to be orders of magnitude less, if existent at all.  search the internet for high-quality 3D renders and youll be presented with a lifetime worth of material to wade through.  try to find a handful of even semi-decently rendered images of the mandelbrot set and, well, we all know the caliber of the material we are likely to find (unless perhaps it was rendered by one of maybe half a dozen people from fractalforums).   >:(

Offline quaz0r

  • Fractal Feline
  • **
  • Posts: 157
Re: Sampling
« Reply #5 on: September 05, 2019, 12:00:59 PM »

here is a link to his paper about his algorithm.

getting further into the weeds, a few ideas popped into my head when i was looking at this stuff, bearing in mind of course that im an ignorant rube and know little to nothing about 3D rendering stuff.  im thinking here about different ways of potentially informing the distribution and placement of new samples in an adaptive-sampling scheme.

on the author's website (i forget if its in the paper too) where he describes sampling surfaces in 3D, he says "Notice that the WeightedSampleElimination object was created as a 3D sampler (working on samples in 3D), but we specify the dimensionality of the sampling domain as 2D, since we would like to sample a surface."  looking at the example image of the rabbit with the sampling points overlayed on its surface, i noticed how the 3D sample set squashes down to 2D, creating different densities of points throughout this 2D image (obviously).  this led me to envision a potentially new way of sampling something like the mandelbrot set.

what if we generate a 3D representation of a location from the escape time or distance (complex distance?), like how people make mandelbrot-set heightmaps or meshes or whatever (ive never actually messed around with any of that myself), create a poisson-disk sample set on our new 3D surface, and then squash this 3D sample set down to 2D? 

thats just my random idea anyway.  maybe it's totally stupid or it wouldnt work or something.  it seems like there could potentially be some benefits though.  by sampling in three dimensions, it seems like both the density distribution and the exact placement of the samples would be more pure / fine-grained / smoothly-changing / whatever compared to any more typical approach.  these favorable (i think?) properties would be in addition to (i guess?) the already favorable properties inherent in poisson-disk sampling, so might a sample set arrived at in this manner be some new pinnacle in sample-set quality for this particular problem?

as far as actual implementation goes, i guess it wouldnt really be any different from what i was already envisioning before for an adaptive-sampling approach, as far as solving some initial minimum number of points to kickstart the process, except in this case you would use your initial set of solved points to construct the 3D surface to create a 3D sample set on.  from there you could just solve the remaining points normally, or you could repeat the process some number of times for further refinement.

Offline hobold

  • Fractal Frogurt
  • ******
  • Posts: 465
Re: Sampling
« Reply #6 on: September 05, 2019, 02:03:32 PM »
The idea with a uniformly sampled height field informing a non-uniformly sampled 2D image plane is just one special case of warping a 2D distribution. You can get locally differing densities, but globally retain desired qualities of, say, Poisson disk sampling.

The local slope of the height field controls an anisotropic density modification: if you project points on a "ramp" down to the 2D plane, they will be spaced more densely in the up/down direction of the ramp.

"Time Span"

Started by cricke49 on Fractal Image Gallery

0 Replies
Last post August 02, 2018, 07:05:21 AM
by cricke49
A new style of fractal imagery using fractal neural style transfer

Started by iRyanBell on Fractal Image Gallery

3 Replies
Last post October 03, 2020, 10:50:39 PM
by Jimw338

Started by quaz0r on Programming

16 Replies
Last post April 16, 2020, 07:36:09 AM
by v
Birdie Style

Started by gannjondal on Fractal Image Gallery

1 Replies
Last post May 08, 2018, 02:39:37 PM
by who8mypnuts
Neural Style Transfer with Fractal Art

Started by reallybigname on Other Artforms

1 Replies
Last post July 20, 2019, 04:25:41 PM
by reallybigname