after doing more research i think i understand that poisson disk sampling isnt about sampling within a disk instead of a square (lol), it is in fact about creating blue noise, as claude mentioned, with samples being evenly distributed and avoiding clusters and voids. it does also still sound like this blue noise is considered "the best" for sampling?

as far as simply jittering a grid goes, it seems to me that utilizing something like simplex noise to apply as offsets to your grid points makes a lot of sense, versus something more random. again, random will result in clusters and voids (though i suppose less so for a randomly-jittered grid versus pure random), whereas the "smoothness" of something like simplex noise in this context simply means that there will be more uniformity to the spacing of adjacent points (no clusters and voids) but still "random" relative to any other part of the image, which sounds to me to be similar to or the same as how we are defining blue noise here?

i came across this guy's work which all looks very interesting:

http://www.cemyuksel.com/cyCodeBase/soln/poisson_disk_sampling.htmlhe makes available here what looks to be a proper and high-quality (i.e. modern C++ template library explicitly designed to be fast and efficient) implementation of various things including what appears to be a novel way to generate a poisson-disk sample set by taking a larger sample set as its input (he simply populates the input with purely random values in his examples) and then his algorithm works by eliminating samples, reducing the sample set down and outputting a smaller subset with poisson-disk properties.

one thought i had was to start by creating a larger sample set from a simplex-jittered grid and feed that as a sort of conditioned input to this sample-elimination algorithm to end up with a potentially higher-quality sample set of a given size? assuming of course that doing so is worthwhile, or that doing it in this manner is even the best way to do it.

an interesting feature of this guys implementation of his sample-elimination algorithm is that you can set a flag to have it sort the output, as he says, "such that when the samples are introduced one by one in this order, each subset in the sequence becomes a Poisson disk sample set," facilitating for instance a progressive sampling scheme.

so right now im kind of envisioning an adaptive-sampling scheme utilizing this sample-elimination thing to generate a progressive (sorted) poisson-disk sample set. this sample set would be large enough such that it would at least roughly equate to some maximum number of samples per pixel area. you could start off by solving some minimum number of these points, and then use the information learned from solving these points to inform an adaptive-sampling scheme which adds more samples (by simply stepping through your pre-computed progressive sample set) to regions which exceed some threshold measure of density or complexity or whatever, hopefully resulting in fairly uniform image quality.

anyway, i know adaptive-sampling schemes are not a new idea. some of the particulars of what i have in mind might be a little different, though it doesnt even have to be new to be interesting to me. everyone wants to make 3D stuff as great as humanly possible, but the enthusiasm for making 2D stuff as great as humanly possible seems to be orders of magnitude less, if existent at all. search the internet for high-quality 3D renders and youll be presented with a lifetime worth of material to wade through. try to find a handful of even semi-decently rendered images of the mandelbrot set and, well, we all know the caliber of the material we are likely to find (unless perhaps it was rendered by one of maybe half a dozen people from fractalforums).