Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It looks like none of the proposed approaches work well, and the problem seems to be much more complicated that it looks.

I think what might work properly is:

- A "fractal" dither pattern so that it can be zoomed out and in smoothly and is scale invariant

- Doing things in texel space so that both camera movement and object movement works properly

- Doing bilinear filtering (perhaps keeping all samples instead of storing the weighted average) or perhaps supersampled rendering of the dithered pattern, and then using some sort of error diffusion pass in screen space (with a compute shader)

But not actually sure if this works in practice.

If that's not enough, an alternative would be to do things in screen space "naively", then reverse map the screen space rendering to texel space (in a resolution-preserving way), and use the information in texel space on the next frame to create a screen space solution compatible to the one in texel space, map it to texel space, etc., effectively building up the fractal per-texel pattern incrementally at runtime. This might be the best solution but seems very expensive in terms of memory, computation and complexity.



I've got a fractal approach working pretty well as of last week:

https://x.com/dgant/status/1851840125065453894 https://x.com/dgant/status/1851835968342446576?t=kCUSWCtJEc_...

How it works:

- World space dithering

- Apply 2D ditheringalong triplanar-mapped surfaces

- Choose coordinate scale based on depth such that the dither pattern ranges on [1px, 2px)

- Paint sufficiently distant surfaces using spherical coordinates

So there's some repainting at depth thresholds but it's not very loud in practice.


Doesn't seem fractal: the dots on the door become big as the camera approaches it rather than revealing underlying fine-grained structure.


Valid critique. I think nothing prevents the technique from being applied at multiple levels of detail to make it do what you describe.


That is wild! It's not every day I get to see a completely new, unique visual effect. Kudos.

I'd love to see a video with vastly slower movement, so I can pay attention to what's actually happening. The fast movement turns it all into a blur (literally).




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: