|
hana
|
2026-02-14 03:34:29
|
I also used the latest nightly build with XL Converter
I only replaced the `cjxl.exe` there |
|
|
RaveSteel
|
2026-02-14 03:35:35
|
According to exiftool your TIFFs contain lossy JPEGs |
|
2026-02-14 03:35:42
|
so of course the lossless JXL will be larger |
|
|
hana
|
2026-02-14 03:36:06
|
oh so it didn't jpeg transcoded it |
|
|
|
ignaloidas
|
2026-02-14 03:36:28
|
not from inside the tiff, doesn't do that |
|
|
RaveSteel
|
2026-02-14 03:36:30
|
magick and vips do not support lossless transcode sadly |
|
2026-02-14 03:36:51
|
you could try to extract the JPEG from the TIFF and then transcode that |
|
|
hana
|
2026-02-14 03:37:15
|
oh interesting thx |
|
|
VcSaJen
|
2026-02-14 03:45:36
|
We need more tools for lossless transcode. One of which was published on reddit recently: https://discord.com/channels/794206087879852103/805722506517807104/1471601926884954306 . It's only for normal jpeg files, tho. |
|
|
hana
|
2026-02-14 03:50:49
|
would this work?
```
ffmpeg -i scan1.tif -c:v copy scan1.jpg
``` |
|
2026-02-14 03:51:38
|
I think it worked. It is almost the same file size
ah nvm
```
cjxl -d 0 -e 9 scan1.jpg scan1.jxl
JPEG XL encoder v0.12.0 b2edc77 [_AVX2_,SSE4,SSE2] {Clang 19.1.5}
Getting pixel data failed.
``` |
|
|
RaveSteel
|
2026-02-14 04:07:37
|
No. If you want to preserve the quality you'll have to find a way to extract the JPEG from the TIFF without reencoding the JPEG |
|
|
Quackdoc
|
2026-02-14 04:26:50
|
I use oiiotool |
|
|
Demiurge
|
2026-02-15 05:55:47
|
https://unix.stackexchange.com/questions/740074/muxing-a-jpeg-bitstream-inside-a-tiff-container |
|
2026-02-15 05:56:43
|
If a tiff contains jpeg binary data, it is tricky to extract the jpeg binary from the tiff container unless you like bit fiddling and reading bitstream spec documents |
|
2026-02-15 05:58:11
|
But if there's a way to extract the JPEG binary from the TIFF file, that would be what you would want to do first. Then convert whatever metadata to JFIF/Exif/XMP/whatever and then finally trancode that file to jxl losslessly :) |
|
2026-02-15 05:59:20
|
The hard part is extracting the JPEG binary from the tiff without decoding it into pixels |
|
2026-02-15 05:59:43
|
Most software just assumes you want to decode the JPEG data and turn it into a lossy pixmap |
|
|
AccessViolation_
|
2026-02-15 11:40:18
|
I've been thinking about a lossy compression method for screen content (in general, not for JXL) that combines the color precision of a palette (no ringing or decoloration artifacts) but the spatial compression of a DCT. So for a tile like this, for example, some pixels along those shaded lines might become dark instead of light, or vice versa, when an index tips a threshold, but every color will still only be either of those |
|
2026-02-15 11:42:33
|
This could be pretty easy to implement by just compressing it with DCT like normal, and then mapping every pixel to the closest one from the accompanying signaled palette |
|
2026-02-15 11:48:39
|
I'm curious how well it'd compress compared to lossless compression of the pixels. I'm also not sure if mapping the messy pixels after IDCT to the nearest palette entries is the best approach. I feel like bits must be wasted on trying to model palette colors as continuous three-channel color values in the DCT, only to then map them back to palette indices... |
|
2026-02-15 11:54:39
|
You could also run the DCT on a single channel of integer indicies (in the end - they can be mapped to be between 0 and 1 if DCT requires it, not sure). In that case, it's more likely a 1 becomes a 2 than that a 1 becomes a 3, so you'd want to sort the indices ahead of time so that colors that are close numerically are touching in the image, that way, when a pixel becomes a different color index, it just becomes part of the other shape. if it became an entirely different color from the palette, it would look much more out of place. if you only have two colors though, you wouldn't have to worry about that |
|
|
MissBehavior
|
2026-02-16 11:03:39
|
interesting idea for sure |
|
|
Lumen
|
2026-02-20 10:14:23
|
I am having some issue with the CID22 dataset
all the cld_avif compressed versions are in format rgb48be BUT for some reason it's bgr in reality, and I have no idea why that is or how ffmpeg gets it right
because vapoursynth bestsource and ffms are completely fooled by them |
|
2026-02-20 10:14:31
|
and that's only specifically for cld_avif |
|
2026-02-20 10:14:33
|
is it normal? |
|
|
_wb_
|
2026-02-20 10:25:00
|
huh? they should be just 8-bit png files? |
|
|
Lumen
|
2026-02-20 10:36:57
|
sources are |
|
2026-02-20 10:37:03
|
not compressed cld_avif at least |
|
2026-02-20 10:37:19
|
oh wait |
|
2026-02-20 10:37:26
|
cld_heif is also 8 bit png |
|
2026-02-20 10:37:31
|
only cld_avif is rgb48be (bgr) |
|
|
_wb_
|
2026-02-20 10:39:27
|
oh interesting, you're right, for some reason those are 16-bit png files |
|
|
Lumen
|
2026-02-20 10:39:58
|
I skipped them in my current test because the bgr thing breaks a lot of the tools the metrics I am testing use |
|
2026-02-20 10:40:02
|
but I wanted to report it |
|
|
_wb_
|
2026-02-20 10:41:34
|
they're likely not _really_ 16-bit png files, in the sense that probably only 256 values per channel are actually used so you can just treat them as 8-bit images |
|
|
Lumen
|
2026-02-20 10:42:06
|
when opening them in vapoursynth, they seem to be actual 16 bits bgr |
|
|
_wb_
|
2026-02-20 10:42:13
|
those avif files were encoded using 8-bit yuv420, no idea how we ended up with 16-bit png files |
|
2026-02-20 10:42:27
|
PNG should be RGB, not BGR |
|
2026-02-20 10:42:44
|
but it is big-endian |
|
|
Lumen
|
2026-02-20 10:43:00
|
indeed, it's odd
but ffmpeg gets it right somehow |
|
2026-02-20 10:43:08
|
`rgb48be(pc, gbr/unknown/unknown)` |
|
2026-02-20 10:43:16
|
ffprobe says rgb but bgr ^^' |
|
2026-02-20 10:43:17
|
somehow |
|
2026-02-20 10:43:49
|
and vapoursynth ffms2 or bestsource do not get the bgr
I don't even know where it comes from since mediainfo doesnt show it either |
|
2026-02-20 10:44:09
|
when viewing in vapoursynth, I get blue and red channel swapped |
|
|
_wb_
|
2026-02-20 10:44:19
|
anyway you can just do something like `convert bla.png -depth 8 bla.ppm` and work with that |
|
|
Lumen
|
2026-02-20 10:44:33
|
indeed, I could do that |
|
|
_wb_
|
2026-02-20 10:45:38
|
strange that it swaps channels since PNG does not have any options, in 8-bit it's interleaved RGBRGBRGB... and in 16-bit it's RrGgBbRrGgBb... (where capital letter is msb and lowercase is lsb) |
|
|
Lumen
|
2026-02-20 10:47:04
|
wait, so this big endian should be decoded not by reversing bits, but by reversing bytes? |
|
2026-02-20 10:48:15
|
but yes, this bgr thing is very obscure to me |
|
|
_wb_
|
2026-02-20 11:20:12
|
endianness is always about byte order. How bits are ordered within a byte is generally not really defined, a byte is just a number between 0 and 255 and when doing operations like bitshift it is always conceptually big-endian (msb on the left, lsb on the right) |
|
|
VcSaJen
|
2026-02-20 11:40:25
|
Basically in little endian if you take an address of variable whose value is less than 255, it does not matter if you read a byte, word, int32 or int64 at that address, the result will be always the same number.
In big endian instead you have a normal order, but you lose that convenience. |
|
|
Lumen
|
2026-02-20 12:00:27
|
I code in C++ using logical operators and such since quite a while now
and I didnt even know that
that's shameful |
|
2026-02-20 12:00:48
|
though I never tried to cut an integer manually by address |
|
2026-02-20 12:01:20
|
I thought machine worked in little endian (true) and that that was the way logical operations in C are done (wrong) |
|
2026-02-20 12:01:43
|
it's very disturbing |
|
|
lonjil
|
2026-02-20 12:37:48
|
nitpick: "left" and "right" are not "earlier" and "later". Endianess doesn't care how we display bytes, only which addresses have which meaning.
Almost always, people consider bytes to be little endian (that is, the least significant bit is "first", it has "address 0" within the byte, etc). This can be seen in bit by bit wire protocols like Ethernet. Network byte order is big endian, but Ethernet bit order is little endian, lsb always sent first!
Another example is hardware design in VHDL. The msb is always to the left in a bit array, but you can choose how arrays are indexed. If you say `(0 to 7)`, then the msb is bit 0, and the lsb is bit 7. But if you say `(7 downto 0)`, the lsb is bit 0, and the msb is bit 7. And you always always use the latter form, because people get really confused if you make anything but the lsb have the lowest index 😄 |
|
|
spider-mario
|
2026-02-20 01:08:09
|
by the way, you will save yourself a lot of endianness bugs if, instead of conditionally swapping bytes, you maintain a strict separation between serialized bytes (LE or BE) and integers, similar to the `str`/`bytes` divide in Python, and cross the border using (say):
```c++
// int to LE bytes
bytes[0] = integer & 0xFF;
bytes[1] = (integer >> 8) & 0xFF;
// (etc.)
// LE bytes to integer
integer = bytes[0] | (bytes[1] << 8) | (bytes[2] << 16) | (bytes[3] << 24);
```
which works independently of native endianness (edit: or would have worked if I hadn’t messed up the amounts of shifting – but at least it would have been buggy on all platforms, which would have made the bug easier to discover) |
|
2026-02-20 01:08:16
|
(see this classic if you haven’t read it: https://commandcenter.blogspot.com/2012/04/byte-order-fallacy.html ) |
|
|
lonjil
|
2026-02-20 01:33:48
|
I presume compilers are good at translating these patterns into simple uses of the CPU's native byte swapping instructions? |
|
|
spider-mario
|
2026-02-20 01:42:01
|
yes: https://godbolt.org/z/6q5Y1WsT7 (although not when I replace `uint8_t` with `char`; I guess signedness is annoying here) |
|
2026-02-20 01:43:19
|
oops, I somehow managed to use indices 1..4 instead of 0..3 for one of them |
|
2026-02-20 01:43:42
|
well, it was still optimised – it just reads from `bytes[1..]` instead of `bytes` |
|
|
Exorcist
|
2026-03-01 05:08:29
|
An old article about quantization parameter ≠ quality
https://regex.info/blog/lightroom-goodies/jpeg-quality |
|
|
juliobbv
|
2026-03-01 05:27:36
|
it's interesting to see how we all independently seem to find a contrasting pair of images that prove without any doubt, global AQ is necessary for image encoding |
|
2026-03-01 05:31:27
|
this my go-to pair:
https://cdn.discordapp.com/attachments/992029418644054086/1396359453590425620/mountain.png?ex=69a5c554&is=69a473d4&hm=b187f635e77ec80b5b399c60a41fa9c8518e6337e3749d30569e9672195b7476&
https://cdn.discordapp.com/attachments/992029418644054086/1396359456438095932/peacock.png?ex=69a5c555&is=69a473d5&hm=3d32e5a36d2bd319bb05aca885045715aba3342a35c7a47ccae6a53dcd45bd4d& |
|
|
jonnyawsom3
|
2026-03-01 08:03:54
|
That gave me an idea related to this https://discord.com/channels/794206087879852103/805176455658733570/1477120032080396381
Lossy encoding tends to have an exponential falloff at the bottom and top of quality ranges, so instead of obliterating dark areas and blue details, I think we could recover quite a lot with very little cost |
|
2026-03-02 04:28:57
|
Is it just me, or does the blue noise not look very... Blue? Maybe it's just because I've stared at so much of it while adding it to libjxl https://youtu.be/kT4p1GXq4HY |
|
|
AccessViolation_
|
2026-03-02 04:32:01
|
second video titled "which random generator is best" after they found out the random generator they used wasn't very good
(I have actually seen videos showcasing bad random number generator libraries create noticeable patterns in noise) |
|
|
jonnyawsom3
|
2026-03-02 04:33:10
|
Maybe it's just because he's using 4x4 'pixels' to demo it on the video, but all the examples of the blue noise dithering in action look really clumped up to me |
|
2026-03-02 05:24:23
|
This is the 'blue noise' he showed in the video, versus djxl set to 1 bit output. Matched the 'pixel' size to 2x2 which he used for that segment too |
|
2026-03-02 05:25:23
|
You can see a pattern from the 32x32 LUT we used, but it has far more detail and averages far better when downscaled |
|
2026-03-02 05:26:19
|
I love that he made a video on it, but such a shame he used broken examples |
|
2026-03-02 05:51:51
|
Sent him a message about it, we'll see if he gets back to me |
|
|
awxkee
|
2026-03-02 08:45:26
|
I've got some issue about jpegli in my repo where the user states that somewhere jpegli supports loseless JPEG re-encoding https://github.com/awxkee/jpegli-coder/issues/1 |
|
2026-03-02 08:45:41
|
But I know nothing about it |
|
2026-03-02 08:46:01
|
Is there loseless re-encoding in jpegli? |
|
|
A homosapien
|
2026-03-02 09:03:44
|
No, Jpegli is not a transcoder like jpegtran. |
|
|
jonnyawsom3
|
2026-03-02 10:05:41
|
They even say
> the default cjxl behavior
cjxl is not cjpegli.... |
|
|
DZgas Ж
|
2026-03-03 07:40:43
|
Bruh noise. So small tile of blue noise |
|
|
jonnyawsom3
|
2026-03-03 07:42:28
|
It was already a 1KiB LUT, obvious when set to 1 bit, but in a normal image it works well enough |
|
|
DZgas Ж
|
2026-03-03 07:45:20
|
There's a program for generating blue noise, including on the GPU, including noise generation taking into account the input image, which makes it even more ideal. What it uses is a small tile of blue noise, which reduces noise. It's not seamless, there will always be gaps at the joints. But you can generate noise at a size of 1280x720, which is a completely insane size, it's not even exist in internet. I generated a 1024x1024 super sample in about 6 hours. But it might be better on an RTX 5090. If anyone has one, let me know. I'd be happy to create noise at a size of 4k x 4k. And you don't even need to generate it for each frame; at that size, you can just randomly move it. But no one has done this yet. |
|
2026-03-03 07:49:14
|
A year ago, when I was studying Minecraft maps, I noticed that 3D maps are beautiful, while 2D maps are ugly. But you can get close to good quality on 2D if generating blue noise directly for their image. |
|
2026-03-03 07:50:20
|
But even if there were pre-generated blue noise files on websites, it would solve all dithering problems and completely eliminate all other dithering except blue noise. |
|
2026-03-03 07:53:27
|
10 mb limit discord moment |
|
2026-03-03 07:54:57
|
1024x1024 blue blue noise tile |
|
2026-03-03 07:55:28
|
|
|
2026-03-03 07:56:47
|
— Original art
— 2D art without dithering
— 2D art with Floyd–Steinberg
— my Blue Noise 2D art
— 3D art without dithering
— 3D art with Floyd–Steinberg |
|
2026-03-03 07:59:31
|
Basically, blocks have colors, and you can set a color table. So, instead of wasting a lot of effort on creating 3D art in Minecraft, you can create the highest possible quality 2D flat art, which will look great thanks to blue noise. You can generate your own noise for the art, which will be perfect, but you can also use a ready-made texture; it will be just as good, better than nothing. |
|
2026-03-03 08:01:32
|
By the way, I thought the screenshot was taken incorrectly, but no, what is this? This is definitely NOT blue noise, it's NOT THAT. |
|
2026-03-03 08:03:44
|
Here's what I generated 1 bit image blue noise using that 1024 tile |
|
2026-03-03 08:04:48
|
smaller |
|
2026-03-03 08:05:13
|
more smaller |
|
2026-03-03 08:07:03
|
blue noise generated specifically for the image, takes 1 minute to calculate 320x180 on gpu |
|
2026-03-03 08:08:26
|
This is what a sample of blue noise generated specifically for an image looks like |
|
2026-03-03 08:09:34
|
I posted my program above, but the core of the generator itself is " Tellusim Blue Noise " program |
|
2026-03-03 08:18:34
|
just do ts_noise.exe -size 4096 -o blue_noise_4096.png
<:BlobYay:806132268186861619> |
|
|
Adrian The Frog
|
2026-03-03 08:48:34
|
That's clearly biased though because the dark areas should not be black |
|
2026-03-03 08:53:19
|
I think it's best to just use one of these
https://github.com/Calinou/free-blue-noise-textures
Or if you have a specific use case maybe
https://github.com/electronicarts/fastnoise |
|
|
DZgas Ж
|
2026-03-03 09:09:13
|
The truth is, there are tradeoffs when generating any blue noise, even with the highest-quality algorithm currently available. I actually figured this all out a year ago, but now I'm thinking: hmm, why not create my own super-perfect blue noise tile... |
|
2026-03-03 09:57:34
|
there is some work to do |
|
2026-03-04 04:26:50
|
nope it should be completely black |
|
2026-03-04 04:37:14
|
I spent seven hours fiddling around with it. There are other methods for obtaining it, but overall, nothing better than void and clusters works, even though it's a simplification of the real problem. It works good on pixel grid. The only problem is the Sigma parameter—it's just a constant that doesn't depend on anything, just like dark matter in space. It's just like that, and everything works, so think of it as that. |
|
2026-03-04 04:39:42
|
I was also able to create blue noise, very close, using neurons as pixels, and then collapsing them into pixels using gradient descent. This also works, but there's very weak interference at low frequencies, so it's still worse. Alas. It's still funny, though—nobody's ever done this before—but the idea of using a neuron as a pixel, cell color may be unknown, to create blue noise works surprisingly well. The cool thing is, you can create gigantic blue noise texts with thousands of pixels, but the problem is, it's not perfect, so it's probably not worth it. |
|
|
Demiurge
|
2026-03-04 12:33:06
|
What do you mean generated specifically for the image? I think the generic blue noise looks better actually... |
|
|
DZgas Ж
|
2026-03-04 01:48:41
|
The program adjusts blue noise to the image, its relief, and its spectrum. I've sent a sample below; you can look closely at this blue noise sample and see the sharp transitions in the image. I haven't delved into how exactly it works, but the result is different; sometimes it's a little better, but these are purely subjective. |
|
|
Demiurge
|
2026-03-04 01:50:40
|
I dunno if it's still considered blue noise if it has low frequency features and correlations with the image... |
|
2026-03-04 01:51:13
|
That sounds like a novel concept |
|
|
DZgas Ж
|
2026-03-04 01:51:19
|
I spent a lot of time to come to the conclusion that void and cluster are the only best algorithm for pixels, but they also have many variables, such as sigma number, and so today I will be trying out different parameters of the algorithm to understand what's going on. Perhaps I can create better blue noise iteratively. |
|
|
Demiurge
|
2026-03-04 01:52:59
|
Also I think 1024x1024 is overkill. |
|
2026-03-04 01:53:19
|
After a certain size it's big enough |
|
|
DZgas Ж
|
2026-03-04 01:53:33
|
This is an experiment anyway, no one will generate blue noise for image in real time, it's too expensive |
|
2026-03-04 01:54:30
|
Maybe this 1024 image is good, but it is definitely not perfect. I feel. |
|
|
Demiurge
|
2026-03-04 01:56:55
|
Error diffusion dithering can give slightly better results than blue noise but it depends on neighboring pixels whereas blue noise doesn't |
|
|
Adrian The Frog
|
2026-03-04 02:07:01
|
His shirt should not be black
It's not black in the random noise image |
|
|
jonnyawsom3
|
2026-03-04 02:11:58
|
It shouldn't be completely black |
|
|
DZgas Ж
|
2026-03-04 02:52:01
|
<@238552565619359744> <@755210713357877281> it can not be black because the blue noise void and cluster doesn't have a low-frequency region. it can't just draw a couple of pixels there, it would ruin the essence of its work. there are many other designing algorithms that solve this problem, not the blue noise |
|
2026-03-04 02:57:14
|
Like Electrostatic Halftoning |
|
2026-03-04 03:01:57
|
It is precisely the presence of a dot in the middle of the black on the T-shirt, which is supposedly "gray," that is the problem with all algorithms; a white dot on a black background looks bad, although mathematically it conveys a lighter black. |
|
2026-03-04 04:21:05
|
To be honest, I've never liked error diffusion dithering. The most ideal existing dithering algorithm is called Serial Gaussian Blue Noise Stippling https://discord.com/channels/794206087879852103/794206087879852106/1478609643247898644 . But it has a fundamental problem: it's analog, not digital. That is, the result is circles, positioned with float32 precision or even more, and it can't be pixelated. I'm currently exploring something that might be better than Void and Cluster Blue Noise, like Electrostatic Halftoning https://www.mia.uni-saarland.de/Research/Electrostatic_Halftoning/images.shtml or Dot-Diffused Halftoning https://arxiv.org/pdf/1508.05373 but there is simply nothing faster than using a blue noise texture to calculate dithering; if the texture is ready, it will be as fast as possible; faster is impossible. But is it possible to do better...?... |
|
2026-03-04 04:22:54
|
Since we live in a world where Bilinear has won, here's an interesting question: is it possible to create dithering that won't produce interpolation artifacts? Hmm. |
|
2026-03-04 05:34:27
|
<@238552565619359744> <@755210713357877281> You probably mentioned this, but this dithering through blue noise does not correspond to the original brightness of the image; it is a loss of contrast, loss of information |
|
2026-03-04 05:37:20
|
|
|
|
jonnyawsom3
|
2026-03-04 05:37:37
|
You don't need to ping me |
|
|
DZgas Ж
|
2026-03-04 05:37:55
|
I won't ask you for permission |
|
2026-03-04 05:43:38
|
hmm, this algorithm makes a slight increase in the brightness of the overall image, interestingly, apparently dithering can’t be done simply either <:Thonk:805904896879493180> |
|
2026-03-04 05:53:02
|
Wow I never thought I'd run into a gamma 2.2 srgb encoding problem like 1-bit dithering |
|
2026-03-04 06:07:28
|
So, I have:
- My dithering.
- Standard blue noise dithering
- Standard blue noise dithering that exactly matches the visual brightness of the original image, creating an identical amount of light When viewing pixel perfect. |
|
2026-03-04 06:23:12
|
I have to admit, I've been deceiving you all. The best dithering image is the result of my work on the megaGEN-bluenoise project I created a year ago, which I posted above, and which I've posted before.
The essence of the algorithm is that I perform a ternary search for the dithering sharpness ratio, iterating through options, and comparing them to the original image using Butteraugli. But I forgot about that a bit. That's why it was so important for blue noise to remove the white dots on black, which are generated by other algorithms. I forgot that I had already solved all these problems back then. <:SadOrange:806131742636507177> |
|
2026-03-04 07:00:16
|
While I was studying all this, I couldn't shake the thought that blue noise was just some kind of noise. And trying different examples, generating my own, trying options from the internet, I noticed that sometimes things weren't as good as I'd like. Not only the sigma parameter is different, but in general, different people who generate the same blue noise do it differently. And that something better, cleaner, more elegant could clearly be done. |
|
2026-03-04 07:03:43
|
undoubtedly, need to create my own |
|
2026-03-04 07:33:22
|
One of my key observations, while reviewing dozens of different types of noise from different authors, is the violation of the locality principle, as defined by blue noise. That is, if you cut out any part of the image, all the frequencies in it should be the same as in the other part, but this isn't always true, and it reveals a large number of artifacts in the low-frequency range.
The first is a clean 64x64 blue noise. The second and third are 64x64 crops from 128x128 images by two different authors. |
|
|
Exorcist
|
|
DZgas Ж
|
2026-03-04 07:53:28
|
<:monkaMega:809252622900789269> |
|
2026-03-04 07:55:51
|
Well you can't really convert an 8 bit image into an 8 bit image with noise... by adding noise..... |
|
|
Exorcist
|
2026-03-04 08:06:52
|
The textbook dithering for uniform quantization is:
1. add TPDF noise
2. noise shaping (also known as error diffusion)
There is no such thing that "directly add blue noise"
[white noise become blue] is the result of noise shaping
---
> compare the decimal against a blue noise LUT and round up or down depending on the LUT's value
[stochastic rounding by remainder] is equal as [add 1-LSB noise then floor]
and, you need prove your "blue noise LUT" is better than standard RPDF noise |
|
2026-03-04 08:07:09
|
this is full copy of my text😅 |
|
|
DZgas Ж
|
2026-03-04 08:10:15
|
Surprisingly, it became less clear. Alas, I am self-taught. The more precise terminology is not entirely clear to me. |
|
2026-03-04 08:11:38
|
There is noise, a noise picture, the image is transferred through it noise picture for dithering, well. |
|
2026-03-04 08:16:13
|
MPC-HC has good dithering noise when viewed in 10 bit (I have no idea which one) |
|
|
Adrian The Frog
|
2026-03-04 09:20:39
|
Make sure you remember the gamma correction for the perceptual one
I would consider blue noise by default to have a perfectly even histogram |
|
|
DZgas Ж
|
2026-03-04 09:23:03
|
In fact, this is true; this is the definition of blue noise in an image. The gamma problem is due to the input image. Because the monitors have a 2.2 sRGB gamma, the light output is uneven. https://discord.com/channels/794206087879852103/794206087879852106/1478816163713646727 |
|
2026-03-04 09:26:48
|
I solved this in an extermal way, simply using Ternary search
via Butteraugli to find the ideal gamma ratio for the generated area |
|
2026-03-04 09:30:35
|
Each noise and image has its own result
Since the point of dithering is to preserve the original, I think this is a great approach, but it might be possible to get by with something simpler like ssim
because in this case this analysis takes about 5-10 seconds per image, not fast at all, but it looks great |
|
|
AccessViolation_
|
2026-03-04 10:13:15
|
`upsampling_mode=1`
never tried it out before, this is turning one pixel into 8x8 pixel with `--already_downsampled` |
|
2026-03-04 10:15:27
|
also some interesting blocking artifacts with the default upsamping `-1` |
|
2026-03-04 10:16:59
|
aw, nearest neighbor upsampling makes it go from 241 bytes to 662 |
|
|
DZgas Ж
|
2026-03-04 11:06:43
|
I live in 1988 |
|
2026-03-04 11:14:12
|
https://momentsingraphics.de/BlueNoise.html#BlueNoise64FFT Since the image is seamless, it analyzes it beyond the boundaries, but I think this is a mistake |
|
2026-03-04 11:40:20
|
Let me summarize the results of two days of research and videocoding.
Blue noise is one big magic number, the essence of which is that to the human eye and perception, it looks better than anything else. All generators are made of magic numbers, mostly Gaussin, but this can be done by directly analyzing the FFT spectrum with fitting, and it works. Blue noise parameters are purely subjective, which is why Void and Cluster defines sigma as 1.5, but in reality, it can be anywhere from 1.1 to 2.0, and each blue noise algorithm has its own sigma. Blue noise can be generated in many ways, but Void and Cluster use it because of its simplicity. I've done it and moved on. Quality blue noise exists, and it's true. For example, "Tellusim Blue Noise", which I used and generates large textures on the GPU, really does have the worst quality. Of all the ones I tested, I was able to generate better noise myself. The main quality criterion, of course, is FFT perfection and the brightness deviation of each pixel, taking into account their number relative to the total number of pixels. But there's a parameter that directly determines noise quality: how far apart pixels of the same color are in the image; each pixel should be as far away as possible from other pixels of the same color. It turns out that my algorithm is superior to the standard Void-and-Cluster algorithm in this regard. Although I must admit, it takes an extremely long time to run. Considering that Void-and-Cluster technology was extremely expensive in 1993, and work on this topic is completely irrelevant today, I can assume that I've created the best blue noise. The problem is that I can't find any other algorithms specifically for blue noise, because all amateurs simply implement Void-and-Cluster with different parameters. My algorithm is Void-and-Cluster without the sigma parameter. |
|
|
Adrian The Frog
|
2026-03-04 11:45:04
|
For gamma correction, you can just inverse srgb the image before you threshold it with the noise, you don't need to do anything complicated
Or pow 1/2.2 although I don't think that is what modern monitors are calibrated for, idk really |
|
|
DZgas Ж
|
2026-03-04 11:49:46
|
Gamma is a bit different problem here. The problem with gamma in a 1-bit image isn't even the actual gamma, but that the image isn't uniform. It needs to be normalized, preferably in a way that preserves as much information as possible, because the result will be a white and a black pixel and nothing more. When dithering a multicolor image, my program uses K-means to find colors for the palette, and gamma problem doesn't arise at all. |
|
|
Adrian The Frog
|
2026-03-04 11:49:53
|
https://copyparty.adrian.place/u/?doc=code.py
This is the code for the stupid text dithering thing that I did recently
(Only looks correct if your system has a font for the unicode symbols for legacy computing supplement) |
|
2026-03-04 11:51:17
|
I think the most important thing to ensure with dithering is that a large dithered area averages to exactly the same final brightness as if the whole area was showing the exact color |
|
|
DZgas Ж
|
2026-03-04 11:52:35
|
https://discord.com/channels/794206087879852103/794206087879852106/1478816163713646727 Well, it actually works, but then the image must be on your site and must be at 100% scale, otherwise the interpolation will break everything. |
|
|
Adrian The Frog
|
2026-03-04 11:53:40
|
Yes, it's very annoying that every single viewer of everything decided collectively to interpolate things wrong... |
|
2026-03-04 11:54:00
|
And now it's too late to change it I guess |
|
|
DZgas Ж
|
2026-03-04 11:54:06
|
There is no problem in creating a 1-bit image that is the same in terms of the physical light flux emitted, it's just not the best option for the eye, as my research has shown |
|
|
Adrian The Frog
|
2026-03-04 11:55:08
|
I think in general dithered images up close look better with a ton of contrast and sharpness filters first
But I wouldn't consider it as accurate
It all depends on the use case |
|
|
DZgas Ж
|
2026-03-04 11:56:17
|
If it doesn't look very good, you can compress image use the "STRESS" algorithm, it's also built into GIMP under some funny name |
|
2026-03-04 11:57:50
|
Or, of course, according to the documentation and sources of Gimp, I implemented it in Python |
|
2026-03-04 11:59:36
|
radius=100, samples=3 |
|
2026-03-05 12:00:24
|
If details are more important than the actual color of the image, you can do dithering like that |
|
|
Adrian The Frog
|
2026-03-05 12:00:53
|
I remember using that when I was trying to extract images of sheet music from a picture I think |
|
|
DZgas Ж
|
2026-03-05 12:03:13
|
https://gitlab.gnome.org/GNOME/gegl/-/blob/master/operations/common/stress.c |
|
2026-03-05 12:03:59
|
https://pippin.gimp.org/publications/Kolaas_11_jist_preprint.pdf
Colors → Desaturate → Color to Gray. |
|
2026-03-05 12:23:49
|
the distance of each pixel from each other,
each pixel color from pixels of the same color
1 - 470x470 https://momentsingraphics.de/BlueNoise.html#BlueNoise470
2 - 256x256 https://github.com/Atrix256/VoidAndCluster/blob/master/bluenoise256.png
3 - 64x64 https://momentsingraphics.de/BlueNoise.html#BlueNoise64Magnified
4 - 128x128 https://github.com/Calinou/free-blue-noise-textures/blob/master/128_128/HDR_LA_0.png
5 - My blue noise 128x128 |
|
2026-03-05 12:24:56
|
I feel like I've created something, but it's not 1990 anymore. |
|
|
Demiurge
|
2026-03-05 12:57:40
|
I think this might be too obvious but, hasn't anyone tried computing blue noise textures quickly by starting with the desired frequency components and doing a Fourier transform? Is it because it won't tile well? |
|
|
DZgas Ж
|
2026-03-05 01:18:08
|
You make a logical conclusion, and I thought so too. But I literally can't find anything like it anywhere. All the existing blue noises I've found are classic Void And Cluster, On the pixel distribution graph, they have an obvious artifact at the start-ends of the colors, and overall, they are less effective in terms of average pixel distribution than my algorithm. I want to mention again that perhaps no one needs this anymore, and all the research ended 30 years ago, and no one was interested anymore. |
|
|
Demiurge
|
2026-03-05 01:19:12
|
Perhaps no one needs it? Blue noise seems to have a lot of practical uses |
|
|
DZgas Ж
|
2026-03-05 01:19:50
|
and yet, you google blue noise and you end up on all sorts of dithering sites, and any blue noise is VoidAndCluster |
|
2026-03-05 01:22:20
|
There are developments in other areas, such as GBN or BNOT algorithms, but this is not blue noise texture generation |
|
2026-03-05 02:05:55
|
stable |
|
2026-03-05 02:07:03
|
equally distributed noise of any size and with any number of VoidAndCluster starting points |
|
2026-03-05 02:24:43
|
I noticed that the grid and crystals are not filtered at all at certain generation parameters, so I created an algorithm for shifting noise onto itself to look for similarities pixel with itself when shifting noise |
|
2026-03-05 03:26:31
|
In the original VoidAndCluster, 10% of the pixels were initially selected for initialization. I have no idea why 10% was chosen; everyone simply used it as a constant. Overall, I can see that by creating 10 tables, overlaying them, and initializing from 2 to 29 pixels, there's a subtle tendency toward worse final noise, but it's so insignificant, and yet it's there. The horizontal grid is made up of noises with almost identical parameters, which I added for quantity. This demonstrates how subtle this tendency is. |
|
2026-03-05 03:28:24
|
Most likely, few people in the 1990s thought of calculating the grid problem by overlay noise on each other with a shift, but now I can say that 10% is not an optimal parameter, the best parameters are from 3 to 4%. |
|
2026-03-05 03:30:21
|
I'm still working on clarification it, it requires generating a lot of noise, but at least here it's clear that a parameter above 10% clearly can't be better. |
|
2026-03-05 03:56:57
|
bad noise looks interesting in nearest neighbor |
|
2026-03-05 03:58:03
|
|
|
2026-03-05 04:09:09
|
Iterative analysis showed that for 32x 64x 128x blue noise the best parameter for the number of initial pixel placements is 3% of the total number of pixels. |
|
2026-03-05 04:10:58
|
As the noise size increases, this parameter becomes disappear, but it still exists |
|
2026-03-05 04:18:32
|
|
|
|
Adrian The Frog
|
2026-03-05 05:08:46
|
That's what this was lol
https://discord.com/channels/794206087879852103/794206087879852106/1463055218609623074
Partially because I think it's hard to get an even histogram this way
I don't really know, I haven't done much noise making |
|
2026-03-05 05:09:35
|
I'm sure you can get better results than that, iirc that was with only the extremely high frequency components and I thought it looked cool |
|
2026-03-05 05:09:49
|
Might have been only the corners actually... |
|
2026-03-05 05:12:25
|
Interleaved gradient noise is also very nice for dithering imo (and you don't need a texture for it, it's insanely cheap to compute) |
|
|
DZgas Ж
|
2026-03-05 05:24:21
|
I'm still looking for the ideal noise parameter, but I'll probably provide the noise itself next day |
|
|
Demiurge
|
2026-03-05 10:56:29
|
It looks good |
|
|
jonnyawsom3
|
2026-03-05 11:26:33
|
Was checking BBC for updates on the war, and noticed an image load as blocks of pixels. For a moment I thought it was a JXL, but no.
It's a 16 MB 41 MP JPEG...
https://flo.uri.sh/visualisation/27893615/embed?auto=1 |
|
|
DZgas Ж
|
2026-03-05 12:46:29
|
So, after just 8 hours of calculations, I got the statistics: 1000 iterations for 64x64 noise and 200 iterations for 128x128, a total of two images. The distance values of each noise color (0-255) in the image on the graphs are averaged |
|
2026-03-05 12:49:33
|
The statistics immediately show that the fewer pixels that are initialized initially, the better, in theory, but at 2%, artifacts are visible that apparently some pixels select strictly defined points during initialization. You can also see artifacts of the pixel grid, which creates some inaccuracy artifacts, which is not critical, although I choose a percentage of the number of pixels. Judging by these statistics, for each size of the generated noise, there is a certain number of points for initialization, at which the probability of an extra artifact will be minimal <:Thonk:805904896879493180> |
|
2026-03-05 12:52:31
|
The red color visually indicates self-similarity, that is, the fact that the pixels are arranged in a stack and not completely random. On the left is a larger graininess, which is negative for visual perception. We can say that P is an analogue of the simga value in Void and cluster, but this is a different parameter, which is nevertheless responsible for the size of the noise. |
|
2026-03-05 12:59:58
|
a very interesting artifact curve that depends on the number of pixels selected at the beginning |
|
2026-03-05 01:04:42
|
https://static.laszlokorte.de/blue-noise/ Here's a good online visual program to help you understand what "initial insert pixels selected" I'm talking about |
|
2026-03-05 01:20:40
|
The most important thing is that I got the range in which the maximum ideal arrangement of dots on each color, relative to each other in general |
|
2026-03-05 01:21:40
|
All the blue noise I found had this parameter between 9 and 10 |
|
2026-03-05 01:40:20
|
All that worries me is the blue noise graph, it doesn't match any of the blue noise that void and cluster generate, so I don't even have anything to compare it to, it looks like a half of "normal distribution" |
|
2026-03-05 01:42:11
|
Any blue noise has a circle in the fft spectrum, and it has some kind of sharp edge, I don't have that |
|
2026-03-05 01:48:57
|
The funniest thing is that Tellusim Blue Noise, which I used before in my program, which is calculated on the GPU, is the worst noise I have, of all that I found |
|
2026-03-05 01:57:10
|
If I consider blue noise as a problem: each pixel of each color has a maximum distance from other similar pixels, this ceases to be a problem for the human eye, it becomes a purely mathematical problem |
|
2026-03-05 02:13:28
|
chatgpt pompously said that my algorithm has complexity O(N^2 log N)
Well, I already created one 512x512 texture for testing, but I didn't time it. But it's generally possible. I'll make all the textures as soon as I find the perfect values |
|
2026-03-05 06:54:59
|
In general, after a huge number of tests, I found that the best percentage for selecting the initial number of points is %2, but this creates an unexpected artifact on the pixels of the central color, so it is better to use %2.01 (because in iterative generation the parameters behave like a wave, unexpectedly, and it adds up to a central "frequency", if you can say so, at exactly 2% percent) |
|
2026-03-05 08:25:31
|
Okay, observation At 2%, the FFT is more square. Which seems to be a problem... |
|
2026-03-05 08:29:44
|
It's a bit weird, on the one hand it's bad because... but wait, aren't pixels square... |
|
2026-03-05 08:33:27
|
look like Squircle |
|
2026-03-05 08:34:00
|
|
|
2026-03-05 08:35:34
|
<:monkaMega:809252622900789269> ideal form of blue noise is a squircle fft not a circle <:FeelsReadingMan:808827102278451241> |
|
2026-03-05 08:56:13
|
It seems like this really isn't a mistake |
|
2026-03-05 08:59:47
|
I used a noise-shifting algorithm and a difference search to find self-similarities and frequency duplicates. And since this algorithm is pixel-based, ideal for pixels, it seems that this really isn't a circle |
|
2026-03-05 09:45:47
|
HDR_LA_0.png
my noise
The original 1-bit was scaling to 60% bilinear to make blue noise artifacts visible |
|
2026-03-05 09:46:50
|
original
HDR_LA_0.png blue noise
my |
|
2026-03-05 09:59:06
|
between the blurred images there is a beautiful xor |
|
2026-03-05 10:01:44
|
although the difference is even more beautiful. Here I am again in the waves and frequency spectrum |
|
|
dogelition
|
2026-03-05 10:50:55
|
found a relevant comment
```
// This is a probably-temporary internal workaround for the lack of access
// to mTransferFunction - BT2020 seems to always be used with PQ transfer
// function defined by BT2100 and STMPE 2084, we've/ been making this same
// assumption on macOS for quite some time, so if it was not universally
// true, hopefully bugs would have been filed.
```
<https://searchfox.org/firefox-main/source/gfx/webrender_bindings/DCLayerTree.cpp#2671> |
|
|
juliobbv
|
2026-03-05 11:25:30
|
:ohno: |
|
|
Adrian The Frog
|
2026-03-05 11:37:05
|
You should check out that fast noise paper, they made a thing to optimize blue noise for various filters and situations, and found that different looking FFTs were produced
https://media.contentapi.ea.com/content/dam/ea/seed/presentations/seed-id3-2024-spatio-temporal-sampling-paper.pdf |
|
|
DZgas Ж
|
2026-03-05 11:55:52
|
I've seen several similar articles
https://jcgt.org/published/0014/01/08/paper.pdf
https://diglib.eg.org/items/a96087bb-abe8-4851-968c-cccc7f17e08c
They all focus on fast 3D generation that will be applied to each new image frame. And various forms of noise can algorithmically speed up generation significantly, but they won't be perfect. My algorithm is precisely "How to Create the Most Ideal Image Noise," but everything new I've found is about making noise suitable for antialiasing dynamic graphics. Because my noise is unsuitable <:banding:804346788982030337> |
|
2026-03-05 11:58:06
|
I wouldn't even bother doing this if such "ideal" noise just existed, but for some reason it doesn't <:Thonk:805904896879493180> |
|
2026-03-05 11:59:18
|
In any case, I'm almost done. |
|
2026-03-06 12:49:47
|
I've done everything and started generating the textures, it might take half a day.
In any case, I don't know where else to write or post about this, who even needs it lol. And I don't want to write an article about it. And anyway, now that I've run all the tests, seen everything in its entirety, and the math, it's even become trivial <:galaxybrain:821831336372338729> |
|
2026-03-06 12:50:58
|
can make minecraft art dithering even better yay |
|
|
monad
|
2026-03-06 03:14:28
|
> who even needs it
Maybe your result can be incorporated into libjxl, then it can be the first and only codec with ideal blue noise dithering. |
|
2026-03-06 03:28:10
|
or maybe [this guy](https://discord.com/channels/794206087879852103/794206087879852106/1478066596697346191) needs it to improve his future videos |
|
|
solomoncyj
|
2026-03-06 07:45:05
|
ello. how bad does jxl respond to noisy images, eg how muach should i denoise this image? https://www.pixiv.net/en/artworks/138942898 |
|
|
Demiurge
|
2026-03-06 07:49:23
|
jxl is surprisingly very good at dealing with noise. Although, at this early stage, the encoder does not do any noise detection or noise replacement. |
|
2026-03-06 07:49:37
|
There are many unused features that are not in any encoder. |
|
2026-03-06 07:50:54
|
But despite this, even without those features, somehow it still handles noise very well. There is more blurring than there should be. |
|
2026-03-06 07:51:27
|
But other codecs have a similar amount of blurring. |
|
2026-03-06 07:51:54
|
Once someone adds noise detection to the encoder, or makes a new encoder with that feature, then it will be even better at dealing with noise. |
|
|
Meow
|
2026-03-06 07:52:01
|
Many artists add noises to artworks for artistic reasons and reducing colour banding |
|
|
Demiurge
|
2026-03-06 07:52:33
|
Color banding is almost non-existent in JXL |
|
|
A homosapien
|
2026-03-06 07:52:44
|
Are you targeting a file size or certain quality? |
|
|
solomoncyj
|
2026-03-06 07:53:26
|
yup, and file size goes up by 20 times |
|
|
Meow
|
2026-03-06 07:54:15
|
For lossless images not that much |
|
|
solomoncyj
|
2026-03-06 07:55:58
|
im just worried about file size, thats all usallt |
|
|
Meow
|
2026-03-06 07:59:36
|
I don't think other formats can handle that better |
|
|
_wb_
|
2026-03-06 07:59:51
|
Any lossless codec will have a hard time compressing noise. Unless it's not really random, noise is inherently pure entropy so not compressible. |
|
|
solomoncyj
|
2026-03-06 08:01:32
|
i mean av1 encoders usnally has a denoiser built in |
|
|
_wb_
|
2026-03-06 08:01:41
|
If you like to add noise yourself to an image you authored yourself, you can use lossless compression and add generated noise to it; however lossless works in RGB and generated noise in RGB does not look as good as noise generated in XYB (the space used for lossy). |
|
2026-03-06 08:05:08
|
I consider image preprocessing (like denoise or sharpen) not something that should be in the scope of an encoder. In my view, an encoder should just aim to preserve the image as it was provided, and not make assumptions about what kind of "enhancements" would be desirable. |
|
|
Meow
|
2026-03-06 08:08:31
|
Better to use a software to denoise if you really want |
|
|
solomoncyj
|
2026-03-06 08:11:31
|
https://www.pixiv.net/en/artworks/141731973 ran this through both cjxl and cavif... cjxl only reduced from 8mb to 6, cavif dropped it all the way to 400kb... |
|
|
Fahim
|
2026-03-06 08:12:50
|
What options are you using for cjxl and avifenc? |
|
|
solomoncyj
|
2026-03-06 08:13:14
|
cavif, not avienc, defult for both |
|
|
Fahim
|
2026-03-06 08:14:18
|
As in plain `cjxl input.jpg output.jxl`? |
|
|
solomoncyj
|
|
Fahim
|
2026-03-06 08:14:30
|
cjxl does lossless JPEG transcoding by default, that'd be why |
|
2026-03-06 08:15:17
|
It's not processing the raw pixels, it's converting JPEG DCT blocks to JXL VarDCT (+ the other things involved in lossless JPEG transcoding that I don't know the details of) |
|
|
solomoncyj
|
2026-03-06 08:16:02
|
how do i force it to re-encode? |
|
|
Fahim
|
2026-03-06 08:16:12
|
`--lossless_jpeg=0` |
|
|
NovaZone
|
2026-03-06 08:16:20
|
Cause jxl does auto grain/dither xD |
|
|
Fahim
|
2026-03-06 08:17:27
|
But you should also set the quality values, and keep in mind that those numbers are **NOT** portable across codecs |
|
|
Meow
|
2026-03-06 08:17:43
|
For evaluating size differences, quality should be nearly identical |
|
|
Fahim
|
2026-03-06 08:17:44
|
which is why metrics like https://github.com/cloudinary/ssimulacra2 exist |
|
|
Demiurge
|
2026-03-06 08:20:15
|
cjxl compression is exactly reversible. You can get an exact, perfect copy of the original file with djxl file.jxl file.jpg |
|
2026-03-06 08:20:28
|
If the original file is a JPEG |
|
2026-03-06 08:21:10
|
And you're using JPEG re-compress mode (default for cjxl when input is JPEG) |
|
|
Fahim
|
2026-03-06 08:21:31
|
Also cavif seems to only use rav1e and the recent AVIF improvements are exclusive to libaom, I believe? I'm not sure |
|
|
Demiurge
|
2026-03-06 08:22:46
|
`-j 0` for short |
|
2026-03-06 08:23:08
|
`cjxl -j 0 -d 1 in.jpg lossy.jxl` |
|
2026-03-06 08:23:54
|
Don't forget the `-d 1` otherwise it will assume `-d 0` |
|
|
Fahim
|
2026-03-06 08:24:21
|
As for what that means, it's what JXL's "quality" values are actually based off of - butteraugli distance |
|
|
Demiurge
|
2026-03-06 08:24:40
|
You can also use `-q 90` instead of `-d 1` if you prefer |
|
|
Fahim
|
2026-03-06 08:24:42
|
`-q 90` is mapped to `-d 1`, so it's very high quality lossy |
|
|
Demiurge
|
2026-03-06 08:25:43
|
I don't have a pixiv account and don't know how to get the original file |
|
2026-03-06 08:26:44
|
But the libjxl encoder is tuned for photographs at the expense of other types of images. |
|
2026-03-06 08:27:03
|
So digital line art is not tuned or optimized |
|
2026-03-06 08:27:10
|
Except for lossless |
|
|
Fahim
|
2026-03-06 08:28:01
|
You don't need a pixiv account for non-NSFW, and https://github.com/qsniyg/maxurl will get you as good as you can get out of Pixiv |
|
|
Demiurge
|
2026-03-06 08:28:33
|
At this stage of libjxl development, the current version of libjxl lacks a lot of tuning and optimization for many different types of images and scenarios. |
|
2026-03-06 08:29:08
|
And it leaves a lot of features unused and a lot of untapped potential on the table. |
|
2026-03-06 08:30:24
|
Hell yeah! I was looking for something like this. I am so turned off by every website expecting me to make an account and a password just for them when I have zero intention of spending my life online or making any posts. |
|
|
solomoncyj
|
2026-03-06 08:39:31
|
yeah |
|
2026-03-06 08:40:11
|
after running ssimulacra2 and getting 80 from cavif and plugging it back into cjxk, its still 1 mb |
|
|
Demiurge
|
2026-03-06 08:41:50
|
I expect AVIF will be better at this image since libjxl currently doesn't have as much tuning for digital line art. |
|
2026-03-06 08:42:35
|
the AVIF codec is an older and more mature codebase that has had more effort put into tuning for different kinds of images, including line art. |
|
2026-03-06 08:43:27
|
As JXL encoders become more mature and receive more psychovisual tuning, this is expected to change in the future. |
|
2026-03-06 08:47:52
|
At `-d 6.4` or `-q 30` quality setting, the file size of the JXL is about 480 kb |
|
2026-03-06 08:50:41
|
But at such an extremely high compression level, there are going to be wave-like artifacts (known as "ringing") of both colors and shading if you view at full size 1:1 |
|
2026-03-06 08:52:22
|
Certain shading details will also be blurred. |
|
2026-03-06 08:59:54
|
If you add `-m 1` to the settings, then the file size is reduced slightly further, to 460 kb, and the distortion is less severe, especially less blurring. |
|
|
Froozi
|
2026-03-06 09:00:04
|
Okay, but what about "re–compiling" the image to have digital noise instead of the one "baked in"?
Lossy mode already does minute changes that can be ignored as they don't much detract from the overall quality of the image, depending on the viewer. Denoising option would then have to be included as it's a stepping stone towards further compression.
I wouldn't consider that kind of repackaging of an image to be undesirable in an encoder, as grain would still be an integral part of the finished product. Just in a different form. |
|
|
Demiurge
|
2026-03-06 09:00:06
|
At the cost of slightly slower encoding |
|
2026-03-06 09:01:07
|
Yes, this is called noise detection... It is one of the core features of the JXL format and decoder... |
|
2026-03-06 09:01:10
|
But currently there is no encoder that makes use of this feature. |
|
2026-03-06 09:02:10
|
It is one of the low-hanging fruit of untapped potential left on the table for future encoders to use. |
|
|
Froozi
|
2026-03-06 09:02:41
|
So denoising by itself would be possible… why don't we want to allow the user to denoise but not re–noise? |
|
|
Demiurge
|
2026-03-06 09:03:08
|
We do, but no one has gotten around to it yet. |
|
2026-03-06 09:03:35
|
It is untapped potential for future JXL encoders to use. |
|
2026-03-06 09:05:24
|
Currently, libjxl removes a lot of noise by accident, as an undesirable side-effect of the encoding process. It would not be that complicated to add a step to the encoder where it compares the final result to the original file, and measures how much noise was lost... so the decoder could re-add the noise back. |
|
2026-03-06 09:05:35
|
But no one has added that feature yet. |
|
2026-03-06 09:06:08
|
It is one of the unfinished parts of the encoder, at this early stage of development. |
|
2026-03-06 09:06:53
|
Making sure the decoder is 100% compliant is most important at this early, pre-release stage. |
|
2026-03-06 09:07:48
|
So the encoder is in a somewhat quick-n-dirty state |
|
2026-03-06 09:09:12
|
The encoder is already very good and competitive at most things but it lacks a lot of polish |
|
2026-03-06 09:09:55
|
Future encoders will be able to do a lot of amazing things with all of the currently-unused features that are only in the decoder. |
|
2026-03-06 09:11:25
|
Personally, I hope, as much as anyone, that the encoder will improve and become competitive with libaom for lossy compression. |
|
2026-03-06 09:12:06
|
It's important to make a good impression and "wow" people with good results early on, in my view. |
|
|
Meow
|
2026-03-06 09:45:43
|
So does cjxl have the ability to denoise (even unintentionally)? |
|
|
DZgas Ж
|
2026-03-06 09:49:22
|
wow |
|
2026-03-06 09:56:55
|
Okay, I've finished generating 1024x1024 textures. Now. I'm generating smaller sizes. |
|
|
Demiurge
|
2026-03-06 10:18:23
|
Unintentionally yes. Like I said, it's an undesirable and unintended side effect of compression, partialy due to optimizing for metrics like ssim which give high scores to blurry splotches and washed out details. |
|
|
Meow
|
2026-03-06 10:20:33
|
If it's controllable it could be a huge feature |
|
|
Demiurge
|
2026-03-06 10:21:17
|
I notice that adding `-m 1` to the encode settings often greatly reduce the amount of distortion and improve the quality/bitrate of libjxl compared to the default DCT-based compression mode. |
|
2026-03-06 10:21:58
|
It increases the encode time but hugely improves the final result |
|
2026-03-06 10:22:21
|
It's pretty shocking |
|
2026-03-06 10:24:58
|
I notice it's pretty consistent. The default DCT compression has very exaggerated distortion by comparison, especially destructive levels of obliterative blurring. |
|
2026-03-06 10:25:21
|
And colorful wavy ringing artifacts which are slightly reduced with -m 1 |
|
|
DZgas Ж
|
2026-03-06 10:25:32
|
It's funny to see how my 1024x1024 noise in BMP is 1,049,654 bytes in size, and in PNG it's 1,050,078 bytes in size.
it seems it noise <:Stonks:806137886726553651> |
|
|
Demiurge
|
2026-03-06 10:25:34
|
Still present but not as large |
|
2026-03-06 10:26:18
|
I heard it takes days to generate 1024x1024 textures in python |
|
|
DZgas Ж
|
2026-03-06 10:27:24
|
And this is using a simpler generation algorithm. Well, I have an r5 7600 and it took 12 hours |
|
2026-03-06 10:29:09
|
it speaks for itself |
|
|
Demiurge
|
2026-03-06 10:30:13
|
Is that a GPU? |
|
|
DZgas Ж
|
2026-03-06 10:32:56
|
I'm not sure, TellusimBlueNoise does this on the GPU. |
|
|
Demiurge
|
2026-03-06 10:33:28
|
Does anyone know why DCT mode utterly destroys libjxl quality? |
|
2026-03-06 10:34:58
|
Is it because modular mode is designed to preserve more detail since it's used for the DC compression? |
|
|
DZgas Ж
|
2026-03-06 10:36:15
|
I'm too lazy to draw a correct graph, but here's a comparison of TellusimBlueNoise 1024x1024 and my noise, in terms of quality (my is pink) |
|
|
Demiurge
|
2026-03-06 10:36:39
|
I mean an r5 7600, that's a CPU? |
|
2026-03-06 10:37:10
|
What do the axes represent? |
|
|
DZgas Ж
|
2026-03-06 10:37:47
|
the average distance of each pixel from each other |
|
2026-03-06 10:38:16
|
|
|
2026-03-06 10:38:43
|
In ideal blue noise, every pixel should be as far away from every other pixel as possible |
|
2026-03-06 10:38:58
|
and that's what I did |
|
2026-03-06 10:40:08
|
The main artifact I noticed is on a 254-255 color surface. If you dither it, it will be much more shaded than it should be ideally |
|
2026-03-06 10:47:15
|
I'm so exhausted these three days, but it was definitely very interesting
Well, the reason no one has done anything similar is most likely because the most popular blue noise algorithm came out in 1993, and everyone just used it for its speed. The original algorithm has a complexity of O(N^2), which is extremely high, 30 years ago, 20 years ago, and still now. My noise is O(N^2 log N) |
|
2026-03-06 10:48:37
|
I mean specifically that generating my perfect best 16-bit RGBA 2048x2048 blue noise is literally impossible on a regular computer right now. But the old algorithm does it, even though the noise isn't perfect |
|
2026-03-06 10:52:27
|
I don't know, if anyone has a supercomputer in their garage, it would come in handy. But then again, it doesn't solve the gaming problem. Nvidia came up with their own algorithm for gaming and frame smoothing, while mine is only good for static images, or stylized videos using one noise alone |
|
2026-03-06 10:54:34
|
Due to the homogeneity of my noise, it seems I've solved the tiling problem; the places where the tilings break are just not visible in one noise, hmmm |
|
|
Demiurge
|
2026-03-06 10:54:51
|
That's just... bayer at that point. There needs to be some amount of randomness and imperfection. |
|
|
DZgas Ж
|
2026-03-06 10:55:54
|
you are right and that is what i have been looking for for the last 2 days, an absolute(%) number of pixels for the initial generation of the image |
|
2026-03-06 10:56:16
|
and I found it was 2% +- 0.01 |
|
2026-03-06 10:57:09
|
the number of pixels that the initial image creates void and cluster, which then creates a perfectly fitting field for Pixels dithering |
|
2026-03-06 11:00:16
|
But I will quite rightly note that for this ideal I had to pay the price that my blue noise does not look like "classic normal" blue noise, this is not bad, it is just a consequence of the fact that the pixels are square, this does not make it worse, it just looks like that now, ideal blue noise |
|
2026-03-06 11:02:09
|
there is no problem to create noise of the "correct" kind with the same ideal distribution, but all my tests showed that this is a mistake, and was always not true, for a pixel raster grid |
|
2026-03-06 11:02:41
|
https://discord.com/channels/794206087879852103/794206087879852106/1479233757293576243 |
|
2026-03-06 11:04:40
|
the ideal raster grid form of noise turned out not to be a sigmoid |
|
2026-03-06 11:06:28
|
I'm almost ready, I'll send the generated noises within an hour |
|
|
Demiurge
|
2026-03-06 11:11:30
|
Blue noise just looks really good to our eyes. Since it doesn't obscure what's behind it. Same reason noise shaping works in audio. |
|
|
DZgas Ж
|
2026-03-06 11:14:15
|
Yes, but there's another thing that's fundamentally being glossed over: blue noise is a wave-like, and placing it on a pixel grid is a problem |
|
|
Demiurge
|
2026-03-06 11:14:52
|
You mean aliasing? |
|
|
DZgas Ж
|
2026-03-06 11:15:26
|
If you remove square pixels from the bloodstream, everything will be perfect, but now there is an analog problem. |
|
2026-03-06 11:18:50
|
In part, I tested different noise parameters and found one artifact in my noise, barely noticeable but still present, at color 128, a problem that looks like a physics issue. It's almost unnoticeable. The noise is still perfect, But this effect is very similar to the moiré effect in real life; it's just there, it's a property of light, and I encountered this too when generating ideal blue noise. Although it's really funny, because due to the curvature of regular blue noise placing, this doesn't happen |
|
2026-03-06 11:22:42
|
If you view it pixel by pixel, you might see it |
|
2026-03-06 11:24:59
|
At the micro level it looks like this: there is a "grid" on top, there is a "grid" on the bottom, and there is no grid in the center |
|
2026-03-06 11:28:17
|
The problem is solved by adding imprecision to the noise... um... lol. but I decided not to do that |
|
|
Demiurge
|
2026-03-06 11:55:09
|
Yeah, the noise correlates to the pixel grid too strongly because there isn't enough randomness probably |
|
2026-03-06 11:55:27
|
These visualizations are beautiful btw |
|
2026-03-06 11:55:44
|
What is this halftoning method? |
|
|
DZgas Ж
|
2026-03-06 11:57:29
|
some very ordinary one meh, I simply asked the neural network to write one that doesn't perform any gamma correction and doesn't threshold for extreme black and extreme white. This method is purely for testing |
|
2026-03-06 11:58:18
|
It looks good but the true color range is not preserved |
|
2026-03-06 11:59:05
|
By the way, I don’t know any programs where you can just upload your noise to test dithering, maybe you can tell me |
|
|
Demiurge
|
2026-03-06 12:06:15
|
Yeah, there's one, written in go on GitHub I think, that let you use custom ordered dither matrix |
|
|
jonnyawsom3
|
2026-03-06 12:09:01
|
Well, there is `--noise 1`,but it doesn't subtract noise from the original image, just makes a guess and adds it back after the posey lossy encoding |
|
|
Demiurge
|
2026-03-06 12:09:48
|
Is that new? |
|
|
jonnyawsom3
|
|
Demiurge
|
2026-03-06 12:10:13
|
It's in the latest release? |
|
|
jonnyawsom3
|
|
Demiurge
|
2026-03-06 12:11:24
|
https://github.com/makew0rld/didder |
|
|
jonnyawsom3
|
2026-03-06 12:11:57
|
2023 https://github.com/libjxl/libjxl/pull/2621 |
|
|
DZgas Ж
|
2026-03-06 12:16:43
|
decorated |
|
|
jonnyawsom3
|
2026-03-06 12:17:48
|
Photon noise is a decode-side feature that uses a LUT to map brightness to noise intensity. It's works on a pixel basis, so can't reproduce film grain or chroma noise, but it allows the lossy encoder to quantize away the high frequency sensor noise from cameras and add a fake version on top. If the quality is high enough to preserve the original noise, it will try to do that instead |
|
|
Demiurge
|
2026-03-06 12:20:37
|
Hmm hmm |
|
2026-03-06 12:21:08
|
I had no idea there was a `--noise` option |
|
2026-03-06 12:21:20
|
Sounds like a pretty big deal |
|
|
DZgas Ж
|
2026-03-06 12:21:53
|
I also noticed that less tiling helps with the moire effect, surprisingly |
|
2026-03-06 12:25:15
|
I cooked |
|
2026-03-06 12:26:13
|
I don't want to deal with this crap anymore I'll go play balatro have fun yourselves |
|
2026-03-06 12:34:07
|
|
|
|
monad
|
2026-03-06 01:17:31
|
amazing! |
|
|
DZgas Ж
|
|
A homosapien
|
2026-03-06 07:39:03
|
I got it integrated with libjxl, initial testing looks promising. |
|
2026-03-06 07:55:31
|
Okay, metrics seem to agree with my eyes. This blue noise is better than the old one. |
|
2026-03-06 07:57:13
|
Also I just found out I can get away with a slightly smaller LUT, the padding was a bit excessive and I can trim it down with no downsides. |
|
|
monad
|
2026-03-06 08:38:23
|
of course it's better, it's perfect best blue noise. if metrics say otherwise, they are wrong |
|
|
A homosapien
|
2026-03-07 12:40:01
|
Hmm, there are some minor tiling issues, I have some ideas to mitigate that. Otherwise it's a straight up improvement. |
|
2026-03-07 12:41:45
|
I'll make a PR later today if I have the time |
|
2026-03-07 12:43:43
|
It's wild, this is the third version of blue noise in libjxl. It will be the bluest ever! |
|
|
Demiurge
|
2026-03-07 05:16:56
|
Tiling issues? Sounds like a problem with the generation |
|
|
monad
|
2026-03-07 09:53:22
|
nope, try it yourself |
|
|
DZgas Ж
|
2026-03-07 12:36:04
|
i spilled my perfect best blue noise |
|
2026-03-07 12:40:49
|
There's a problem with the metrics. According to my observations, when encoding 1-bit dither, butteraugli shows quality that's approximately 1% worse than regular blue noise. But the value floating between +1.5% and -1.5% across different noise samples. This means these values are extremely inaccurate on such a noisy surface when using a subjective metric like butteraugli. Perhaps more objective metrics are needed to compare better dithering.
Instead of generating new noise from scratch every time, you can take my ready-made ones and shift them by 1 pixel to get a new sample for dithering - for testing |
|
2026-03-07 12:42:26
|
noise is a seamless tile, so you can move it anywhere as long as pixel comes back on its other side |
|
2026-03-07 12:45:56
|
There are no problems with tiles generation of my noise |
|
2026-03-07 12:46:56
|
Even better, due to the extreme uniformity, if you use a small tile size, it will be much less noticeable than regular blue noise. |
|
|
Demiurge
|
2026-03-07 12:53:32
|
I love your quirky sense of humor |
|
|
Adrian The Frog
|
2026-03-07 01:37:25
|
https://github.com/IntelLabs/cgvqm maybe? |
|
2026-03-07 01:38:35
|
Meant for video but I would be surprised if it can't do images |
|
|
DZgas Ж
|
2026-03-07 01:46:17
|
for gifs I think , but not otherwise |
|
2026-03-07 01:59:08
|
I want to know if it's a problem that my noise is gray and not RGB? Does this have a big impact on anything? Was the previous noise the same? |
|
2026-03-07 02:01:05
|
I was just thinking there is YUV and RGB, but jpeg xl uses XYB, and here I don’t quite understand what the requirements should be for noise |
|
2026-03-07 02:30:25
|
```didder_1.3.0_windows_64-bit.exe -i test.png -o output_bw.png --palette "black white" odm PBBN_256.json```
```didder_1.3.0_windows_64-bit.exe -i test.png -o output_16color.png --palette "black white red lime blue yellow cyan magenta silver gray maroon olive green purple teal navy" odm PBBN_256.json```
```didder_1.3.0_windows_64-bit.exe -i test.png -o output_16gray.png --palette "0 17 34 51 68 85 102 119 136 153 170 187 204 221 238 255" odm PBBN_256.json```
nuh uh I didn't figure it out. |
|
2026-03-07 02:40:58
|
--strength 15% There's a parameter that reduces the strength, but I still don't understand why black isn't black. Something's clearly wrong here hmmm |
|
2026-03-07 02:43:19
|
making blue noise dithering for anything other than 1-bit images suddenly turned out to be a non-obvious task |
|
|
jonnyawsom3
|
2026-03-07 03:19:58
|
In my testing I'm seeing more repetition and a loss of detail, but it probably depends on the image and amount of dithering required |
|
|
DZgas Ж
|
2026-03-07 03:23:03
|
What do you use for dithering? |
|
|
jonnyawsom3
|
2026-03-07 03:23:39
|
I'm using the current djxl dithering against the build he made with your noise instead |
|
|
DZgas Ж
|
2026-03-07 03:25:02
|
All I see are absolutely disgusting dithering programs that can't do anything. So, I sit down to create my own special dithering method, and so far it looks pretty good |
|
2026-03-07 03:26:19
|
5 colors test |
|
2026-03-07 04:00:04
|
meh
Why doesn't anyone do this anymore? Are all dither developers lazy? |
|
2026-03-07 04:00:43
|
just 16 classic colors |
|
2026-03-07 04:03:21
|
For the last 4 days that I've been working with dithering, including the days of developing blue noise, I've seen something like this |
|
2026-03-07 04:04:41
|
Does anyone know what this is, and in general, it looks ridiculous |
|
2026-03-07 04:12:20
|
Well, it seems that besides the Perfect noise and Perfect dithering, all that's left is to make the Perfect choice of colors for the palette |
|
2026-03-07 04:18:24
|
gamma 2.2
inverse gamma 0.45
PaintDotNet has a gamma parameter for Gaussian blur, so I quickly debug the correctness of some dithering works <:banding:804346788982030337> or a bilinear downscale also work |
|
2026-03-07 04:27:23
|
"amount of dithering" - this sounds like something that shouldn't exist, doesn't it? Do you know which specific file is responsible for implementing dithering? Could you please send a link to it? |
|
|
jonnyawsom3
|
2026-03-07 04:28:48
|
I meant how quantized the pixels are, since less precision will require more dithering to give a pleasing result <https://github.com/libjxl/libjxl/blob/main/lib/jxl/render_pipeline/stage_write.cc#L63> |
|
|
DZgas Ж
|
2026-03-07 04:30:21
|
hmmmmmmmm it's quite small |
|
2026-03-07 04:31:36
|
Do it need a noise of exactly this 32x32 size? |
|
|
jonnyawsom3
|
2026-03-07 04:32:49
|
No, but we don't want the LUT to be too large for performance (and code readability) reasons |
|
|
DZgas Ж
|
2026-03-07 04:33:50
|
You know, noises of a sufficiently small size have a larger "random" when generated. I could probably generate 100,000 32x32 noises, just as they are microscopic, and then find the best one using my own metrics |
|
2026-03-07 04:36:25
|
To be honest, I didn't even think that such a small size of the picture would be in demand, That's why in the original I just chose the best of the 50 that I manually reviewed based on my metrics |
|
2026-03-07 04:39:20
|
I also see that very float values are used. I can generate them directly like this, without 8 bits png |
|
2026-03-07 04:51:35
|
I haven't started testing yet, but here's a float32 version of the 32x32 noise |
|
2026-03-07 05:00:20
|
There is something beautiful about the fact that no matter the bit depth of the final quantization, the distance and number of points will almost always be exactly the same on each color |
|
2026-03-07 05:01:58
|
|
|
2026-03-07 05:07:08
|
I hmmm <:Thonk:805904896879493180> thought that since only 2 percent of the pixels are involved in the initialization, that would only be 21 pixels. Why not generate all possible combinations of initial states? Oh ..., it's on a 32x32 image, so whatever... |
|
2026-03-07 05:19:01
|
I haven't used this analysis method before, but when working with 32x32 I can analyze the distance of each pixel to each other, that's 1024*1024 checks |
|
2026-03-07 05:28:08
|
i probably releasing it also as 16 bit png |
|
2026-03-07 05:42:45
|
|
|
2026-03-07 05:49:04
|
it is more clear that the generation of blue noise, as well as the fact that it is in a 32 by 32 grid, imposes its noise on noise <:KekDog:805390049033191445> if look at the distance of each pixel to each pixel |
|
2026-03-07 06:07:15
|
I also noticed this non-symmetricity, this is an inaccuracy due to the pixel grid (also the value above is a metric), it occurs at the most extreme values, for example, the distance between pure white and pure black may not exist, this does not interfere with the definition of blue noise, on a larger grid the effect will dissolve |
|
2026-03-07 06:25:16
|
there is a good candidate |
|
2026-03-07 06:32:37
|
I will check 100 thousand candidates, it will take about 3 hours |
|
2026-03-07 07:25:46
|
Yep standard 1D R G B A dithering, I have nothing to offer here, since it is important to maintain speed, not quality, in this case |
|
2026-03-07 07:33:52
|
https://discord.com/channels/794206087879852103/794206087879852106/1479871428471029961 here I'm using a very complex probability function on a 3D RGB cube using blue noise as a probability for each pixel. I'm not entirely sure there's a name for it,I just figured out how it should work and vibecoded it. It looks perfect. |
|
2026-03-07 10:24:08
|
The grid metric algorithm I used turned out to be useless in cases of searching for noise that has not an Autocorrelation problem due to tiling, but I work |
|
2026-03-07 11:44:56
|
Okay, there's good news and bad news. The good news is that I created a special algorithm that can find blue noise tiles that will look as seamless best as possible out of all the existing ones. The bad news is that 100,000 variants seem a bit low, so I'll continue in 10 hours when I generate another 300,000 <:JXL:805850130203934781> |
|
|
Meow
|
2026-03-08 04:39:03
|
I actually added some noises on my avatar |
|
2026-03-08 04:39:52
|
Of course file size would significantly increase |
|
|
DZgas Ж
|
2026-03-08 12:18:56
|
Well I found my best 32x32 |
|
2026-03-08 12:26:30
|
a complete absence of noticeable "lines", out of 400 thousand candidates |
|
2026-03-08 12:28:38
|
<@238552565619359744><@207980494892040194> |
|
|
Adrian The Frog
|
2026-03-08 06:43:37
|
It seems like there's probably some interesting things that can be done for rgb blue noise for 1 bit dithering, since you could try to optimize for maximal smoothness of luminance for example since that probably matters more than chroma
or maybe the perceptual metrics could find some optimal balance, idk |
|
2026-03-08 06:46:11
|
having correct luminance and even just naively dithered chrominance looks pretty good
(image from https://en.wikipedia.org/wiki/Spatiotemporal_reservoir_resampling) |
|
2026-03-08 06:46:43
|
(idk if that's actually perfect luminance but that's what it looks like) |
|
|
DZgas Ж
|
2026-03-08 07:18:26
|
Well it doesn't look very good but as far as I know, these are the basics of RTX rendering and, in general, rendering with optimization in projects and games, everything that is related to frame movement and 3D rendering directly <:banding:804346788982030337> <:Thonk:805904896879493180> |
|
2026-03-08 07:19:57
|
It's a good idea to use blue noise to select pixels for choosing where rays to render |
|
|
VcSaJen
|
2026-03-09 05:22:53
|
Moiré seems to be the problem |
|
|
Dunda
|
2026-03-09 07:28:07
|
Unfortunately, that will happen regardless of the pattern |
|
2026-03-09 07:29:13
|
However an image viewed at native resolution, or a dither shader applied per-pixel, will not suffer moiré |
|
2026-03-09 07:33:08
|
I'm not sure how you're generating your images, but you may have better luck if you treated your image like a toroidal surface during generation such that the borders of the image are in no way distinct to the center. This would actively prevent seams instead of getting lucky from guess-and-check |
|
2026-03-09 07:34:44
|
I.e. for a torus you wouldn't only check a distance in the current space, but also check the distance for the points mapped to each bordering repeated tile, then take the minimum |
|
|
DZgas Ж
|
2026-03-09 07:57:55
|
Thanks for the advice, but I'm already doing it and this has been the standard for generating blue noise since 1993. Moreover, without it, it is impossible to generate blue noise at all <:SadCheems:890866831047417898> |
|
|
Dunda
|
2026-03-09 08:00:31
|
If you mean to say you are using void and cluster, toroidal distance is compatible |
|
2026-03-09 08:02:27
|
This demo page about void and cluster blue noise even explicitly mentions toroidal surfaces to make the textures tileable
https://blog.demofox.org/2019/06/25/generating-blue-noise-textures-with-void-and-cluster/ |
|
2026-03-09 08:03:16
|
Oh excuse me I missed that you said you're already using it |
|
|
DZgas Ж
|
2026-03-09 08:03:53
|
I'm using a modified version of void and cluster, my own development, but based on the old one. It's a toroidal surface, but I call it a seamless texture. Of course I use it, otherwise it would be mathematically impossible for each pixel to have an equal number of neighbors |
|
2026-03-09 08:07:45
|
This is a problem for 1 bit. And in a world where bilinear interpolation has won, if you use more colors and bicubic or better, then there is no problem |
|
|
Dunda
|
2026-03-09 08:16:39
|
Unfortunately it's not just 1 bit, we still use mipmaps in 3D graphics after all. For example this 8-bit tiled blue noise has some Moiré with non-integer scaling |
|
2026-03-09 08:18:14
|
1-bit is just very noticeable because there is so much contrast |
|
|
A homosapien
|
2026-03-09 09:35:20
|
Moiré is just unavoidable regardless of the dithering pattern |
|
|
DZgas Ж
|
2026-03-09 10:28:11
|
Larger sizes of noise textures are available here https://discord.com/channels/794206087879852103/794206087879852106/1479454816991182930 |
|
2026-03-09 10:29:08
|
At least it reduces the tiling problem, 32x32 is too small for me, but I was interested in finding the best one that I could |
|
|
Dunda
|
2026-03-09 10:31:41
|
It actually looks fairly different, your blue noise is indeed very regular with much less larger structure |
|
|
DZgas Ж
|
2026-03-09 10:31:42
|
This is not entirely true in general, but for blue noise it is true |
|
2026-03-09 10:35:47
|
At 1024x tiling, I noticed a moiré effect in the noise itself, which is likely caused by some physical effect that becomes noticeable at this size. Therefore, I would recommend using 256 or 512 tiling when testing in pure color. This specific issue with 1024 tiling requires separate study, but I don't want to deal with that anymore; I use 256 for my images and tests. |
|
2026-03-09 10:37:58
|
What I mean is that the noise is so precisely placed that it causes an interference pattern in certain gray colors, which is a funny and unexpected effect at PBBN_1024.png |
|
|
Dunda
|
2026-03-09 10:41:58
|
Some ridges seem to show up around little cells as you say, what a peculiar effect |
|
2026-03-09 10:43:59
|
A snippet of PBBN_1024 at 89% zoom |
|
|
DZgas Ж
|
2026-03-09 10:44:50
|
This image is completely unrelated with my actualy blue noise. But I got it when I was trying to create a blue noise generation algorithm using neural networks. It was almost successful, but I wasn't happy with the quality, and this image appeared when the neural network overfitted |
|
|
Dunda
|
2026-03-09 10:46:07
|
It's almost like a perlin with impulse noise atop it |
|
|
DZgas Ж
|
2026-03-09 10:51:29
|
Given the infinite number of possible variations of blue noise, there's likely some as-yet-undiscovered algorithm that could generate ideal blue noise, just as ideal for solving interpolation problems. Well, there are no researchers in this field. |
|
2026-03-09 10:52:25
|
All I can do is generate 400,000 noises and poke at each one to check how good they are at it, and it still takes days of calculations |
|
|
Dunda
|
2026-03-09 10:52:51
|
I suppose most people consider there to be not much left to do with blue noise |
|
|
DZgas Ж
|
2026-03-09 10:53:42
|
Most people have been using void and clusters 1993 for 30 years now and everything is fine |
|
|
Dunda
|
2026-03-09 10:53:43
|
But there's a certain qualitative difference here that suggests there could be another angle to look at the problem through that is even smoother, only incidentally generated by void and cluster |
|
|
DZgas Ж
|
2026-03-09 10:54:49
|
Unfortunately, there's no way to create an initial grid other than simply generating a couple of percent random pixels and arranging them correctly, and I do the same. Any other initial grid generates a perfect Bayer grid instead of noise. |
|
|
Dunda
|
2026-03-09 10:56:30
|
Interesting how that works out |
|
2026-03-09 10:57:06
|
I am curious to see how perturbing one point in an initial grid that makes bayer would turn out |
|
2026-03-09 10:57:32
|
Perhaps there would be a cool effect of some chaos emerging within an ordered grid |
|
|
DZgas Ж
|
2026-03-09 10:57:42
|
like https://discord.com/channels/794206087879852103/794206087879852106/1479427896136302592 |
|
|
Dunda
|
2026-03-09 10:59:25
|
I mean taking a grid that would make bayer, slightly randomising one or a few points, and then continuing with the regular full algorithm |
|
|
DZgas Ж
|
2026-03-09 10:59:35
|
In fact, I described every discovery in great detail when creating my blue noise, you can start reading from here https://discord.com/channels/794206087879852103/794206087879852106/1478477243025199245 or at least scroll through, maybe some pictures will catch your eye |
|
|
Dunda
|
2026-03-09 10:59:40
|
Not for practical purposes because it might look interesting |
|
2026-03-09 11:00:54
|
I suspected it might have been correlated to that numberphile dither video, his "blue noise" looked really terrible and some people were likely bothered by that |
|
|
DZgas Ж
|
2026-03-09 11:03:17
|
Okay, let me satisfy your interest in about a couple of minutes |
|
2026-03-09 11:04:54
|
Since I was creating my own noise, I have a debug program for checking all the options for the number of points and the power of attraction of each pixel during the generation of my void and clusters algorithm. |
|
|
Dunda
|
2026-03-09 11:05:01
|
Thanks, hopefully it is interesting |
|
2026-03-09 11:06:18
|
This is interesting, how did you influence edges being formed in this noise? |
|
|
DZgas Ж
|
2026-03-09 11:07:57
|
Tellusim Blue Noise does this, but I don't care anymore, my noise turned out to be even better than the one generated specifically for the image |
|
|
Dunda
|
2026-03-09 11:11:07
|
Thanks for mentioning these other dithering types, I think your showing of SGBNS is especially beautiful |
|
|
DZgas Ж
|
2026-03-09 11:12:44
|
Due to the 10MB limit on Discord, I cut out a small, uninteresting part |
|
2026-03-09 11:14:59
|
Don't pay attention to the noisiness of the red, I needed more than 1000 generations for each parameter to draw conclusions after averaging them all |
|
2026-03-09 11:15:13
|
like |
|
|
Dunda
|
2026-03-09 11:16:07
|
Sorry, but what are the two axes? |
|
2026-03-09 11:17:33
|
The very left has an interesting effect, it is like bayer with region boundaries. Almost like a crystal structure |
|
|
DZgas Ж
|
2026-03-09 11:18:34
|
The number of pixels at initialization, from top to bottom, is the percentage, S=N//2 (2 percent of all pixels at initialization). In classic Void and Cluster, 10% is used, but I found that this is not the optimal value for a raster pixel grid.
From left to right, this is the force by which pixels are attracted to each other, similar to sigma in Void and Cluster. However, I completely removed the Sigma value from the algorithm. For me, this is P Power (absolute strength). 1.0 means that everyone is attracted to everyone else equally, regardless of distance. This turned out to be the ideal final parameter. |
|
2026-03-09 11:21:07
|
Another discovery was that when downscale any noise pattern, both mine and others, using the nearest-neighbor method, you can see the pattern. This is one of the interesting observations - If you take any image with 1-bit dithering and downscale it using the nearest-neighbor method, you will also see the pattern created by the noise. |
|
|
Dunda
|
2026-03-09 11:21:43
|
It seems like higher powers cause more low frequency signals, but you found 1.0 to be most ideal? |
|
2026-03-09 11:22:59
|
This is a very pretty pattern, it's like crystals again |
|
2026-03-09 11:23:04
|
Perhaps like bismuth |
|
|
DZgas Ж
|
2026-03-09 11:23:25
|
Yes, unfortunately, this is exactly what our eyes see. Despite the best metrics, this noise looks terrible when dithered, even though it has all the other ideal parameters. You can actually check this for yourself. |
|
|
Dunda
|
2026-03-09 11:25:47
|
I don't have any tools to formally check the noise, but it does seem to cluster similar to white noise |
|
|
DZgas Ж
|
2026-03-09 11:26:57
|
well noticed |
|
2026-03-09 11:27:45
|
I hadn't thought about it, but it seems to be true. After all, white noise has minimal possible artifacts during interpolation and scaling, and zero self-similarity. |
|
|
Dunda
|
2026-03-09 11:28:05
|
Well, actually comparing to white noise, the blue noise still seems more even despite clusters |
|