JPEG XL

Info

rules 58
github 38694
reddit 688

JPEG XL

tools 4270
website 1813
adoption 22069
image-compression-forum 0
glitch-art 1071

General chat

welcome 3957
introduce-yourself 294
color 1687
photography 3532
other-codecs 25116
on-topic 26652
off-topic 23987

Voice Channels

General 2578

Archived

bot-spam 4577

on-topic

Whatever else

DZgas Ж
2026-03-09 11:29:08 There's something interesting about blue noise being the perfect middle ground between a Bayer grid and white noise
Dunda
2026-03-09 11:29:50 What does the frequency plot of bayer actually look like?
DZgas Ж
2026-03-09 11:31:24
2026-03-09 11:34:01 Even the minimum attraction for the arrangement yields almost perfect noise, precisely because of the number of starting points. That is, this is the answer to the fact that if we only made the starting points, and all the other points were randomly selected.
Dunda
2026-03-09 11:34:39 How strange that the ftm also resembles bayer
2026-03-09 11:41:43 Sorry to have given you "advice" earlier, you clearly know very well what you are developing and I misinterpreted what you meant by "lines" as tiling borders
DZgas Ж
2026-03-09 11:50:12 I don't really understand what I'm doing, since I'm doing it based on practice and non-theory. There are probably dozens of specific names and designations for all of this. I don't care about any of that; I create things based on testing, not by thinking about how they should actually be. I'm not a theoretician, and besides, I'm self-taught. BUT this field, the creation of blue noise <:galaxybrain:821831336372338729> , is so small that no one even knows what to tell me, and no one knows anything deeper than the basics themselves hehe
Dunda
2026-03-09 11:51:38 I am strangely pulled in two directions, half the time I am reading theory, and the other half I am forgetting theory when I want to make a project
DZgas Ж
2026-03-09 11:54:42 Well, I've been doing *vibecoding *for four years now. I have zero programming skills, but I've developed a strong ability to accurately describe the neural network task, so you could say I'm a professional pro(mpt)grammer. I've created all the programs I've created over the past four years exactly like this. All the programs for testing and generation, I often create everything from scratch in Python. No problem, it's fast and interesting.
Dunda
2026-03-09 11:55:48 Ah, I'm not a fan of vibe coding, but at least you are going in an interesting direction
DZgas Ж
2026-03-09 11:57:35 I can understand. Ultimately, it's associated with bad code, but it's absolutely true. Vibecoding is as difficult as programming, but programming specifically doesn't come to me. I look at code for 10 years now and I see nothing, just look and my head is empty. Unfortunately, it's something personal.
Elias
2026-03-09 01:43:17 I know the correct answer to the question is probably "it depends on the image", but roughly speaking, what technique improves compression the most compared to JPEG? the variable DCT, predictions or ANS? Or something else entirely? Or is there an easy way to test it for a dataset?
DZgas Ж
2026-03-09 01:44:01 Are you asking about jpeg xl?
Elias
2026-03-09 01:44:12 yes
DZgas Ж
2026-03-09 01:46:18 Well, the basis is the smart definition of importance and division into varDCT blocks, better compression of the data itself within the DCT blocks. The second is the XYB space instead of YUV, but that's very brief, talking about the differences in JPEG. But I think it's better to read this: https://docs.google.com/presentation/d/1LlmUR0Uoh4dgT3DjanLjhlXrk_5W2nJBDqDAMbhe8v8/edit#slide=id.gae1d3c10a0_0_17
2026-03-09 01:52:00
Exorcist
2026-03-09 01:57:33 > improves compression the most intra prediction
_wb_
2026-03-09 02:56:27 Assuming you're talking about lossy JPEG and lossy JXL: in ballpark numbers, I'd say better entropy coding gives ~20% improvement (not just ANS but also context modeling and the numzero approach for encoding HF and the modular LF encoding), varblocks with adaptive quantization gives another 15% maybe, XYB and chroma from luma give a few percent each, say together 5-10%, and then gaborish/EPF also help a bit.
Elias
2026-03-09 03:05:35 yes exactly, thank you that helps a lot!
ziemek.z
2026-03-09 09:31:13 Thanks <@532010383041363969> for merging my (and couple of others') PRs so Knusperli can be built again with modern Bazel 🙂
2026-03-09 09:32:33 > MP3 Pro Max Will somebody tell him that mp3PRO already exists? 😉 https://www.rarewares.org/rrw/ctmp3pro.php https://web.archive.org/web/20080713034003/http://www.mp3prozone.com/basics.htm
2026-03-10 12:47:22 https://github.com/libjxl/libjxl/issues/477
Jyrki Alakuijala
2026-03-10 01:00:05 ((I wrote a relatively nice two pass dithering method around 1997 but didn't opensource, I proposed it to HP for their printers, but they didn't want it))
2026-03-10 01:01:43 makes me wonder what the JPEG XL noise generation (4-connected Laplacians) algorithm looks in FFT space
DZgas Ж
2026-03-10 01:01:56 not related to blue noise ?
2026-03-10 01:03:43 Well, it's most likely random. It's just that blue noise is the most non-random noise; it can't be generated from a single pixel; the algorithm is extremely complex. Therefore, it's used only as a ready-made set tile texture
2026-03-10 01:07:00 Using my blue noise texture as a random generator, I created this complex algorithm that projects each image pixel onto a three-dimensional-rhombus inside an RGB cube and selects the closest color based on the blue noise number. It's an extremely complex algorithm, and I'm starting to doubt anyone has ever done anything like this. For some reason, I can't see it... have I really invented something else?https://discord.com/channels/794206087879852103/794206087879852106/1479871428471029961 Can you tell me if this already existed and where?
2026-03-10 01:08:15 I found the simplest three-color algorithm, R G B 3-color dithering similar, but I didn't find anything like that for a palette, for a completely own custom color palette
Dunda
2026-03-10 01:24:07 It's hard to tell what exactly you mean
DZgas Ж
2026-03-10 01:28:09 I mean dithering algorithms that can do this quality https://discord.com/channels/794206087879852103/794206087879852106/1479871428471029961 using 16 colors
Exorcist
2026-03-10 01:28:37 Where is the source image?
DZgas Ж
2026-03-10 01:29:21 https://discord.com/channels/794206087879852103/794206087879852106/1479457050835685397 part at the bottom of the image
2026-03-10 01:33:57 16 colors [0, 0, 0], [255, 0, 0], [0, 255, 0], [0, 0, 255], [255, 255, 0], [255, 0, 255], [0, 255, 255], [192, 192, 192], [128, 128, 128], [128, 0, 0], [0, 128, 0], [0, 0, 128], [128, 128, 0], [128, 0, 128], [0, 128, 128], [255, 255, 255]
2026-03-10 01:34:45 full image dither
2026-03-10 01:36:31 PBBN_256.png which I used for dithering here https://discord.com/channels/794206087879852103/794206087879852106/1479454816991182930
A homosapien
2026-03-10 10:15:29 https://github.com/libjxl/libjxl/pull/4669
monad
2026-03-11 03:22:12 > clamped Is it actually clamped, or is this misstated?
A homosapien
2026-03-11 03:45:54 Would "scaled" be a more appropriate term? Only ±0.5 got changed to ±0.49219
monad
2026-03-11 03:58:39 Sounds like it is not misstated then, but clamping seems a blunt approach rather than scaling all values to accommodate the range limit.
A homosapien
2026-03-11 04:24:23 I'll scale the values when I'm not afk
spider-mario
2026-03-11 09:27:06 https://www.keithcirkel.co.uk/whats-my-jnd/
Tirr
2026-03-11 09:44:29 I got 0.0054
2026-03-11 09:45:47 the whole experience is something like "I can clearly see it" and "looked at it very long time and got it wrong"
_wb_
2026-03-11 09:46:01 nice, I only got here
2026-03-11 09:46:18 I mostly noticed my screen is dirty, I should clean it and retry 🙂
Tirr
2026-03-11 09:47:54 I'm on somewhat clean iPhone 17
veluca
2026-03-11 10:14:54
2026-03-11 10:15:17 (not sure if my display is properly calibrated though, using a AW3423DW)
lonjil
2026-03-11 10:15:18 I wonder how different my results might be between my budget screen from 2010 vs my midrange screen from 2020.
veluca
2026-03-11 10:16:48 wonder if this is what spending years staring at compressed images does to you 😄
Lucas Chollet
2026-03-11 10:18:16 This has to be dependent on your screen and ambient luminosity too
lonjil
2026-03-11 10:18:26 it says that the colors will get closer and closer but that's clearly not true in a perceptual sense
veluca
2026-03-11 10:18:58 yeah not all the color pairs with the same deltaE are visually equally apart
2026-03-11 10:22:10 yeah got 0.0047 on my phone, I suspect the desktop screen oversaturates a bit (or possibly it's just randomness)
lonjil
2026-03-11 10:33:01
Fahim
2026-03-11 10:35:47 I struggled a lot with the magentas and bright reds (Gigabyte M32U, should've probably switched to the sRGB mode lol)
2026-03-11 10:42:13 0.0042 on my phone, similar struggles but with touches registering quite far off alongside
spider-mario
2026-03-11 10:44:51 managed to optimise the viewing conditions to
2026-03-11 10:46:59 (link: https://www.keithcirkel.co.uk/whats-my-jnd/?r=AHokKP___O_3) (dark room, maximum screen brightness, no calibration, assigned a P3 profile to a roughly P3 screen, occasionally wiggled the window to make sure I wasn’t mistaking a screen non-uniformity for the actual transition)
DZgas Ж
2026-03-11 11:48:24 A strange but funny test, my 20-year-old ips monitor be well Frosted glass, sun shining on the wall, low contrast 0.0115
2026-03-11 11:49:53 Oh yea
2026-03-11 11:50:52 Well, either I just have good eyesight, I still don't understand what this test measures, my eye or my monitor...
2026-03-11 11:58:31 I think the old description was more intuitive because the phrase "to prevent rounding" begs the question of why the values ​​aren't equal to 0.49999 for this
veluca
2026-03-11 12:22:08 What's My JND? 0.0021 Can you beat it? https://www.keithcirkel.co.uk/whats-my-jnd/ (This on the phone)
_wb_
2026-03-11 01:05:11 What's My JND? 0.0033 Can you beat it? https://www.keithcirkel.co.uk/whats-my-jnd/?r=AUkgKP__8v4u On my phone
Kleis Auke
2026-03-11 01:58:35 What's My JND? 0.0037 Can you beat it? https://www.keithcirkel.co.uk/whats-my-jnd/?r=AXMgKP__8fTP (with my old SDR (Dell UltraSharp U2414H) monitor, I should probably test this on my phone as well)
jonnyawsom3
2026-03-11 02:17:53 I discovered 2 things. The first is that my desaturation testing basically trained me for this. The other is that my decade old phone just doesn't have a display capable of showing all the colors. A few times it had 'noise' because the range wasn't high enough What's My JND? 0.0115 Can you beat it? <https://www.keithcirkel.co.uk/whats-my-jnd/?r=BHofKP_7_Dh_>
Meow
2026-03-11 02:30:38 What's My JND? 0.0032 Can you beat it? https://www.keithcirkel.co.uk/whats-my-jnd/?r=AT0iKP__v8-P
2026-03-11 02:30:52 At the exhausted night
awxkee
2026-03-11 03:27:12 What's My JND? 0.0029 Can you beat it? https://www.keithcirkel.co.uk/whats-my-jnd/?r=ASEjKP__33-n
2026-03-11 03:28:00 It's fairly complicated on my MacBook
AccessViolation_
2026-03-11 04:22:20 ([Q1572121](<https://www.wikidata.org/entity/Q1572121>)) image file format ([Q138645418](<https://www.wikidata.org/entity/Q138645418>)) general-purpose image coding system `general-purpose image coding system` **is subclass of** `image file format`
2026-03-11 04:22:58 It's unfortunate there's no distinction between *file* formats and formats in general
2026-03-11 04:23:56 e.g. JPEG XL defines the file format and the bitstream, but they're not one and the same
2026-03-11 04:36:48 it seems like generalizing data formats or coding systems to all be file formats means things now can't be classified properly. I'm not about to change up a bunch of entities but it irks me a little
Kampidh
2026-03-11 07:55:49 https://www.keithcirkel.co.uk/whats-my-jnd/?r=AMUiKP___3Gf LG 27GL850, clamped to sRGB @ 130 cd/m2
DZgas Ж
2026-03-11 08:13:15 0.0070 at night
monad
2026-03-11 08:22:24 What's My JND? 0.0027 Can you beat it? <https://www.keithcirkel.co.uk/whats-my-jnd/?r=AQ0jKP__33f5> XB273K GP in bright room
2026-03-11 09:23:42 What's My JND? 0.00032 Can you beat it? <https://www.keithcirkel.co.uk/whats-my-jnd/?r=ACAkKP____o_> Second attempt, tryharding in a dark room
Adrian The Frog
2026-03-11 10:27:18 .0034 on my Oneplus 12, .0039 on cheap gaming IPS monitor
A homosapien
2026-03-11 10:32:17 Done
2026-03-11 10:32:44 I just tried it and a bunch of tests fail, 0.49999 still rounds to 0.5 which causes issues. I chose 0.49219 since it was also the limit of the older dithering implementation. It's passes all tests, which is good enough for me.
lonjil
2026-03-11 11:00:17 I did my previous one when the sun was blasting this room. Tried again now with just regular indoor lighting and got 0.0021. Both times on my cheap Lenovo monitor.
A homosapien
2026-03-11 11:01:57 I initially got 0.0018. The more I retake the test, the worse my scores become, I think my cones are getting fatigued.
jonnyawsom3
2026-03-11 11:06:05 Did better on my desktop
AccessViolation_
2026-03-11 11:47:38 What's My JND? 0.0029 Can you beat it? https://www.keithcirkel.co.uk/whats-my-jnd/?r=ASEhKP__-181
2026-03-11 11:56:08 second time, I assume higher is better because I took more time to allow my eyes to adjust and look around carefully before assuming I couldn't see the difference
2026-03-11 11:56:44 if lower is better then I don't have an explanation
ignaloidas
2026-03-12 12:03:33 Mostly struggled with bright colors, the darker ones were all basically free (in a dark room, mid-tier LG monitor)
Adrian The Frog
2026-03-12 02:54:31 lower is better. I had a decent amount of variance between attempts though
2026-03-12 03:11:40 got a new best, .0025 on my ips monitor after cleaning it
AccessViolation_
2026-03-12 09:05:12 there's no way any of these results here are comparable btw xD
2026-03-12 09:05:33 unless people here have the same display or viewing conditions
DZgas Ж
2026-03-12 09:22:16 Literally everyone has normal monitors and normal vision, wow, that's a test <:PepeOK:805388754545934396>
monad
2026-03-12 09:24:10 what if we're all the same person
DZgas Ж
2026-03-12 09:24:55 <:monkaMega:809252622900789269> I get it, we're all just have pixels on we're monitor, so the tests are similar
2026-03-12 09:27:37 I have the worst results, but they're so microscopic that it's not even worth it. My monitor is from 2009, it's 17 years old, IPS VGA, I bought it for literally $10
2026-03-12 09:28:07 Probably, to do poorly on the test you need to have a CRT monitor or just a terribly calibrated monitor
AccessViolation_
2026-03-12 10:00:24 careful, you're going to be invited to a philosophy podcast if you keep this up <:KekDog:805390049033191445>
Jyrki Alakuijala
2026-03-12 02:01:00 oklab color model doesn't have a gamma compression bias, i.e., no linear range for small values, it will be very confused there
ignaloidas
2026-03-12 02:09:53 tbh it wasn't even that dark for the easy ones, I'd say anything around 60% or below (okalab) lightness was easy
Jyrki Alakuijala
2026-03-12 03:07:02 in dark values oklab is totally off
2026-03-12 03:07:28 in bright values it is a log (human eye) vs. cubic root (oklab)
2026-03-12 03:07:47 in JPEG XL we also approximate the log as a cubic root, but at least it is biased for the low values
2026-03-12 03:08:44 Butteraugli XYB color compression is log, JPEG XL XYB cbrt, SSIMULACRA 2 cbrt
2026-03-12 03:11:04 I think SSIMULACRA cbrt is also biased, but never looked if it is really the case
Adrian The Frog
2026-03-12 03:18:56 On the 3 very different screens I've tried in very different viewing conditions, I'm relatively consistently getting .0045-.0025
AccessViolation_
2026-03-12 03:19:03 this incidentally could also be a pretty good test of how well image encoders are able to represent different but similar colors at different quality settings
Jyrki Alakuijala
2026-03-13 09:49:51 I got 0.0021 JND (usual office viewing environment)
Froozi
2026-03-13 09:51:43 I've managed 0.0057 on an incredibly old monitor that has few dead pixels and two dead LEDs on the back panel; But I doubt I'd manage a better score on a proper modern display.
Jyrki Alakuijala
2026-03-13 09:52:40 if you move around the window on the screen, you can escape some unevenness in the monitor that you cannot get with by just moving the eyes
Dunda
2026-03-13 11:07:21 I got about 0.0038 which is a little disappointing
Meow
2026-03-14 12:18:29 Maybe it meant that you should buy a better display
DZgas Ж
2026-03-14 12:36:58 lol
HCrikki
2026-03-15 04:30:44 dng sdk received an update for security/other fixes and jxl library update (1.7.1 Build 2502, March 10, 2026)
2026-03-15 04:31:24 anyone checked how meaningful change is?
RaveSteel
2026-03-15 07:19:12 No, but apparently there is now a license file included. Seems the immich devs have contacted a lawyer to check for compatibilty with FOSS licenses
Kleis Auke
2026-03-15 11:17:46 It looks like the "Adobe DNG SDK License Agreement" is not [OSI-approved](https://opensource.org/licenses), so it is unlikely to be included in Immich (though IANAL).
spider-mario
2026-03-15 01:53:42 ``` Example used in this integration: - File: `camera_raw/libjxl/client_projects/mac/Configs/jxl.xcconfig` - Added: `HWY_DISABLED_TARGETS=25952256` - Purpose: disable problematic SVE targets in this environment. ``` curious way to write `(HWY_SVE|HWY_SVE2|HWY_SVE_256|HWY_SVE2_128)`
2026-03-15 01:54:06 or even 0x18C0000
jonnyawsom3
2026-03-15 08:21:38 Library update will just be 0.11.2, so nothing new
HCrikki
2026-03-15 08:59:25 really? i thought it shipped some old slow libjxl
Froozi
2026-03-15 09:20:47 Noo, me want 0.12… <:FeelsSadMan:808221433243107338>
Ryoumiya
2026-03-16 08:48:03 Hello everone, I was recommneded to share this here... as part of SIGBOVIK me and a few friend decided to yeet a bunch of model weights into image compression (lossy and lossless) algorithms, and found that JXL performed pretty good for small models (0.15 MiB). Please focus on the top left part around 0.15 MiB that is lossless, with JXL, achiving a small 0.02MiB drop from the baseline,
spider-mario
2026-03-17 01:09:10 there is now apparently a “hard mode” https://www.keithcirkel.co.uk/whats-my-jnd-hard/
lonjil
2026-03-17 02:27:22 0.0058 Lack of gradient made it harder, but having differences in two axes made it easier to not get fooled by monitor weirdness.
Weaselslider
2026-03-17 06:33:08 a fun side effect of this is finding that 100% quality mode is consistently worse than lossless mode lol
HCrikki
2026-03-17 06:39:04 jpg is vague. could be handy comparing mozjpeg to jpegli
Weaselslider
2026-03-17 06:41:15 its whatever PIL has packaged in it
Fahim
2026-03-17 06:41:31 Should be plain libjpeg-turbo at best
jonnyawsom3
2026-03-17 07:21:11 One day we'll get into Blender...
RaveSteel
2026-03-17 08:03:36 JXL payload in EXR when? /s
_wb_
2026-03-17 08:13:09 TBH I would prefer it if JXL wouldn't be used as a payload codec inside application-specific formats, but specific applications would just use JXL files with their application-specific stuff in extra boxes. So instead of JXL in DICOM, or JXL in DNG, or JXL in GeoTIFF, etc, you would just have valid JXL files (even though you could give them a different filename extension if you like) with some extra box(es) in them with all the stuff needed for DICOM/DNG/GeoTIFF/etc. That way any viewer including browsers could just render the image natively with zero effort.
Exorcist
2026-03-17 08:14:23 > with their application-specific stuff in extra boxes Same wrong as APNG🤮
2026-03-17 08:16:20 In today, many viewers still show only first frame (and this is conformant)
spider-mario
2026-03-17 10:55:29 if the alternative is no support at all (like MNG), though?
Demiurge
2026-03-18 12:06:17 People will do what they want. Not everyone is smart enough to ask for consultation...
2026-03-18 12:17:43 Sometimes you have to reach out to someone and send an email to the right person and they're usually appreciate of the advice.
Mine18
2026-03-20 04:34:44 0.0189
AccessViolation_
2026-03-20 07:48:25
2026-03-20 07:53:19 I bet I'll do much better when I do it on my phone which can be set to stretch the sRGB range into amoled wide gamut
lonjil
2026-03-20 07:59:18 isn't that cheating?
AccessViolation_
2026-03-20 08:19:25 certainly, I wouldn't submit it
2026-03-20 08:28:35
DZgas Ж
2026-03-21 09:00:57 hm
A homosapien
2026-03-21 09:03:15 Yes?
DZgas Ж
2026-03-21 09:16:15 I just think it's a funny long delay, given that not a single line of executable code that could be checked has been changed and importance
KonPet
2026-03-26 02:24:26 https://booth.pm/ja/items/5245660 lmao
2026-03-26 02:24:55 it's almost like the jpegxl equivalent of llvm.moe
hana
2026-03-26 07:54:10 bruh
Meow
2026-03-26 10:12:12 We need JXLchan
jonnyawsom3
2026-03-30 03:01:38 <@1156997134445461574> How's the mobile photo upload quality been going by the way? Last time you said you started measuring quality with PSNR and SSIM and saw room for improvement. Currently (I think) Discord lowers quality based on resolution, but I'd use a fixed quality and lower the resolution instead. Then modern 50MP phones aren't uploading worse photos than a 1MP desktop resize Also if you hadn't seen, JXL support is now behind a flag in Chrome, with progressive loading implemented in Canary At some point you might even be able to replace the blurhashes and WebP previews with progressive JXL thumbnails instead (Quick comparison against an existing WebP preview from Discord's CDN. 31 KB of the WebP vs 4 KB, 11 KB and 31 KB of the same JXL)