JPEG XL

Info

rules 58
github 38694
reddit 688

JPEG XL

tools 4270
website 1813
adoption 22069
image-compression-forum 0
glitch-art 1071

General chat

welcome 3957
introduce-yourself 294
color 1687
photography 3532
other-codecs 25116
on-topic 26652
off-topic 23987

Voice Channels

General 2578

Archived

bot-spam 4577

jxl

Anything JPEG XL related

dogelition
2026-01-20 06:51:23 my jxr->png converter, reshade and skiv all just use bt.2100 pq and nobody's ever complained about that choice afaik
Quackdoc
2026-01-20 06:51:27 it will also mean I need to implement a few functions I'm too Lazy to do lol
dogelition
2026-01-20 06:51:38 i don't think those wasted bits have a big impact on compression but i could be wrong
Quackdoc
2026-01-20 06:52:23 the other issue with PQ is that its actually kinda annoying to work with, as its entirely reference based vs HLG which "just works"
dogelition
2026-01-20 06:52:46 well the data you're working with is already entirely reference-based, just linear and with different primaries
Quackdoc
2026-01-20 06:54:53 also the last thing is djxl tonemapping just seems to do better when fed HLG content vs pq
dogelition
2026-01-20 06:56:42 is that with the target_intensity set properly on the pq image? i would have assumed it does the same then
Quackdoc
2026-01-20 06:59:01 I think I set it yeah
dogelition
2026-01-20 07:00:40 can you send the jxl files and the commands that you're using?
Quackdoc
2026-01-20 07:01:59 its been a while back but I'll look for them
jonnyawsom3
2026-01-20 08:05:10 The API is already meant to output sRGB when the requested format is unsigned ints, so I think it'd make sense to rescale it then. Otherwise, it says keep it in Linear, where out of range is valid <https://github.com/libjxl/libjxl/blob/53042ec537712e0df08709524f4df097d42174bc/lib/include/jxl/codestream_header.h#L164>
ignaloidas
2026-01-20 09:39:33 I'm honestly a bit confused by the units used, isn't nits a "emitted light per unit *area*" measurement? Because sun is far you'd get a very different measure, what you should be looking at is instead "how much light from a certain spatial angle" for a comparison with displays?
Traneptora
2026-01-20 09:40:30 nits are actually both
spider-mario
2026-01-20 09:40:44 as you get further away, you also get more area in a given solid angle, so it cancels out
Traneptora
2026-01-20 09:41:29 lumens is total quantity of light lux is lumens per square meter candela is lumens per steradian nits is lumens per square meter per steradian
spider-mario
2026-01-20 09:42:56 think about taking a photo of your monitor displaying a white rectangle: how bright it is doesn’t depend on how close or how far you are
Traneptora
2026-01-20 09:44:04 more specifically, the sun is very far but it emits a constant amount of light per steradian regardless of your distance
2026-01-20 09:44:42 however you are correct that the apparent surface brightness of the sun in nits depends on our distance from it
spider-mario
2026-01-20 09:45:13 (does it?)
Traneptora
2026-01-20 09:45:43 yea, if you are farther from the sun, one square meter has fewer candelas
spider-mario
2026-01-20 09:46:03 but you get more square metres
Traneptora
2026-01-20 09:46:34 not really, that's luminous intensity
2026-01-20 09:47:33 luminous intensity doesn't depend on distance, but luminance does
2026-01-20 09:48:29 since luminous intensity of the sun is how much light you have per angle
ignaloidas
2026-01-20 09:50:06 this is very annoying to wrap your head around but I guess I get the point
Traneptora
2026-01-20 09:50:37 It doesn't particularly matter the difference between Lux and Nits though in the context of a monitor
2026-01-20 09:50:57 they're effectively just one another with a constant factor
spider-mario
2026-01-20 09:52:11 are you not thinking of illuminance?
Traneptora
2026-01-20 09:53:47 luminance is nits isn't it
2026-01-20 09:54:10 illuminance is lux
2026-01-20 09:54:21 the terms definitely don't help
spider-mario
2026-01-20 09:55:07 as far as I can find, illuminance depends on distance, luminance doesn’t
2026-01-20 10:00:27 https://en.wikipedia.org/wiki/Surface_brightness#Calculating_surface_brightness > For astronomical objects, surface brightness is analogous to photometric luminance and is therefore constant with distance: as an object becomes fainter with distance, it also becomes correspondingly smaller in visual area. […] [7]
Quackdoc
2026-01-20 10:08:41 so to send normalized scrgb to jxl, I would need to apply a transfer to it right? or could I just normalize it 0 to 1.0 and then set intensity_target and linear?
spider-mario
2026-01-20 10:15:39 if it’s linear and originally went from 0 to 8, you could in principle normalise to 0 to 1 and set the intensity_target to 8 times what you would otherwise have set it to, but applications may be less likely to treat it as HDR than if you were to convert it to PQ or HLG (and out of the two, the conversion to PQ is probably more straightforward to define)
2026-01-20 10:16:17 `exr_to_pq` from libjxl’s devtools can take care of that
2026-01-20 10:17:06 it will take an input EXR of arbitrary range, as well as an optional parameter to clarify the luminance (by default, it assumes that 1.0 = 100 nits), and will output a PQ PNG and the intensity_target parameter to use
2026-01-20 10:17:36 (as in, it will tell you “you should encode this with `--intensity_target=1812`”)
Quackdoc
2026-01-20 10:17:47 thats nifty for sure
2026-01-20 10:22:12 Ill probably just normalize and then linear for now, and then add pqification later, or just point people to output exr then use exr_to_pnq
jonnyawsom3
2026-01-20 10:40:56 Any thoughts on that? Right now it's just clipping the range at 0-1, I think it'd make sense to rescale it too
Quackdoc
2026-01-21 01:04:07
2026-01-21 01:04:14 hmmm
2026-01-21 01:04:34 got sent this, setting intensity nits doesn't seem to help, or I messed up normalization, both are possible
2026-01-21 01:04:40 jxl-oxide produces a very different image
2026-01-21 01:05:12 jxl-oxide:
2026-01-21 01:06:43 djxl: https://files.catbox.moe/we78bz.png
2026-01-21 01:07:08 ``` box: type: "JXL " size: 12, contents size: 4 JPEG XL file format container (ISO/IEC 18181-2) box: type: "ftyp" size: 20, contents size: 12 box: type: "jxlc" size: 637157, contents size: 637149 JPEG XL image, 2560x1440, lossy, 32-bit float (8 exponent bits) RGB num_color_channels: 3 num_extra_channels: 0 intensity_target: 1012.500000 nits min_nits: 0.000000 relative_to_max_display: 0 linear_below: 0.000000 have_preview: 0 have_animation: 0 Intrinsic dimensions: 2560x1440 Orientation: 1 (Normal) Color space: RGB, D65, sRGB primaries, Linear transfer function, rendering intent: Relative ```
Traneptora
2026-01-21 01:49:09 out of curiosity how doee jxlatte handle it
2026-01-21 01:49:44 it does linear transform before clipping if you output a PQ PNG
jonnyawsom3
2026-01-21 01:50:17 I could be wrong, but it looks like Oxide did the 0-1 rescaling for 8-bit sRGB output
Traneptora
2026-01-21 01:50:32 Yes, peak detection
2026-01-21 01:51:22 jxlatte will also output a PFM without color management if that can be helpful
jonnyawsom3
2026-01-21 01:54:43 Just doing `test-lossy-1012.jxl Test.png` gave a 16bit file that looks like this
Quackdoc
2026-01-21 01:57:31 immensely, I will try that later, I also just used a really basic normalization that doesn't take gamut or anything else into consideration which I probably should have done, I havent pondered on that yet very much
2026-01-21 01:58:18 im more focused on trying to get the brightness at least close to good enough
2026-01-21 02:00:40 mmm, opening it in tev isnt pretty either, maybe Ill have to do more then just a basic normalization afterall
2026-01-21 02:04:34 pretty really basic stuff https://codeberg.org/Quackdoc/jxr_to_jxl/commit/57313ff2720a2f6a5f682f6ca8b5848719974ab5 with this jpegxl-rs patch https://github.com/inflation/jpegxl-rs/issues/185#issuecomment-3775053622 ill probably not actually bother with the normalization and just go straight to a basic sdrifcation or pqification
jonnyawsom3
2026-01-21 02:14:54 Turns out I have a f32 image with uh.... Yeah
2026-01-21 02:18:47 Turns out highlights in Blender renders get quite bright
2026-01-21 02:23:05 exr_to_pq understandably gives up `WARNING: the image is too bright for PQ (would need (1, 1, 1) to be 1.14358e+09 cd/m^2)`
2026-01-21 02:24:19 Though also fails to output the PNG even when it does work `JPEGXL_TOOLS_CHECK: jpegxl::tools::Encode(*image, output_filename, &encoded, pool.get())`
Traneptora
2026-01-21 02:48:25 PQ profile?
jonnyawsom3
2026-01-21 02:52:22 No, but it gave the same output with `--png-hdr`, in fact none of the arguments seem to be doing anything...
Traneptora
2026-01-21 02:53:57 It's supposed to attach a PQ profile if you use --png-hdr=yes
2026-01-21 02:54:00 bug?
2026-01-21 02:54:11 latest version etc.
jonnyawsom3
2026-01-21 02:57:12 `--png-compression` and `--draw-varblocks` work, but `--png-depth`, `--png-hdr` and `--png-peak-detect` don't seem the change the final image. Binary is dated November 5th, v0.1.0
Traneptora
2026-01-21 02:58:10 15 commits since then
2026-01-21 03:10:56 when I use `java -jar jxlatte.jar --png-hdr test-lossy-1012.jxl test.png` it produces a PQ PNG that renders fine
2026-01-21 03:12:22 note that `--png-hdr` defaults to auto, and `--png-depth` defaults to 16 when outputting HDR, so it should be outputting a PQ profile
jonnyawsom3
2026-01-21 03:35:41 Strange, I tried opening the PNG in both an image viewer and Chrome but it was washed out
Traneptora
2026-01-21 03:36:03 mpv displayed it correctly. I did have some recent PNG writer changes
2026-01-21 03:36:30 mpv performs peak detection on HDR when outputting to an SDR display, which is necessary
2026-01-21 03:36:40 you can't display HDR images on an SDR display properly without doing so
jonnyawsom3
2026-01-21 03:37:10 I checked and the PNG had no CICP chunk so that was probably it
Traneptora
2026-01-21 03:40:11 yea it just had an iCCP
Quackdoc
2026-01-21 06:22:13 I added clipping negative values out to prevent the image from darkening and now jxl-oxide looks perfectly like what I wanted `jxl-oxide --target-colorspace "wp=d65,primaries=srgb,tf=srgb,intent=rel" /tmp/test.jxl -o /tmp/test.jxl.png`
2026-01-21 06:24:01 djxl is still really dark though https://files.catbox.moe/l7qlm2.png
2026-01-21 06:24:22 honestly, I think if someone is using my tool, they can also use jxl-oxide so ill call that fine lmao
Exorcist
2026-01-21 07:20:41 Is there any browser polyfill based on jxl-rs?
HCrikki
2026-01-21 07:58:42 https://github.com/hjanuschka/jxl-rs-polyfill
2026-01-21 07:59:47 seems much better than the outdated slow wasm from years back
Quackdoc
2026-01-21 08:12:52 ~~extension when~~
AccessViolation_
2026-01-21 08:43:19 okay I am relieved it's still doing the actual decoding in Wasm haha
2026-01-21 08:44:07 I was worried it was JS judging by the yellow bar, which would have been a lot of performance left on the table compared to Wasm
chocolatkey
2026-01-21 10:45:19 https://github.com/hjanuschka/jxl-rs-polyfill/blob/main/src/jxl-polyfill.js#L48 I guess this is the smallest possible JXL that can be decoded for testing support?
2026-01-21 10:45:39 Always useful to know for each image format
A homosapien
2026-01-21 11:00:10 The smallest jxl is 12 bytes `data:image/jxl;base64,/wr/BwiDBAwASyAY`
jonnyawsom3
2026-01-21 11:01:36 Nope https://x.com/jonsneyers/status/1452727349405966341
2026-01-21 11:01:56 Damn, Sapien beat me while I was fighting the mobile UI
2026-01-21 11:04:40 There's also this github repo you might find useful https://github.com/mathiasbynens/small/pull/125
AccessViolation_
2026-01-21 11:21:40 set your username to `BwiDBAwASyAY`, and your profile picture to that image, and know the sysadmin will freak out when they see your username and profile image are the same value in the database
Quackdoc
2026-01-22 12:42:01 xD
chocolatkey
2026-01-22 01:03:14 Oh nice, I like having the smallest ones when testing for support
2026-01-22 01:04:37 definitely smaller than my current PNG "pixel" in base64
TheBigBadBoy - 𝙸𝚛
2026-01-22 09:25:38 funny how "1 pixel" image is not always the smallest size across codecs https://cdn.discordapp.com/attachments/673202643916816384/1204546007610949652/image.png
Exorcist
2026-01-22 09:50:51 HEIF container is bloated you can try `.obu` or `.ivf` for AV1
_wb_
2026-01-22 02:30:24 in avif/heic the payload codec can only do an integer number of macroblocks iirc, so to make a 1-pixel image you need to encode an 8x8 image and add container-level syntax to crop it to 1x1
2026-01-22 02:31:35 in jxl there's a compact sizeheader when the dimensions are a multiple of 8, this saves one byte
mincerafter42
2026-01-22 07:24:11 ?? how can GIF possibly have different sizes for single grey pixel and single yellow pixel
2026-01-22 07:25:44 if i recall i think a few GIF implementations might have a default palette of just #000 and #fff if no palette is specified, but i don't think that's common
AccessViolation_
2026-01-22 08:02:30 the most surprising about this to me is that making the pixel transparent adds around 200 bytes for AVIF and 300 bytes for HEIC
2026-01-22 08:03:39 I guess that's because transparency is a separate losslessly encoded frame/layer? at least for AVIF iirc, idk how HEIC does it
_wb_
2026-01-22 08:18:48 yes, both avif and heic have to do alpha as a separate codestream with a pretty large amount of container overhead
2026-01-22 08:23:27 for example you have to include the literal ascii string `urn:mpeg:mpegB:cicp:systems:auxiliary:alpha` to mark that separate codestream as being of type alpha
AccessViolation_
2026-01-22 08:38:44 actually, I wonder if this is lossless or lossy mode for JPEG XL?
TheBigBadBoy - 𝙸𝚛
2026-01-22 08:39:35 should be lossless
_wb_
2026-01-22 08:47:05 lossless indeed
jonnyawsom3
2026-01-22 08:48:37 Yeah, it's JXL art ```Bitdepth 8 Width 256 Height 128 Gaborish 16BitBuffers XYB XYBFactors 4096 512 256 EPF 2 - Set 0```
AccessViolation_
2026-01-22 08:58:49 if a single black pixel is 13 bytes, what is the smallest possible image, at 12 bytes? a 0 x 0 image?
2026-01-22 09:01:42 ah, is that it
2026-01-22 09:01:57 it doesn't work on the jxl art website
2026-01-22 09:02:18 I think `16BitBuffers` might be the thing that breaks it, because it works without that
jonnyawsom3
2026-01-22 09:16:46 Yeah, 16bit is on libjxl main but not the site yet
Traneptora
2026-01-22 09:19:50 256x512 iirc
2026-01-22 09:20:11 wait no it's 512x256
2026-01-22 09:20:25 cheap size header
AccessViolation_
2026-01-22 09:25:57 gotcha
2026-01-22 09:26:20 the semitransparent HEIC pixel being almost a kilobyte is actually so funny
2026-01-22 09:26:40 oops, sorry for the ping
2026-01-22 09:34:47 wait, why are we setting EPF and Gaborish in Modular mode <:Thonk:805904896879493180>
2026-01-22 09:37:20 I mean I get why - so that it doesn't need to spend bits on deviating from default options. I'm just surprised it's possible to set those in Modular mode at all
jonnyawsom3
2026-01-22 09:40:21 But yeah, that's the smallest possible JXL. 12 bytes
TheBigBadBoy - 𝙸𝚛
2026-01-22 09:51:00 I don't care about pings lol
AccessViolation_
2026-01-22 09:54:39 me: I sure hope my tiny images are being compressed efficiently today the nefarious 1 kilobyte single semitransparant HEIC pixel:
username
2026-01-22 10:25:17 something random I just remembered is that the old libjxl implementation in Chromium would skip doing progressive decoding if an image had an alpha channel
2026-01-22 10:26:43 still a bit unsure as to why it explicitly did that since I remember trying progressive decoding with libjxl in Waterfox/patched-Firefox and images with alpha seemed to look fine while decoding progressively
2026-01-22 10:30:23 oh huh I'm looking back a few years and it was a patched version of Chromium I tested to show that progressive decoding with an alpha channel in the image looked fine not Gecko
2026-01-22 10:33:05 hmm was probably a bug and not explicit then
whatsurname
2026-01-23 01:16:36 They are trying to address that with an amendment https://www.iso.org/standard/90273.html
Orum
2026-01-23 01:39:36 AVIF header about to ruin your day <:PeepoDiamondSword:805394101340078092>
jonnyawsom3
2026-01-23 03:27:03 I tried to check when LQIP decoding got broken, but it was v0.7-base, before Chrome got progressive support in 2022 <https://github.com/libjxl/libjxl/commit/27d1aec3461e467e3035cbfdb5413bba924a249f> <https://chromium-review.googlesource.com/c/chromium/src/+/3865432>
gawi.
2026-01-23 03:55:40 Hey, is there a certain recommended way to handle creating animated jxl files from frames currently? So far it seems muxing to apng with ffmpeg before converting with cjxl is generally effective, however the conversion between duration formats gets truncated without compensation resulting in an innacurate duration. Any tips on this front yet?
monad
2026-01-23 05:12:27 https://discord.com/channels/794206087879852103/794206170445119489/1458297038310736067
2026-01-23 05:13:06 no need to go through apng if you're using ffmpeg
gawi.
2026-01-23 05:25:54 ah gotcha, didnt find the right codec, thanks for the pointer
jonnyawsom3
2026-01-23 05:35:27 Do note, the animation encoder isn't very 'smart' yet, it doesn't do differential storage between frames, ect
gawi.
2026-01-23 04:24:38 although, now that I'm testing it, it does seems that jxl can support tick per second definition and isn't limited to ms/f (and the encoder handles this already). I'm guessing cjxl code was mainly built for gif conversion, and apng is less of a priority.
jonnyawsom3
2026-01-23 04:28:59 APNG is actually better supported with cjxl, not sure why the timing was off. But yeah, it supports setting a tick as a fraction of a second, and then ticks per frame
gawi.
2026-01-23 04:31:33 gotcha neat, in that case, should I maybe open an issue on github? cant find one mentioning this yet (only found a comment in a discussion [mentioning](https://github.com/libjxl/libjxl/discussions/3495#discussioncomment-15111631) the issue)
Traneptora
2026-01-23 09:59:12 they're part of the frame header and they are used for lossy modular
AccessViolation_
2026-01-25 10:47:15 question about lossless JPEG transcoding
2026-01-25 10:47:32 it's fast enough that a webserver can do it just-in-time before serving the request
2026-01-25 10:48:32 could you make it "lossy" by ditching some information, while keeping it just as fast? transcoding it from pixel data is usually much slower
2026-01-25 10:50:16 like, quantizing some coefficients more without ever touching the pixel data?
2026-01-25 10:51:17 I'm thinking about a bandwidth-constrained server that can dynamically reduce the quality of transcoded JPEG images to adapt to the load, but if you VarDCT encode from pixel data that might be too slow to be practical
2026-01-25 10:54:12 I know this wouldn't be lossless anymore, I'm just wondering if there's a faster way of transcoding JPEG images to a lower quality, that uses some of the coding tools from lossless JPEG recompression 🤔
jonnyawsom3
2026-01-25 11:04:50 Yet another feature that's been discussed in theory but not tried yet. No reason why it shouldn't be possible
AccessViolation_
2026-01-25 11:06:38 ooo, awesome
_wb_
2026-01-25 11:14:21 You can requantize DCT coeffs, I made a proprietary fork of jpegtran that does just that and it is used in Cloudinary when you do q_auto with JPEG in, JPEG out, no pixels changed. In jxl you could do something similar in principle.
AccessViolation_
2026-01-25 06:00:49 I was invited to be a member of staff for some unrelated community, and naturally my first order of business is transcoding all the assets on our community website (I didn't ask but I'm pretty sure I was selected for my JPEG XL evangelism :clueless:) ``` 5.9M ┌── .jxl │█████ │ 17% 28M ├── .png │█████████████████████████ │ 83% 34M ┌─┴ (total)│█████████████████████████████ │ 100% ```
username
2026-01-25 06:02:24 any JPEGs on the site?
AccessViolation_
2026-01-25 06:02:40 nope, just PNGs
2026-01-25 06:05:05 there was one image I considered turning from a PNG into a lossy JXL, because it had a lot of noise (some sort of ray-traced render of a MInecraft skin), and because of that a larger file size, so it would benefit from progressive loading in the future. but upon closer inspection it turned out to be one of those lossy PNGs that had only 1177 unique colors, so I just went with a lossless version for that too. JXL's pallete compression did pretty well (PNG did well too, to be fair). VarDCT was not competitive
username
2026-01-25 06:05:32 I wonder what size those lossless JXLs would be if encoded to be progressive with nightly libjxl
AccessViolation_
2026-01-25 06:06:31 last time I tried progressive lossless using squeeze it massively inflated the file size, though that was years ago. presumably you mention it because things have improved somewhat? 👀
username
2026-01-25 06:06:53 yes size has improved majorly since then
2026-01-25 06:07:21 libjxl was making some really **really** bad choices when enocding progressive lossless
2026-01-25 06:07:55 they are still going to be larger then non-progressive ones but way less so now
AccessViolation_
2026-01-25 06:08:03 oh cool 👀
2026-01-25 06:08:34 I'll check that out for sure. is that just the current main branch?
username
2026-01-25 06:08:42 yep
2026-01-25 06:09:04 here's the PR that improved things: https://github.com/libjxl/libjxl/pull/4201
jonnyawsom3
2026-01-25 06:25:52 Roughly 40% smaller, but I know we can get another 20% on top of that. The heuristics seem to break so we had to hardcode it to YCoCg and no predictor
username
2026-01-25 06:27:09 no predictor? was that done just for speed or did none of the predictors give a size improvement?
2026-01-25 06:27:31 was it a case of just lz77 on the raw pixels did better?
jonnyawsom3
2026-01-25 06:29:51 Only P15 made it smaller, which is why we wanted to make a new predictor set for progressive but never got round to it
2026-01-25 06:31:24 The squeeze steps function like the HF and the LF, so some are mostly noise and some should be relatively smooth, forcing 1 predictor would only help one half of it
username
2026-01-25 06:34:13 speaking of, I don't think we ever found out exactly why/how the order of predictors in a set affects things in libjxl
runr855
2026-01-25 07:10:55 To enable progressive with lossless is `-R` switch right? On current release version 0.11.1 I got an image which if I remember correctly became multiple times bigger than even the original image. Does that sound like a bug or reasonable for the current implementation in the release version?
2026-01-25 07:11:21 I don't remember what my test image was
jonnyawsom3
2026-01-25 07:28:45 `-p` just like lossy, and use a build from here <https://artifacts.lucaversari.it/libjxl/libjxl/latest/>, should be 30-40% smaller than v0.11
_wb_
2026-01-25 07:30:10 It depends on the image content. For photographic images, the gap between progressive and non-progressive lossless shouldn't be too large. For nonphoto, the gap can be very large since things like palette and lz77 cannot be used when doing progressive.
jonnyawsom3
2026-01-25 07:34:58 I assume you mean they wouldn't be very effective, since the residuals would be mostly noise. IIRC right now it's using LZ77 but pallete was doubling the filesize
_wb_
2026-01-25 07:49:26 lz77 and palette might still be useful for progressive lossless too, but I don't think they will be very effective typically
jonnyawsom3
2026-01-25 09:00:36 In future we could allow setting how progressive it is too. 3 squeeze steps would match VarDCT's 1:8 LF, right now it goes down to 8 pixels regardless of original size. Definitely a lot of testing to be done
runr855
2026-01-25 11:13:58 What is the difference? On the release build I get the same file size with both `-p` and `-R 1`
2026-01-25 11:15:00 On main release it gives over twice the size of an oxipng optimized screenshot
2026-01-25 11:15:27 I know you recommend using latest main
jonnyawsom3
2026-01-26 07:34:07 I couldn't remember if we did anything special for the progressive flag or if it was only tied to the squeeze transform. Oxipng isn't progressive by default, so that's expected. Size should improve in the future, or you can try -e 10 --patches 0
TheBigBadBoy - 𝙸𝚛
2026-01-26 12:58:01 you should take a look at [jpegdropscan.pl](https://github.com/MegaByte/jpegultrascan/blob/master/jpegdropscan.md) > JPEG lossy recompressor that removes least informative scans to reach a target quality > Rather than recoding an image, jpegdropscan iteratively removes scan information starting with least significant bits and moving to least significant coefficients. Originally developed using butteraugli for quality evaluation, but other metrics may be used. > > Best results can be achieved by processing an image with jpegultrascan first, running jpegdropscan, then jpegultrascan once more.
jonnyawsom3
2026-01-26 04:55:51 I only just remembered, but `-p` also sets the group order to centre-first instead of top to bottom. Same filesize but different results
Orum
2026-01-27 04:14:55 `Brotli-compressed Exif metadata: 57 compressed bytes` how do I view all of this (after decompression)? <:Thonk:805904896879493180>
VcSaJen
2026-01-27 09:16:59 I heard that lossless patches are using too much memory on hi-res images, is there a reason for that?.. Can't you just grab every x pixels (both vertically and horizontally), essentially lowering memory usage by x² times? Then just verify that those regions are actually equal after matches are found.
A homosapien
2026-01-27 09:20:24 Patches did get memory optimizations a while back, but this improvement is only on main not v0.11
jonnyawsom3
2026-01-27 09:27:13 How do you decide what x is though?
2026-01-27 09:28:42 And how do you know if it's a match if you only have x amount of pixels to compare against
VcSaJen
2026-01-27 09:48:27 It would work fine for high-res high-density images like digital pictures for printing and stuff, and non-small patches. It's basically taking bird's eye of view before bothering with details, and patches being "non-small" would keep amount of false-positives low. It would of course break down on high-res low-density images like satellite imaging, or for very small patches. > How do you decide what x is though? Minimal value to keep resulting resolution at or below 2048 (current limit, AFAIK). So, 2 for 4096x4096, 3 for 6144x6144, etc.
jonnyawsom3
2026-01-27 10:06:08 Pictures for printing generally don't have any identical content though, Patches is mostly for text, tiles and icons
TheBigBadBoy - 𝙸𝚛
2026-01-27 11:07:00 you can even use `exiftool` needs `perl-io-compress-brotli: Support for brotli-compressed` as dep tho
monad
2026-01-27 12:55:12 If I understand correctly, it seems not as robust because high-res matches can diverge in shape in the low-res representation. Otherwise, assuming heuristics would scale, it could reduce the memory cost of unmatched quantized patch candidates.
Traneptora
2026-01-27 02:08:32 exiftool and ffmpeg both support exif reporting
2026-01-27 05:32:02 (for example, `ffprobe -i input.jxl -show_frames` will dump the EXIF tags)
nicosemp
2026-01-27 09:32:10 What's the best way to detect `image/jxl` support from a client? Zen Browser has the JPEG XL flag enabled by default, but it returns the same `navigator.userAgent` as Firefox (of course). What's the next most reliable way to tell them apart?
Quackdoc
2026-01-27 09:33:56 accept headers and then html tag fallbacks are the only reliable solution
nicosemp
2026-01-27 09:35:26 thanks, i'll look into that
AccessViolation_
2026-01-27 10:16:47 in addition to what's been said, if you just want an image element that tries to render a JXL first and fall back to a different format, you can use the `<picture>` HTML element which natively supports this, and doesn't require looking at request headers: ```html <picture> <source srcset="assets/screenshot.jxl" type="image/jxl" /> <source srcset="assets/screenshot.webp" type="image/webp" /> <img src="assets/screenshot.png" alt="A screenshot of a video game" /> </picture> ``` this will try to load the JXL first, and if that's not supported it'll try the WebP, and if that's not supported it'll try the PNG. it's a bit confusing, but the reason the last one is an `<img>` rather than a `<source>` is because that way it also provides a fallback if the `<picture>` element itself is not supported. and it's also where you attach any additional attributes, like `alt="..."`. these will also apply to above formats if one of those are loaded. it's not what you asked, but it's a common use case so I thought I'd mention it
missaustraliana
2026-01-27 10:58:35 hiii
nicosemp
2026-01-27 11:14:05 Yes I believe that would be the best approach! However I was looking into this for Immich, that added `image/jxl` [support on Safari](https://github.com/immich-app/immich/pull/24766/changes). Their current approach - [browser detection](https://github.com/immich-app/immich/blob/main/web/src/lib/utils/asset-utils.ts#L304-L318) and [asset selection](https://github.com/immich-app/immich/blob/main/web/src/lib/utils.ts#L223-L228) - seems to use client-side logic to decide whether to load the original file, a fullsize version or a preview. The latter two are both previously generated for compatibility and speed. Using the `<picture>` tag would indeed be ideal, but would also require a bit of a rework of their logic.
AccessViolation_
2026-01-27 11:16:10 ah, interesting
2026-01-27 11:17:48 also re: accept header, I think this changes depending on the type of request. for example a request for the main document of a page won't list all the supported image formats in the accept header, but a request initiated from an `<img>` element will
2026-01-27 11:18:30 so you'd have to get the browser to send a dummy request for an image to get the browser to tell you which image types it supports, I think
2026-01-27 11:20:41 my knowledge on this is limited, I'm not a web dev by trade, I just happened to have been working on adding JXL support to a site myself as of recent
nicosemp
2026-01-27 11:37:56 no worries! thanks for the additional insights
Exorcist
2026-01-28 12:11:01 ```js var img = new Image(); img.onload = () => alert('yes'); img.onerror = () => alert('no'); img.src = 'data:image/jxl;base64,/woIAAAMABKIAgC4AF3lEgA='; ```
nicosemp
2026-01-28 06:24:06 This is a really nice solution, thanks! I was trying to do the same with HEIC. Using `magick -size 1x1 xc:white -strip smallimg.heic` the smallest I could get is a 500bytes image. I'm sure JPEG XL holds the crown, but do you know if it's possible to get a HEIC image base64 string smaller than what I got?
Exorcist
2026-01-28 06:51:14 <@124665624176623617> ⬆️
TheBigBadBoy - 𝙸𝚛
2026-01-28 10:19:57 btw anyone knows where this comes from ? https://cdn.discordapp.com/attachments/673202643916816384/1204546007610949652/image.png I remember getting it from this server a while ago...
username
2026-01-28 10:21:14 one of the slides from here: https://docs.google.com/presentation/d/1LlmUR0Uoh4dgT3DjanLjhlXrk_5W2nJBDqDAMbhe8v8/view
2026-01-28 10:21:42 I would have linked to the actual slide but my internet always struggles loading this
TheBigBadBoy - 𝙸𝚛
2026-01-28 10:22:20 that's already really great thanks
nicosemp
2026-01-28 10:23:30 slide 90! it would be amazing to find that 386byte HEIC valid image for browser support detection
AccessViolation_
2026-01-28 10:24:03 the return of the nefarious 1 kilobyte single semitransparant HEIC pixel
2026-01-28 10:26:01 that's a lot of fun to say, by the way, I can highly recommend it
MissBehavior
2026-01-30 02:52:59 what is "Lossy modular"?
2026-01-30 02:53:22
Orum
2026-01-30 03:14:37 modular encoding for lossy images
2026-01-30 03:14:53 seems pretty self-explanatory
_wb_
2026-01-30 03:21:03 the original source for this was https://cloudinary.com/blog/one_pixel_is_worth_three_thousand_words, but that's from before JPEG XL existed
monad
2026-01-30 10:24:23 One tool for lossy encoding is VarDCT, which transforms data in a similar way to traditional JPEG. This is generally effective for representing photographic content at some desired visual fidelity. JXL doesn't require using this tool, there are other so-called modular tools which enable lossless encoding, but can also intake alternatively transformed data. A hypothetical encoder could make lossy decisions which are more desirable for representing non-photographic content or mathematical data, but I am not sure libjxl is very mature here.
Orum
2026-01-30 11:09:18 yeah, it's basically like lossy PNG
_wb_
2026-01-31 09:10:22 except Squeeze and Delta palette are tools in the modular toolkit that explicitly designed with lossy in mind, even though they can also be used in a lossless way
2026-01-31 09:11:34 I am sure libjxl is not very mature in this area 🙂
intelfx
2026-02-01 05:29:34 Hey. Quick question. What am I doing wrong and why ⁨⁨⁨⁨`cjxl -j1`⁩⁩⁩⁩ produces an image that, upon further comparison, does not yield a butteraugli metric = 0 when compared to the original jpg? ⁨⁨⁨⁨``` $ /usr/lib/jxl/butteraugli IMG_3975.JPG IMG_3975.my.q100j0.jxl 0.0000000000 3-norm: 0.000000 $ /usr/lib/jxl/butteraugli IMG_3975.JPG IMG_3975.my.q100j1.jxl 1.5599813461 3-norm: 0.329101 ```⁩⁩⁩⁩ (all cjxl args encoded in the file names)
RaveSteel
2026-02-01 05:32:31 Expected behaviour due to differences in decoding, the underlying image data is still identical though. ~~You can verify this by comparing the original JPEG and JXL using ssimulacra2~~
intelfx
2026-02-01 05:32:51 Same thing, unless you mean something else: ⁨⁨``` $ /usr/lib/jxl/ssimulacra2 IMG_3975.JPG IMG_3975.my.q100j1.jxl 93.73162939 ```⁩⁩
RaveSteel
2026-02-01 05:34:30 Alternatively (the better solution), is to use imagemagick's ⁨`identify`⁩ like so ⁨`/usr/bin/identify -format "%#" SOURCE.JPEG`⁩ ⁨`/usr/bin/djxl TRANSCODED.JXL --output_format jpg - | /usr/bin/identify -format "%#" -`⁩
2026-02-01 05:34:58 The best way is to use a script which does these steps for you so you can always verify the hashes
intelfx
2026-02-01 05:43:24 Okay, yeah, this gives identical hashes for the JPEG and ⁨⁨⁨`cjxl -j1`⁩⁩⁩ JXLs: ⁨⁨⁨``` $ for f in IMG_3975*; do printf "%s\t%s\n" "$(case "${f:l}" in; *.jxl) djxl $f --output_format jpg - 2>/dev/null;; *) cat $f;; esac | identify -format "%#" -)" "$f"; done | sort 0768dff3e09d90f2bc97915987acdc7567bd40429b3a3196d784954250dbf626 IMG_3975.my.q100j0.jxl 0768dff3e09d90f2bc97915987acdc7567bd40429b3a3196d784954250dbf626 IMG_3975.my.q100j0p.jxl ca9652a5b6fbbfddecad925a69a810c9fb447bc1588393a48a6e217fc90e4705 IMG_3975.JPG ca9652a5b6fbbfddecad925a69a810c9fb447bc1588393a48a6e217fc90e4705 IMG_3975.my.j1.jxl ca9652a5b6fbbfddecad925a69a810c9fb447bc1588393a48a6e217fc90e4705 IMG_3975.my.q100j1.jxl ca9652a5b6fbbfddecad925a69a810c9fb447bc1588393a48a6e217fc90e4705 IMG_3975.my.q100j1p.jxl ```⁩⁩⁩ BUT. 1. ~~Why ⁨⁨⁨`cjxl -q100 -j0`⁩⁩⁩ produces a different hash?~~ Ah, nevermind, we are doing an unconditional roundtrip to JPEG, so naturally if the input was not losslessly decoded into JPEG, this step will introduce an error. Duh. 2. What exactly is meant by differences in decoding? How does that happen?
jonnyawsom3
2026-02-01 05:47:33 Original JPEG spec wsn't fully defined, different decoders can give different outputs. Transcoding to JXL does all the maths in full float precision, so it's more accurate than normal but doesn't match pixels with the old JPEG decoder. Transcoding is bit-accurate reversible though, so you don't loose any data
_wb_
2026-02-01 06:59:54 Actually original JPEG spec is just like the JXL spec: all the arithmetic is defined mathematically (infinite precision math), and implementations can decide how they want to approximate it. The original JPEG conformance criteria are pretty loose though, which made sense because this was designed in a time when 16-bit integer arithmetic was basically the only thing you could use — hardware float arithmetic was still something fancy.
3DJ
2026-02-02 01:11:03 sView (a stereoscopic 3D image/video viewer) now supports JXL <https://github.com/gkv311/sview/issues/108#issuecomment-3822306954>
2026-02-02 01:11:31 Does the format support any flags to specify the 3D layout (like Side-By-Side, Top-And-Bottom, Interlaced, etc.)?
2026-02-02 01:14:07 and does it happen to support efficient multiview recompression to save even more space by sharing redundant data between the views like MVC (the codec used in Blu-ray 3D)?
jonnyawsom3
2026-02-02 01:14:21 > Well, this libjxl library looks pretty huge compared to FFmpeg itself :). A sentence to strike fear into the hearts of programmers everyone
ignaloidas
2026-02-02 01:21:16 not currently as far as I know?
2026-02-02 01:22:34 This could be done as compressing the two views as separate frames (and marking them as left/right) and utilizing data from the first one for encoding the second one, but it's not currently done (or at least well) in the current encoders
jonnyawsom3
2026-02-02 01:25:59 In the raw image I don't think it does, but there are EXIF, XMP and JUMBF as metadata options which could signal it As said above, multiview could be done by storing them as separate frames and subtracting one from the other to get differential storage
spider-mario
2026-02-02 11:47:00 does that actually save as much as one might hope, though?
AccessViolation_
2026-02-02 01:06:49 interlaced reference frame, then patch copy every line :)
2026-02-02 01:07:43 I wonder if that's predictable enough that the patch signaling has relatively little overhead
2026-02-02 01:08:22 though not a serious suggestion for obvious reasons
2026-02-02 01:14:01 I don't know if multiple frames can share an MA tree, but the textures and patterns which might be captured in MA trees are probably going to apply equally well to both images even if parts are misaligned. though because of that misalignment, decisions about the absolute positions of a pixel might throw things off
2026-02-02 01:15:58 I'm pretty sure groups can share MA trees though, so if you instruct the encoder to not use position related decisions, and the left and right view each fit in a single group on the same frame, that could work
couleur
2026-02-03 10:52:05 doesnt it sound reasonable to compress my image library to jpeg, and when I share it to non jxl compliant, decompress it on the fly?
2026-02-03 10:52:18 are there android share apps that let you do that per example
2026-02-03 10:52:47 where you'd share a .jxl to it, and it would trigger another share with the jpeg that you could then shared wherever
AccessViolation_
2026-02-03 11:37:51 compress to JXL you mean? that sounds reasonable. and with lossless JPEG recompression, you get the original exact JPEG back
2026-02-03 11:38:47 I don't think there's an app that does that currently, though I like the idea. maybe Image Toolbox lets you do that? I'm sure, but I seem to remember it lets you decode JPEG XL. though not as swiftly as you describe
couleur
2026-02-03 11:56:24 can recompression be bit perfect?
2026-02-03 11:56:31 and under what circumstances
jonnyawsom3
2026-02-04 12:08:31 ImageToolbox has JPEG Transcoding to and from JXL, but it's more steps than just a share menu. Transcoding is bit-perfect and will error otherwise
VcSaJen
2026-02-04 03:18:05 Specifically, it's not in Edit->Format Conversion, it's in Tools->JXL Tools->JPEG to JXL.
perk
2026-02-04 09:16:19 Is there an image viewer yet that can display the layers/frames of a jxl file as separate images?
2026-02-04 09:24:21 nvm xnview can do it
couleur
2026-02-04 10:00:52 ooo, it has layers? could you share how they look?
jonnyawsom3
2026-02-04 10:11:22 Well, that depends on how you saved them
awxkee
2026-02-04 10:28:30 I think I had errors on some jpegs when I were implementing this, but I'm not quite sure. Is there a defined criterion when it may fail?
Fahim
2026-02-04 10:32:21 Closest from what I recall off the top of my head is <https://github.com/libjxl/libjxl/issues/1172#issuecomment-1037246945> > Other things that are in the original JPEG spec but are not considered part of the 'de facto' standard are outside the scope of lossless JPEG recompression in JPEG XL, for the simple reason that they are many orders of magnitude more rare, and it would have been a significant complication to extend the bitstream reconstruction procedure to also accomodate for that. This includes arithmetic coding, lossless, hierarchical, strange subsampling factors, and CMYK (JXL does support CMYK, but not lossless recompression of CMYK JPEGs).
AccessViolation_
2026-02-04 10:32:33 iirc one of the reasons it can fail is if there's more than 4 MB of metadata
awxkee
2026-02-04 10:39:30 Hm, out of curiosity, it seems MPF ( multi-picture JPEGs ) such as UHDR should also immediately fail as well, do I get it right?
_wb_
2026-02-04 10:40:13 4 MB of tail data, specifically (stuff that comes after the end-of-image marker). You can have more than 4 MB of metadata _inside_ the jpeg file, stored in various APP markers or whatever.
2026-02-04 10:42:09 They are currently getting treated as the first JPEG with the rest being tail data, so it will fail if the rest is > 4 MB. And even if it doesn't fail, it's currently not quite ideal since it just treats the non-first JPEGs as "arbitrary tail data" so it's basically semantically not there in the JXL file, it will only return when you reconstruct the JPEG.
2026-02-04 10:43:06 We have had discussions on how to deal better with MPF, which includes gain maps btw, but it gets tricky and there have always seemed to be more urgent things 🙂
VcSaJen
2026-02-04 10:45:14 Krita (image editor) can import layers from jxl. I dunno if any of the viewers support viewing layers separately
awxkee
2026-02-04 10:45:34 Well then it's clear where did I get fails with author of ImageToolbox since probably we had some UHDRs for testing and that case seems to be a tricky one
2026-02-04 10:45:58 As well it seems some android cameras now make photos in UHDR so they will also fail randomly sometimes
_wb_
2026-02-04 10:46:21 Philosophically, the main point of the lossless JPEG recompression is to provide a transition path for dealing with legacy content. The annoying thing is that things like gain maps are basically retrofitting stuff onto the legacy format to try to make it support stuff it really doesn't support, and if we have to keep adapting our definition of what is "legacy content" to also include these things, we end up creating complications and compromises that shouldn't really be there, imo.
awxkee
2026-02-04 10:47:02 Yeah, I'm not insisting on supporting these or something
2026-02-04 10:48:46 It's just not very obvious since it isn't documented, so I ran into some seemingly random failures whose origin wasn't clear.
_wb_
2026-02-04 10:49:10 In theory, JXL now does have a gain map box defined in its file format so in principle you can make a conversion tool that takes an UHDR JPEG as input, recompresses the base JPEG using regular jpeg recompression, recompresses the gain map JPEG also using regular jpeg recompression and puts it in a gain map box of the base JXL, and semantically that should be something that is equivalent to the UHDR JPEG and that is reconstructible to it.
2026-02-04 10:50:47 Emphasis on "in principle". Nobody seems to have had the energy or desire to produce such a tool or to make libjxl / jxl-rs actually use the stuff in the gain map box and do the transformation of baseline+gainmap->HDR so it gets rendered in HDR instead of just as the fallback SDR baseline.
2026-02-04 10:53:51 To me, the current state of things is OK. JPEG XL supports it in principle, and I don't mind that it doesn't work in practice since it's a cursed approach to begin with. If you want HDR, just forget about the old JPEG, and encode an actual HDR image, using a baseline HDR jxl image. If you really want you can add a gainmap that specifies a custom local tonemapping to SDR (this is the only thing these gainmaps do actually bring as an extra functionality), but that's entirely optional.
VcSaJen
2026-02-04 11:00:01 We've seen encoding time/compression graphs comparing various image formats, but what about decoding time/compression graphs? It seems like some lossless images' decode time can be quite slow compared to PNG. Readers are not going to wait 3 seconds just to see the next comic page on their 4K tablet.
monad
2026-02-04 11:03:26 Yes, I have been wanting to look into this myself for some time.
VcSaJen
2026-02-04 11:05:39 (Of course, comic readers in theory can decode the next few pages ahead-of-time, but I dunno if they do that right now, and it's iffy to rely on that because not every software will support it)
monad
2026-02-04 11:06:37 Readers can already be slow with full page images in existing formats.
awxkee
2026-02-04 11:16:04 Thanks for explanation! I agree approach of UHDR is a nightmare but there is nothing to do. When I was handling UHDR gainmaps, I did some research and discovered a complete zoo of formats. UHDR itself has several ways to store gainmap metadata, Adobe has its own standard, and Apple has its own gainmaps with their own storage method. Compared to all that, just having PQ and HLG didn't seem so bad.
_wb_
2026-02-04 11:23:26 A good viewer for anything where it's pretty predictable what needs to be shown next should do this, but I suspect many viewers still don't and only start decoding the next page after you press the button to go there.
monad
2026-02-04 12:55:37 The downside with using identify like this is it can ignore render-affecting information like color profiles. When incorporating djxl to produce a reconstructed JPEG, butteraugli would be the more robust option. For that specific case you can also just compare file binary which should be fastest and most thorough.
Quackdoc
2026-02-04 03:40:31 yeah, if there was only one way of handling gainmaps I would even prefer them, but for now good old HDR and tonemapping seems to win for me. this is why I like jxl since you can just request the decoder to send sRGB in the case of jxl-oxide
2026-02-04 03:41:04 ofc that has its own issues, but at least it works
_wb_
2026-02-04 08:16:27 There's now also an ISO standard to do it, I've discussed it with the folks in TC 42 who created that standard. In the JPEG committee some people were not amused that basically everyone reinvented the wheel while all of this stuff was already standardized in a better way in 2016 (JPEG XT part 2) but nobody used that standard.
jonnyawsom3
2026-02-04 08:23:25 Reinventing the JPEG wheel to carry 2 extra bits as the float32 JXL freight train barrels past
nol
2026-02-05 06:42:05 Hey all, I have a question about Exif. In the draft of the JPEG XL specification shared in this server there is a reference to the Exif 2.32 specification ("The Exif payload is as described in.."). Does this mean that Exif in JPEG XL is "locked" to that version, meaning that I shouldn't produce a JXL with Exif 3.0 features? Or is it simply the case that version 2.32 was the most recent when writing the spec, and JXL readers/writers should just give a best effort to interpret the Exif data?
Tirr
2026-02-05 06:45:28 I guess it's just a spec thing and it simply means "the payload contains Exif data"
2026-02-05 06:46:05 but I didn't write the spec so I can't say for sure
_wb_
2026-02-05 07:58:51 18181-2 refers to the Exif spec via the bibliography, not as a dated normative reference, so it's not "locked" to that version but that's just the most recent reference we could provide. Interpreting the Exif payload is outside the scope of 18181-2, so applications can do what they want, the only normative thing a jxl decoder has to be able to do is to extract the payload and return it as a blob.
nol
2026-02-05 08:41:40 Excellent, thanks for your answers
monad
2026-02-05 02:15:17 I can collect measurements from cli tools now, I just need to display them in a nice way. I tend to post tables with rows for each encode setting and am planning to add decode speed per decoder as columns. Is that the kind of breakdown you had in mind? To test, I decoded 270 web photos of 27 unique image contents from Scope ITAP. Increased encode effort implied increased decode effort on this set. ```djxl_0.11.0_num_threads0 Mpx/s real (mean) | encoder 45.4 cjxl_0.11.0_d0e1 36.6 cjxl_0.11.0_d0e2 19.96 cjxl_0.11.0_d0e3 17.40 cjxl_0.11.0_d0e4 14.13 cjxl_0.11.0_d0e5 12.39 cjxl_0.11.0_d0e6 11.31 cjxl_0.11.0_d0e7 10.97 cjxl_0.11.0_d0e8 10.73 cjxl_0.11.0_d0e9 10.18 cjxl_0.11.0_d0e10```
VcSaJen
2026-02-05 03:05:24 Thanks. Add bits per pixel compression rate column.
Traneptora
2026-02-05 11:13:55 I've actually wondered about this. A way to provide an HDR image that has SDR fallback. Would this be in the scope of annex N extensions since it specifies SDR fallback or would it be in the scope of 18181-2
2026-02-05 11:16:08 you should for libjxl too if it's xyb. if not that's a bug (fwiw jxlatte also supports this but I haven't had the time to fix known decoding issues)
Quackdoc
2026-02-05 11:54:12 libjxl tonemapping has some weird oddities that doesn't make it work great for instance for sRGB input
Traneptora
2026-02-06 12:12:17 jxlatte fwiw doesn't tone map
2026-02-06 12:12:30 it just gamut maps peak detects and clips
2026-02-06 12:12:39 idk if peak detection counts as tone mapping
Orum
2026-02-06 05:40:36 Yeah, I'll have graphs for this too as soon as we have a release that isn't over a year old...
jonnyawsom3
2026-02-06 05:56:16 While I'd like a release ASAP, I think it should wait until we have the buffering sorted out https://discord.com/channels/794206087879852103/1464417869470371912/1468288380730085387 (it's still libjxl, discussion just happened in the jxl-rs channel)
FujiwaraChika
2026-02-06 07:20:02 Interesting discovery: My iPad mini5 running iOS13 can actually decode JXL. It's so cool.
2026-02-06 07:21:36
_wb_
2026-02-06 07:35:18 The current gain map box that is in 18181-2 was actually only added to the spec because that's the only thing it really _adds_ in terms of functionality: the possibility to provide a custom tone mapping to SDR, in case you want to make artistic choices on that and the default tonemapping doesn't suffice. In JPEG XL, there is no need for a gainmap as a workaround for precision limitations in the main format, so the "SDR base + gainmap to bring it to HDR" scenario is not needed, and it is inefficient in terms of compression and quality. That's why I made sure this comment was added to the 18181-2 spec when we added ⁨`jghm`⁩ to it:
spider-mario
2026-02-06 08:25:31 oh, what happens for those? if the `intensity_target` is at most the requested target luminance, the EETF *should* be the identity, from what I remember
monad
2026-02-06 09:28:32 ``` colors/Mpx (mean) images Mpx (mean) | e3/e1 (mean) | share | share colors (mean) | tag 27 100.00% 0.71 100.00% 143861 208059 0.87 ITAP 25 92.59% 0.70 91.37% 143384 209634 0.87 visibly_opaque 2 7.41% 0.83 8.63% 149820 188360 0.83 visibly_transparent djxl~0.11.0~num_threads0 Mpx/s real (mean) | jxl-oxide~0.12.5~j1 Mpx/s real (mean) | | jxl_cli~0.3.0 Mpx/s real (mean) Mpx/s CPU (mean) | | | included in 'all' B bpp (mean) | Mpx/s real (mean) | | | densest of 19350294 8.1038437 0.0853700 0.0953113 10.43 9.470 4.311 · all 19369070 8.1116712 0.112763 0.121330 10.18 9.306 4.239 A cjxl~0.11.0~d0e10 19481373 8.1544545 0.36689 0.46680 10.73 9.639 4.423 A cjxl~0.11.0~d0e9 19583839 8.1964819 0.54253 0.67578 10.97 9.936 4.503 · cjxl~0.11.0~d0e8 19736715 8.2558884 1.2341 1.7184 11.31 10.26 4.635 · cjxl~0.11.0~d0e7 20027213 8.3722803 1.6778 2.498 12.39 11.38 4.995 · cjxl~0.11.0~d0e6 20208408 8.4455025 2.2010 3.601 14.13 13.03 5.690 · cjxl~0.11.0~d0e5 20770086 8.6647737 3.614 6.242 17.40 15.28 7.197 · cjxl~0.11.0~d0e4 20938326 8.7340667 6.361 21.7 19.93 15.79 8.549 · cjxl~0.11.0~d0e3 22628973 9.4555266 9.383 25.3 36.6 40.8 15.17 · cjxl~0.11.0~d0e2 23948419 9.9971001 29.7 51.8 45.4 47.5 16.80 · cjxl~0.11.0~d0e1```
VcSaJen
2026-02-06 09:49:31 What's the forth column? Encoding speed? I'm surprised that djxl is slower than jxl-oxide, is it clang or MSVC build? Will you do the same for PNG and Lossless webp with most common parameters, for comparison's sake?
monad
2026-02-06 10:00:15 Yes, third and fourth columns are encode speed. djxl is built with clang. I too was suspicious about the speed difference (first time trying jxl-oxide), so I tried writing output too and saw both decoders take more time while jxl-oxide was still faster. IDK if it is strictly a fair comparison, but jxl-oxide does actually do more work as encode effort increases. Such difference may be mitigated with multithreading. Yes, I will see about comparing other formats. I am not so sure about what is a representative tool for PNG decode.
veluca
2026-02-06 10:01:44 is that all single threaded?
monad
2026-02-06 10:02:18 Decode is all single-threaded.
2026-02-06 10:02:32 (num_threads0, j0)
veluca
2026-02-06 10:02:58 ok, the numbers surprise me
2026-02-06 10:03:20 I guess I should ask how you measure those mp/s
VcSaJen
2026-02-06 10:03:51 > Sets the number of threads to use. The default 0 value means the machine default. Is "machine default" single-threaded?
veluca
2026-02-06 10:04:08 I don't think so! probably # cores
monad
2026-02-06 10:04:36 Oh, reading error?
veluca
2026-02-06 10:05:35 (have to say that the djxl to jxl-rs gap is also interesting, I probably should dig more into it)
monad
2026-02-06 10:07:13 Where is this written? I double-checked `jxl-oxide --help` which just says `Number of parallelism to use`
2026-02-06 10:08:05 Granted, poor assumption that behavior should match libjxl.
veluca
2026-02-06 10:08:33 I think djxl's thread count setting is very unusual 😛
VcSaJen
2026-02-06 10:10:51 https://manpages.debian.org/unstable/libjxl-tools/djxl.1.en.html (I just launched `djxl --help -v`, looks like that that webpage is wrong)
monad
2026-02-06 10:13:05 Okay, updated that table with `jxl-oxide -j 1`.
veluca
2026-02-06 10:14:46 ok, then the numbers make more sense 🙂
2026-02-06 10:14:59 mind trying if latest git version of jxl-rs is better?
monad
2026-02-06 10:26:31 including latest jxl-rs ``` djxl~0.11.0~num_threads0 Mpx/s real (mean) | jxl-oxide~0.12.5~j1 Mpx/s real (mean) | | jxl_cli~0.3.0 Mpx/s real (mean) | | | jxl_cli~0.3.0-20-g69e7892 Mpx/s real (mean) Mpx/s CPU (mean) | | | | included in 'all' B bpp (mean) | Mpx/s real (mean) | | | | densest of 19350294 8.1038437 0.0853700 0.0953113 10.43 9.470 4.311 4.419 · all 19369070 8.1116712 0.112763 0.121330 10.18 9.306 4.239 4.339 A cjxl~0.11.0~d0e10 19481373 8.1544545 0.36689 0.46680 10.73 9.639 4.423 4.530 A cjxl~0.11.0~d0e9 19583839 8.1964819 0.54253 0.67578 10.97 9.936 4.503 4.607 · cjxl~0.11.0~d0e8 19736715 8.2558884 1.2341 1.7184 11.31 10.26 4.635 4.760 · cjxl~0.11.0~d0e7 20027213 8.3722803 1.6778 2.498 12.39 11.38 4.995 5.162 · cjxl~0.11.0~d0e6 20208408 8.4455025 2.2010 3.601 14.13 13.03 5.690 5.669 · cjxl~0.11.0~d0e5 20770086 8.6647737 3.614 6.242 17.40 15.28 7.197 7.522 · cjxl~0.11.0~d0e4 20938326 8.7340667 6.361 21.7 19.93 15.79 8.549 9.015 · cjxl~0.11.0~d0e3 22628973 9.4555266 9.383 25.3 36.6 40.8 15.17 18.56 · cjxl~0.11.0~d0e2 23948419 9.9971001 29.7 51.8 45.4 47.5 16.80 18.36 · cjxl~0.11.0~d0e1```
veluca
2026-02-06 10:28:49 so fairly minor improvements
2026-02-06 10:28:55 well, more optimization needed 😄
Quackdoc
2026-02-06 03:49:05 last time I tried it it just didn't do anything but I can try again tonight hopefully.
Orum
2026-02-09 09:36:29 this is a bit weird... when I screenshot (spectacle with PPM output piped to cjxl) and set the metadata correctly to what my display is (`-x color_space=DisplayP3`), the image looks oversaturated when viewing it, but if I *don't* set the metadata it looks fine? 🤔
2026-02-09 09:37:33 I guess it's doing some double-conversion, where spectacle is capturing it after some display adjustment has been applied?
RaveSteel
2026-02-09 09:38:37 I think spectacle only supports 8bpc RGB 709
Orum
2026-02-09 09:42:36 looks like flameshot might work?
spider-mario
2026-02-09 12:20:45 or perhaps the opposite, it somehow captures an sRGB buffer before it has been converted to the Display P3 of your display? (is this on Wayland?)
Orum
2026-02-09 12:21:27 yeah, wayland
2026-02-09 12:23:11 I'm just looking for any tool that will capture the currently active window (with no interaction) and use the same gamut as my display, but I've yet to find anything that will do this <:FeelsSadMan:808221433243107338>
spider-mario
2026-02-09 12:24:19 is the content actually wide-gamut?
Orum
2026-02-09 12:30:51 sometimes
2026-02-09 12:31:39 not sure how to tell easily if I'm using color outside of sRGB's primaries or not...
spider-mario
2026-02-09 12:42:59 you could try screenshotting “WebKit logo” from https://webkit.org/blog-files/color-gamut/
Orum
2026-02-09 01:13:08 well I don't expect that to work (at least, not in my browser anyway)
2026-02-09 01:13:20 I suppose I can download it and try in a different app though...
2026-02-09 01:20:48 huh, I just noticed cjxl seems to change the metadata even when you feed it a jxl as input
2026-02-09 01:22:02 e.g. input file is: `Color space: RGB, D65, sRGB primaries, gamma(0.454545) transfer function, rendering intent: Relative` and recompressing it with `-d 0 -e <whatever>` then shows the new output as: `Color space: 508-byte ICC profile, CMM type: "jxl ", color space: "RGB ", rendering intent: 1`
2026-02-09 01:22:14 why are you like this cjxl? <:NotLikeThis:805132742819053610>
2026-02-09 01:23:48 *anyway* that logo is visible if I open the png in mpv, and still visible when I screenshot it either using spectacle or mpv's internal screenshot function
2026-02-09 01:23:59 though I think mpv might be tonemapping to sRGB? <:Thonk:805904896879493180>
2026-02-09 01:24:06 not sure how to tell TBH
jonnyawsom3
2026-02-09 01:25:17 This? https://github.com/libjxl/libjxl/issues/2581
Orum
2026-02-09 01:26:06 yeah, is that still broken?
jonnyawsom3
2026-02-09 01:39:37 Says it's fixed, what version are you using?
Orum
2026-02-09 01:39:56 did you read the comments, because it sounds like it wasn't fixed
2026-02-09 01:40:11 and `cjxl v0.11.1`
jonnyawsom3
2026-02-09 01:41:59 Try main https://github.com/libjxl/libjxl/pull/2635
RaveSteel
2026-02-09 01:42:05 same with main
2026-02-09 01:42:07 just tried
jonnyawsom3
2026-02-09 01:42:14 Huh, interesting
Orum
2026-02-09 01:42:29 reopen pls 🙏
2026-02-09 01:45:40 anyway it looks like the target colorspace in mpv *might* not be configured correctly...
2026-02-09 01:45:59 impossible to tell because the mpv docs are vague on the subject
2026-02-09 01:52:11 yeah, the info in MPV is showing my display as using sRGB primaries, while I've configured display P3 on the display itself and in kwin
2026-02-09 01:52:38 so something is not configured correctly, and I think it's mpv
RaveSteel
2026-02-09 01:54:38 ffmpeg also does not recognise ther color information properly in a reencoded file
2026-02-09 01:55:13 Original: jpegxl (libjxl), rgb48le(10 bpc, pc, gbr/bt2020/arib-std-b67) Reencode: jpegxl (libjxl), rgb48le(10 bpc, pc, gbr/unknown/unknown)
Orum
2026-02-09 02:18:12 well, I'm pretty sure P3 isn't working properly on my display anyway (it actually just says 'DCI', which I *assume* is P3, but they don't explicitly state P3 anwhere), so I'm just going to configure it for 2020 instead
2026-02-09 02:18:41 now to figure out how to prevent mpv from tonemapping this test image...
2026-02-09 02:30:47 Okay, this is rather interesting. If I screenshot with spectacle, I get this, which (when viewed in nomacs, which I think has color management now) appears the same as when I actually look at it in mpv:
2026-02-09 02:31:19 ...*but* if I use mpv's internal screenshot function, I get this, which does not look the same:
2026-02-09 02:32:00 in the former (and within the mpv window itself) I can see the logo, but in the internal screenshot I can't
2026-02-09 02:33:20 which I guess makes sense, seeing as it claims: `Color space: RGB, D65, sRGB primaries, gamma(0.454545) transfer function, rendering intent: Relative`, so it's presumably just clipping to sRGB
RaveSteel
2026-02-09 02:46:36 you need to modify the mpv config to get proper screenshots of non-srgb content
2026-02-09 02:47:05 try these ``` screenshot-sw=yes screenshot-tag-colorspace=yes ```
Orum
2026-02-09 02:49:34 thanks, I'll give that a shot
2026-02-09 02:49:51 also realized I forgot to uncomment the transfer characteristics so it wasn't using 1886
2026-02-09 02:52:36 I still get `Color space: RGB, D65, sRGB primaries, sRGB transfer function, rendering intent: Relative` even with those options
RaveSteel
2026-02-09 02:54:26 hm, true. but the pixel data is identical. Probably because the original file as an ICC, which mpv does not pass to the screenshot
Orum
2026-02-09 02:55:14 well even if the pixel data is correct it won't display properly anywhere
RaveSteel
2026-02-09 02:55:19 Ye, exiftool shows the original has an attached ICC for P3
Orum
2026-02-09 02:56:19 as annoying as it is I think I should just keep using spectacle, even if it limits me to sRGB at least it appears to do some tone mapping
RaveSteel
2026-02-09 02:56:34 You could try FFmpeg for screenshots
Orum
2026-02-09 02:56:53 well what I really want is a way to screenshot *any* window, with wide gamut, not just video
RaveSteel
2026-02-09 02:57:36 I think it may be possible, in theory at least, to capture a wayland window with FFmpeg and capture a single frame as a screenshot
2026-02-09 02:57:47 No idea if that is possible in practice though
Orum
2026-02-09 03:00:51 well at least I have it outputting to the display in wide gamut with the proper transfer now
2026-02-09 03:01:44 I can't say the same for nomacs though
RaveSteel
2026-02-09 03:01:46 do you have target-colorspace-hint=auto target-peak=auto in your config?
Orum
2026-02-09 03:02:43 no, as using `target-colorspace-hint` set to *anything* other than 'no' would *always* give me 709 primaries & sRGB tranfer
2026-02-09 03:02:59 so to get it to work I had to use: ``` target-colorspace-hint=no target-prim=bt.2020 target-trc=bt.1886 ```
RaveSteel
2026-02-09 03:03:02 ah, yes, you need to have an hdr display for it to work properly
Orum
2026-02-09 03:03:13 well I have a HDR display but I leave HDR off
2026-02-09 03:03:20 I *do* want WCG though
2026-02-09 03:05:07 spectacle is really the only annoyance now (well internal screenshots are still bugged on mpv but that's an issue for another day)
RaveSteel
2026-02-09 03:05:40 feature request for hdr for spectacle https://bugs.kde.org/show_bug.cgi?id=502053
2026-02-09 03:05:50 no movement currently though
Orum
2026-02-09 03:06:39 repoted: 2025-03-27 💀
RaveSteel
2026-02-09 03:06:49 yeah
Orum
2026-02-09 03:06:50 instead they are adding OCR, which I don't think anyone asked for?
RaveSteel
2026-02-09 03:07:15 I think OCR is more useful for most than HDR, but both are rather niche atm
Orum
2026-02-09 03:07:26 OCR can be done in a separate app
2026-02-09 03:07:41 once the HDR and/or WCG info is lost, you can't recover it
2026-02-09 03:08:28 and worst part is most people probably don't even realize it's getting lost as it does some tone mapping
RaveSteel
2026-02-09 03:11:35 of course
Orum
2026-02-09 03:12:12 >Windows >PipeWire <:WhatThe:806133036059197491>
Quackdoc
2026-02-09 07:17:24 lmao
Orum
2026-02-09 07:21:13 humans making artificial stupidity when we already have more than enough natural stupidity in the world 😮‍💨
ox
2026-02-09 08:31:28 Memory safety … _"We would welcome contributions to integrate a performant and memory-safe JPEG XL decoder in Chromium."_ https://www.januschka.com/chromium-jxl-resurrection.html
Laserhosen
2026-02-09 10:41:46 Yes, found the same issue in December and discussed in <#804324493420920833>
2026-02-09 10:41:59
2026-02-09 10:43:58 Reverting the change to jxl.cc seems to be enough.
Demiurge
2026-02-10 01:13:11 I dunno if that's the entire reason... It was the VP8/webp/avif guy who basically decided on his own to kill JXL and it was someone else from a different part of the Chrome project who stepped in and announced that JXL is coming back. The left hand is not aware of what the right hand is doing, in an organization of this size.
whatsurname
2026-02-10 02:04:16 Ah, conspiracy. Not being the first one to ship it doesn't mean trying to kill it. It was never a final decision and can always be revisited in the future, which is what happened now. At the time Chrome dropped JXL support (technically it was never supported, because behind a flag is considered an experiment and subject to change at any time), neither Firefox nor Safari wanted it. Then one year later, Safari shipped it, and another year later, Firefox changed their position.
Demiurge
2026-02-10 04:49:46 By kill it I mean have all the existing code ripped out despite chromium having the most complete and solid jxl support out of any other software at the time, with animation and HDR support which was rare even in dedicated image viewers
2026-02-10 04:54:12 It wasn't "revisited," it was literally over-ruled by a different part of the Chrome team. The Chrome Codec Team, led by the VP8/webp guy, wanted to axe it completely with an utterly bizarre announcement that made everyone think he saw the existence of jxl as some sort of personal attack on his very pride and life work.
2026-02-10 04:55:50 I guess in a sense it was revisited, but not by the same team that axed it.
2026-02-10 05:05:45 Some people at Google even made a cool demo of progressive region-of-interest middle-out decoding right before the VP8 team announced: > - There is not enough interest from the entire ecosystem to continue experimenting with JPEG XL > - The new image format does not bring sufficient incremental benefits over existing formats to warrant enabling it by default Like I said, with an organization this size, the left hand truly doesn't know what the right hand is doing. It's not a conspiracy either, because the word "conspiracy" means multiple people are conspiring to do something, but this decision was made by the Chrome Codec Team who also pushed the webp and avif formats despite a lack of interest from the broader ecosystem and the lack of incremental improvements over their predecessors.
2026-02-10 05:06:37 When a single faction or individual chooses to do something and acts unilaterally, that isn't a conspiracy, that's just some dude.
2026-02-10 05:10:40 Walking out-of-step with everyone else, to his own separate rhythm
whatsurname
2026-02-10 07:46:10 Ew, I don't know where you get all those personal attacks from. And I don't know about the lack of interest and incremental improvements part. WebP offers smaller size (which mozjpeg caught up later), alpha, lossless, and animation. People are asking for a JPEG replacement with those features for years, and it was competing against JPEG XR for this back then. For AVIF, it's smaller size + HDR. Firefox decided to implement it, and YouTube, Netflix, Facebook* all showed interests on it. * Facebook is a little different, they also showed interests on WebP before but didn't fully adopt it. They didn't adopt AVIF either, only did some experiments.
derberg🛘
2026-02-10 11:49:24 The issue is also the communication from Google.
2026-02-10 11:49:49 They published some measurements but then never really followed up with any response or am I missing something?
2026-02-10 11:50:18 And even the interop has been intransparent
2026-02-10 11:50:51 (By design but well)
username
2026-02-10 11:51:45 there's what seems like a mix of some miscommunication and ¿misinformation? here across some of these messages that I don't even know how to begin to try attempting to respond
2026-02-10 11:51:46 uhh
2026-02-10 11:51:47 hmm
derberg🛘
2026-02-10 11:51:55 Also it was known for years that PDF prefers JXL
2026-02-10 11:52:12 They said they are evaluating a new format with preference for that
whatsurname
2026-02-10 12:02:19 I think that's just people misunderstand how interop works
derberg🛘
2026-02-10 12:03:03 Message below yes
2026-02-10 12:03:35 It is still annoying when there is basically zero info on why it did not make it
username
2026-02-10 12:04:00 the WPT Interop stuff is for making sure that things that all browsers have or agree on having works consistently between them
2026-02-10 12:04:26 JXL didn't get accepted because Chrome did not want it and Mozilla didn't care unless Chrome did
derberg🛘
2026-02-10 12:04:37 Yeah, but that is an assumption
2026-02-10 12:04:45 We don't know who voted how
username
2026-02-10 12:05:48 first message still stands tho. Interop isn't for making browsers support things it's for making sure that things all browsers already agree on supporting work between them the same
2026-02-10 12:06:24 a lot of people seemingly didn't get that and where confused as to why JXL kept getting declined
derberg🛘
2026-02-10 12:07:39 Really? Huh
Quackdoc
2026-02-10 02:08:23 *a* member of the chrome team posted a severely biased and either intentionally misrepresentative or extremely incompetent comparison of AVIF vs JXL that can only really be taken as lies about JXL inorder to justify its removal from chrome, then they lied about lack of industry interest in jxl
2026-02-10 02:10:29 that's wrong there are features in there specific to some browsers, WEBUSB is an example of this
Demiurge
2026-02-10 02:32:23 I don't know why people want to sugarcoat this. Specifically it was removed by the Codec Team led by the supervisor responsible for pushing webp/avif despite a severe lack of ecosystem interest for his crippled image formats like webp that was literally a step backwards from JPEG, or avif which is a step backwards from PNG, and in webp's case other browsers were hostile to its adoption until youtube suddenly started using webp for thumbnails with no fallback, essentially forcing all other browsers to adopt it unless they want their users to notice youtube looks broken. That decision was something Jim was in charge of too.
2026-02-10 02:34:30 He was hired after the On2 acquisition to be put in charge of what codecs are used by YouTube and Chrome
2026-02-10 02:38:03 And he used that position to push his own personal projects, which is fine, until he made that unexpected announcement about jxl as if he perceived the format to be some sort of personal threat to his life work
username
2026-02-10 02:38:40 attributing big/(bad) decisions to named singular individuals is really not the way to go about things, this was a group/team failure in decision making whether said decision had alter motives or not. going about this the way you are is just going to cause people to not take you seriously I'm sorry.
Demiurge
2026-02-10 02:39:19 I dunno, I was raised to think that leaders are supposed to take responsibility for their team
2026-02-10 02:41:23 That's kinda the whole point of being a leader is taking responsibility
2026-02-10 02:41:43 At least that's how I was brought up.
jonnyawsom3
2026-02-10 02:43:42 Can we just... Not? Every time Chrome is mentioned, you give the same speech about how the codec team is bad and led by 1 guy... There's only a dozen of us active here, so believe us, we know
Demiurge
2026-02-10 02:48:59 I just wanted to clarify that it was a different team, the Audit Team, that seems to have made the reversal.
2026-02-10 02:49:36 There are separate teams and factions developing chrome who aren't all on the same page
VcSaJen
2026-02-10 02:54:45 Blaming a visible scapegoat is not helpful, but blaming a nebulous "corporation" is also not helpful. I dunno how to prevent big corpos from abusing their power, but in this case we don't need to think about that, because it already happened: JPEG XL <:jxl:1300131149867126814> is being added back to Chrome.
whatsurname
2026-02-10 02:57:50 > a severely biased and either intentionally misrepresentative or extremely incompetent comparison of AVIF vs JXL that can only really be taken as lies about JXL inorder to justify its removal from chrome It has detailed verification process and you can reproduce it yourself. The data is correct and it does show some advantages of JXL, it's just how you interpret them and whether those advantages can justify introduce a new format. > then they lied about lack of industry interest in jxl They said "the ecosystem", and I think that means the web specifically, rather than other industries. Like JPEG 2000 is a popular in the film industry, yet no web browser wants to support it. As I said Firefox and Safari didn't want JXL at that time. > that's wrong there are features in there specific to some browsers, WEBUSB is an example of this WebUSB is never part of interop.
Quackdoc
2026-02-10 03:00:53 is webusb not part of it? I thought they added it. but in any case >the test the data set used was tiny, the libjxp version was old, the metrics they used may no sense. And even that data they did collect was represented with extreme bias. > web use tons of major web devs were asking for it, meta, shopify, adobe etc
username
2026-02-10 03:00:54 > They said "the ecosystem", and I think that means the web specifically, rather than other industries. Like JPEG 2000 is a popular in the film industry, yet no web browser wants to support it. As I said Firefox and Safari didn't want JXL at that time. there where big companies within the web ecosystem directly asking for JXL support and Mozilla was working on adding support to their browser at around the same time
2026-02-10 03:03:34 also I feel there's probably a conflict of interest if the people behind deciding that JXL gets removed are supporters/devs of AVIF and labeled their tests (which where published a month or two after JXL was removed) as being from the "AVIF team"
whatsurname
2026-02-10 03:04:04 I won't say Firefox was working on it, they had an initial implementation locked in nightly and refused patches to improve it
username
2026-02-10 03:04:39 I would have to look back but I thought Firefox stopped working on it after Chrome removed support?
2026-02-10 03:04:56 IIRC that's when they started refusing patches
Demiurge
2026-02-10 03:04:58 Safari supported j2k for many years until they only very recently removed it, after adding JXL support.
VcSaJen
2026-02-10 03:05:23 Firefox was working on it, until it wasn't. The decision was just silent, not even the dev resposible for integration knew.
Demiurge
2026-02-10 03:05:28 But no other browser wanted to follow Safari with J2K
username
2026-02-10 03:06:36 It's very likely they where gearing up to add support for when Chrome did and then when Chrome suddenly dropped JXL the people at Mozilla probably thought "oh we don't need to care about this anymore"
whatsurname
2026-02-10 03:06:52 https://phabricator.services.mozilla.com/D119700#3977128 2021/08, the Chrome removal was 2022/11
username
2026-02-10 03:12:35 thanks for the dates ! I had been meaning to verify how accurate my memory was, seems like it was pretty off for the timeline around that.
Demiurge
2026-02-10 03:22:08 I think Steam uses Chromium for its UI engine. As well as many other applications. It's only a matter of time before jxl starts appearing literally everywhere now. And then the plan for world domination will finally be complete
2026-02-10 03:22:45 Everyone underneath the tight iron grip of compression
username
2026-02-10 03:23:05 the sheer reach of Chromium is horrifying tbh
Quackdoc
2026-02-10 03:23:36 that's what happens with no real alternatives
Demiurge
2026-02-10 03:23:43 Agreed... Someone needs to make a better UI toolkit...
2026-02-10 03:24:12 I can see it now. JPEG-UI
Quackdoc
2026-02-10 03:24:16 lol
whatsurname
2026-02-10 03:24:24 It's not that easy, steam still needs to do something in their backend
username
2026-02-10 03:25:28 eh only half way. you can still smuggle through Chromium supported files that Steam doesn't recognize if you try hard enough
2026-02-10 03:25:45 I've seen people do it and I've done so a few times myself for some things
_wb_
2026-02-10 03:26:42 I think it was just incompetence, since the biggest flaws were: - they were looking at a largely insignificant quality range, just treating each quantizer setting (0..63) of avifenc as being equally important, while the high quantizer settings are just irrelevant for still images; - they measured speed and quality mostly independently, basically claiming AVIF is fast _and_ good because s9 is fast and s0 is good; - they used BD rate to summarize results, which further emphasizes irrelevant qualities (most metrics have way more range in super low qualities than in usable qualities) These are all quite common mistakes to make and in particular among video codec developers, depending on who you ask, it may not even be considered a mistake but just "the way we always do this".
Demiurge
2026-02-10 03:29:00 I thought it was funny how you pointed out that their own graphs and charts show avif consistently losing in butteraugli and ssimu2
2026-02-10 03:29:30 Yet they claimed victory
2026-02-10 03:30:39 Crossing the finish line last but telling everyone that you won
whatsurname
2026-02-10 03:31:19 I mean they didn't claim AVIF win, what they said is JXL needs to win more
2026-02-10 03:32:38 That just leads to how much is enough
Demiurge
2026-02-10 03:43:28 Or the two formats would just coexist and fill two separate roles and niches, like everyone assumed would be the case at first
whatsurname
2026-02-10 03:48:38 Yeah AVIF alone never suffices the need, so there gonna be another format eventually
username
2026-02-10 03:50:08 my problem with AVIF is that it could never fully replace older formats like PNG, JPEG, and WebP because of it's downsides
2026-02-10 03:50:48 AVIF2/AVIF-with-AV2 seems like it's going to have the same issue with not being able to fully fit the roles of older formats
2026-02-10 03:51:57 better lossy compression yes but it just caries too many limitations IMO
2026-02-10 03:53:25 I still wonder exactly what's going to happen with AV2? do they plan to make a AVIF2 or just start allowing AV2 streams in AVIFs? or I guess it's possible they could just decide not to do anything image format related
whatsurname
2026-02-10 03:55:32 That's just what I thought. It has so many regression compared to WebP, no incremental decoding, bad lossless, even animation isn't that good considering the increased complexity and resource consumption
2026-02-10 03:57:49 They'll probably just treat AVIF as a container format and put AV2 payload in it, at least that's what the current implementation with libavif
2026-02-10 03:59:03 But I wonder how that gonna work with accept header
username
2026-02-10 04:00:59 they might try to do a similar rollout to when they added YCgCo-R to the format: https://github.com/AOMediaCodec/libavif/issues/2077
2026-02-10 04:02:17 add decoder support then wait until "enough" software has support for it then enable it encoder side
2026-02-10 04:02:51 which IMO feels kinda like ¿cheating?
2026-02-10 04:03:45 imagine if the same thing happened to PNG and then loads of software would just fail saying something like "unsupported PNG type!"
Demiurge
2026-02-10 04:07:12 This is a cool idea because nothing stops people from adding decoder support and amending the standard, and then just waiting and observing until the old unsupported decoders become extinct.
2026-02-10 04:08:03 It's a good way to evolve standards imo. The "wait and observe" strategy
2026-02-10 04:08:47 Like for example the current JXL decoders support many features that no encoder actually uses
whatsurname
2026-02-10 04:08:49 YCgCo-R is a bit different, it's just color transforms and technically can also be used by other formats
Demiurge
2026-02-10 04:10:23 Why not add one more feature to the decoder? And just never release an encoder that uses it until the old decoders are extinct
2026-02-10 04:10:41 Doesn't seem to have a downside
2026-02-10 04:15:33 As long as you can observe what decoders are out there in the wild
AccessViolation_
2026-02-10 04:15:49 one potential issue is that people will make their own decoders and not bother to implement those features as they're not used
Demiurge
2026-02-10 04:16:22 That's why you also clarify the spec "if the file has this, then do this"
2026-02-10 04:17:38 So people building their own decoders know what they should do if they encounter one of those files