|
jonnyawsom3
|
2025-10-21 06:35:36
|
AFAIK it's similar to lossy PNG, tweaking values to better match prediction, or not storing residuals and only using prediction. I think at one point lossy palette worked that way, but the AvgAll predictor was found to work best, so it was hardcoded (I wanted to expose the other predictors again, but code order changed making it impractical)
|
|
2025-10-21 06:36:27
|
Obivously with lower quality since it was forced to a palette, but same idea
|
|
|
_wb_
|
2025-10-21 06:37:15
|
lossy delta palette is yet another beast, what I was talking about is doing it without delta palette, just on the RGB or YCoCg channel data
|
|
2025-10-21 06:42:15
|
in PNG you can only use the AvgN+W predictor meaningfully (if you don't want to turn gradients into banding), and if you do too much quantization you'll get diagonal blur in that direction.
in JXL there are averaging predictors in various directions from AvgW+NW which is West-heavy to AvgN+NE and AvgAll which are both NorthEast-heavy, so possibly you could get away with more residual quantization before it becomes a noticeable blur (since the blur direction can adapt to the image content)
|
|
|
jonnyawsom3
|
|
_wb_
that's not hard to do, just do squeeze in YCoCg (not XYB, if you want to limit the max error per RGB channel) and quantize the least significant bit away of the last few squeeze residuals. We should probably add an option to do this to cjxl/libjxl.
|
|
2025-10-21 06:59:57
|
Would that be a good idea compared to just doing `--override_bitdepth` or quantizing the least significant bit? Lossless squeeze usually adds around 25% to the filesize, so it might add more overhead than it saves
|
|
|
lonjil
|
2025-10-21 07:08:32
|
Looking at libwebp's source code, the algorithm seems to be:
1. Copy the first row and last row exactly, no quantization.
2. Copy the first and last pixels of each row exactly.
3. For each pixel in the current row, check whether its neighborhood is "smooth", if so copy exactly.
4. Simple rounding to the nearest multiple of 2, 4, 8, 16, or 32 (depending on the chosen quality), for every other pixel.
It seems like it applies this algorithm repeatedly when you specify lower qualities, e.g. starting with 32, then 16, then 8, etc.
However, I feel like I must be misunderstanding the code, as rounding to the nearest multiple of 32 does not sound sparticularly "near" lossless.
|
|
|
A homosapien
|
|
_wb_
we should probably at some point implement a cheaper rendering pipeline that does everything or at least most stuff in fixedpoint int16 arithmetic (except maybe the largest inverse dct transforms which require more precision than that), to reduce memory pressure and double the simd lanes... at least for Level 5 conformance this should be possible I think.
|
|
2025-10-21 07:53:52
|
Is this planned to be in jxl-rs in some capacity?
|
|
|
Tirr
|
2025-10-21 08:07:53
|
render pipeline interface is designed to support such cases if I got it right
|
|
|
|
veluca
|
|
_wb_
just doing normal lossless and quantizing predictor residuals (carefully though, making sure the next prediction is based on the decoded values, not the original ones) should work too, I wonder what that looks like actually in jxl where the predictor choice can vary every pixel
|
|
2025-10-21 08:21:54
|
pretty sure we had that at some point, in some sense delta palette is a better version of that
|
|
|
Tirr
render pipeline interface is designed to support such cases if I got it right
|
|
2025-10-21 08:22:27
|
you did
|
|
2025-10-21 08:22:39
|
now I just need to finish writing the code without going insane 🙂
|
|
|
jonnyawsom3
|
2025-10-21 08:24:39
|
Believe me, we're praying for you haha
|
|
|
lonjil
|
|
RaveSteel
Even "high quality lossy raw" is noticeably worse than full lossless if zooming in to spot edit.
Give people the option for lossy raw, but full lossless will remain the better solution if every detail matters
|
|
2025-10-21 10:18:05
|
ok so with my test image (16 bit linear RGB, big_building.ppm from imagecompression.info), vardct e7 d0.05 is a 3x density improvement over lossless and the loss of fidelity is essentially imperceptible even at 8x zoom and my face real close to the monitor
|
|
2025-10-21 10:22:56
|
huh, I tried d0.01, and got two identical files
|
|
|
jonnyawsom3
|
2025-10-21 10:25:27
|
Minimum distance is 0.05 to avoid going past Level 5 of the spec
|
|
|
lonjil
|
|
jonnyawsom3
|
2025-10-21 10:25:37
|
https://github.com/libjxl/libjxl/pull/4238
|
|
2025-10-21 10:26:50
|
Though, I think lossy modular would work fine? Will have to ask when they're awake again, but right now both are limited to 0.05
|
|
|
Minimum distance is 0.05 to avoid going past Level 5 of the spec
|
|
2025-10-21 10:33:22
|
Though thinking about it... That means this was even worse than I thought https://discord.com/channels/794206087879852103/803574970180829194/1428742173356064880. OIIO would only change from Quality 1-20 and 25x higher quality than normal
|
|
|
lonjil
|
2025-10-21 10:45:00
|
this is making me think q99 should map to d0.05 instead of 0.19
d0.1 was the first I tried for that image, and the quantization was quite perceptible up close. d0.19 is still really really high quality, but maybe not suitable for using lossy encoding for what's supposed to be the high quality original masters of your image, like with raw files.
|
|
|
jonnyawsom3
|
|
lonjil
this is making me think q99 should map to d0.05 instead of 0.19
d0.1 was the first I tried for that image, and the quantization was quite perceptible up close. d0.19 is still really really high quality, but maybe not suitable for using lossy encoding for what's supposed to be the high quality original masters of your image, like with raw files.
|
|
2025-10-21 10:46:26
|
Quality has decimals too for libjxl, though using Distance is generally advised
|
|
|
|
veluca
|
2025-10-21 10:46:26
|
> quantization was quite perceptible up close
could have been gaborish, not sure we turn that off at d0.1 yet
|
|
|
lonjil
|
|
veluca
> quantization was quite perceptible up close
could have been gaborish, not sure we turn that off at d0.1 yet
|
|
2025-10-21 10:46:48
|
I'll try without it
|
|
|
Quality has decimals too for libjxl, though using Distance is generally advised
|
|
2025-10-21 10:47:21
|
yeah, I'm thinking more of tools that present a 0-100 integer quality slider to users.
|
|
|
jonnyawsom3
|
|
veluca
> quantization was quite perceptible up close
could have been gaborish, not sure we turn that off at d0.1 yet
|
|
2025-10-21 10:47:49
|
Gaborish turns off at 0.5 https://github.com/libjxl/libjxl/pull/3563
|
|
2025-10-21 10:48:40
|
Unrelated, but we still think there's been a regression with Gaborish between the encoder and decoder, but it's hard to test
|
|
|
lonjil
|
2025-10-21 10:50:09
|
oh, I can add, the perceptible quantization was mostly in dark areas
|
|
|
jonnyawsom3
|
|
lonjil
this is making me think q99 should map to d0.05 instead of 0.19
d0.1 was the first I tried for that image, and the quantization was quite perceptible up close. d0.19 is still really really high quality, but maybe not suitable for using lossy encoding for what's supposed to be the high quality original masters of your image, like with raw files.
|
|
2025-10-21 10:51:41
|
After checking, q99.99 only goes to d0.1 anyway, which may be good enough for most use cases but could probably be tweaked a bit. The real solution would be to avoid quantizing dark areas so harshly
|
|
2025-10-21 10:52:59
|
I think part of it is not taking into account higher bitdepth, assuming the image won't get brightened. Having it boost the blacks as bitdepth goes up would be nice
|
|
|
lonjil
|
2025-10-21 10:54:00
|
unless something has changed, I feel like dark areas are always more quantized (according to subjective perception of what is more noticeable) than light areas
|
|
|
jonnyawsom3
|
2025-10-21 10:55:26
|
Yeah, but it's especially noticeable in higher bitdepths where you usually want to change the brightness after
|
|
|
lonjil
|
2025-10-21 10:55:49
|
mm, that makes sense
|
|
|
_wb_
|
2025-10-22 06:13:55
|
the lossy encoder doesn't care about bitdepth, it gets float32 input with a nominal range of [0,1] and it doesn't know what bit depth that has been
|
|
2025-10-22 06:16:14
|
there is intentensity_target which defaults to 255 and corresponds to the number of nits the color (1,1,1) is assumed to get rendered at
|
|
2025-10-22 06:18:36
|
probably 255 is a bit on the low side for current displays. E.g. my laptop goes to 600 nits in SDR mode when setting brightness to max, and newer ones even go to 1000 nits now in SDR mode
|
|
2025-10-22 06:20:34
|
so if you're viewing dark areas with display brightness cranked up (and even more importantly: zoomed in so there are no bright parts visible anymore, which allows your eyes to adapt), you will see more than what the encoder is assuming you can see
|
|
|
|
veluca
|
|
veluca
now I just need to finish writing the code without going insane 🙂
|
|
2025-10-24 09:23:41
|
slowly making progress, but some preliminary benchmarks are promising 😄
|
|
|
jonnyawsom3
|
2025-10-24 09:32:38
|
Oooh, looking forward to testing it haha
|
|
|
lonjil
|
2025-10-24 10:49:36
|
Woo!
|
|
2025-10-25 12:30:46
|
incidentally, I haven't been following the SIMD saga for a while, how'd the additions to Rust go, and what approach did y'all end up taking in jxl-rs?
|
|
|
A homosapien
|
|
KKT
|
|
_wb_
the lossy encoder doesn't care about bitdepth, it gets float32 input with a nominal range of [0,1] and it doesn't know what bit depth that has been
|
|
2025-10-27 02:30:16
|
So this post inspired me to take a run at writing an article about it. Cause people clearly don't understand.
https://www.fractionalxperience.com/ux-ui-graphic-design-blog/why-jpeg-xl-ignoring-bit-depth-is-genius
|
|
|
Demiurge
|
2025-10-27 05:30:23
|
Wow nice
|
|
2025-10-27 05:36:48
|
Well written. Laudable job
|
|
|
spider-mario
|
2025-10-27 08:33:03
|
would have come in handy for this exchange: https://www.dpreview.com/news/5829652105/jpeg-xl-image-format-promises-smaller-files-backwards-compatibility-and-more?comment=4250301462
|
|
|
Lilli
|
|
KKT
So this post inspired me to take a run at writing an article about it. Cause people clearly don't understand.
https://www.fractionalxperience.com/ux-ui-graphic-design-blog/why-jpeg-xl-ignoring-bit-depth-is-genius
|
|
2025-10-27 09:52:08
|
Enjoyed the read. I'm a little confused maybe at the argument that there is "less things to worry about" but if you need to have different version of your jxl for different monitor brightness levels, there is still some parameter to worry about. Certainly it's more adaptable and not tied to bit depth, but it still is a parameter you have to select at encode time
|
|
|
spider-mario
would have come in handy for this exchange: https://www.dpreview.com/news/5829652105/jpeg-xl-image-format-promises-smaller-files-backwards-compatibility-and-more?comment=4250301462
|
|
2025-10-27 10:02:31
|
Wooh, a tough nut this one. Quite a bit of bad faith and even strawmanning. He has a very one dimensional view of image encoding, sounds like he comes from one industry and doesn't really see outside of his niche. Congratz for your patience and for letting it go
|
|
|
|
ignaloidas
|
|
Lilli
Enjoyed the read. I'm a little confused maybe at the argument that there is "less things to worry about" but if you need to have different version of your jxl for different monitor brightness levels, there is still some parameter to worry about. Certainly it's more adaptable and not tied to bit depth, but it still is a parameter you have to select at encode time
|
|
2025-10-27 10:12:48
|
HDR video has the same thing with reference brightness levels, you *shouldn't* need to have different versions for different monitor brightness levels - instead, when a image/video with higher brightness levels than the monitor can represent, there should be some relatively decent mapping between the different brightness levels - usually by compressing the high-end a bit
|
|
2025-10-27 10:14:46
|
That's exactly what the rendering intent bit of JXL is about - whether it should be perceptual, relative, saturation-focused, or absolute. It doesn't matter much for encoding but does for display
|
|
|
jonnyawsom3
|
2025-10-27 10:16:01
|
If you're more focused on quality than size, there's no harm in overshooting the encode-side intensity target and requesting a dimmer decode
|
|
|
lonjil
|
|
ignaloidas
That's exactly what the rendering intent bit of JXL is about - whether it should be perceptual, relative, saturation-focused, or absolute. It doesn't matter much for encoding but does for display
|
|
2025-10-27 10:16:11
|
It matters for encoding because JXL will smooth out details it thinks you can't see.
|
|
|
|
ignaloidas
|
|
lonjil
It matters for encoding because JXL will smooth out details it thinks you can't see.
|
|
2025-10-27 10:20:33
|
at least right now libjxl doesn't care about what value it's set to for encoding as far as I can see?
|
|
|
jonnyawsom3
|
2025-10-27 10:21:20
|
It does, but it mostly affects dark areas, from what I can tell the brightest areas aren't any more lossy at higher values
|
|
2025-10-27 10:22:00
|
In fact, in light of that article and this comment by Jon https://discord.com/channels/794206087879852103/822105409312653333/1430440186340311061, I might look into raising the default intensity in libjxl from 255 to 300 nits. Did a quick test and it works very well with our blue noise PR that got merged, retaining gradients where it was quantized to just black before
|
|
|
|
ignaloidas
|
2025-10-27 10:23:24
|
hmm, can't find it by searching the code but I'll trust you
|
|
|
Lilli
|
|
If you're more focused on quality than size, there's no harm in overshooting the encode-side intensity target and requesting a dimmer decode
|
|
2025-10-27 10:24:32
|
That's my feeling as well, it means either you have a version of your file with specific nits values, or you overshoot and count on the decoder to adjust it accordingly
This also means you always have a bigger file, since having a higher target intensity means less compression most of the time
|
|
2025-10-27 10:25:22
|
I hope I'm wrong on that last point, but in my tests this was quite true
|
|
|
jonnyawsom3
|
2025-10-27 10:25:52
|
Yeah, for a nearly entirely dark image, it was around a 4% size increase going from 255 to 300 nits. Unfortunately extra quality isn't free, which is why it's so harsh on darks by default
|
|
|
|
ignaloidas
|
|
It does, but it mostly affects dark areas, from what I can tell the brightest areas aren't any more lossy at higher values
|
|
2025-10-27 10:27:02
|
I think we're talking past each other a bit? I meant that the rendering intent doesn't change encoding, not that intensity doesn't change encoding (these are different things)
|
|
|
jonnyawsom3
|
|
ignaloidas
That's exactly what the rendering intent bit of JXL is about - whether it should be perceptual, relative, saturation-focused, or absolute. It doesn't matter much for encoding but does for display
|
|
2025-10-27 10:28:38
|
Ahh right, I missed that message, sorry
|
|
|
Lilli
|
2025-10-27 10:29:05
|
The rendering intent in the decode side can't invent data that's not in the file because it has been compressed away right? So I'm not sure I'm seeing how that solves the issue, have I missed something?
|
|
|
|
ignaloidas
|
2025-10-27 10:30:09
|
rendering intent is basically deciding what happens when you have a 10000 nit image and want to display it on a 300 nit monitor
|
|
2025-10-27 10:31:50
|
absolute = display what you can fit in 300 nits exactly, clip everything else, media-relative = display 10000 nit white as if it was a 300 nit white, scale everything accordingly, saturation = keep saturation as is, clip brightness as needed, perceptual = try to have something that looks somewhat close to human eye (loosely defined)
|
|
|
Exorcist
|
|
spider-mario
would have come in handy for this exchange: https://www.dpreview.com/news/5829652105/jpeg-xl-image-format-promises-smaller-files-backwards-compatibility-and-more?comment=4250301462
|
|
2025-10-27 11:43:48
|
bit depth don't mean anything for lossy mode
|
|
2025-10-27 11:45:56
|
multiply quantization-step by 2^N, the extra N bit can be eliminated
|
|
|
_wb_
|
2025-10-27 12:09:30
|
> As DELIVERY formats go, 9 bits (with γ=2 compression) would be completely sufficient for ALL practical purposes. With γ=2.2, this grows to 10 bits. Using 16 (and 32!) bits a complete lunacy!
This paragraph is wrong on several levels. Even if you restrict things to delivery, 10-bit displays are becoming mainstream and have been widely used for a very long time in specific use cases like medical, so obviously the image format needs an internal precision that at least matches the display precision. If the display space and the image space match perfectly, then 10-bit suffices, but if they don't, then some additional precision can be useful.
I don't understand why gamma=2 would require one bit less precision than gamma=2.2; it's more like the other way around.
I would say that for HDR, 12-bit internal precision is the minimum if you want to have enough for all practical delivery purposes. Assuming that there are delivery use cases where you want to be sure there is no visible color banding.
But then of course the point is that JPEG XL is not only a delivery format, but can be used also in production workflows, where it is useful to have way more internal precision. Capture precision is typically around 14-bit and having some additional precision during editing is useful, so I'd say 16-bit is the bare minimum. Not complete lunacy.
|
|
|
Lilli
|
|
ignaloidas
absolute = display what you can fit in 300 nits exactly, clip everything else, media-relative = display 10000 nit white as if it was a 300 nit white, scale everything accordingly, saturation = keep saturation as is, clip brightness as needed, perceptual = try to have something that looks somewhat close to human eye (loosely defined)
|
|
2025-10-27 12:17:29
|
Oh I see, that's good ! I didn't know that's how it worked 🙂 Thank you for the details. So if you want an hdr output, but only encoded with 300nits, that means you'll probably end up with subpar results. The only option to be compatible is deliver high-nits jxl, such that everyone decodes to whatever precision and dynamic range they want ?
|
|
|
|
ignaloidas
|
|
Lilli
Oh I see, that's good ! I didn't know that's how it worked 🙂 Thank you for the details. So if you want an hdr output, but only encoded with 300nits, that means you'll probably end up with subpar results. The only option to be compatible is deliver high-nits jxl, such that everyone decodes to whatever precision and dynamic range they want ?
|
|
2025-10-27 12:19:15
|
If you have dynamic range for that in your source, then ideally yes, but for SDR content it's best to have it at it's native nits. Though the handling between different systems can be quite different and sometimes it's really bad...
|
|
|
Tirr
|
2025-10-27 12:19:35
|
I thought rendering intents is about gamut mapping and only affects colors that are out of gamut?
|
|
|
Lilli
|
|
_wb_
> As DELIVERY formats go, 9 bits (with γ=2 compression) would be completely sufficient for ALL practical purposes. With γ=2.2, this grows to 10 bits. Using 16 (and 32!) bits a complete lunacy!
This paragraph is wrong on several levels. Even if you restrict things to delivery, 10-bit displays are becoming mainstream and have been widely used for a very long time in specific use cases like medical, so obviously the image format needs an internal precision that at least matches the display precision. If the display space and the image space match perfectly, then 10-bit suffices, but if they don't, then some additional precision can be useful.
I don't understand why gamma=2 would require one bit less precision than gamma=2.2; it's more like the other way around.
I would say that for HDR, 12-bit internal precision is the minimum if you want to have enough for all practical delivery purposes. Assuming that there are delivery use cases where you want to be sure there is no visible color banding.
But then of course the point is that JPEG XL is not only a delivery format, but can be used also in production workflows, where it is useful to have way more internal precision. Capture precision is typically around 14-bit and having some additional precision during editing is useful, so I'd say 16-bit is the bare minimum. Not complete lunacy.
|
|
2025-10-27 12:20:00
|
I use it for production workflows and 16bits were barely enough (though, other issues like bandwidth and memory were non-negociable constraints), so I concur 😄
|
|
|
|
ignaloidas
|
|
Tirr
I thought rendering intents is about gamut mapping and only affects colors that are out of gamut?
|
|
2025-10-27 12:20:34
|
usually, not necessarily
|
|
|
dogelition
|
|
Tirr
I thought rendering intents is about gamut mapping and only affects colors that are out of gamut?
|
|
2025-10-27 12:21:11
|
gamut mapping inherently means you also need to compress in-gamut colors to make space
also there's absolute/relative colorimetric with the difference being whether the white point should be the image's or your display's
|
|
|
|
ignaloidas
|
2025-10-27 12:21:27
|
e.g. printing is something that's explicitly considered with rendering intents, and you may want to shift the whole colorspace to align the white points
|
|
|
Tirr
|
2025-10-27 12:33:17
|
uh maybe my wording wasn't quite right, my point was that tone mapping is orthogonal to rendering intent, but yeah I now think it can affect tone mapping because colors with excess luminance are also out of gamut
|
|
|
spider-mario
|
2025-10-27 12:44:52
|
the perceptual rendering intent absolutely can affect in-gamut colours to “make room” for those that aren’t
|
|
2025-10-27 12:45:10
|
it’s the gamut equivalent of tone mapping
|
|
2025-10-27 12:45:44
|
(but yeah, relative and absolute just clip)
|
|
|
|
ignaloidas
|
2025-10-27 12:47:29
|
You can have colors with excess saturation as well, e.g basically no display can show you something that's pure 500nm light, you have to bring down the saturation to fit in the display's color volume. And what do you do if it's in a saturation gradient that crosses into representable - for perceptual accuracy you would need to desaturate the whole gradient to keep it as a gradient
|
|
|
Quackdoc
|
|
KKT
So this post inspired me to take a run at writing an article about it. Cause people clearly don't understand.
https://www.fractionalxperience.com/ux-ui-graphic-design-blog/why-jpeg-xl-ignoring-bit-depth-is-genius
|
|
2025-10-27 12:58:23
|
was jxl limited to [0.0 , 1.0]? I thought it allowed greater range
|
|
2025-10-27 12:58:34
|
or is this output vs input?
|
|
|
|
ignaloidas
|
2025-10-27 01:02:33
|
This article reached HN and the comment quality is as expected https://news.ycombinator.com/item?id=45718490
|
|
|
lonjil
|
2025-10-27 01:09:24
|
People on Lobste.rs think that the article is LLM generated.
|
|
|
|
veluca
|
2025-10-27 01:27:52
|
> I've had exactly the opposite outcome with AVIF vs JPEG-XL. I've found that jxl outperforms AVIF quite dramatically at low bitrates.
uh, interesting observation...
|
|
|
Quackdoc
|
|
veluca
> I've had exactly the opposite outcome with AVIF vs JPEG-XL. I've found that jxl outperforms AVIF quite dramatically at low bitrates.
uh, interesting observation...
|
|
2025-10-27 01:36:00
|
I mean, if he was using a rav1e based encoder then I could see it. avif encoding is still kinda a mess right now
|
|
|
jonnyawsom3
|
2025-10-27 01:50:41
|
If they're testing main then *maybe* that makes sense with the auto resampling? Still surprising though, especially with the quality regression
|
|
2025-10-27 02:21:08
|
Back when Blender support was being worked on, I realised it could just always encode lossy in f32, since Blender downsamples to other bitdepths. Issue being lossles *does* have bitdepths, so the UI would have to change depending on if you pick lossy or lossless, which could confuse a lot of people
|
|
|
username
|
2025-10-27 03:24:40
|
couldn't you just have the bitdepth setting for lossy change what the intended output bitdepth of the final JXL lossy file is but still use the full bitdepth data from Blender be used for encoding?
|
|
|
spider-mario
|
2025-10-27 04:05:39
|
as in reproduce cjxl’s behaviour?
|
|
|
Lilli
|
|
username
couldn't you just have the bitdepth setting for lossy change what the intended output bitdepth of the final JXL lossy file is but still use the full bitdepth data from Blender be used for encoding?
|
|
2025-10-28 03:01:31
|
I guess Blender downsamples before passing the data to the encoder
|
|
|
jonnyawsom3
|
2025-10-28 03:18:36
|
Probably easiest to just always encode as float32, since JXL decoders can output to requested bitdepths anyway
|
|
|
AccessViolation_
|
|
KKT
So this post inspired me to take a run at writing an article about it. Cause people clearly don't understand.
https://www.fractionalxperience.com/ux-ui-graphic-design-blog/why-jpeg-xl-ignoring-bit-depth-is-genius
|
|
2025-10-28 03:55:30
|
this is actually really cool. this is the first time I'm reading about it, thanks for the writeup
|
|
2025-10-28 03:56:59
|
storing everything as a float value between 0 and 1 and making bit depth just metadata to decide what to decode to, and using an intensity target to guide the compressor, is such a delightfully aesthetic solution
|
|
|
KKT
|
2025-10-28 06:12:02
|
That's a good way to summarize it.
|
|
|
Meow
|
2025-10-29 06:51:24
|
https://grokipedia.com/page/JPEG_XL
|
|
2025-10-29 06:52:03
|
Yes Grokipedia includes a page of JXL
|
|
2025-10-29 06:55:01
|
> Browser adoption progressed unevenly; experimental support appeared in Chrome via flags, but Google terminated this in June 2025, citing insufficient ecosystem readiness and prioritizing alternatives like AVIF for web deployment. Conversely, Apple committed to native Safari integration by late 2025, driven by JPEG XL's advantages in lossless JPEG recompression and animation handling.
|
|
|
jonnyawsom3
|
2025-10-29 07:26:37
|
Those 3 references at the bottom are the Chromium rigged benchmarks, a website testing v0.3.2, and (thankfully) Gianni's benchmark blog post in 2023
|
|
|
HCrikki
|
2025-10-29 11:02:01
|
Its no wonder. Everything of any importance is either walled off to discord like a state secret or a temporarily pinned blog entry on cloudinary's site thats only visible 3 days until new blogs sink it
|
|
|
_wb_
|
2025-10-29 11:58:06
|
I dunno, there are good links on jpegxl.info, so it's not like good info is impossible to find. But yes, generally search engines and chatgpt etc don't have a clue about how substantial something is, so a blogpost with a flawed methodology based on very little data to draw far-reaching and wrong conclusions can get put on the same level as a nuanced article based on a large amount of data and solid methodology.
|
|
|
AccessViolation_
|
|
Meow
https://grokipedia.com/page/JPEG_XL
|
|
2025-10-29 01:30:47
|
I don't think this article is any different from the wikipedia version. afaik when grokipedia does deviate, there's an 'edits' button which shows you the changes
|
|
2025-10-29 01:32:22
|
oh, no, it is different. weird
|
|
|
Meow
|
2025-10-29 01:36:01
|
Grokipedia is supposed to be entirely managed by Grok
|
|
|
_wb_
|
2025-10-29 01:41:25
|
but Grok does read wikipedia, right?
|
|
|
jonnyawsom3
|
2025-10-29 01:43:05
|
As far as I can tell, it uses Wikipedia as a base and then tries to add/edit information using additional sources. For example it mentioned my changes to Downsampling by reading the GitHub changelog, but got the details wrong
|
|
|
HCrikki
|
2025-10-29 01:48:34
|
compareasons and tables should be putting real reproducible numbers, not vague slowest/fastest. kinda infuriating to read someone saying jxl is x2 slower in one particular scenario (flawed with improper decode, ie singlethreaded) without the elaboration that its 8 milliseconds versus 14 milliseconds - local loading being literally instant so it doesnt matter
|
|
2025-10-29 01:50:51
|
a 200 kilobyte webp still consumes the exact same storage and bandwidth as a 200kb jxl so its dubious to link potential to actual efficiency when its a matter of human creators predetermining the target bandwidth/filesize themselves
|
|
|
AccessViolation_
|
|
_wb_
but Grok does read wikipedia, right?
|
|
2025-10-29 02:41:00
|
yeah it's fact checking wikipedia using wikipedia (among others) <:KekDog:805390049033191445>
|
|
2025-10-29 02:42:43
|
this does not seem like a bad use case of language models. except of course they should flag potential issues for human review and not rewrite things themselves
|
|
2025-10-29 02:45:06
|
they probably shouldn't produce facts and send those to human reviewers, but language models seem like a good tool for very quickly flagging cases where statements with a citation don't line up with the source
|
|
|
jonnyawsom3
|
2025-10-29 02:51:42
|
The problem is it's fact checking against *a* source, not necessarily one that's verified to be correct
|
|
|
_wb_
|
2025-11-01 04:31:11
|
https://www.404media.co/grokipedia-is-the-antithesis-of-everything-that-makes-wikipedia-good-useful-and-human/?ref=platformer.news
|
|
|
Exorcist
|
2025-11-01 04:44:47
|
He already regret quit OpenAI, miss the change to control ChatGPT
|
|
|
dogelition
|
2025-11-01 04:45:06
|
i do think there's value in the idea of running an advanced LLM over wikipedia to look for factual errors and inconsistencies
|
|
2025-11-01 04:45:13
|
obviously don't agree with the execution here though
|
|
|
Exorcist
|
2025-11-01 04:47:03
|
the people blindly believe ChatGPT also blindly believe Wikipedia
|
|
|
username
|
2025-11-01 05:09:03
|
something I've noticed about Wikipedia is that random numbers on articles will just be incorrect
|
|
2025-11-01 05:10:24
|
no clue if it's just a common thing for people to write down wrong numbers or if it's some group or otherwise intentionally fucking around by slightly changing random numbers across Wikipedia but whatever it is it's annoying
|
|
|
Quackdoc
|
|
_wb_
https://www.404media.co/grokipedia-is-the-antithesis-of-everything-that-makes-wikipedia-good-useful-and-human/?ref=platformer.news
|
|
2025-11-01 05:10:41
|
I don't like grok/grokpedia, but this article pretending wikipedia is good is major kekw
|
|
|
lonjil
|
2025-11-01 05:29:15
|
wikipedia is a lot better than most things in this world
|
|
|
Quackdoc
|
2025-11-01 05:33:55
|
I can probably count on one hand how many times wikipedia is actually had good info when I have checked it over the last 2 years, but I would need at least 4 hands to count the times I've seen trash info
|
|
|
AccessViolation_
|
2025-11-01 06:01:10
|
what do you mean by "good info?" factual accuracy, or the specific usefulness of the information to you?
|
|
2025-11-01 06:01:13
|
or something else?
|
|
|
tokyovigilante
|
|
_wb_
probably 255 is a bit on the low side for current displays. E.g. my laptop goes to 600 nits in SDR mode when setting brightness to max, and newer ones even go to 1000 nits now in SDR mode
|
|
2025-11-01 06:43:20
|
Also makes the DICOM use-case very compelling, usually aiming for SDR brightness of 350-500 nits.
|
|
|
Demiurge
|
|
lonjil
wikipedia is a lot better than most things in this world
|
|
2025-11-01 06:46:19
|
Only if you have a very pessimistic view of the world lol
|
|
|
Quackdoc
|
|
AccessViolation_
what do you mean by "good info?" factual accuracy, or the specific usefulness of the information to you?
|
|
2025-11-01 07:00:36
|
a reasonable degree of accuracy
|
|
|
AccessViolation_
|
2025-11-01 08:08:24
|
wikipedia's accuracy is pretty good
|
|
2025-11-01 08:10:12
|
iirc there's a lower error rate on average than published academic books
|
|
2025-11-01 08:13:24
|
here, have a look at this article about the reliability of wikipedia, from wikipedia <:KekDog:805390049033191445>
https://en.wikipedia.org/wiki/Reliability_of_Wikipedia
|
|
|
lonjil
|
|
AccessViolation_
iirc there's a lower error rate on average than published academic books
|
|
2025-11-01 08:13:54
|
certainly fewer errors than you see in traditional encyclopedias
|
|
|
Quackdoc
I can probably count on one hand how many times wikipedia is actually had good info when I have checked it over the last 2 years, but I would need at least 4 hands to count the times I've seen trash info
|
|
2025-11-01 08:14:23
|
wikipedia's accuracy isn't even across topics so it's possible we've simply been reading very different sorts of articles
|
|
|
Quackdoc
|
|
AccessViolation_
here, have a look at this article about the reliability of wikipedia, from wikipedia <:KekDog:805390049033191445>
https://en.wikipedia.org/wiki/Reliability_of_Wikipedia
|
|
2025-11-01 08:15:01
|
[omegalul~1](https://cdn.discordapp.com/emojis/885026577618980904.webp?size=48&name=omegalul%7E1)
|
|
|
lonjil
wikipedia's accuracy isn't even across topics so it's possible we've simply been reading very different sorts of articles
|
|
2025-11-01 08:15:20
|
thats possible, I've seen some extraordinarly dumb articles
|
|
|
lonjil
|
2025-11-01 08:15:51
|
the worst I've come across this year is the article on Opus Dei, which im like 99% sure is maintained by Opus Dei members
|
|
2025-11-01 08:16:15
|
And the articles on Welsh history are often pretty bad.
|
|
2025-11-01 08:17:11
|
Though tbf like 75% of the time they're bad because they cite articles from the BBC or from actual Welsh history orgs that are just straight wrong, lol.
|
|
|
Quackdoc
|
2025-11-01 08:17:20
|
I mean, that ultra HDR one was horrid before it was fixed
|
|
2025-11-01 08:17:27
|
and now it's just kinda... a page
|
|
|
lonjil
|
2025-11-01 08:17:36
|
Same thing happens with Lithuanian paganism
|
|
2025-11-01 08:17:54
|
wiki articles full of nonsense, but only because the entire research field is full of nonsense
|
|
2025-11-01 08:18:17
|
when the historians are saying things that are completely made up, it's hard for an encyclopedia to do better
|
|
|
AccessViolation_
|
2025-11-01 08:20:52
|
I'm very grateful to have such a massive collaborative and open collection of knowledge maintained by a nonprofit, especially considering that many other public goods are in the hands of profit-driven corporations
|
|
|
lonjil
the worst I've come across this year is the article on Opus Dei, which im like 99% sure is maintained by Opus Dei members
|
|
2025-11-01 08:28:56
|
this isn't even allowed
|
|
2025-11-01 08:30:32
|
though it's generally true that there's somewhat of a positive bias, especially for niche topics. for example, an article about a certain anime is more likely to be maintained by people who like the anime in the first place
|
|
|
lonjil
|
|
AccessViolation_
this isn't even allowed
|
|
2025-11-01 08:30:58
|
it's not allowed but im pretty sure it's true
|
|
|
AccessViolation_
|
2025-11-01 08:31:38
|
oh yeah I wasn't contesting that to be clear
|
|
2025-11-01 08:32:56
|
I bet there's a template for that that you could add to the page
|
|
2025-11-01 08:38:02
|
found it
https://en.wikipedia.org/wiki/Template:POV
|
|
2025-11-01 08:38:42
|
It'll look like this
|
|
2025-11-01 08:40:15
|
edit article (visual editor) > insert ('+' button) > template > search for "pov"
|
|
2025-11-01 08:41:29
|
which reminds me, I should rework the part of the JPEG XL article that sounds like marketing slop before someone comes in with that template <:KekDog:805390049033191445>
|
|
2025-11-01 08:41:41
|
I was intending to do so a while ago but never got to it
|
|
|
Exorcist
|
|
AccessViolation_
here, have a look at this article about the reliability of wikipedia, from wikipedia <:KekDog:805390049033191445>
https://en.wikipedia.org/wiki/Reliability_of_Wikipedia
|
|
2025-11-01 09:37:24
|
trust me bro
|
|
|
Demiurge
|
|
lonjil
certainly fewer errors than you see in traditional encyclopedias
|
|
2025-11-02 02:59:16
|
Traditional encyclopedias are far better written and hence far more valuable. Having a low error rate is useless if it doesn't actually teach you anything...
|
|
|
lonjil
|
2025-11-02 03:00:58
|
I don't recall ever reading a traditional encyclopedia that was particularly good.
|
|
2025-11-02 03:01:41
|
The only part of Wikipedia that's hard to understand is math Wikipedia and traditional encyclopedias don't exactly cover a lot of math anyhow.
|
|
|
Demiurge
|
2025-11-02 03:04:06
|
Wikipedia doesn't usually read like teaching materials but more like nerds trying to flex to other nerds
|
|
|
Dunda
|
2025-11-02 03:04:22
|
Wikipedia tends to catalogue information about people pretty well, but when it comes to technical topics it can be either severely lacking or written too densely
|
|
|
Demiurge
|
2025-11-02 03:04:45
|
Real encyclopedias are written to teach the reader
|
|
2025-11-02 03:05:14
|
And yeah the Ultra HDR article was basically AI generated
|
|
2025-11-02 03:05:38
|
Hilariously bad
|
|
|
Dunda
|
2025-11-02 03:07:53
|
Unfortunately it's kind of fundamental to huge public undertakings like wikipedia
|
|
2025-11-02 03:09:37
|
People do get banned from wikipedia, even IP banned, but you can't guarantee vandalism and bad behaviour will be prevented since you can't evict somebody like in a physical project
|
|
2025-11-02 03:10:01
|
Letting as many people as possible in allows for a huge coverage, but not necessarily a great coverage, and so on
|
|
|
Meow
|
2025-11-02 06:33:31
|
As a long-time (21 years) Wikipedian I would admit that Grokipedia's issues can be applied to Wikipedia as well
|
|
2025-11-02 06:34:55
|
The first month of Wikipedia is really fun to read
|
|
|
AccessViolation_
|
2025-11-02 12:54:37
|
I have no doubt traditional encyclopedias are better written (in terms of article structure and writing style) *on average* but they also have far, far fewer entries. it'd be interesting to take some established traditional encyclopedia, and compare the various aspects of the set of articles that they both have
|
|
2025-11-02 12:56:48
|
I expect Wikipedia's articles to be more extensive and factually correct, while the traditional encyclopedia's articles are probably better structured and consistent in their encyclopedic tone
|
|
|
Meow
|
2025-11-02 02:38:37
|
As editors have bias on political views, they also have bias on choosing what to write
|
|
|
lonjil
|
2025-11-02 02:51:06
|
I took a loot at the Encyclopedia Britannica (online-only since 2011) and the first thing I saw was a button suggesting I ask their chatbot instead of reading an article 😆
|
|
|
AccessViolation_
|
2025-11-02 03:29:33
|
oh damn I didn't know they were online-only
|
|
|
lonjil
|
2025-11-02 03:37:49
|
the 2010 printing of the 15th edition was the last time it was printed
|
|
|
_wb_
probably 255 is a bit on the low side for current displays. E.g. my laptop goes to 600 nits in SDR mode when setting brightness to max, and newer ones even go to 1000 nits now in SDR mode
|
|
2025-11-03 09:28:23
|
hm, actually, my monitor only goes up to 250 nits, and the room I have it in is very well lit, and the extra quantization in dark areas is still visible to me.
|
|
|
_wb_
|
2025-11-03 09:43:03
|
I guess things depend a lot on how the darks actually get sent to the monitor and how it translates to actual light, with all the confusion around pure gamma or linear segment etc. Possibly the actual brightness of the darks varies a lot, not just relative to the max nits of the display but rather in how it interprets those first few 8-bit values.
|
|
|
AccessViolation_
|
2025-11-03 09:46:17
|
my display is 435 nits, if I set the intensity target to that, will it decrease the file size because it's assuming I can't see detail in certain dark parts next to bright parts, moreso than if I left the intensity target at the default?
|
|
2025-11-03 09:50:36
|
hmm no it made it bigger
|
|
2025-11-03 09:57:55
|
ahh, it seems instead if estimates how much of the dark areas you can see at all, and will compress them more if the intensity target is low
|
|
2025-11-03 09:59:17
|
I experimented with an intensity target of 1, and setting my display brightness to the lowest, and it's actually crazy how I can't see a difference between it and the default intensity encoding, and only when I turn the brightness up do I see how much it was compromising in the darker areas
|
|
|
jonnyawsom3
|
|
AccessViolation_
hmm no it made it bigger
|
|
2025-11-03 10:28:28
|
That's because the current default is 255 nits, so you doubled the perceivable brightness with your display
|
|
2025-11-03 10:30:05
|
Oh right, you said dark parts next to bright. Unfortunately the encoder doesn't understand local contrast, so it's just a global level for the blacks
|
|
|
AccessViolation_
|
2025-11-03 10:30:22
|
yeah, I was wrong about how it worked. I thought the logic was "if the intensity target is high, then dark things next to very bright things will be hard to make out and we can decrease the quality" while instead it's more like "if the intensity target is low, dark details are harder to make out so we can decrease the quality" from the looks of it
|
|
2025-11-03 10:31:30
|
I'm very impressed with how well it actually works though, especially at very low intensity targets paired with very low display brightness
|
|
|
jonnyawsom3
|
2025-11-03 10:32:44
|
There used to be a tool bundled with the binaries that allowed you to tonemap an image to emulate a different display brightness
|
|
2025-11-03 10:32:59
|
djxl had it too, but it's broken
|
|
|
AccessViolation_
|
2025-11-03 10:33:47
|
ah, well setting changing the display brightness works well enough for me
|
|
|
jonnyawsom3
|
2025-11-03 10:33:51
|
```--display_nits=N
If set to a non-zero value, tone maps the image the given peak display luminance.```
<https://github.com/libjxl/libjxl/issues/4085>
|
|
|
AccessViolation_
|
2025-11-03 10:34:55
|
what's the philosophy behind the intensity target? I imagine more often than not you can't be sure on which devices an image will be viewed?
|
|
|
monad
|
2025-11-03 10:46:17
|
probably it's for use cases where you can be sure
|
|
|
AccessViolation_
|
2025-11-03 10:48:27
|
hmm
|
|
2025-11-03 10:48:38
|
oh I bit it's great for like, printed images
|
|
2025-11-03 10:49:10
|
if you know how shitty the printer is, you can save on file size
|
|
2025-11-03 10:50:04
|
actually no what am I saying, it doesn't make sense to reencode images to a worse quality to then print them
|
|
2025-11-03 10:52:15
|
baked video game assets, if you're in a situation where it's dark and the brightness can't change, you can compress the assets that appear in that area for their brightness
|
|
|
jonnyawsom3
|
2025-11-03 10:52:20
|
Consider this, no other image format has had target nits before, so now we can control it at all
|
|
2025-11-03 10:53:13
|
Obviously local contrast detection would be great, but for now we can just tune and balance the target to find a good quality/density tradeoff
|
|
|
AccessViolation_
|
2025-11-03 10:56:13
|
I assume this is also an important part of supporting HDR without having to hard-code the tuning for specific bit depths and brightness
|
|
|
jonnyawsom3
|
2025-11-03 11:02:45
|
Yeah, PQ sets the target to 10K, HLG sets it to 1K nits
|
|
|
spider-mario
|
|
```--display_nits=N
If set to a non-zero value, tone maps the image the given peak display luminance.```
<https://github.com/libjxl/libjxl/issues/4085>
|
|
2025-11-03 01:21:24
|
`--display_nits=1` should, on the contrary, produce a very bright image (if shown on a monitor of normal brightness), since it will take very little to produce full output; but indeed, no effect at all is not good
|
|
|
jonnyawsom3
|
2025-11-03 01:23:09
|
Right, it's been at least a year so I probably forgot
|
|
|
spider-mario
|
|
```--display_nits=N
If set to a non-zero value, tone maps the image the given peak display luminance.```
<https://github.com/libjxl/libjxl/issues/4085>
|
|
2025-11-05 12:30:10
|
it seems it still works for PQ images, just not for SDR images (it’s plausible we skip tone mapping on purpose then)
|
|
|
jonnyawsom3
|
2025-11-05 12:30:32
|
Ah right, makes sense
|
|
|
spider-mario
|
2025-11-05 12:32:39
|
ah, looking at the code, that does seem to be the case
|
|
2025-11-05 12:33:49
|
if source is PQ: construct PQ tone mapper https://github.com/libjxl/libjxl/blob/3a4f18bbc74939ddc4481a0395e1f6599a68bf44/lib/jxl/render_pipeline/stage_tone_mapping.cc#L42-L43
if source, but not destination, is HLG: construct HLG OOTF https://github.com/libjxl/libjxl/blob/3a4f18bbc74939ddc4481a0395e1f6599a68bf44/lib/jxl/render_pipeline/stage_tone_mapping.cc#L50-L55
`bool IsNeeded() const`: returns whether there is a PQ tone mapper or an HLG OOTF
https://github.com/libjxl/libjxl/blob/3a4f18bbc74939ddc4481a0395e1f6599a68bf44/lib/jxl/render_pipeline/stage_tone_mapping.cc#L65
|
|
|
HCrikki
|
2025-11-10 05:20:44
|
http://theregister.com/2025/11/10/another_chance_for_jpeg_xl/
|
|
2025-11-10 05:27:08
|
of interest: all browsers include a pdf reader (and edge's preinstalled reader is from adobe). in total, there must be more than 3x more pdf reader installs than browser installs (acrobat reader alone has over a billion installs between mobile and desktop)
|
|
|
Meow
|
2025-11-11 03:51:12
|
It will be fun when pages on Chrome can't display JXL but PDFs on Chrome can
|
|
|
HCrikki
|
2025-11-11 04:00:38
|
(mobile) apps couldve led adoption since they can fully bypass browsers lagging behind. wouldve been handy so mobile versions of web services load faster, sooner and consume less bandwidth despite showing ads
|
|
2025-11-11 04:01:29
|
serve jxls to your app, let chrome load the jpg, png and gifs with gainmaps
|
|
|
jonnyawsom3
|
2025-11-11 04:37:24
|
Issue is most apps are Chromium based anyway these days
|
|
|
Jarek Duda
|
2025-11-11 07:17:16
|
https://news.ycombinator.com/item?id=45884454
"Finally. I was wondering how long it would take them. For scanned documents with OCR, it makes a lot of sense. At same quality you get significantly smaller files, especially when looking at above 300dpi.
JPEG keep getting extensions like ultra HDR (which really is a hack more than anything else), XYB JPEG (significantly higher compression efficiency than classic jpeg), but all of these habe very limited support tbh, hence introducing jpeg xl seems more logical.
The only downside of jpeg xl to me is that it doesn’t support gain maps (yet?[1]). Without gain map, you have limited control over how the SDR image looks like, which can be very problematic in environments where you don’t know what the hardware presenting the image will be in the end.
But the list of upsides is so long: * higher bit depth * native HDR (no hack) * better compression (still better than xyb jpeg, at high resolutions, the difference becomes huge) * Bit depth independence (ok, that’s better compression again, I admit) * Better at Multispectral/hyperspectral imagery with lots of sub-images.
It is becoming the preferred format on so many levels: * iPhones use it’s encoding (not the file format itself) in its RAW files. * Medicine scanners are starting to use it (and that’s not really an area that is fast moving when it comes to software…) * now pdf (also not really known for being a break-neck-evolution environment…) * Safari supports it as well
I really do not understand why it’s taking so long for Chromium. I do believe they will support it at some point, as jxl-rs [2] is progressing, but really, it already took too long.
[1] https://github.com/libjxl/libjxl/discussions/3505 [2] https://github.com/libjxl/jxl-rs"
|
|
|
A homosapien
|
2025-11-11 07:26:47
|
> Without gain map, you have limited control over how the SDR image looks like
JXL is fully color managed, do you really need a gain map? It just seems unnecessary.
|
|
|
username
|
2025-11-11 07:28:02
|
I think the gainmap support in the JXL spec recommends people use inverse gainmaps (HDR base image with the gainmap defining the SDR look)?
|
|
|
_wb_
|
|
username
I think the gainmap support in the JXL spec recommends people use inverse gainmaps (HDR base image with the gainmap defining the SDR look)?
|
|
2025-11-11 07:46:46
|
Correct.
|
|
|
A homosapien
> Without gain map, you have limited control over how the SDR image looks like
JXL is fully color managed, do you really need a gain map? It just seems unnecessary.
|
|
2025-11-11 07:54:07
|
In general I tend to agree. Though there should be a better standardization on how to tone map HDR images to SDR displays or HDR displays with less headroom. If every platform just does whatever they like, images will look bad on some of them. That's the current situation.
|
|
|
jonnyawsom3
|
2025-11-11 11:36:49
|
I suppose that GitHub discussion could be closed as fixed, correct?
|
|
|
Jim
|
|
HCrikki
http://theregister.com/2025/11/10/another_chance_for_jpeg_xl/
|
|
2025-11-11 02:44:36
|
Sadly, most embedded PDF readers in Chrome-based browsers are compiled third-party and include their own image format decoders. So if you think it will lead to browser adoption, not necessarily. I think Firefox's PDF reader uses Firefox's own image decoding (which is why it only supports a subset of formats), but most do not rely on the browser for media decoding.
|
|
|
_wb_
|
2025-11-11 03:02:12
|
Isn't the default chromium pdf reader a javascript-based thing? I think it does have a jpeg 2000 decoder inside, iirc...
|
|
|
AccessViolation_
|
2025-11-11 03:47:43
|
browser pdf readers have many compatibility issues from what I remember. even if adding JXL support to their PDF readers would lead to support in the browser itself too, I'm not necessarily expecting JXL support in their PDF readers to happen any time soon
|
|
|
lonjil
|
|
Jim
Sadly, most embedded PDF readers in Chrome-based browsers are compiled third-party and include their own image format decoders. So if you think it will lead to browser adoption, not necessarily. I think Firefox's PDF reader uses Firefox's own image decoding (which is why it only supports a subset of formats), but most do not rely on the browser for media decoding.
|
|
2025-11-11 03:53:53
|
most oddball image formats in PDF aren't very commonly used, especially in most of the kinds of PDFs people view in the browser. On the other hand, JXL will be the only format in PDF for HDR, and with both Safari and Firefox on board with JXL, users will ask why Chrome is so slow at rendering HDR PDFs, while Safari and Firefox are so fast.
|
|
|
Jim
|
2025-11-11 10:01:10
|
Chrome uses PDFium. It was initially made by a third party but Google bought it and open-sourced it later.
https://github.com/chromium/pdfium
|
|
2025-11-11 10:04:28
|
Edge has a PDF reader as well, but can't find much info on what it is so it is likely a proprietary reader that Microsoft makes.
https://learn.microsoft.com/en-us/deployedge/microsoft-edge-pdf
|
|
2025-11-11 10:06:07
|
Pretty sure Safari uses Apple's PDF reader that comes with the operating system.
|
|
2025-11-11 10:07:16
|
Pretty sure Firefox is the only one using a Javascript-based PDF reader and it uses the image formats supported by the browser. The others use compiled readers that come with their own image format support.
|
|
|
lonjil
|
|
Jim
Chrome uses PDFium. It was initially made by a third party but Google bought it and open-sourced it later.
https://github.com/chromium/pdfium
|
|
2025-11-11 10:23:16
|
So when they add JXL to PDFium, it'll be with a native library (presumably jxl-rs), and thus full speed. But interestingly, this means that the cost arguments against JXL (xxx KiB extra to ship and attackable surface area that needs to be kept around in the browser forever) go away.
|
|
|
HCrikki
|
|
Jim
Edge has a PDF reader as well, but can't find much info on what it is so it is likely a proprietary reader that Microsoft makes.
https://learn.microsoft.com/en-us/deployedge/microsoft-edge-pdf
|
|
2025-11-11 10:31:58
|
edge adopted a pdf reader from adobe 2-3 years ago. afaik starting this october, a modernized version is now default for everyone including entreprise/ltsc and webview2
a public announcement said to be bringing it to 1,4 billion windows users through installs (edge/webview2 are preinstalled and autoupdate)
|
|
|
AccessViolation_
|
2025-11-11 10:51:05
|
it is a bit silly that browser developers are worried about potential security issues arising from adding support for an image format while they're playing an eternal game of catch-up with vulnerabilities because they decided to JIT compile user-controlled code and make assumptions about it that eventually don't hold and lead to an exploit
except edge which somewhat addressed this
|
|
2025-11-11 10:51:50
|
there are only exceptional cases where not JIT compiling javascript leads to noticeable performance issues, from my own testing by having JIT compilation disabled for a month or two
|
|
2025-11-11 10:54:14
|
I'm not saying it's not fair to be conservative with formats you support for this reason, it's just silly that they're okay with JIT compiling JS then imo
|
|
|
Exorcist
|
2025-11-11 11:04:58
|
|
|
2025-11-11 11:05:07
|
|
|
|
|
veluca
|
2025-11-11 11:13:15
|
I do understand the desire to stop adding C(++) with potential RCE bugs that don't get discovered for 10+ years 😛
|
|
|
lonjil
|
|
AccessViolation_
it is a bit silly that browser developers are worried about potential security issues arising from adding support for an image format while they're playing an eternal game of catch-up with vulnerabilities because they decided to JIT compile user-controlled code and make assumptions about it that eventually don't hold and lead to an exploit
except edge which somewhat addressed this
|
|
2025-11-11 11:27:41
|
iOS has a no-JIT mode
|
|
|
|
veluca
|
2025-11-11 11:34:10
|
also, the issue with JIT compilers is not that they get assumptions on the *code they JIT* wrong, it's usually that they get assumptions on the compiler's invariants wrong (or perhaps they don't have actual sensible semantics for what optimizations are possible -- LLVM is full of that kind of miscompilations)
|
|
2025-11-11 11:35:12
|
hah, found the issue I had in mind: https://github.com/rust-lang/rust/issues/107975
|
|
|
AccessViolation_
|
|
lonjil
iOS has a no-JIT mode
|
|
2025-11-11 11:48:46
|
android too btw
|
|
2025-11-11 11:49:11
|
in firefox you can disable jit tiers individually in about:config
|
|
|
veluca
also, the issue with JIT compilers is not that they get assumptions on the *code they JIT* wrong, it's usually that they get assumptions on the compiler's invariants wrong (or perhaps they don't have actual sensible semantics for what optimizations are possible -- LLVM is full of that kind of miscompilations)
|
|
2025-11-11 11:51:16
|
that's interesting. from what I remember about the article about JIT security issues from the microsoft edge team, it was like a three step plan to get the JIT in vulnerable state, and many JIT vulnerabilities were variations or implementations of this process
|
|
2025-11-11 11:51:59
|
https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode/#security-vs-performance-reconsidering-the-trade-offs
|
|
2025-11-11 11:52:17
|
I'm not sure if this is the blog post I read originally, but this briefly mentions it
|
|
|
|
veluca
|
2025-11-11 11:52:26
|
I mean, they might *also* make wrong assumption on the input code (and possibly even more), but I believe pretty much every compiler is internally inconsistent, sometimes in very obvious ways
|
|
|
AccessViolation_
|
2025-11-11 11:53:58
|
oh yeah I bet
|
|
|
lonjil
|
2025-11-11 11:55:05
|
surely if we use Cranelift, all problems will go away
|
|
|
AccessViolation_
|
2025-11-11 11:55:32
|
Wasm all the things
|
|
|
lonjil
|
2025-11-11 11:56:06
|
i mean most wasm runtimes'll jit your stuff, no?
|
|
2025-11-11 11:56:23
|
no added security compared to JITing something else directly
|
|
|
|
veluca
|
2025-11-11 11:57:17
|
I do believe jit-ting wasm should be much simpler than jit-ting js
|
|
2025-11-11 11:57:44
|
wasm is a much lower level language
|
|
|
jonnyawsom3
|
2025-11-11 11:58:00
|
jit-MA trees
|
|
|
AccessViolation_
|
2025-11-11 11:58:07
|
yes, but the issue with traditional JS jit is that they make assumptions about the code to speed it up, and have to catch when they're wrong.
"JS has no static types at all, this function can be passed literally any object, but we've only seen this object so fast so we'll compile it with that assumption and break out when we're wrong"
Wasm, on the other hand, is strongly typed and "JIT" compiling Wasm can be done entirely ahead of time with full optimization
|
|
|
lonjil
i mean most wasm runtimes'll jit your stuff, no?
|
|
2025-11-11 11:58:31
|
re: this
|
|
|
lonjil
|
2025-11-11 11:59:08
|
> "JS has no static types at all, this function can be passed literally any object, but we've only seen this object so fast so we'll compile it with that assumption and break out when we're wrong"
tbh, I don't think I've heard of this being much of an issue in practice
|
|
2025-11-12 12:00:12
|
though js just in general sucks, so eh
|
|
|
AccessViolation_
|
2025-11-12 12:00:16
|
well regardless, a past version of Wasmtime (which at that point was JIT-only) was formally verified to be safe
|
|
2025-11-12 12:00:23
|
so it can be done
|
|
|
lonjil
|
2025-11-12 12:00:30
|
hooray
|
|
2025-11-12 12:00:54
|
though I severely dislike the design of wasm
|
|
2025-11-12 12:01:55
|
designing something to use raw integer offsets for memory access, but also with only structured high level control flow. really baffling.
|
|
|
AccessViolation_
|
|
veluca
I mean, they might *also* make wrong assumption on the input code (and possibly even more), but I believe pretty much every compiler is internally inconsistent, sometimes in very obvious ways
|
|
2025-11-12 12:02:17
|
I remember Rust enabled support for generating LLVM IR with the `noalias` keyword. traditionally languages rarely used this keyword, but because of the memory model of Rust this invariant held everywhere, so `noalias` was *correctly* scattered all over the in the IR which lead to an avalanche of LLVM bugs being discovered <:KekDog:805390049033191445>
|
|
|
lonjil
|
|
AccessViolation_
I remember Rust enabled support for generating LLVM IR with the `noalias` keyword. traditionally languages rarely used this keyword, but because of the memory model of Rust this invariant held everywhere, so `noalias` was *correctly* scattered all over the in the IR which lead to an avalanche of LLVM bugs being discovered <:KekDog:805390049033191445>
|
|
2025-11-12 12:02:40
|
that was hilarious / sad
|
|
|
|
veluca
|
2025-11-12 12:02:49
|
yup, they disabled writing noalias annotations like 3 times
|
|
|
lonjil
|
|
lonjil
designing something to use raw integer offsets for memory access, but also with only structured high level control flow. really baffling.
|
|
2025-11-12 12:03:38
|
what I dislike about using flat memory like that is that while sure, you can sandbox the bad memory unsafety, it's still there, and now you have no good way of sharing data other than copying buffers, which sucks
|
|
2025-11-12 12:04:41
|
safe sandboxing has always been very easy as long as you don't have to share stuff between the sandbox and the outside world
|
|
|
AccessViolation_
|
|
lonjil
designing something to use raw integer offsets for memory access, but also with only structured high level control flow. really baffling.
|
|
2025-11-12 12:05:04
|
I honestly really like it. I don't know what the control flow being "high level" means, but structured control flow is an important aspect of the security model.
note also that Wasm is basically *expected* to be JIT compiled, which it safely can be thanks to that model
|
|
|
lonjil
|
2025-11-12 12:05:35
|
it's just really annoying as a compilation target
|
|
2025-11-12 12:06:29
|
like if you have goto in your language (like C, or Lisp) you have to go through all kinds of convolutions to convert it into massive weird-looking structures of control flow
|
|
|
lonjil
safe sandboxing has always been very easy as long as you don't have to share stuff between the sandbox and the outside world
|
|
2025-11-12 12:07:58
|
a model where data was accessed via opaque capability objects would've allowed 100% safe sharing between the host and the sandbox, and even between many different sandboxed pieces of code.
MS-Wasm implemented this idea, but it was just a research project after Wasm as it was had already been embraced, so it never went anywhere.
|
|
|
AccessViolation_
|
2025-11-12 12:08:16
|
yeah, goto is by definition unstructured. it was an intentional decision to not allow things like goto
|
|
|
|
veluca
|
2025-11-12 12:08:54
|
I believe that was to ensure that you can statically verify stack usage
|
|
2025-11-12 12:09:07
|
but maybe there were other reasons
|
|
|
AccessViolation_
|
2025-11-12 12:10:22
|
iirc there have been things going on relating to memory sharing, but I haven't been following wasm development very closely for a while
|
|
|
lonjil
|
|
veluca
I believe that was to ensure that you can statically verify stack usage
|
|
2025-11-12 12:10:35
|
how would goto prevent that, unless you have a alloca primitive?
|
|
2025-11-12 12:11:52
|
I remember several years back when some Lisp folks proposed a very nice exception mechanism for Wasm, that would've made exceptions from many different languages easy to represent, but it was rejected because, and this was explicit, not implied, they didn't really care about anything but C++.
|
|
|
Exorcist
|
2025-11-12 12:14:02
|
> very nice exception mechanism
What is this?
|
|
|
AccessViolation_
|
2025-11-12 12:16:13
|
I'll have to read more about that because I know exception handling has been completely overhauled (the original behavior is legacy).
and also "they didn't really care about anything but C++" is surprising to hear given that Wasm presents itself as an ISA independent of specific languages
|
|
2025-11-12 12:17:05
|
they've since added support for garbage collected objects (so GC languages don't have to ship their own GC as part of the module) and tail calls (for functional languages) if that was true at some point, it's not true now
|
|
|
lonjil
|
2025-11-12 12:17:39
|
sorry, I'm airing my several year old grievances 🙂
|
|
2025-11-12 12:17:57
|
wasm is useful... I'll be using it myself soon to write a plugin for Typst
|
|
2025-11-12 12:18:11
|
I'm mostly salty over how it could've been better
|
|
|
AccessViolation_
|
|
lonjil
sorry, I'm airing my several year old grievances 🙂
|
|
2025-11-12 12:19:25
|
you're good ^^ it's an interesting perspective because from what I've heard most people agree that wasm is the "we *can* have nice things for a change" platform and has applied decades of hindsight into something with fair compromises and many good decisions from the start
|
|
|
lonjil
|
2025-11-12 12:20:33
|
How it feels to me, is that they implemented the easy thing we've known how to do for decades, which made better things much harder to retrofit into it (like sharing)
|
|
2025-11-12 12:21:17
|
> and also "they didn't really care about anything but C++" is surprising to hear given that Wasm presents itself as an ISA independent of specific languages
also I think this was *specifically* with regards to exceptions
|
|
2025-11-12 12:21:24
|
not wasm in general
|
|
|
AccessViolation_
|
2025-11-12 12:21:44
|
oh interesting
|
|
|
lonjil
|
2025-11-12 12:22:09
|
I guess that is what is now the legacy behavior?
|
|
|
AccessViolation_
|
2025-11-12 12:23:16
|
it could be. you can check active, proposed etc features here
https://webassembly.org/features/
|
|
2025-11-12 12:23:40
|
legacy exception handling is at the bottom and links to a github issue describing it
|
|
|
lonjil
|
|
Exorcist
> very nice exception mechanism
What is this?
|
|
2025-11-12 12:33:11
|
I was going to write something but it's 1:30 AM and I honestly don't remember enough details to give a useful answer.
|
|
2025-11-12 12:33:48
|
One thing I remember though is that it was a lot more general than just exceptions
|
|
|
AccessViolation_
|
2025-11-12 12:34:32
|
I think Wasm has so many use cases and I really enjoy that it exists. one use case I like is that games can use WIT/the Component Model to basically create imports/exports that programming language tooling can hook into. this would allow modders to create mods for the game in any language that can compile to Wasm, and the game's functions will be exposed as native functions in that language
|
|
|
lonjil
|
2025-11-12 12:34:33
|
Important because some languages like Common Lisp, 1) have lots of ways of unwinding the stack without exceptions, and 2) don't unwind the stack when exceptions are thrown
|
|
2025-11-12 12:35:10
|
(For example, in Common Lisp, you can unwind the stack with goto 🙂 )
|
|
|
AccessViolation_
|
2025-11-12 12:37:00
|
I'd love to discuss Wasm further but it's getting hard to keep my eyes open...
good night 💤
|
|
|
lonjil
|
2025-11-12 12:37:07
|
good night!
|
|
|
HCrikki
|
2025-11-16 11:33:16
|
https://kampidh.com/miniblog/reshade-jpegxl-saga
|
|
2025-11-16 11:35:13
|
tentative addition of jxl saving to Reshade.
SpecialK might have resolved certain potential pain points
|
|
2025-11-16 11:37:58
|
for browser support, outside safari/apple there's always waterfox (ships jxl enabled by default with hdr, with parity between windows and also android)
|
|
|
jonnyawsom3
|
|
HCrikki
tentative addition of jxl saving to Reshade.
SpecialK might have resolved certain potential pain points
|
|
2025-11-16 12:00:34
|
Considering they default to quality 99.95 because they refuse to belive lossless is any good, I wouldn't trust them on other aspects either
|
|
2025-11-16 12:00:54
|
Also, here's the actual PR https://github.com/crosire/reshade/pull/399
|
|
2025-11-16 03:59:16
|
> This looks like a good addition, it doesn't increase complexity and size too much.
Yippee
|
|
|
RaveSteel
|
2025-11-16 04:11:04
|
gamescope JXL PR when 🤔
|
|
2025-11-16 04:11:06
|
xd
|
|
|
Kampidh
|
|
> This looks like a good addition, it doesn't increase complexity and size too much.
Yippee
|
|
2025-11-16 04:28:29
|
wheeee~
|
|
|
Demiurge
|
2025-11-21 12:33:40
|
It looks like he made some constructive criticism on how better code organization could make it easier for third party devs to integrate and use libjxl.
|
|
2025-11-21 12:34:54
|
Rather than forcing people to fork and modify it manually, libjxl really should make it easier on devs by being more modularly organized.
|
|
2025-11-21 12:37:51
|
It doesn't require multiple github repos either, with redundant copies of the same source files. That just makes things incredibly more complicated and harder to maintain for no reason. No, the source files themselves should just be more organized and modular and easy to selectively compile or embed.
|
|
|
username
|
2025-11-21 12:38:49
|
this isn't full libjxl. it's the simple lossless encoder which is a single file
|
|
|
Demiurge
|
2025-11-21 12:39:05
|
It's very important to observe how libjxl is being used "in the wild" and learn from it how to make it more frictionless for future users
|
|
2025-11-21 12:39:59
|
That's key to adoption.
|
|
|
username
this isn't full libjxl. it's the simple lossless encoder which is a single file
|
|
2025-11-21 12:41:27
|
Sure but it looks like the guy provided some constructive critique of the way the code is organized. This is a real-world case of people trying to adopt libjxl code for use in third party software
|
|
2025-11-21 12:42:36
|
It's important to observe how to make it easier for developers to add jxl support to their software. If adoption is to succeed.
|
|
|
lonjil
|
2025-11-21 12:43:07
|
but it literally isn't libjxl code
|
|
|
Demiurge
|
2025-11-21 12:44:54
|
It literally is. Wow, and apparently there is also yet another redundant github repo with a redundant copy of source files. Multiplying the maintenance workload yet again for no reason.
|
|
|
AccessViolation_
|
2025-11-21 12:47:23
|
it doesn't. it's a fork so it can pull in changes from libjxl/libjxl
|
|
2025-11-21 12:47:28
|
assuming you're talking about the repo I just shared
|
|
|
Demiurge
|
2025-11-21 12:47:30
|
That's 3 different repos so far with 3 copies of the same source files that have to be managed and updated separately.
|
|
|
AccessViolation_
assuming you're talking about the repo I just shared
|
|
2025-11-21 12:48:28
|
libjxl/libjxl, google/jpegli, and libjxl/simple-lossless-encoder
|
|
|
AccessViolation_
|
2025-11-21 12:48:33
|
oh ok
|
|
|
Demiurge
|
|
Jyrki Alakuijala
|
2025-11-23 08:53:59
|
Google Revisits JPEG XL in Chromium After Earlier Removal | Hacker News https://share.google/Ho0v88WEGeI7sanqG
Probably a good idea to upvote
|
|
|
Demiurge
libjxl/libjxl, google/jpegli, and libjxl/simple-lossless-encoder
|
|
2025-11-23 08:58:44
|
Debian etc. maintainers wish for relatively small repositories. One way to reduce repo sizes is to have a clear scope what a repository is for.
|
|
|
Demiurge
|
2025-11-23 09:56:56
|
Debian and most linux distros have "split packages" where one repo and one compile produces multiple packages. Like for example in debian the header files are usually separated into a separate -headers package.
In libjxl's case it makes no sense to have 3 copies of the same source files in 3 separate repos... It just makes things way more confusing to maintain and compile. Distros like Debian will want to compile everything in 1 `make all` command anyways and then separately copy and install each thing into separate "split" packages.
|
|
2025-11-23 10:08:28
|
Whatever debian decides to do is Debian's business, and I think you should do whatever is best for libjxl itself and all the actual developers that use libjxl in their own software. What's good for debian is not what's important to worry about... but I think it would be good for them, since they would not have to re-download and re-compile the same source/object files multiple times if they can compile everything once and then cd and install each module in separate steps into separate split packages.
|
|
2025-11-23 10:10:27
|
That would be the simplest for literally everyone
|
|
|
_wb_
|
2025-11-23 12:37:22
|
https://www.phoronix.com/news/JPEG-XL-Possible-Chrome-Back
|
|
|
jonnyawsom3
|
2025-11-23 12:50:29
|
Quoting "JPEG XL" while using JPEG-XL for the rest of the article hurts xD
|
|
|
Meow
|
2025-11-23 12:57:41
|
Why are people so obsessed about that hyphen?
|
|
|
Jim
|
2025-11-23 12:57:43
|
Not really. If you look back at their other articles they always use a hyphen. So it's probably just policy to keep names the same and possibly to help find all the articles in their own internal search. They can't control how others outside the site write but it really doesn't matter. The web search engines don't really care about a space or hyphen.
|
|
|
Demiurge
|
|
Meow
Why are people so obsessed about that hyphen?
|
|
2025-11-23 03:57:06
|
No idea. I think it's just a common thing with nerdy people to either have some mild OCD or, to just enjoy wasting time nitpicking irrelevant details like that sometimes.
|
|
|
Meow
|
2025-11-23 04:04:52
|
I meant such hyphen should never exist and pollute people's knowledge
|
|
|
_wb_
|
2025-11-23 06:17:34
|
If I would have named it, I wouldn't have put a space in it because spaces are often annoying — line breaks if you forget to use nbsp, needs escaping or quotes when used in filenames or urls, etc. But it was already named before it was created. The issue can be avoided by using JXL (or jxl).
|
|
|
AccessViolation_
|
2025-11-23 06:22:42
|
I think I'm going to tell people it has a hyphen that's silent in speech and in writing
|
|
|
Moritz Firsching
|
2025-11-23 08:03:59
|
https://de.wiktionary.org/wiki/Deppenleerzeichen
|
|
|
lonjil
|
2025-11-24 01:15:21
|
you ever notice that Phoronix forum commenters are illiterate?
|
|
|
monad
|
2025-11-24 01:28:59
|
you can generalize that to "forum commenters"
|
|
|
|
veluca
|
|
lonjil
you ever notice that Phoronix forum commenters are illiterate?
|
|
2025-11-24 05:57:23
|
is that surprising?
|
|
|
lonjil
|
|
AccessViolation_
|
2025-11-24 09:22:06
|
I sometimes read the phoronix comments on articles about rust for a laugh
|
|
|
Cacodemon345
|
|
AccessViolation_
I sometimes read the phoronix comments on articles about rust for a laugh
|
|
2025-11-24 09:23:28
|
"Moronix"
|
|
|
|
Squid Baron
|
2025-11-24 04:49:35
|
HN again: https://news.ycombinator.com/item?id=46033330
|
|
|
lonjil
|
2025-11-24 04:53:46
|
Responding to people on Phoronix and HN is bad for my soul
|
|
2025-11-24 04:54:11
|
It's hard to resist the temptation to start calling people illiterate morons.
|
|
|
AccessViolation_
|
2025-11-24 05:05:15
|
I've also seen a surprising amount of posts confidently claim JPEG XL is a lossy format when others bring up the lossless capabilities
|
|
|
HCrikki
|
2025-11-24 05:13:14
|
lots of myths and misunderstandings keep getting respread cuz correct information is walled off to discord and only seen again when someone remembers when it was posted
|
|
2025-11-24 05:16:26
|
reversible jpeg transcoding is way more amazing than people imagine. a lossy jpeg losslessly converting to webp or avif would for instance have its final filesize *multiple times larger* than the input in addition to taking a lot of time (actually normal).
jxl's behaviour is the media world's unique anomaly that not only *almost instantly* converts and losslessly but also guarantees a consistent 20% filesize reduction even at the fastest effort.
|
|
|
lonjil
|
2025-11-24 05:33:43
|
> HDR should not be "typical web" anything. It's insane that websites are allowed to override my system brightness setting through HDR media. There's so much stuff out there that literally hurts my eyes if I've set my brightness such that pure white (SDR FFFFFF) is a comfortable light level.
>
> I want JXL in web browsers, but without HDR support.
|
|
|
jonnyawsom3
|
2025-11-24 05:36:59
|
Crazy idea. Turn off HDR in your settings
|
|
|
AccessViolation_
|
|
HCrikki
lots of myths and misunderstandings keep getting respread cuz correct information is walled off to discord and only seen again when someone remembers when it was posted
|
|
2025-11-24 05:42:44
|
the humble *The JPEG XL Image Coding System
History, Features, Coding Tools, Design Rationale, and Future*:
|
|
|
_wb_
|
|
lonjil
> HDR should not be "typical web" anything. It's insane that websites are allowed to override my system brightness setting through HDR media. There's so much stuff out there that literally hurts my eyes if I've set my brightness such that pure white (SDR FFFFFF) is a comfortable light level.
>
> I want JXL in web browsers, but without HDR support.
|
|
2025-11-24 05:46:14
|
I get that point, but it's like saying YouTube shouldn't be allowed to make sound because your volume setting might be such that some videos will be uncomfortably loud
|
|
|
VcSaJen
|
2025-11-24 05:47:04
|
I remember the wild west when every site was allowed to randomly blare sound. Those ads were abusing it.
|
|
|
lonjil
|
2025-11-24 05:47:09
|
I guess the annoying thing is if you set your brightness so that the normal maximum white level is comfortable, HDR can be uncomfortable.
|
|
|
_wb_
|
2025-11-24 05:47:24
|
If a page is blasting you with way too bright images, that's imo similar to a page using very annoying flashy GIFs or a page with loud music on autoplay.
|
|
2025-11-24 05:47:44
|
HDR when used right should not be uncomfortable when SDR white is comfortable.
|
|
|
lonjil
|
2025-11-24 05:48:20
|
In practice devices need heuristics to tone down HDR content depending on the users settings. Apple does that. I think originally they didn't and you could get a screen that was set to the lowest brightness level to suddenly emit its highest brightness level with a video file.
|
|
|
AccessViolation_
|
|
Crazy idea. Turn off HDR in your settings
|
|
2025-11-24 05:49:02
|
but how will they get upvotes if a simple solution means they can't put their foot down and stand up against being oppressed by bright images
|
|
|
VcSaJen
|
2025-11-24 05:50:17
|
Ideally you should be able to block sites abusing HDR from using it with a couple of clicks.
|
|
|
AccessViolation_
|
|
AccessViolation_
but how will they get upvotes if a simple solution means they can't put their foot down and stand up against being oppressed by bright images
|
|
2025-11-24 05:51:09
|
I say this, but to be fair I'm not willing to admit the amount of times I've emotionally felt like starting a revolution when confronted with websites that don't offer a dark mode <:KekDog:805390049033191445>
|
|
2025-11-24 05:57:18
|
right, I forgot you could do this
|
|
2025-11-24 06:00:32
|
hackernews is also the only site where I need to increase the default zoom because the text is uncomfortably small
|
|
|
VcSaJen
|
|
AccessViolation_
right, I forgot you could do this
|
|
2025-11-24 06:18:23
|
You can just use "Enable Darkmore" browser extension. Personally I use it for forcibly enable Light Mode on sites with dark backgrounds (I get headaches from them), but it's intended for opposite.
|
|
|
AccessViolation_
|
2025-11-24 06:21:14
|
I used to use one of those years ago, unfortunately they had quite a performance impact back then
|
|
|
spider-mario
|
2025-11-24 07:34:37
|
for user content that can potentially be HDR, presumably, websites will use https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-limit-property for thumbnail galleries and the like, reserving full HDR for when the image is viewed fullscreen
|
|
2025-11-24 07:35:10
|
instead of complaining about HDR support, maybe that HN user should spread the word about `dynamic-range-limit`
|
|
2025-11-24 07:35:32
|
or have it in a user stylesheet or something
|
|
2025-11-24 07:36:16
|
> The last discussion in libjxl about this was seemingly taking the stance it wasn't necessary since JXL has "native HDR" which completely fails to understand the problem space entirely.
😂
|
|
2025-11-24 07:37:00
|
> Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be "immune to banding".
okay, I can’t
|
|
|
Cacodemon345
|
2025-11-24 07:37:16
|
Maybe UltraHDR should get encoded to native JXL HDR.
|
|
|
spider-mario
> Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be "immune to banding".
okay, I can’t
|
|
2025-11-24 07:38:02
|
Isn't the native color format of the image format like float or double?
|
|
|
AccessViolation_
|
|
spider-mario
> Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be "immune to banding".
okay, I can’t
|
|
2025-11-24 08:30:34
|
oh I just replied to that one <:galaxybrain:821831336372338729>
|
|
2025-11-24 08:39:58
|
I made an account just now to fight JXL misinformation ~~and slander~~
|
|
|
lonjil
|
2025-11-24 08:43:03
|
are you spaceducks?
|
|
2025-11-24 08:43:23
|
because in that case, all of your comments have been killed
|
|
|
AccessViolation_
|
2025-11-24 08:45:37
|
yes I am
|
|
2025-11-24 08:46:25
|
because I'm a new account I guess?
|
|
2025-11-24 08:46:37
|
oh I'm also on a VPN which I'm sure doesn't help
|
|
2025-11-24 08:52:59
|
so is this permanent or how does this work
|
|
|
lonjil
|
|
AccessViolation_
|
2025-11-24 09:03:11
|
is this what you saw?
> ** What does [dead] mean?**
>
> The post was killed by software, user flags, or moderators.
>
> Dead posts aren't displayed by default, but you can see them all by turning on 'showdead' in your profile.
>
> If you see a [dead] post that shouldn't be dead, you can vouch for it. Click on its timestamp to go to its page, then click 'vouch' at the top. When enough users do this, the post is restored. There's a small karma threshold before vouch links appear.
|
|
|
lonjil
|
|
AccessViolation_
is this what you saw?
> ** What does [dead] mean?**
>
> The post was killed by software, user flags, or moderators.
>
> Dead posts aren't displayed by default, but you can see them all by turning on 'showdead' in your profile.
>
> If you see a [dead] post that shouldn't be dead, you can vouch for it. Click on its timestamp to go to its page, then click 'vouch' at the top. When enough users do this, the post is restored. There's a small karma threshold before vouch links appear.
|
|
2025-11-24 09:32:37
|
I didn't realize the couch button is hidden like that. I vouched for your comments.
|
|
|
Quackdoc
|
|
lonjil
> HDR should not be "typical web" anything. It's insane that websites are allowed to override my system brightness setting through HDR media. There's so much stuff out there that literally hurts my eyes if I've set my brightness such that pure white (SDR FFFFFF) is a comfortable light level.
>
> I want JXL in web browsers, but without HDR support.
|
|
2025-11-24 09:56:38
|
Now that's a system issue of I've ever heard one.
|
|
2025-11-24 09:57:04
|
If I had a dime for every botched HDR implementation, I would be absurdly rich.
|
|
|
dogelition
|
|
Quackdoc
If I had a dime for every botched HDR implementation, I would be absurdly rich.
|
|
2025-11-24 10:00:38
|
do you know that the system ui on lg tvs uses the full hdr brightness? (only when the content is hdr)
|
|
2025-11-24 10:00:47
|
don't know if any other brands also do that
|
|
2025-11-24 10:01:00
|
but it's funny how they keep making brighter tvs so the ui gets more and more painful to use
|
|
|
AccessViolation_
|
2025-11-24 10:01:27
|
that's so bad...
|
|
|
Quackdoc
|
|
dogelition
do you know that the system ui on lg tvs uses the full hdr brightness? (only when the content is hdr)
|
|
2025-11-24 11:05:42
|
that's hilarious
|
|
|
spider-mario
|
2025-11-24 11:19:59
|
but also sad
|
|
2025-11-24 11:20:01
|
but also hilarious
|
|
|
Exorcist
|
|
Squid Baron
HN again: https://news.ycombinator.com/item?id=46033330
|
|
2025-11-25 01:20:22
|
> It's not just Google, Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains either:
> Mozilla said over a year ago that they would support JXL as soon as there's a fast memory safe decoder that will be supported.
> If that's the case, let it be a feature of image editing packages that can output formats that are for the web.
But:
https://hacks.mozilla.org/2019/05/firefox-brings-you-smooth-video-playback-with-the-worlds-fastest-av1-decoder/
|
|
|
Meow
|
|
AccessViolation_
hackernews is also the only site where I need to increase the default zoom because the text is uncomfortably small
|
|
2025-11-25 02:14:41
|
Old-school nerds don't like Dark Mode and BIGGER texts
|
|
|
Cacodemon345
|
|
Meow
Old-school nerds don't like Dark Mode and BIGGER texts
|
|
2025-11-25 09:21:46
|
Old-school nerds also call GitLab interface childish while they code away like a caveman in their 80x25 text modes and using mail lists.
|
|
|
VcSaJen
|
|
Cacodemon345
Old-school nerds also call GitLab interface childish while they code away like a caveman in their 80x25 text modes and using mail lists.
|
|
2025-11-25 09:46:56
|
To be fair, GitLab's interface is fixed/sticky-abusing garbage. Nowadays everything is either widescreen or ultra widescreen, don't waste vertical space with panels when you're not even a desktop app.
|
|
|
lonjil
|
2025-11-25 09:49:34
|
Most of my browser windows are narrow, it's very annoying how wide most websites want to be.
|
|
|
Quackdoc
|
|
VcSaJen
To be fair, GitLab's interface is fixed/sticky-abusing garbage. Nowadays everything is either widescreen or ultra widescreen, don't waste vertical space with panels when you're not even a desktop app.
|
|
2025-11-25 01:13:17
|
gitlab UI is just horrid, complex and unusable on mobile to actually to anything related to code. forgejo is far better
|
|
2025-11-25 01:13:39
|
also "works" mostly with noscript so massive plus there
|
|
|
VcSaJen
|
2025-11-26 01:12:53
|
https://youtu.be/iTxUFsWLtrA?t=1216
|
|
|
Adrian The Frog
|
2025-11-26 05:15:07
|
https://m.youtube.com/watch?v=uDydCrCXHyU
|
|