|
jonnyawsom3
|
2024-06-17 10:47:39
|
All the gain maps lately make me wonder how different things could be if they had adopted JXL a little sooner, seems like everyone has agreed on a workaround right as a real solution is here
|
|
|
Quackdoc
|
2024-06-17 10:51:26
|
JXL doesn't have a solution for gain maps, traditional HDR and gainmaps are pretty much two seperate things. Gain maps give you a very "close to traditional HDR" experience while also being fully compatible with SDR, which is not something you can do with a transfer function + tone mapping. you are simply missing the required data
|
|
|
spider-mario
|
|
lonjil
?
|
|
2024-06-17 11:19:55
|
dog whistle
|
|
|
lonjil
|
2024-06-17 11:20:48
|
Yeah. I wanted to see how he'd explain it. He has posted and "explained" several dog whistles in the past.
|
|
|
Demiurge
|
2024-06-18 12:30:15
|
How's it a dog whistle if that guy is pretty famous for saying that lol
|
|
2024-06-18 12:30:34
|
I was pretty sure you guys would get the joke
|
|
2024-06-18 12:32:31
|
Is it considered a "dog whistle" now to make fun of WEF and Klaus Schwab?
|
|
2024-06-18 12:51:04
|
Wow shame on me, I'm gunna cry in the corner now :(
|
|
|
|
okydooky_original
|
2024-06-18 01:08:29
|
If you guys want to take a break from "um, yikes"-ing about state-enforced letter capitalization, then check this out:
After adding support for MPEG1, H.264, Opus, modern compression methods, and much more, the chad Rasky mentioned he could (and might) add JXL support to the N64 console via the FOSS SDK he's now lead developer of called LibDragon.
|
|
2024-06-18 01:08:36
|
|
|
|
Quackdoc
|
2024-06-18 01:21:27
|
I played with libdragon a bit, it's neat, could be nice to have
|
|
|
|
okydooky_original
|
2024-06-18 01:23:07
|
I'm really interested to see initial and then optimized benchmarks. Most JXL features wouldn't be useful in this context, I think. But, I'm suspecting that the frames/layers could be leveraged somehow.
|
|
|
Quackdoc
|
2024-06-18 01:24:10
|
it could be neat to make custom roms that have the kinda dynamic resolution stuff
|
|
2024-06-18 01:24:38
|
partially decoded textures for n64, and on emulators that use higher rendering resolutions have the full 4k or 8k or whatever textures
|
|
|
|
okydooky_original
|
2024-06-18 01:24:53
|
Are you a part of the N64 Homebrew Discord?
|
|
|
Quackdoc
|
2024-06-18 01:24:57
|
nope
|
|
2024-06-18 01:25:06
|
I don't really do homebrew stuff since like, gba
|
|
2024-06-18 01:25:27
|
I still tinker every now and then, but only as far as examples
|
|
|
|
okydooky_original
|
2024-06-18 01:25:27
|
It might be useful to have your input on these matters.
|
|
|
Quackdoc
|
2024-06-18 01:26:05
|
I wouldn't even know if it would be possible on libdragon to do something like that
|
|
|
|
okydooky_original
|
|
Quackdoc
I don't really do homebrew stuff since like, gba
|
|
2024-06-18 01:26:10
|
I think a lot of the members don't. They're just there to see what's going on. I don't code or do any development, right now. But, I sometimes ask questions or give input anyways.
|
|
|
Quackdoc
I wouldn't even know if it would be possible on libdragon to do something like that
|
|
2024-06-18 01:27:14
|
From what I've seen, it may not be out of the realm of possibility. The only issue is just getting X feature to run on hardware.
|
|
|
Quackdoc
|
2024-06-18 01:28:15
|
well, ill check it out eventually
|
|
|
|
okydooky_original
|
2024-06-18 01:30:47
|
It'd be cool to see you or any other JXL people there. It's been really amazing seeing new developments almost literally every day.
Some of the tech demos are pretty awe-inspiring. Like, the guy who was working on Portal 64 did a successful implementation of John Carmack's megatextures running on real hardware, as just one recent example.
|
|
|
Demiurge
|
|
Quackdoc
JXL doesn't have a solution for gain maps, traditional HDR and gainmaps are pretty much two seperate things. Gain maps give you a very "close to traditional HDR" experience while also being fully compatible with SDR, which is not something you can do with a transfer function + tone mapping. you are simply missing the required data
|
|
2024-06-18 02:22:36
|
Yeah, it's a feature that allows HDR images to be displayed and scaled appropriately on monitors with different amounts of headroom or brightness settings... From the sound of it, it seems like a pretty essential feature and Apple is even trying to get it to be part of the ICC standard. Which is funny considering how their devices utterly bomb these tests: https://littlecms.com/blog/2020/09/09/browser-check/
|
|
|
Quackdoc
|
2024-06-18 02:23:03
|
oh no, not more ICC fuckery
|
|
2024-06-18 02:23:07
|
[av1_monkastop](https://cdn.discordapp.com/emojis/720662879367332001.webp?size=48&quality=lossless&name=av1_monkastop)
|
|
|
Demiurge
|
2024-06-18 02:25:43
|
https://www.iso.org/standard/86775.html
|
|
|
Quackdoc
|
2024-06-18 02:27:25
|
this is gonna cause some headaches lol
|
|
|
_wb_
|
|
Quackdoc
JXL doesn't have a solution for gain maps, traditional HDR and gainmaps are pretty much two seperate things. Gain maps give you a very "close to traditional HDR" experience while also being fully compatible with SDR, which is not something you can do with a transfer function + tone mapping. you are simply missing the required data
|
|
2024-06-18 07:13:00
|
We are adding a box for gain maps, it will be discussed at the upcoming JPEG meeting. I am in favor of this, if only just for feature parity. But fundamentally, I don't like the idea of essentially having the option that the SDR image and the HDR image are potentially very different — I am more in favor of standardizing tone mapping with some parameters for artistic control, and having only one master image at the highest fidelity from which the renderings on less-capable displays are automatically derived.
|
|
2024-06-18 07:16:01
|
Gain maps might be nice from the point of having artistic control over delivered images, but for a processing pipeline it is a nightmare. Cropping and rescaling is still doable, but what does it even mean to apply operations like sharpening or color adjustments on an image that is really two images?
|
|
|
CrushedAsian255
|
2024-06-18 07:20:36
|
i agree and think that makes sense, although i am not very knowledgable in this field. it does seem more sensible to have a system for converting HDR to SDR from the image, instead of having 2 images that create one.
|
|
|
_wb_
|
2024-06-18 07:23:54
|
There are many tone mapping algorithms to go from HDR to SDR — from simple global curves to fancy local tone mapping. Some of them are available in the libjxl repo, and the libjxl decode API does implement some of it if you request to get an HDR image at a lower max_nits than the image.
|
|
|
CrushedAsian255
|
|
_wb_
There are many tone mapping algorithms to go from HDR to SDR — from simple global curves to fancy local tone mapping. Some of them are available in the libjxl repo, and the libjxl decode API does implement some of it if you request to get an HDR image at a lower max_nits than the image.
|
|
2024-06-18 07:25:15
|
what's your opinion on HLG gamma curve as a backwards compatible solution?
|
|
|
_wb_
|
2024-06-18 07:26:30
|
The issue with tone mapping algorithms is that they might make artistically 'wrong' decisions, which is why gain maps can bring an advantage since they can spell out the exact way a local tone mapping has to be done. But realistically, I think very few people will actually generate custom gain maps, and instead they'll use whatever algorithm their software is implementing — so we're spending bytes on encoding the result of an algorithm, as opposed to just signaling "use this algorithm".
|
|
|
CrushedAsian255
|
|
_wb_
The issue with tone mapping algorithms is that they might make artistically 'wrong' decisions, which is why gain maps can bring an advantage since they can spell out the exact way a local tone mapping has to be done. But realistically, I think very few people will actually generate custom gain maps, and instead they'll use whatever algorithm their software is implementing — so we're spending bytes on encoding the result of an algorithm, as opposed to just signaling "use this algorithm".
|
|
2024-06-18 07:27:46
|
i guess there's no harm in supporting gain maps, although aren't they already an option by using another channel?
|
|
|
_wb_
|
|
CrushedAsian255
what's your opinion on HLG gamma curve as a backwards compatible solution?
|
|
2024-06-18 07:28:21
|
HLG does define a generic tone mapping that takes display brightness and ambient light into account, which is both more and less good than what you can do with gain maps. Gain maps cannot take ambient light into account, but HLG only defines a global tone mapping.
|
|
|
CrushedAsian255
i guess there's no harm in supporting gain maps, although aren't they already an option by using another channel?
|
|
2024-06-18 07:32:07
|
We could simply define an extra channel (or multiple, in case of 3-channel gain maps) to represent gain maps, but we're going for the separate codestream in a separate box approach, which makes it easier to add/change/strip gain maps. The idea is that in JXL, gain maps would mainly be used in the inverse direction, i.e. the main JXL image is an HDR image, and the gain map only describes a local tone mapping for rendering a nice SDR image — so it is in a way not really needed, and the 'real image' is just the main image. This is different from how it is done in JPEG (UltraHDR) or iPhone HEICs, where the main image is an SDR image and the gain map is needed to reconstruct the HDR image. Though the gain map spec TC 42 is working on will allow both directions, and so will JXL.
|
|
|
CrushedAsian255
|
2024-06-18 07:33:59
|
oh, so hdr will act sort of like JPEG bitstream reconstruction
|
|
|
_wb_
|
2024-06-18 07:40:13
|
Gain maps remind me of this feature of PNG to store a palette (and even a histogram that goes with it) in a truecolor PNG. This dates back to the time when many displays could only render 256 colors (or even less), so displaying a truecolor image often required reducing the colors and doing dithering and all that. So these PNG chunks for palette/histogram were useful since a viewer could use them to render the image more efficiently: instead of doing its own 'pngquant'-like thing, they could just directly use the signaled palette. Though I think most software would just ignore the signaled palette and do its own thing anyway, since that needed to be implemented anyway. So basically it ended up being a rather useless chunk that just takes up extra bytes without anything really doing anything with it. Once HDR displays become more common, I think gain maps might become something similar.
|
|
2024-06-18 07:47:16
|
The whole concept of spelling out explicitly in an image file how to render an image on a device that is not fully capable of rendering the image is just not a very good idea imo. You could also start adding auxiliary images to have a custom 16-shade grayscale image included for e-readers, or custom local mappings to CMYK for printing the image, or low-framerate versions of animations, etc etc. I think it will only add bloat and be likely to not be used anyway, since there will always have to be a generic rendering implemented too for such devices, and in terms of code plumbing it will generally be highly nontrivial to have such custom "bypass the generic thing and use this custom thing instead" methods.
|
|
2024-06-18 07:57:11
|
The internal colorspace used in lossy JXL is general enough to represent anything human eyes can see — it's not based on current display technology, it's based on the limits of the human visual system. I very much like the concept of having image files that just represent the visual information with the highest possible fidelity (gamut, dynamic range, precision only limited by the capturing technology, the limits of human vision, and when doing lossy also the amount of acceptable distortion in the anticipated range of viewing conditions), and then it is up to the rendering technology to figure out the best way to reproduce the 'ideal' image with whatever limited means it has to render images — although I am not opposed to having some signaled info for high-level artistic control over how trade-offs should ideally be made (e.g. like the rendering intent field in ICC profiles, but possibly with some parameters and preferably with normatively defined behavior so there is consistency between implementations).
|
|
2024-06-18 08:06:33
|
To me, gain maps go too far in the direction of just storing two separate images (an SDR one and an HDR one). If you have a 3-channel, full-resolution gain map, you can effectively make the two images completely unrelated. If you only have a 1-channel, heavily downsampled gain map (as is now done in UltraHDR and iPhone's HDR HEICs), then this comes at a cost in fidelity, at least if the main image is SDR. Either way, SDR+gainmap is a less-than-ideal way to do HDR, and I think the only reason they're doing it is to have "graceful degradation" (i.e. it "works" in non-HDR-aware applications, which will just end up ignoring the gainmap). But in my experience, graceful degradation usually ends up paying bytes and mostly getting only the degraded experience in practice, since the incentive to upgrade software is too small (it already "works", so mostly they won't even realize they need to do something).
|
|
|
Orum
|
2024-06-18 08:08:57
|
I agree--it's better to just store them as 2 separate images (or as separate channels in a single image)
|
|
2024-06-18 08:10:00
|
if you want a single image that displays well in both SDR and HDR, allow storing some tone mapping parameters/metadata to show how to convert, but that's it
|
|
2024-06-18 08:11:57
|
with over 4K channels it seems like there's little reason to flag something specifically as a gain map when you could just store an already computed image there
|
|
|
VcSaJen
|
|
_wb_
The whole concept of spelling out explicitly in an image file how to render an image on a device that is not fully capable of rendering the image is just not a very good idea imo. You could also start adding auxiliary images to have a custom 16-shade grayscale image included for e-readers, or custom local mappings to CMYK for printing the image, or low-framerate versions of animations, etc etc. I think it will only add bloat and be likely to not be used anyway, since there will always have to be a generic rendering implemented too for such devices, and in terms of code plumbing it will generally be highly nontrivial to have such custom "bypass the generic thing and use this custom thing instead" methods.
|
|
2024-06-18 08:12:53
|
*.ico files support all that
|
|
2024-06-18 08:14:28
|
I suppose it was a transition era
|
|
|
_wb_
|
2024-06-18 11:10:09
|
ICO is pretty much what you do if you cannot get SVG to render nicely to small dimensions, so you just store several rasterized versions. It's like raster fonts. Maybe justifiable when there is not enough compute oomph available to do something better, but that era is decades in the past.
|
|
|
lonjil
|
2024-06-18 11:22:06
|
The Haiku Vector Icon Format supports selectively adding or removing details at different sizes. Each shape has two parameters for this, "min LOD" and "max LOD".
|
|
2024-06-18 11:22:34
|
Very little compute oomph needed
|
|
|
spider-mario
|
2024-06-18 12:25:34
|
it also becomes less necessary as screens get so dense that even small physical sizes can render plenty of detail
|
|
|
Quackdoc
|
|
_wb_
We are adding a box for gain maps, it will be discussed at the upcoming JPEG meeting. I am in favor of this, if only just for feature parity. But fundamentally, I don't like the idea of essentially having the option that the SDR image and the HDR image are potentially very different — I am more in favor of standardizing tone mapping with some parameters for artistic control, and having only one master image at the highest fidelity from which the renderings on less-capable displays are automatically derived.
|
|
2024-06-18 01:07:34
|
I'm not a fan of the implementation either, but at this point In time, We really don't have anything else. And we do need something pretty much now. It's been a long time coming in creatives have wanted something really anything to give them real solution and game maps is the only thing that's actually viable cropped up.
|
|
2024-06-18 01:08:25
|
Thankfully, at most it will be a pain because it will either stick around and tools will develop for it or it will die off and you'll have to use a special tool to convert it. But once the conversion is done, it's done.
|
|
2024-06-18 01:11:01
|
and ofc, its only really a stop-gap until HDR actually takes off
|
|
|
_wb_
|
2024-06-18 02:16:51
|
Arguably, HDR has already taken off. Linux is lagging behind, software support in general is still rather flimsy, and obviously many computer screens currently out there are still SDR, but for TVs HDR is already the norm, and the same is true for recent Apple devices, high-end phones, monitors for gamers, and high-end monitors in general.
|
|
2024-06-18 02:25:04
|
What do you mean by "we don't have anything else"? Basically every image format except for webp can just represent HDR images directly. Gain maps are mostly a way to encode > 8-bit using an 8-bit image + an 8-bit gain map (kind of like Radiance's RGBE), but even JPEG in 8-bit mode actually has enough precision to do HDR, and certainly JPEG 2000, HEIC, AVIF, JXL can just store a HDR image directly. The other reason for doing it the gain maps way is to make it "work" on legacy software that does everything with 8-bit buffers (of course it doesn't really work, you only get an SDR image, but at least that SDR image looks OK), but to me that seems like a strategy that will delay the push to get everything HDR-ready rather than accellerate it.
|
|
|
Cacodemon345
|
2024-06-18 02:38:44
|
True HDR monitors are still expensive, unfortunately.
|
|
2024-06-18 02:39:02
|
Even my "HDR" monitor is merely 120% sRGB.
|
|
|
lonjil
|
2024-06-18 02:39:24
|
That's gamut though, not dynamic range.
|
|
|
jonnyawsom3
|
2024-06-18 02:39:51
|
I'm still not convinced my phone is HDR and isn't just boosting saturation and setting brightness to max
|
|
|
Quackdoc
|
2024-06-18 02:40:14
|
the issue is that legacy software and legacy displays are still king, around 25% of all android devices which are "mobile and tablet" are still android 10 or lower according to apilevels.com which pulls froms statcounter, as of april 27th. Windows 11 still doesn't have good support for HDR globally, firefox has zero support etc.
SDR is still massively important, and needs to be a first class citizen, it cannot be a second class one.
currently as a creator you have three options, you either distribute both SDR and HDR separately, which most at least that I deal with simply won't do. even mastering aside, the actual delivery of images in this regard is just too great a hassle for many.
you serve just HDR and treat a large majority of people as second class citiziens. keep in mind that android only actually got good SDR/HDR interop in A13 as A12 support was mediocre at best and relied on subpar tone mapping, A13 accounts for less then half of current android users. or they just serve SDR, which is what most creators do.
Software is still a massive bottleneck, regardless of whether or not I agree about the state of HDR displays (many of the HDR displays are not even HDR400 compatible still). SDR will still need to be treated as a first class citizen for many years to come, Gain maps let creators treat both HDR and SDR consumers as first class citizens.
|
|
|
Cacodemon345
|
|
lonjil
That's gamut though, not dynamic range.
|
|
2024-06-18 02:40:45
|
Hence HDR in quotes.
|
|
|
Quackdoc
|
2024-06-18 02:40:46
|
Windows 11 HDR experience is a joke. it's just *really bad*
|
|
|
lonjil
That's gamut though, not dynamic range.
|
|
2024-06-18 02:41:27
|
HDR is a "general" term anyways, at minimum it should be treated as the transfer, however HDR generally means both a large range transfer as well as a large gamut
|
|
2024-06-18 02:41:44
|
in the end HDR is just a bad marketing term
|
|
|
lonjil
|
|
_wb_
Arguably, HDR has already taken off. Linux is lagging behind, software support in general is still rather flimsy, and obviously many computer screens currently out there are still SDR, but for TVs HDR is already the norm, and the same is true for recent Apple devices, high-end phones, monitors for gamers, and high-end monitors in general.
|
|
2024-06-18 02:42:53
|
HDR isn't very widely deployed on Linux yet, but arguably, HDR is already better on Linux than on Windows. (Idk about MacOS). KDE has a good HDR implementation that first shipped on the Steam Deck OLED.
|
|
|
Cacodemon345
|
2024-06-18 02:42:58
|
All that I got from that is that this monitor is supposed to have ~306 levels of sRGB.
|
|
|
Quackdoc
|
|
lonjil
HDR isn't very widely deployed on Linux yet, but arguably, HDR is already better on Linux than on Windows. (Idk about MacOS). KDE has a good HDR implementation that first shipped on the Steam Deck OLED.
|
|
2024-06-18 02:43:14
|
even disregarding linux, because its safe to disregard, the state of HDR is poor
|
|
|
Cacodemon345
|
|
lonjil
HDR isn't very widely deployed on Linux yet, but arguably, HDR is already better on Linux than on Windows. (Idk about MacOS). KDE has a good HDR implementation that first shipped on the Steam Deck OLED.
|
|
2024-06-18 02:44:00
|
There's a reason for why it isn't widely deployed on Linux: Xorg does not support it, and many people want to stick to it because Wayland is still a mess, and then you get Wayland fanatics.
|
|
|
Quackdoc
|
2024-06-18 02:44:25
|
also it's not like HDR but not HDR displays are a *massive* issue, even on a bad HDR10 display, as long as the display handles it somewhat OK, HDR should still look decent enough, at least as good as SDR in most cases
|
|
|
lonjil
|
2024-06-18 02:44:40
|
Technically speaking you can do HDR with Xorg it just sucks a lot.
|
|
|
Quackdoc
|
2024-06-18 02:45:09
|
even when using a massively overcooked video like this one https://www.youtube.com/watch?v=pYt1TFQNi8Q HDR still looks more or less fine on my crappy LG HDR10 monitor
|
|
|
lonjil
|
2024-06-18 02:45:22
|
The solution on Wayland is to treat apps as sRGB by default and map that to the HDR output.
|
|
|
Quackdoc
|
|
lonjil
The solution on Wayland is to treat apps as sRGB by default and map that to the HDR output.
|
|
2024-06-18 02:45:37
|
that's not a solution, thats a hack lol
|
|
2024-06-18 02:45:57
|
wayland has no solution, wayland is totally treated as sRGB, whether the compositor does something special or not is up to the compositor
|
|
|
lonjil
|
2024-06-18 02:46:15
|
So legacy apps that expect sRGB should magically be HDR somehow??
|
|
|
Quackdoc
|
2024-06-18 02:46:35
|
there is *no* solution yet
|
|
|
lonjil
|
2024-06-18 02:46:37
|
That's not a hack that's just what you need to do to support old apps...
|
|
|
Quackdoc
|
2024-06-18 02:46:40
|
as in, no HDR at all
|
|
2024-06-18 02:46:48
|
it just out right doesnt exist right now
|
|
|
lonjil
|
2024-06-18 02:46:56
|
There literally is though
|
|
|
Quackdoc
|
2024-06-18 02:47:04
|
KDE implements their own hacks
|
|
2024-06-18 02:47:04
|
it's not a part of wayland
|
|
|
lonjil
|
2024-06-18 02:47:14
|
Those aren't "hacks"
|
|
|
Quackdoc
|
2024-06-18 02:47:40
|
yes it is, they use a vulkan layer shim to get the metadata and forward that to the compositor
|
|
|
lonjil
|
2024-06-18 02:47:43
|
There is no agreed upon standard yet, but you can literally do HDR today
|
|
|
Quackdoc
|
2024-06-18 02:47:44
|
thats about as hacky as it gets
|
|
|
lonjil
There is no agreed upon standard yet, but you can literally do HDR today
|
|
2024-06-18 02:48:09
|
*you* can do HDR, Wayland cannot
|
|
2024-06-18 02:48:24
|
there is a very significant difference
|
|
2024-06-18 02:49:16
|
also afaik only kwin can do it
|
|
|
lonjil
|
2024-06-18 02:49:22
|
*I* don't need to do anything. Apps that want to do HDR can do it that way and it'll work. If you're on KDE ;)
|
|
|
Quackdoc
|
2024-06-18 02:49:39
|
*some* apps can do it on KDE
|
|
|
dogelition
|
|
Quackdoc
Windows 11 HDR experience is a joke. it's just *really bad*
|
|
2024-06-18 02:50:13
|
what specifically do you think is wrong with it?
|
|
|
Quackdoc
|
|
dogelition
what specifically do you think is wrong with it?
|
|
2024-06-18 02:50:48
|
SDR apps look like crap, the default peak for HDR on windows is 1500 nits, (and is a wee bit of a pain to fix) and it also uses a fixed transfer that all applications have to map to, but they don't often map to
|
|
|
lonjil
*I* don't need to do anything. Apps that want to do HDR can do it that way and it'll work. If you're on KDE ;)
|
|
2024-06-18 02:51:24
|
https://wiki.archlinux.org/title/KDE#HDR
currently limited to vulkan applications
|
|
|
lonjil
|
|
Quackdoc
|
2024-06-18 02:51:43
|
which granted, a good amount of HDR applications are vulkan aware, but that's still very limiting
|
|
2024-06-18 02:51:57
|
for *wayland* to support HDR, there needs to be a protocol
|
|
|
lonjil
|
|
Quackdoc
*some* apps can do it on KDE
|
|
2024-06-18 02:52:11
|
Yeah. But that'd be the case literally no matter how it's solved. *Any* solution will require apps adding support.
|
|
|
Quackdoc
|
2024-06-18 02:52:46
|
there is a stark difference between "wayland" supporting it globally, and using a custom protocol and custom vulkan layer that ads support for just VK apps
|
|
|
lonjil
|
2024-06-18 02:53:39
|
Wayland can't support anything globally since it'd be optional
|
|
|
Quackdoc
|
2024-06-18 02:53:40
|
although I think the layer now uses the draft PR protocol
|
|
|
lonjil
|
2024-06-18 02:53:55
|
But yes having a standard way of doing it would.be good
|
|
|
Quackdoc
|
2024-06-18 02:54:02
|
either way, we need a real protocol or else we get more fragmentation
|
|
|
dogelition
|
|
Quackdoc
SDR apps look like crap, the default peak for HDR on windows is 1500 nits, (and is a wee bit of a pain to fix) and it also uses a fixed transfer that all applications have to map to, but they don't often map to
|
|
2024-06-18 02:56:04
|
SDR applications "only" look wrong if you're used to something other than the sRGB EOTF (but to be fair, most people are used to something like ~2.2 gamma)
i think most monitors nowadays have proper static hdr metadata in the EDID, so you should get the proper peak brightness by default. though for whatever reason, TVs usually just have `0` in those fields... still, i'd argue that using the windows 11 hdr calibration app isn't really any more of a pain than the equivalent calibration menus on consoles
not entirely sure what you mean by the last point
|
|
|
Quackdoc
|
|
dogelition
SDR applications "only" look wrong if you're used to something other than the sRGB EOTF (but to be fair, most people are used to something like ~2.2 gamma)
i think most monitors nowadays have proper static hdr metadata in the EDID, so you should get the proper peak brightness by default. though for whatever reason, TVs usually just have `0` in those fields... still, i'd argue that using the windows 11 hdr calibration app isn't really any more of a pain than the equivalent calibration menus on consoles
not entirely sure what you mean by the last point
|
|
2024-06-18 02:56:44
|
i've yet to find an sRGB application that doesn't look like trash when HDR is enabled on windows 11
|
|
2024-06-18 02:57:12
|
also I've yet to see any HDR monitor on windows not have windows default to 1500 nits
|
|
2024-06-18 02:58:09
|
a real easy way to see what I mean by the last issue is take an HDR movie, play it in MPV with metadata on in windowed, then let MPV take over, often times you will notice a shift in color/luminance
|
|
|
dogelition
|
|
Quackdoc
i've yet to find an sRGB application that doesn't look like trash when HDR is enabled on windows 11
|
|
2024-06-18 02:58:15
|
if your monitor displays HDR signals accurately (within its gamut/brightness limitations), it should look pretty much exactly the same as in SDR mode (when the monitor is in an sRGB emulation mode). though that will most likely look brighter near black than you're used to
|
|
|
Quackdoc
|
|
dogelition
if your monitor displays HDR signals accurately (within its gamut/brightness limitations), it should look pretty much exactly the same as in SDR mode (when the monitor is in an sRGB emulation mode). though that will most likely look brighter near black than you're used to
|
|
2024-06-18 02:58:36
|
I have never seen any monitor do that properly
|
|
2024-06-18 02:58:54
|
and i've tested quite a few
|
|
|
dogelition
|
2024-06-18 02:59:17
|
which models specifically?
|
|
|
Quackdoc
|
2024-06-18 02:59:54
|
I dunno it's not something i've ever gone out to check, the latest one that I remeber was some 600nit gaming monitor
|
|
|
dogelition
|
|
Quackdoc
also I've yet to see any HDR monitor on windows not have windows default to 1500 nits
|
|
2024-06-18 02:59:58
|
afaik DisplayHDR certification mandates that those values are present in the EDID, so that surprises me
|
|
|
lonjil
|
2024-06-18 03:01:21
|
Speaking of Windows. My BenQ monitor looks really weird under Windows. Not using HDR or anything. Just, the colors are wrong. On Linux, and on my other monitor, they look alright.
|
|
|
Quackdoc
|
2024-06-18 03:02:15
|
just on KDE?
|
|
2024-06-18 03:02:25
|
iirc kde swapped to using a gamma2.2 transfer, maybe that?
|
|
|
w
|
|
lonjil
Speaking of Windows. My BenQ monitor looks really weird under Windows. Not using HDR or anything. Just, the colors are wrong. On Linux, and on my other monitor, they look alright.
|
|
2024-06-18 03:02:39
|
is windows downloading the display profile and the others not?
|
|
|
lonjil
|
|
Quackdoc
just on KDE?
|
|
2024-06-18 03:07:10
|
? No HDR anywhere here. It just looks wrong with Windows and always correct with Linux.
|
|
|
Quackdoc
|
2024-06-18 03:07:19
|
but yeah, loads of people still have issues with windows 11 HDR mode, and it doesn't look like they will be fixing it any time soon,
Android 13 finally fixed HDR tonemapping stuff and has two implementations for it now (you can see the updated tonemapper here https://cs.android.com/android/platform/superproject/main/+/main:frameworks/native/libs/tonemap/tonemap.cpp;drc=7a577450e536aa1e99f229a0cb3d3531c82e8a8d;l=437)
OSX and apple has decent support, I never tested it in depth and have no apple devices to test on now
Linux is well, linux
So supporting HDR as first class and SDR as second class is a ways off
|
|
|
lonjil
? No HDR anywhere here. It just looks wrong with Windows and always correct with Linux.
|
|
2024-06-18 03:07:44
|
not referecing HDR, just making note of the fact that KDE as a compositor defaults to a pure 2.2 transfer instead of an sRGB one now
|
|
|
lonjil
|
2024-06-18 03:08:02
|
I think Windows is getting a profile from the monitor or something that's just wrong. While eternally wise Linux ignores / doesn't support using that data.
|
|
|
Quackdoc
|
2024-06-18 03:08:25
|
is possible
|
|
|
dogelition
|
2024-06-18 03:09:09
|
windows doesn't do any color management by itself (except when using HDR, or the newer SDR Auto Color Management)
|
|
|
lonjil
|
|
Quackdoc
not referecing HDR, just making note of the fact that KDE as a compositor defaults to a pure 2.2 transfer instead of an sRGB one now
|
|
2024-06-18 03:09:12
|
I don't understand the relevance. If you're not in HDR mode, the pixels that the application renders just get sent to the screen unaltered.
|
|
|
w
|
2024-06-18 03:10:32
|
are you sure windows isnt the correct one and everything else isnt wrong
|
|
|
Quackdoc
|
|
lonjil
I don't understand the relevance. If you're not in HDR mode, the pixels that the application renders just get sent to the screen unaltered.
|
|
2024-06-18 03:13:11
|
I believe kwin internally does colormanagement but I might be wrong, it's something I could probably test later
|
|
|
_wb_
|
2024-06-18 03:13:34
|
Sending application pixels directly to the screen is not a great approach, that means all applications need to 1) have some way to figure out the display space, and 2) implement proper color management. Which is almost surely just not going to happen.
|
|
|
Quackdoc
|
|
_wb_
Sending application pixels directly to the screen is not a great approach, that means all applications need to 1) have some way to figure out the display space, and 2) implement proper color management. Which is almost surely just not going to happen.
|
|
2024-06-18 03:14:32
|
depends on the application, a lot of higher end applications like MPV, or professional applications like NLEs let you set this, but yeah general consumer applications...
|
|
|
w
|
2024-06-18 03:14:45
|
white point and gamma curves should get applied
|
|
2024-06-18 03:15:10
|
(things that aren't handled by app color management)
|
|
|
dogelition
|
2024-06-18 03:15:32
|
also, the problem with trying to switch from the sRGB EOTF to something like 2.2 gamma is that the end result will always be varying degrees of wrong... e.g. if you're doing photo editing in an editor that supports color management, and you want sRGB images to look like 2.2 instead, you basically have to lie to the application about the gamma of your monitor. so images with a transfer function other than sRGB will then also be rendered too dark. similar issue with web browsers, as almost all content on the web is in sRGB
while windows's approach to using the sRGB EOTF for all SDR applications in HDR can look bad subjectively, it's at least technically correct for all scenarios
|
|
|
lonjil
|
|
w
are you sure windows isnt the correct one and everything else isnt wrong
|
|
2024-06-18 03:15:49
|
The Windows colors are way off from how things look on my phone, my iPad, my laptop. My Lenovo screen under Windows matches those other things. Under Linux both screens match the other things. It's just that one BenQ monitor only under Windows that doesn't.
|
|
|
Quackdoc
|
2024-06-18 03:16:03
|
the general approach that should be done is application sends out a standard colorspace, and compositor maps it to whatever it needs.
higher end applications can send out to what it's supposed to be, and notify the compositor to not do anything
|
|
|
w
|
|
lonjil
The Windows colors are way off from how things look on my phone, my iPad, my laptop. My Lenovo screen under Windows matches those other things. Under Linux both screens match the other things. It's just that one BenQ monitor only under Windows that doesn't.
|
|
2024-06-18 03:16:24
|
but you could have changed the settings on the monitor to counter the wrong ness that the profile would have been correcting
|
|
|
dogelition
|
|
lonjil
The Windows colors are way off from how things look on my phone, my iPad, my laptop. My Lenovo screen under Windows matches those other things. Under Linux both screens match the other things. It's just that one BenQ monitor only under Windows that doesn't.
|
|
2024-06-18 03:16:31
|
i assume you checked if the HDMI signal range is set correctly? (if you're using HDMI)
|
|
|
w
|
2024-06-18 03:17:34
|
yeah or gpu setting is wack
|
|
|
lonjil
|
|
_wb_
Sending application pixels directly to the screen is not a great approach, that means all applications need to 1) have some way to figure out the display space, and 2) implement proper color management. Which is almost surely just not going to happen.
|
|
2024-06-18 03:17:34
|
But if you assume that the input is sRGB, and that the monitor is sRGB, there's nothing to do. Doesn't stop you from doing something else in other situations. Applications don't get direct access to the screen, compositors still have full control over the pipeline and what is done.
|
|
|
dogelition
i assume you checked if the HDMI signal range is set correctly? (if you're using HDMI)
|
|
2024-06-18 03:18:10
|
How do you do that on Windows?
|
|
|
dogelition
|
2024-06-18 03:18:30
|
it's part of the GPU driver settings, not windows itself, so it depends on what GPU you have
|
|
|
Quackdoc
|
|
lonjil
But if you assume that the input is sRGB, and that the monitor is sRGB, there's nothing to do. Doesn't stop you from doing something else in other situations. Applications don't get direct access to the screen, compositors still have full control over the pipeline and what is done.
|
|
2024-06-18 03:18:39
|
iirc what kwin does now is assume the monitor is 2.2 and the application is putting out sRGB
|
|
2024-06-18 03:18:44
|
because sRGB is the norm for apps
|
|
|
lonjil
|
2024-06-18 03:20:17
|
Kwin in HDR mode has the goal of making SDR content look the same in HDR mode as in SDR mode
|
|
|
Quackdoc
|
2024-06-18 03:20:57
|
im not talking about HDR mode
|
|
|
lonjil
|
2024-06-18 03:21:42
|
But it doesn't do anything in SDR mode
|
|
|
Quackdoc
|
2024-06-18 03:21:58
|
does it not? as I said, i thought when I checked it, it did
|
|
|
lonjil
|
2024-06-18 03:22:22
|
With HDR, it needs to decide how to display SDR content, and it does so by pretending sRGB = 2.2
|
|
|
Quackdoc
|
2024-06-18 03:22:41
|
I may have been mistaken, but I opened old kwin and new kwin since the change and noticed a difference when swapping between the two
|
|
|
lonjil
|
2024-06-18 03:23:15
|
Maybe they did something weird in SDR mode too. I only heard about HDR stuff.
|
|
2024-06-18 03:24:35
|
The idea is that monitors are 2.2, and "sRGB" content is already mastered for being viewed on 2.2 displays (because that's what everyone is already using). So converting from sRGB to 2.2 seems like it'd undermine that.
|
|
|
Quackdoc
|
2024-06-18 03:28:15
|
the issue is that content is not being mastered for 2.2 displays, content is still being mastered for sRGB in many cases
|
|
2024-06-18 03:28:41
|
hence the headache between 2.2 and sRGB, if it was just one or the other, so many problems would be solved
|
|
|
dogelition
|
|
lonjil
The idea is that monitors are 2.2, and "sRGB" content is already mastered for being viewed on 2.2 displays (because that's what everyone is already using). So converting from sRGB to 2.2 seems like it'd undermine that.
|
|
2024-06-18 03:28:48
|
treating sRGB content as ~2.2 gamma does often look better, but the problem is that that hack falls apart as soon as color management (e.g. via ICC profiles) is involved
|
|
|
lonjil
|
2024-06-18 03:29:46
|
Why does it fall apart?
|
|
|
Quackdoc
|
2024-06-18 03:29:50
|
I mean it falls apart depending on the monitor itself
|
|
2024-06-18 03:29:59
|
some monitors are sRGB like mine, some are gamma 2.2
|
|
|
dogelition
|
2024-06-18 03:30:00
|
personally, i have my OLED TV calibrated to BT.709 (with the proper 2.4) gamma, which means that movies/videos are displayed accurately, while sRGB content is objectively too dark (but still looks good, subjectively)
and for HDR games and movies i can just enable windows HDR, and that's displayed accurately too
|
|
|
w
|
2024-06-18 03:30:08
|
false
|
|
2024-06-18 03:30:10
|
you cant calibrate oled
|
|
|
Quackdoc
|
2024-06-18 03:30:16
|
what?
|
|
2024-06-18 03:30:36
|
https://tenor.com/view/math-lady-old-gif-18904946
|
|
|
w
|
2024-06-18 03:31:32
|
maybe if you have it on blast on max white for 10 hours before profiling
|
|
2024-06-18 03:32:39
|
<a:trollform:771932238748712979>
|
|
|
lonjil
|
2024-06-18 03:32:59
|
On KDE, apps that don't do their own HDR rendering are treated as 2.2, which indirectly means that stuff like sRGB images are treated like 2.2
|
|
|
dogelition
|
|
lonjil
Why does it fall apart?
|
|
2024-06-18 03:33:10
|
e.g. if you're telling your browser that your display is sRGB (to make it pass web content through 1:1, without modifying the gamma) while your display is 2.2, you do get the desired effect on sRGB content. but any content with a different transfer function will also be color managed under the assumption that your display uses the sRGB EOTF, while in reality it uses 2.2 gamma -> the end result will be too dark
|
|
2024-06-18 03:33:52
|
it's probably a bigger issue for things like photo editing, as in practice almost all web content should be using the sRGB EOTF
firefox, by default, treats videos as also using the sRGB EOTF (i.e. it doesn't modify the gamma if it thinks your display is sRGB), don't know what exactly chrome does there
|
|
|
Quackdoc
|
2024-06-18 03:34:18
|
> browser that your display is sRGB (to make it pass web content through 1:1, without modifying the gamma) while your display is 2.2, you do get the desired effect on sRGB content
this is wrong, if you are viewing sRGB content on a 2.2 display the blacks will be messed up
|
|
2024-06-18 03:34:34
|
this is a significant issue for a lot of folk
|
|
|
dogelition
|
2024-06-18 03:35:04
|
by desired effect i mean if we're working under the assumption that sRGB content "should" in reality be viewed on a 2.2 display
|
|
2024-06-18 03:35:15
|
as there is some controversy around that, due to some ambiguity in the sRGB standard
|
|
|
Quackdoc
|
2024-06-18 03:35:33
|
ah you mean "I graded my content for gamma 2.2 but tagged it in sRGB"?
|
|
|
dogelition
|
|
w
|
2024-06-18 03:36:08
|
well it's a good thing that the people making things are calibrated or using a mac
|
|
|
Quackdoc
|
2024-06-18 03:36:11
|
also there is no ambiguity in the sRGB standard, it's actually really clear, troy sobotka talks about this a lot and even on the "color management for linux developers"
|
|
2024-06-18 03:36:37
|
the repo being here https://gitlab.freedesktop.org/pq/color-and-hdr/
|
|
|
dogelition
|
2024-06-18 03:39:55
|
<https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024>
this thread has some interesting discussion, including one of the people who actually worked on the standard
it certainly doesn't help that the standard literally says that the reference display is characterized by 2.2 gamma
|
|
|
Quackdoc
the repo being here https://gitlab.freedesktop.org/pq/color-and-hdr/
|
|
2024-06-18 03:41:12
|
which page there specifically?
|
|
|
Quackdoc
|
2024-06-18 03:43:36
|
I want to say it was the eotf issue? https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/12 ???
|
|
2024-06-18 03:43:43
|
it's been a while so i would need to dig for it
|
|
2024-06-18 03:45:29
|
also the headache it'self, just in general for anyone reading, is worth watching this video, https://www.youtube.com/watch?v=NzhUzeNUBuM
|
|
|
dogelition
|
|
Quackdoc
I want to say it was the eotf issue? https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/12 ???
|
|
2024-06-18 03:53:13
|
some interesting discussion in there, thanks
IMO, the biggest issue with interpreting "sRGB" as having 2.2 gamma involved in the colorimetry is that it would mean everyone's been doing color management wrong for decades. in an ICC color managed workflow, using the official sRGB profile and an accurate profile of your display, you're effectively creating content for a display which uses the sRGB EOTF. so i think you can argue that, even if the original intent of the spec might have been different, the sRGB EOTF has become the reference standard
|
|
|
Quackdoc
|
2024-06-18 03:54:55
|
I think regardless everyone has been doing it wrong since it's really complicated [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
|
spider-mario
|
|
Cacodemon345
Hence HDR in quotes.
|
|
2024-06-18 06:28:06
|
they’re orthogonal things
|
|
2024-06-18 06:28:19
|
having a somewhat narrow gamut doesn’t preclude being a true HDR display
|
|
2024-06-18 06:28:27
|
in principle, you could even have greyscale HDR
|
|
|
Orum
|
2024-06-18 07:26:17
|
exactly, HDR is simply the difference between light and dark (and everything in between), while gamut coverage is about what colors can be displayed
|
|
|
_wb_
|
2024-06-18 07:27:03
|
Isn't grayscale HDR something that has been used in medical imaging for a long time now?
|
|
2024-06-18 07:28:25
|
HDR is basically extended luma range, WCG is basically extended chroma range.
|
|
|
Quackdoc
|
|
spider-mario
having a somewhat narrow gamut doesn’t preclude being a true HDR display
|
|
2024-06-18 07:29:18
|
does HDR even *have* a proper definition? I've seen many people consider HDR to require both the transfer and a large gamut together.
|
|
|
_wb_
|
2024-06-18 07:29:52
|
They're often confused because both things tend to require more precision (higher bit depth) and proper color management.
|
|
|
Quackdoc
|
2024-06-18 07:30:35
|
I personally consider just the transfer, but I have seen at least the statement "HDR refers to a specific type of colorspace" more then once, which would necessitate taking gamut into consideration
|
|
|
Orum
|
2024-06-18 07:30:36
|
really HDR, in the capture sense, is about how light is quantized on a medium
|
|
2024-06-18 07:31:43
|
and while you can have 8-bit HDR, it's prone to banding, so it's preferable to have higher bit depth
|
|
|
Quackdoc
|
2024-06-18 07:32:18
|
I would like to see a proper hard definition from some industry standard group
|
|
|
_wb_
|
2024-06-18 07:32:36
|
It's hard to define a specific criterion that separates SDR from HDR. Something like DisplayHDR-400 can have a smaller dynamic range than some SDR displays.
|
|
|
Quackdoc
|
2024-06-18 07:33:11
|
I know, this is why I personally try to be specific in listing HLG or PQ when relevant
|
|
2024-06-18 07:33:29
|
but for me "HDR" means about as much as "RAW" does
|
|
2024-06-18 07:33:35
|
AKA next to nothing
|
|
|
dogelition
|
|
Quackdoc
does HDR even *have* a proper definition? I've seen many people consider HDR to require both the transfer and a large gamut together.
|
|
2024-06-18 07:34:35
|
at least in the consumer (video) world, "HDR" usually refers to BT.2100 ("HDR-TV") or related standards like HDR10+ or Dolby Vision, which all use the BT.2020 primaries
|
|
2024-06-18 07:34:55
|
but the technical term HDR by itself doesn't say anything about the gamut involved
|
|
|
Quackdoc
|
|
dogelition
at least in the consumer (video) world, "HDR" usually refers to BT.2100 ("HDR-TV") or related standards like HDR10+ or Dolby Vision, which all use the BT.2020 primaries
|
|
2024-06-18 07:35:22
|
ive seen it in reference to dcip3 as well
|
|
|
dogelition
but the technical term HDR by itself doesn't say anything about the gamut involved
|
|
2024-06-18 07:35:45
|
that's what im asking does HDR have a technical definition anyone can point to? because I have not yet found one
|
|
2024-06-18 07:35:59
|
as far as I can tell, it's just "kinda a thing"
|
|
|
_wb_
|
2024-06-18 07:36:00
|
Also there is the signal space and then there is the part of it that the display can actually reproduce or that the actual image content actually contains.
|
|
|
A homosapien
|
|
Quackdoc
that's what im asking does HDR have a technical definition anyone can point to? because I have not yet found one
|
|
2024-06-18 07:36:17
|
I agree, I would also like to know
|
|
|
Quackdoc
|
2024-06-18 07:36:57
|
HDR gives the same feelings to me that I get when I hear "raw content"
|
|
|
dogelition
|
|
Quackdoc
that's what im asking does HDR have a technical definition anyone can point to? because I have not yet found one
|
|
2024-06-18 07:37:42
|
the problem is that it's kind of a vague technical term that's used to mean many different things
in the context of bt.2100, i'd say the point is to have a higher maximum luminance while also having sufficiently many steps near black to preserve detail there (and across the rest of the range, too)
|
|
|
_wb_
|
2024-06-18 07:38:42
|
A display can understand signals in Rec2100 PQ but e.g. have a gamut that is 90% of P3 and goes only to 1000 nits.
|
|
|
Quackdoc
|
|
dogelition
the problem is that it's kind of a vague technical term that's used to mean many different things
in the context of bt.2100, i'd say the point is to have a higher maximum luminance while also having sufficiently many steps near black to preserve detail there (and across the rest of the range, too)
|
|
2024-06-18 07:39:36
|
this is exactly my point, it's like `raw` what does `raw` mean in photography or video? next to nothing unless you quantify it. to give meaning to this example, video in general is a nightmare, `raw` could mean unmassaged RGB values, it could be a raw sensor reading, Prores output from a camera, DNG, J2k output from a camera etc.
|
|
|
dogelition
|
2024-06-18 07:40:09
|
https://downloads.bbc.co.uk/rd/pubs/papers/HDR/BBC_HDRTV_FAQ.pdf
|
|
|
Quackdoc
|
2024-06-18 07:40:17
|
like the term `raw`, `HDR` I would argue has no real meaning unless quantified, outside of that, it's more or less just marketing
|
|
2024-06-18 07:42:04
|
this is even worse because of the last bit
> A better definition of dynamic range is the ratio between blackest black and the brightest white that can be seen on the display without significant artefacts.
|
|
2024-06-18 07:42:24
|
~~now we need to define significant artifacts to use it technically~~
|
|
|
dogelition
|
2024-06-18 07:43:28
|
the "high" part of "high dynamic range" is already subjective
|
|
|
Quackdoc
|
2024-06-18 07:44:25
|
man... I just want a technical defintion to refer to, at least as much as "a type of transfer function that can, when given enough data represent a luminance range greater then X or Y"
|
|
2024-06-18 07:45:04
|
linear would be classifed as HDR under that though... which I guess...
|
|
2024-06-18 07:45:16
|
but that really fits the bill for most transfer functions
|
|
|
A homosapien
|
2024-06-18 07:46:49
|
I agree, it's kind of a nebulous definition
> Conventional 8-bit SDR TV has a displayed dynamic range of about 32:1
Also this is wrong, the best 8-bit SDR TVs can reach upwards of 5000:1
|
|
|
Quackdoc
|
2024-06-18 07:47:50
|
I hate the current situation around this because it's more or less like, I feel like that's HDR and thus it's HDR.
|
|
|
A homosapien
|
2024-06-18 07:52:05
|
As far as I'm aware, when people talk about HDR, they mean 3 things: It has a higher bit-depth (>8 bits), has a wider color gamut (anything better than rec.709/sRGB), and something about peak brightness, nits, cd/m², or whatever.
|
|
2024-06-18 07:52:30
|
It's a mess tbh
|
|
|
Quackdoc
|
2024-06-18 07:52:55
|
generally, the minimum common agreement is the transfer, IE, how bright the video is encoded to be at a given value
|
|
|
Orum
|
2024-06-18 07:53:01
|
peak brightness is only part of HDR
|
|
|
Quackdoc
|
2024-06-18 07:53:46
|
color gamut is the next thing, this isnt universally agreed upon to be determined if something is "HDR" or not. I personally don't believe it necessary, but some people do
|
|
|
Orum
|
2024-06-18 07:54:02
|
it means nothing if you don't have a decent black level and easily discernible levels of grey in between
|
|
2024-06-18 07:54:42
|
but yeah, color gamut still has nothing to do with HDR or not
|
|
|
Quackdoc
|
2024-06-18 07:55:10
|
the next thing would be bitdepth, many people believe you require 10bits or greater per color to be HDR, which may have merits quality speaking, but technically speaking I don't agree with.
the last thing would be white point, if you subscribe to the idea that HDR refers to a `colorspace` then you need to also consider whitepoint as the three componenets that make up a colorspace are gamut, transfer and whitepoint
|
|
|
Orum
but yeah, color gamut still has nothing to do with HDR or not
|
|
2024-06-18 07:55:19
|
the issue is that this isn't defined anywhere that matters
|
|
|
Orum
|
2024-06-18 07:55:44
|
literally any B&W "film" can be HDR (but they *cannot* be WG)
|
|
|
Quackdoc
|
2024-06-18 07:55:54
|
you can disagree with it, but you cannot definitively say it has nothing to do with it, because there is no industry consensus or technical definition
|
|
2024-06-18 07:56:14
|
It's a term that has no real meaning unless quantified
|
|
2024-06-18 07:57:10
|
needless to say, I don't like the term HDR, as I said, it's just as bad as the term raw
|
|
|
Orum
|
2024-06-18 07:57:20
|
maybe, but it's like the definition of space
|
|
2024-06-18 07:58:08
|
we define space to be above the Karman line, but does space just start and end exactly at 100Km? No...
|
|
|
A homosapien
|
|
Quackdoc
needless to say, I don't like the term HDR, as I said, it's just as bad as the term raw
|
|
2024-06-18 07:58:14
|
Hell, look at the Wikipedia page: https://en.wikipedia.org/wiki/High_dynamic_range
|
|
2024-06-18 07:58:19
|
There are *10 definitions* of HDR depending on context
|
|
|
dogelition
|
2024-06-18 07:58:39
|
movies/shows are generally considered to be "proper" hdr if they make good use of the luminance headroom, not necessarily the wider gamut
|
|
|
Orum
|
2024-06-18 07:58:55
|
it's just a convenient point to pick, and what matters is that you can see the difference between older standards for dynamic range, and modern ones
|
|
|
Quackdoc
|
2024-06-18 07:59:21
|
~~all I know is that we should abandon transfer functions and just use linear 32bit data for everything~~
|
|
|
Orum
|
2024-06-18 08:00:25
|
linear should be forever banned from representing any form of luma
|
|
|
Quackdoc
|
|
dogelition
movies/shows are generally considered to be "proper" hdr if they make good use of the luminance headroom, not necessarily the wider gamut
|
|
2024-06-18 08:00:36
|
I mean, I personally agree that "HDR" refers to the effective range of the video's luminance given the current data limitations and transfer. but I cannot definitely say that
|
|
|
dogelition
|
|
dogelition
movies/shows are generally considered to be "proper" hdr if they make good use of the luminance headroom, not necessarily the wider gamut
|
|
2024-06-18 08:00:37
|
a lot of the "hdr" content on netflix, e.g. the one piece series, doesn't even go beyond 100 nits
|
|
|
Quackdoc
|
|
Orum
linear should be forever banned from representing any form of luma
|
|
2024-06-18 08:00:44
|
exr chads say hello
|
|
|
Orum
|
2024-06-18 08:00:55
|
EXR uses floats
|
|
2024-06-18 08:01:07
|
and floats do not have linear resolution
|
|
|
Quackdoc
|
2024-06-18 08:01:07
|
yeah, it's linear float
|
|
2024-06-18 08:01:56
|
specifically there are multiple types of `linear` but meh, too technical for me right now
|
|
|
Orum
|
2024-06-18 08:02:05
|
it's the same with audio; it uses floats, but floats make sense, as you don't care about a tiny difference at high amplitude, but you do at low amplitudes
|
|
|
Quackdoc
|
2024-06-18 08:02:34
|
team chad float vs team virgin ints
|
|
2024-06-18 08:04:01
|
also scRGB is linear too
|
|
|
Orum
|
2024-06-18 08:06:49
|
honestly integer is fine as long as you're using a transform (e.g. HLG, gamma curves, etc.)
|
|
2024-06-18 08:07:02
|
but then you're still non-linear
|
|
|
Quackdoc
|
2024-06-18 08:07:47
|
scRGB uses 16bit integer and works fine [av1_akkoShrug](https://cdn.discordapp.com/emojis/654080960492732435.webp?size=48&quality=lossless&name=av1_akkoShrug)
|
|
2024-06-18 08:08:01
|
as long as you have 16 more more bits per color is fine
|
|
|
lonjil
|
|
dogelition
movies/shows are generally considered to be "proper" hdr if they make good use of the luminance headroom, not necessarily the wider gamut
|
|
2024-06-18 08:08:11
|
hah, they make very poor use of the volume headroom when doing sound mixing, so maybe we shouldn't let movie people do visual HDR...
|
|
|
Orum
|
|
Quackdoc
as long as you have 16 more more bits per color is fine
|
|
2024-06-18 08:08:34
|
sure, but that's not terribly efficient
|
|
|
Quackdoc
|
2024-06-18 08:08:55
|
who needs effiency when you have have **moar data**
|
|
|
Orum
|
2024-06-18 08:08:58
|
"just throw more bits at it" should never be the first response
|
|
|
A homosapien
|
2024-06-18 08:11:41
|
What do you mean? Its always my first response.
It solved my marriage, got me a job, and fixed my back pain. <:AV1:805851461774475316>
|
|
|
Quackdoc
|
|
spider-mario
|
|
Orum
it's the same with audio; it uses floats, but floats make sense, as you don't care about a tiny difference at high amplitude, but you do at low amplitudes
|
|
2024-06-18 08:19:53
|
audio is often 24-bit integers
|
|
2024-06-18 08:20:41
|
typically, it’s recorded this way, then processed as such or as 32-bit floats, then dithered to 16 bits for delivery
|
|
|
_wb_
|
2024-06-18 08:25:15
|
SDR is when the max-value color is a white you can use as a background color for something like a web page or a text document. HDR is when you cannot do that because it destroys your eyes and/or your display will not be able to sustain it for a long time.
|
|
|
Quackdoc
|
2024-06-18 08:40:57
|
thats one way of putting it [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
|
Orum
|
|
spider-mario
typically, it’s recorded this way, then processed as such or as 32-bit floats, then dithered to 16 bits for delivery
|
|
2024-06-18 10:53:10
|
high end gear will basically always offer f32 these days
|
|
2024-06-18 10:53:32
|
which is the gold standard for capturing as you never have to worry about getting levels correct pre-recording
|
|
|
spider-mario
|
2024-06-18 10:54:00
|
you shouldn’t have to worry with 24-bit integers either
|
|
|
Orum
|
2024-06-18 10:55:09
|
depends if you're trying to record a mouse's squeak and an atomic bomb in the same recording
|
|
|
spider-mario
|
2024-06-18 10:55:37
|
144 dB of dynamic range gets you all the way from the quietest environments you will likely encounter to sounds that are instantly damaging to the ears (https://web.archive.org/web/20211114090548/https://www.noisehelp.com/noise-dose.html)
|
|
2024-06-18 11:06:10
|
(not to mention, the microphone itself is unlikely to exceed that DR anyway)
|
|
|
Quackdoc
|
2024-06-18 11:08:41
|
>me recording whispers and gun shots in the same audio track
|
|
|
CrushedAsian255
|
|
Quackdoc
>me recording whispers and gun shots in the same audio track
|
|
2024-06-18 11:09:40
|
>me recording an ant crawling while on a bomber jet
|
|
|
Quackdoc
|
2024-06-18 11:09:54
|
[av1_woag](https://cdn.discordapp.com/emojis/852007419474608208.webp?size=48&quality=lossless&name=av1_woag)
|
|
2024-06-18 11:16:13
|
I never did understand how hearing damage actually works, or rather how audio degrades through open air, this chart says 140db is immediate damage, but I've had full size rifles go off right near my head (public range, idiot now banned) and, there is some minor pain, but nothing lasting, however shot a suppressed 22 indoors and it was just about as bad
|
|
|
lonjil
|
2024-06-18 11:19:47
|
a large amount of the noise actually comes from the supersonic shockwave
|
|
2024-06-18 11:19:59
|
suppressors work best with subsonic bullets
|
|
2024-06-18 11:20:30
|
and if you're indoors more of the noise might bounce back at you rather than dissipate
|
|
2024-06-18 11:20:50
|
> but I've had full size rifles go off right near my head (public range, idiot now banned) and, there is some minor pain, but nothing lasting, however shot a suppressed 22 indoors and it was just about as bad
both of those were permanent damage
|
|
|
Quackdoc
|
2024-06-18 11:20:53
|
I am aware of that, It's just the rifle was *really* close to my head when it went off, I got the whole shebang
|
|
|
lonjil
> but I've had full size rifles go off right near my head (public range, idiot now banned) and, there is some minor pain, but nothing lasting, however shot a suppressed 22 indoors and it was just about as bad
both of those were permanent damage
|
|
2024-06-18 11:21:23
|
maybe, but zero measurable effects
|
|
|
CrushedAsian255
|
2024-06-18 11:21:30
|
no ears
|
|
2024-06-18 11:21:40
|
https://tenor.com/view/my-ears-make-it-stop-loud-noise-quiet-gif-17369234
|
|
|
lonjil
|
|
Quackdoc
I am aware of that, It's just the rifle was *really* close to my head when it went off, I got the whole shebang
|
|
2024-06-18 11:21:57
|
I would posit that ears probably have a maximum volume they can register
|
|
2024-06-18 11:22:14
|
so that anything above that volume sounds the same
|
|
|
CrushedAsian255
|
|
lonjil
I would posit that ears probably have a maximum volume they can register
|
|
2024-06-18 11:22:17
|
like they clip?
|
|
|
lonjil
|
2024-06-18 11:22:25
|
yeah, I would guess so
|
|
|
Quackdoc
|
2024-06-18 11:22:41
|
possibly, but in the end, I still regularly get hearing tests and they are about as they have always ever been
|
|
2024-06-18 11:23:26
|
kinda like welding with no mask [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
|
lonjil
|
2024-06-18 11:25:54
|
I got sunburned on my eyelids when I did that
|
|
|
Quackdoc
|
2024-06-18 11:26:30
|
that must have been uncomfortable
|
|
|
lonjil
|
|
Quackdoc
possibly, but in the end, I still regularly get hearing tests and they are about as they have always ever been
|
|
2024-06-18 11:26:45
|
permanent cumulative damage isn't necessarily more than just a very small amount of damage. Just don't have guns go off next to your head too many more times!
|
|
|
Quackdoc
that must have been uncomfortable
|
|
2024-06-18 11:26:49
|
quite
|
|
|
Quackdoc
|
|
lonjil
permanent cumulative damage isn't necessarily more than just a very small amount of damage. Just don't have guns go off next to your head too many more times!
|
|
2024-06-18 11:27:16
|
I don't plan on it [av1_kekw](https://cdn.discordapp.com/emojis/758892021191934033.webp?size=48&quality=lossless&name=av1_kekw)
|
|
2024-06-18 11:27:32
|
though occasionally, I do forget earpro, or they aren't on quite right
|
|
|
_wb_
|
2024-06-21 02:17:53
|
https://en.wikipedia.org/wiki/Aritzia — these guys are serving quite a lot of jxl images right now
|
|
|
yoochan
|
2024-06-21 02:26:32
|
which means most of their customers have an iphone ?
|
|
|
w
|
2024-06-21 02:27:06
|
vancouver mentioned
|
|
2024-06-21 02:27:10
|
most certainly chinese + iphone
|
|
|
_wb_
|
2024-06-21 02:30:47
|
> targeted towards young North American women
I think that means mostly iPhones, yes 🙂
|
|
2024-06-21 02:31:36
|
but I meant in absolute terms, I haven't checked what percentage of their imagese are jxl, I just see they are serving a lot of jxls
|
|
|
w
|
2024-06-21 02:34:53
|
looks like all product images
|
|
2024-06-21 02:35:05
|
for a store i guess it's all
|
|
|
username
|
2024-06-21 02:35:22
|
and also all the temp placeholder images as well
|
|
2024-06-21 02:35:36
|
so basically all images doubled
|
|
2024-06-21 02:38:15
|
all the loading/placeholder images are the same res as the final images
|
|
|
_wb_
|
2024-06-21 02:51:15
|
yeah I noticed they do a weird kind of loading thing with large placeholders that are explicitly pixelated — there are better ways to do that (use small images, make the browser upscale it with NN)
|
|
2024-06-21 02:51:43
|
I wonder if they're going for some kind of artistic effect with that, or if it's just a poorly implemented LQIP thing
|
|
|
jonnyawsom3
|
2024-06-21 03:29:25
|
`https://assets.aritzia.com/image/upload/f_auto/q_10/e_pixelate:50/c_limit,w_1200/su24-wk10-06-19-hp-pt2-c`
`https://assets.aritzia.com/image/upload/f_auto/q_auto/c_limit,w_1200/su24-wk10-06-19-hp-pt2-c`
Just seems like a strange way of thumbnailing
|
|
2024-06-21 03:40:20
|
I assume they did it for image formats without progressive loading, but in the case of JXL it just makes loading take even longer for worse results
|
|
|
username
|
2024-06-21 03:42:46
|
speaking of that reminds me how when the update that fixed Shopify's logic for chooisng JXLs files came out it was also the same update where Shopify forced hashblur for every image (including JXLs) and and only shows the full image once it's fully done downloading
|
|
2024-06-21 03:43:59
|
a lot of CDNs and websites do things that just completely nullify JPEG XL's progressive decoding abilities by only assuming full images should be shown once they are 100% downloaded
|
|
|
w
|
2024-06-21 03:46:39
|
proof jxl isnt needed
|
|
2024-06-21 03:46:43
|
just like webp and mozjpeg
|
|
|
username
|
2024-06-21 03:49:04
|
not really it's just proof websites assume all images are only presentable once fully downloaded and feel like they have to dual download placeholders generated of the full image(s)
|
|
|
w
|
2024-06-21 03:53:10
|
and that problem is solved
|
|
2024-06-21 03:53:12
|
just not with jxl
|
|
|
username
|
2024-06-21 03:55:50
|
horrible solution imo I don't like websites downloading a bunch of separate images/files/data that then have to be processed through javascript just so they can be discarded and then switched out with something I was already downloading
|
|
2024-06-21 03:57:48
|
just adding unnecessary layers to solve a problem that wouldn't be a problem if image formats/decoders did the work themselves
|
|
|
w
|
2024-06-21 03:58:52
|
yeah just like webp
|
|
|
username
|
2024-06-21 04:01:46
|
WebP's problem was having downsides that previous formats did not while also not providing enough benefits compared to previous formats hence why Mozilla fought against it back in the day
|
|
|
w
|
2024-06-21 04:02:48
|
exactly
|
|
2024-06-21 04:02:54
|
why use jxl when you can use multiple jpgs
|
|
2024-06-21 04:03:34
|
and it performs faster
|
|
|
_wb_
|
2024-06-21 04:07:27
|
Safari doesn't implement progressive loading of jxl at all 😦 — I assume they went for a simple integration of just passing a full bitstream to CoreMedia and getting pixels back, rather than doing a deeper integration like they have for jpeg
|
|
|
w
why use jxl when you can use multiple jpgs
|
|
2024-06-21 04:09:08
|
all browsers support progressive jpeg so there's no need to use multiple jpegs
|
|
2024-06-21 04:10:49
|
the whole "placeholder images" thing is basically an anti-pattern and web devs should just stop doing that, and use progressive jpeg or jxl instead
|
|
|
Quackdoc
|
|
_wb_
Safari doesn't implement progressive loading of jxl at all 😦 — I assume they went for a simple integration of just passing a full bitstream to CoreMedia and getting pixels back, rather than doing a deeper integration like they have for jpeg
|
|
2024-06-21 08:02:05
|
To be fair, it's probably actually not that real high of a priority for them. given their user base and the current state of the web
|
|
|
VcSaJen
|
|
username
a lot of CDNs and websites do things that just completely nullify JPEG XL's progressive decoding abilities by only assuming full images should be shown once they are 100% downloaded
|
|
2024-06-22 09:53:15
|
Might be intentional. With progressive formats it's not always obvious whether image fully loaded or the source is just blurry, especially on the last couple of "progressions"
|
|
2024-06-22 09:56:32
|
JS=based loading bar might be better, though
|
|
|
lonjil
|
2024-06-22 10:16:59
|
that's why it's nice that jxl only has 2 passes by default
|
|
|
Foxtrot
|
2024-06-22 11:22:05
|
2 passes means 1. blurry 2. fully loaded?
|
|
|
Demiurge
|
|
_wb_
The issue with tone mapping algorithms is that they might make artistically 'wrong' decisions, which is why gain maps can bring an advantage since they can spell out the exact way a local tone mapping has to be done. But realistically, I think very few people will actually generate custom gain maps, and instead they'll use whatever algorithm their software is implementing — so we're spending bytes on encoding the result of an algorithm, as opposed to just signaling "use this algorithm".
|
|
2024-06-22 11:22:29
|
Do we know what algorithm Apple is using to generate their gain maps? I thought their camera captures multiple images at different exposure levels to generate an HDR image. But if the gain map can be generated from the HDR image with an algorithm, then why is Apple choosing to do something absurd like storing the result of an algorithm instead of just signalling what algorithm to use?
|
|
|
lonjil
|
|
Foxtrot
2 passes means 1. blurry 2. fully loaded?
|
|
2024-06-22 11:24:00
|
yeah
|
|
|
Foxtrot
|
2024-06-22 11:26:59
|
Sounds good. I hate when websites are badly coded and I try to click some button during loading of media and while I click some image just appears, whole site moves and I click on advertisment accidently 😄
|
|
|
CrushedAsian255
|
|
Demiurge
Do we know what algorithm Apple is using to generate their gain maps? I thought their camera captures multiple images at different exposure levels to generate an HDR image. But if the gain map can be generated from the HDR image with an algorithm, then why is Apple choosing to do something absurd like storing the result of an algorithm instead of just signalling what algorithm to use?
|
|
2024-06-22 11:32:51
|
I think the way it works is that it’s an SDR image and the gain map signifies the HDR brightness, so like RGBE. I’m not 100% sure though so could be wrong
|
|
|
dogelition
|
2024-06-22 12:02:24
|
yeah, afaik the idea is to store a regular SDR image that can be displayed by any software, and then have the extra HDR data in there that can be used to recover the "full" HDR image (if the software supports it)
presumably, whatever computational photography magic they use to generate the SDR image from the HDR data is much more expensive than applying a gain map when displaying an image, so i guess that's also a good reason to pre-compute it
|
|
|
lonjil
|
2024-06-22 12:05:38
|
I'm pretty sure it isn't
|
|
2024-06-22 12:06:00
|
They do it that way so that old software that doesn't support HDR images will have an SDR image to show
|
|
2024-06-22 12:06:33
|
And you can't actually recover the full HDR image, since there will be a lot of quantization errors leading to a lower quality image.
|
|
|
Demiurge
|
2024-06-22 12:26:10
|
Yeah, exactly. It just seems pretty insane actually, to store a (degraded) output of an algorithm instead of just signalling what algorithm to apply...
|
|
2024-06-22 12:28:08
|
Like wb says, I doubt ANYONE is actually manually editing a gain map by hand. And having 2 separate images that could potentially be utterly different scenes if you wanted them to be, seems like a bad idea in general
|
|
|
dogelition
|
|
lonjil
They do it that way so that old software that doesn't support HDR images will have an SDR image to show
|
|
2024-06-22 12:28:21
|
yes, but the SDR image is what the vast majority of people will see, so it has to look as good as possible
though i suppose the relationship between the SDR and HDR image could be relatively simple, depending on where in the pipeline they generate the two variants of the image (or how much they care about the HDR image looking significantly better)
|
|
|
Demiurge
|
2024-06-22 12:29:52
|
It defeats the purpose of switching to a more efficient lossy format like HEIC. Wasting bytes storing 2 separate images for every image...
|
|
2024-06-22 12:31:13
|
Well, maybe not, if they're doing it smart...
|
|
|
jonnyawsom3
|
2024-06-22 01:09:30
|
Out of all the formats JXL is probably best for gainmaps too. Could use a single greyscale 16 bit extra channel alongside the normal 8 bit VarDCT image to keep the best of both worlds... Although I've just reinvented RGBE...
|
|
|
CrushedAsian255
|
|
Out of all the formats JXL is probably best for gainmaps too. Could use a single greyscale 16 bit extra channel alongside the normal 8 bit VarDCT image to keep the best of both worlds... Although I've just reinvented RGBE...
|
|
2024-06-23 01:48:39
|
Yeah that’s basically RGBE lol
|
|
|
_wb_
|
2024-06-23 06:55:25
|
The gain maps approach is mostly about interoperability. It makes it easy for applications that don't support HDR to gracefully degrade. Too easy, if you ask me. The downside of graceful degradation is that there's not much pressure on the wider ecosystem to implement the non-degraded feature.
|
|
|
Quackdoc
|
2024-06-23 06:58:11
|
I see it contrarily, currently you have a lot of people who don't "get" hdr. and the few times they are exposed to HDR capable devices, it's almost always just plain SDR content. so a lot of people don't actually really care about HDR since the only real exposure they have is YT videos or netflix on their phones.
I think that once people start understanding why HDR is great, the ecosystem will naturally progress towards HDR, it's just a matter of properly exposing folk to HDR.
|
|
|
_wb_
|
2024-06-23 08:32:24
|
Sure, but there will also always still be SDR images around. It is hard for a naive end-user to know if an image they're looking at is just an SDR image, or an HDR image gracefully degraded to an SDR image because their software doesn't know how to apply gain maps.
|
|
|
VcSaJen
|
2024-06-23 09:41:40
|
Where wide-gamut images fit into this?
|
|
2024-06-23 09:45:56
|
HDR-to-wide-gamut conversion is kinda an obscure thing, for some reason.
|
|
|
Quackdoc
|
2024-06-23 09:56:21
|
HDR to wide gamut isn't really a thing afaik, you do standard range to high range, and standard gamut to wide gamut. an "HDR to wide gamut" image would simply be tonemapping a"typical" HDR image to SDR without doing gamut mapping
though I suppose it could be possible that HDR could be better to extrapolate data from then SDR...
|
|
|
_wb_
|
2024-06-23 11:00:22
|
Gamut and dynamic range are orthogonal things (though in both cases you need more precision if you want a larger range)
|
|
|
CrushedAsian255
|
|
_wb_
Gamut and dynamic range are orthogonal things (though in both cases you need more precision if you want a larger range)
|
|
2024-06-23 11:02:55
|
so HDR is more variance in Luma, where as WCG is like more variance in Chroma, correct? and since both involve having more variance, they both require higher bit depths.
|
|
|
_wb_
|
2024-06-23 11:05:10
|
Yes. You can have grayscale HDR, while grayscale is the narrowest color gamut you can have 🙂
|
|
|
CrushedAsian255
|
|
_wb_
Yes. You can have grayscale HDR, while grayscale is the narrowest color gamut you can have 🙂
|
|
2024-06-23 11:19:56
|
Grayscale = Ultra-NCG (Narrow Colour Gamut) 🤣
|
|
|
dogelition
|
|
CrushedAsian255
so HDR is more variance in Luma, where as WCG is like more variance in Chroma, correct? and since both involve having more variance, they both require higher bit depths.
|
|
2024-06-23 11:33:02
|
also note that PQ (the most widely used EOTF/"gamma" for HDR content) is specifically optimized to keep banding just below a visible threshold across the entire range, which is why it only requires 10 bits to look good despite the huge luminance range (0 - 10k nits)
|
|
|
CrushedAsian255
|
|
dogelition
also note that PQ (the most widely used EOTF/"gamma" for HDR content) is specifically optimized to keep banding just below a visible threshold across the entire range, which is why it only requires 10 bits to look good despite the huge luminance range (0 - 10k nits)
|
|
2024-06-23 12:58:56
|
i really should learn about these things properly
|
|
|
Orum
|
|
_wb_
Gamut and dynamic range are orthogonal things (though in both cases you need more precision if you want a larger range)
|
|
2024-06-23 02:00:37
|
well, you don't *need* more precision if you want a larger range, but without it you'll get <:banding:804346788982030337> artifacts
|
|
2024-06-23 02:02:49
|
or, theoretically, with enough spacial resolution you could simply have 1-bit of depth and dither between to get every possible value, but that's not exactly practical in the real world (for displays) <:NotLikeThis:805132742819053610>
|
|
2024-06-23 02:03:39
|
...but it is how printers with only a single color of ink work
|
|
|
jonnyawsom3
|
2024-06-23 02:25:40
|
Technically, the same precision but with a pallete would work fine, assuming the image doesn't use most of the range anyway
|
|
|
_wb_
|
2024-06-23 04:13:43
|
I mean the effective precision you need is higher, however you implement it (spatial or temporal dithering or just having more shades, in the end that's an implementation detail).
|
|
|
spider-mario
|
|
Orum
or, theoretically, with enough spacial resolution you could simply have 1-bit of depth and dither between to get every possible value, but that's not exactly practical in the real world (for displays) <:NotLikeThis:805132742819053610>
|
|
2024-06-23 07:46:35
|
in audio, DSD (used in SACD) kind of does that – it’s 2.82 MHz, 1-bit
|
|
2024-06-23 07:47:25
|
(from what I understand, this is combined with heavy noise shaping to push the noise to the higher frequencies)
|
|
2024-06-23 07:47:48
|
https://sjeng.org/ftp/SACD.pdf is quite critical of the approach, though
|
|
2024-06-23 07:48:44
|
> Single-stage, 1-bit sigma-delta converters are in principle imperfectible. We prove this fact. The reason, simply stated, is that, when properly dithered, they are in constant overload. Prevention of overload allows only partial dithering to be performed. The consequence is that distortion, limit cycles, instability, and noise modulation can never be totally avoided. We demonstrate these effects, and using coherent averaging techniques, are able to display the consequent profusion of nonlinear artefacts which are usually hidden in the noise floor. Recording, editing, storage, or conversion systems using single-stage, 1-bit sigma-delta modulators, are thus inimical to audio of the highest quality. In contrast, multi-bit sigma-delta converters, which output linear PCM code, are in principle infinitely perfectible. (Here, multi-bit refers to at least two bits in the converter.) They can be properly dithered so as to guarantee the absence of all distortion, limit cycles, and noise modulation. The audio industry is misguided if it adopts 1-bit sigma-delta conversion as the basis for any high-quality processing, archiving, or distribution format to replace multi-bit, linear PCM.
|
|
|
lonjil
|
2024-06-23 07:52:18
|
1-bit audio, I've done that when I've wanted audio output from a microcontroller
|
|
|
Demiurge
|
|
_wb_
Sure, but there will also always still be SDR images around. It is hard for a naive end-user to know if an image they're looking at is just an SDR image, or an HDR image gracefully degraded to an SDR image because their software doesn't know how to apply gain maps.
|
|
2024-06-23 10:22:42
|
That's a good reason for JXL to only support HDR base images...
|
|
|
_wb_
|
2024-06-24 05:01:53
|
Well we intend to support both but suggest applications to it that way.
|
|
|
Demiurge
|
2024-06-24 08:37:20
|
Maybe if it's a losslessly transcoded JPEG it would make sense...
|
|
2024-06-24 08:38:07
|
But otherwise it seems like a bad idea and even for JPEG it's probably a better idea to just use a color profile eh?
|
|
2024-06-24 08:38:24
|
btw if JPEG can do HDR with a color profile why can't webp?
|
|
|
_wb_
|
2024-06-24 11:33:52
|
JPEG can only do direct HDR when using its 12-bit mode (which is not part of the de facto format) or when using a sufficiently precise implementation of its 8-bit mode that can work with higher bit depth buffers (like jpegli).
WebP only has an 8-bit mode and moreover it has obligatory 4:2:0, obligatory limited-range yuv, and an internal precision that is lower than the internal precision of JPEG's 8-bit mode. So even if you would write a "webpli" that works with higher bit depth buffers, I don't think you would have enough wiggle room to squeeze enough effective precision out of it for HDR.
|
|
2024-06-24 11:35:01
|
Something like UltraHDR (8-bit SDR image + separate gain map) could be applied to WebP, though it would have all of the disadvantages of UltraHDR JPEG and none of its advantages.
|
|
|
spider-mario
|
2024-06-24 02:31:08
|
yeah, technically possible but probably hopeless banding-wise with PQ; HLG might have a bit more of a chance
|
|
|
novomesk
|
2024-06-25 08:03:47
|
libjxl was added to Freedesktop SDK, so it will be easy to use with Flatpak applications.
https://gitlab.com/freedesktop-sdk/freedesktop-sdk/-/merge_requests/19916/commits
Previously libjxl was available inside KDE runtime and GNOME runtime.
The presence in the "standard runtime" is even better, because ffmpeg can use libjxl in Flatpak too:
https://gitlab.com/freedesktop-sdk/freedesktop-sdk/-/commit/277f0d65b1f587fc28b81b846ceecf195d5acefc
|
|
|
CrushedAsian255
|
2024-06-27 12:41:15
|
macOS still dies trying to view large JXL files (at least it can process the 1:8 downsample and then tiled decode the high resolution data)
|
|
2024-06-27 12:41:44
|
i'm using this as my "large image"
|
|
2024-06-27 12:41:48
|
JPEG XL image, 13284x7473, lossy, 8-bit RGB+Alpha
|
|
2024-06-27 12:41:54
|
although i can probably drop the alpha
|
|
|
Quackdoc
|
2024-06-27 12:43:43
|
it is a decently sized image 0.0
|
|
|
CrushedAsian255
|
2024-06-27 12:44:22
|
here's version with no alpha
|
|
|
Meow
|
|
CrushedAsian255
macOS still dies trying to view large JXL files (at least it can process the 1:8 downsample and then tiled decode the high resolution data)
|
|
2024-06-27 03:31:06
|
Similar to the first year of AVIF on macOS
|
|
|
dogelition
|
2024-06-27 03:56:16
|
not sure how new this is, but `WindowsCodecs.dll` from windows 11 24h2 has functions for creating jpeg xl decoder/encoder instances, and also for reading/writing animation frames
it references `msjpegxl_store.dll`, which is not present on the image, so the actual codec is probably going to be distributed via the microsoft store (similar to `mswebp_store.dll` etc.)
|
|
|
jonnyawsom3
|
2024-06-27 04:09:26
|
We were aware of references to an encoder and decoder for a few months now, but the DLLs are new as far as I'm aware
|
|
2024-06-27 04:09:48
|
<@98957339243073536> soon
|
|
|
KKT
|
|
CrushedAsian255
here's version with no alpha
|
|
2024-06-27 09:47:19
|
I'm getting crazy corruption in Preview with images over 8 bits. Have you tried that?
|
|
|
Foxtrot
|
2024-06-29 07:32:34
|
unfortunate that there isnt any MS dev here that could confirm us that they really are cooking jpeg xl plugin for ms store
|
|
|
jonnyawsom3
|
2024-06-29 10:25:24
|
Seeing as we saw registry entries months ago, now actual DLL calls in 24H2 and reference to the MS store, I think it's safe to say they're getting close on finishing support
|
|
|
damian101
|
|
Demiurge
btw if JPEG can do HDR with a color profile why can't webp?
|
|
2024-07-01 01:26:50
|
it can
|
|
|
_wb_
|
2024-07-01 08:01:22
|
If you don't mind banding
|
|
|
damian101
|
2024-07-01 08:27:08
|
I think 10'000 nits PQ has significantly less perceptual banding than 300 nits SDR gamma when there are dark regions. sRGB definitely, it's a terrible transfer function...
|
|
|
Quackdoc
|
2024-07-01 08:57:29
|
pq 8 bit would be pretty rough
|
|
|
I think 10'000 nits PQ has significantly less perceptual banding than 300 nits SDR gamma when there are dark regions. sRGB definitely, it's a terrible transfer function...
|
|
2024-07-01 09:14:26
|
I think for SDR being the intended range of around 80-203~ nits sRGB is fine, the issue you run into with PQ and 8bit can be played around with using this script, dont mind it working in bt2020, despite everything it wont actually effect the luminance, you could try for yourself with `RGB_COLOURSPACE_BT709`
```py
import colour
import numpy as np
r = 100
g = 100
b = 100
# divide 1023 for 10bit #65536 for 16bit eotf_BT2100_HLG
rl = r / 255
gl = g / 255
bl = b / 255
rgb = np.array([rl,gl,bl])
BT2020 = colour.models.RGB_COLOURSPACE_BT2020.matrix_RGB_to_XYZ ##constants
RGB_lin = colour.models.eotf_ST2084(rgb)
#RGB_lin = colour.models.eotf_BT2100_HLG(rgb)
xyz = colour.algebra.vector_dot(BT2020, RGB_lin)
srgb_xyz = colour.sRGB_to_XYZ(rgb)
print("PQ is: ", xyz)
print("sRGB is: ", srgb_xyz)
```
|
|
2024-07-01 09:17:15
|
but even a quick test of 100 vs 150 shows a harsh showcase
```ps
➜ scripts venv/bin/python3 lum.py 2> /dev/null
PQ is: [ 28.28771187 29.76225521 32.41281472]
sRGB is: [ 0.12112952 0.12743768 0.13877963]
➜ scripts venv/bin/python3 lum.py 2> /dev/null
PQ is: [ 207.36297014 218.17210481 237.60202175]
sRGB is: [ 0.28989044 0.30498731 0.33213119]
```
|
|
2024-07-01 09:17:41
|
the one we care about is the middle number
|
|
2024-07-01 09:23:30
|
the last number would could reliably use us (191, 191, 191) as (192,192,192) is over the 1knit range, where as with sRGB, you are only at "half" the intensity of whatever nit range you are grading for
```ps
➜ scripts venv/bin/python3 lum.py 2> /dev/null
PQ is: [ 926.28780074 974.5720705 1061.36526705]
sRGB is: [ 0.49520629 0.52099557 0.56736418]
➜ scripts venv/bin/python3 lum.py 2> /dev/null
PQ is: [ 960.21908551 1010.2720791 1100.24463812]
sRGB is: [ 0.50102293 0.52711513 0.57402837]
```
|
|
2024-07-01 09:24:39
|
ofc you could just say "give me the PQ transfer, but make my max nit range 0..300~" or something like that, which to my knowledge JXL actually supports
|
|
2024-07-01 09:24:54
|
have fun breaking 99% of software tho
|
|
|
damian101
|
2024-07-01 09:24:59
|
sRGB transfer is never fine in the dark regions with 8-bit, even at 80 nits peak
|
|
|
Quackdoc
|
2024-07-01 09:25:23
|
I disagree, with an 80nit peak its fine,
|
|
|
damian101
|
2024-07-01 09:25:35
|
for gamma 2.2 maybe
|
|
|
Quackdoc
|
2024-07-01 09:26:00
|
you would be extremely hard pressed to find issues with sRGB on any true 80nit display let alone 80nit CRT (as to which sRGB is designed for)
|
|
2024-07-01 09:30:21
|
for modern SDR content which is more like 300~ nits I do agree that sRGB is bad, but PQ is not the answer for that
|
|
|
Oleksii Matiash
|
2024-07-01 10:15:45
|
Just noticed this in the foobar2000 (audio player) beta change log:
2024-05-31
...
Fixed JXL/AVIF/HEIC external album covers not being recognized, even if relevant system image codecs are present.
|
|
2024-07-01 10:16:04
|
https://www.foobar2000.org/changelog-win-2.2-preview
|
|
|
TheBigBadBoy - 𝙸𝚛
|
2024-07-01 10:25:49
|
and then FFmpeg only support PNG, JPEG, GIF album arts…
|
|
|
_wb_
|
2024-07-01 12:23:02
|
The thing with lossy WebP is that it's effectively 6.5-bit, not 8-bit. The green channel is still around 7-bit but the red and blue loses more than one bit of precision due to the conversion to limited-range yuv.
|
|
|
Rasky
|
2024-07-01 08:51:43
|
I noticed that libmagic is a bit shy on jxl files
|
|
2024-07-01 08:51:50
|
```
$ file reference_image.jxl
reference_image.jxl: JPEG XL codestream```
|
|
2024-07-01 08:51:52
|
vs
|
|
2024-07-01 08:51:59
|
```
$ file reference_image.jpg
reference_image.jpg: JPEG image data, JFIF standard 1.01, aspect ratio, density 1x1, segment length 16, baseline, precision 8, 3600x4200, components 3```
|
|
2024-07-01 08:52:25
|
is there any background context on this? Or is it just a metter of somebody contributing to libmagic?
|
|
|
lonjil
|
2024-07-01 09:01:05
|
the latter
|
|
|
spider-mario
|
2024-07-01 09:11:31
|
can magic use arbitrary logic to extract such information?
|
|
|
Demiurge
|
2024-07-02 12:15:13
|
https://man.openbsd.org/OpenBSD-7.2/magic.5
|
|
2024-07-02 12:15:46
|
It uses this logic
|
|
|
spider-mario
|
2024-07-02 08:45:38
|
I seemed to remember something along those lines
|
|
2024-07-02 08:46:11
|
I’m not sure it’s flexible enough for that
|
|
|
Demiurge
|
2024-07-02 08:34:40
|
It is capable of finding matches and going down conditional branches to find additional matches to display more information about a file. It should be able to tell you everything about a file based on the header info as long as it is capable of finding a match
|
|
2024-07-02 08:35:47
|
Like it should be able to tell you if it's a raw jxl or a jxl container with multiple boxes, etc, as long as it's not compressed in an encoding that prevents finding matches
|
|
|
Traneptora
|
2024-07-02 09:01:27
|
offset is in bytes
|
|
2024-07-02 09:01:34
|
which strikes me as difficult to make work with JXL
|
|
|
Demiurge
|
2024-07-02 09:50:06
|
At the very least it should be able to tell the difference between raw and containerized jxl
|
|
2024-07-02 11:44:51
|
It can do variable offset by searching for a string or a regex
|
|
2024-07-03 05:04:13
|
It can also use offsets read from the file... there's a lot of flexibility so unless everything is brotli compressed or something it should be able to print lots of info
|
|
|
Quackdoc
|
2024-07-04 05:10:07
|
I hope fossify accepts the jxl pr, most images i've tested work. some like my woag testing image I think are too large and trigger resource protection
|
|
|
TheBigBadBoy - 𝙸𝚛
|
|
Quackdoc
I hope fossify accepts the jxl pr, most images i've tested work. some like my woag testing image I think are too large and trigger resource protection
|
|
2024-07-04 06:36:19
|
give me one file like that, I want to test
|
|
|
Quackdoc
|
|
TheBigBadBoy - 𝙸𝚛
give me one file like that, I want to test
|
|
2024-07-04 06:51:56
|
|
|
2024-07-04 06:53:53
|
I think you already have the apk but for anyone else https://files.catbox.moe/7dllzt.apk
|
|
|
TheBigBadBoy - 𝙸𝚛
|
2024-07-04 07:58:33
|
mmmmh I just have a "black" screen
|
|
2024-07-04 07:58:41
|
at least it doesn't crash the app
|
|
2024-07-04 07:59:55
|
unlike the other Gallery app with the package name `fr.oupson.pocjxlgallery`
|
|
|
Quackdoc
|
2024-07-04 08:08:06
|
yeah, I think glide has resource protections when I was looking at the logs but they weren't super clear
|
|
2024-07-04 08:08:55
|
it could be that decoding is taking too much ram, which granted anything more then 100mib on a phone is a hard swallow.
could also simply be render limitations
|
|
2024-07-04 08:11:46
|
Actually, PNG decoding fine, so it's likely a RAM limitations
|
|
2024-07-09 01:27:33
|
well, it's some progress https://developer.mozilla.org/en-US/docs/Mozilla/Firefox/Releases/128#experimental_web_features
|
|
2024-07-09 03:04:50
|
this is still a nightly change
|
|
2024-07-09 03:05:14
|
I think [Hmm](https://cdn.discordapp.com/emojis/1113499891314991275.webp?size=48&quality=lossless&name=Hmm)
|
|
|
HCrikki
|
2024-07-09 03:33:19
|
firefox's libjxl code was apparently updated to this june's snapshot that covers gainmaps
|
|
2024-07-09 03:34:16
|
maybe nightly's support was improved in the meantime
|
|
2024-07-09 03:34:37
|
are any websites doing an origin trial (basically force activates certain flags for those sites, handy for testing and especially cdns doing enabled by default experiments)?
|
|
|
username
|
2024-07-09 03:36:58
|
it wasn't improved, and also libjxl is just set to somewhat auto update itself in the codebase
|
|
|
Quackdoc
|
2024-07-09 03:38:47
|
havent seen anything major, but I didnt scan much https://hg.mozilla.org/mozilla-unified/log?rev=jxl
|
|
|
VcSaJen
|
2024-07-09 10:42:15
|
It's more like documentation change, to describe what's been happening all this time
|
|
|
|
okydooky_original
|
|
Quackdoc
I hope fossify accepts the jxl pr, most images i've tested work. some like my woag testing image I think are too large and trigger resource protection
|
|
2024-07-12 10:56:12
|
The last PR merge seems to be a month ago, so it's possible they're just not helming the wheel, right now. And I hope so, too. Thank you for doing that, btw.
Re: large images, what's the "smallest" large image resolution it fails on, that you've seen?
|
|
|
Quackdoc
|
2024-07-12 10:57:57
|
4k works, 16k doesn't, don't have an 8k image on hand
|
|
|
|
okydooky_original
|
2024-07-12 11:12:05
|
Cool, so it should still handle the vast majority of images people might have.
|
|
|
Quackdoc
|
2024-07-12 11:30:52
|
correct
|
|
|
The last PR merge seems to be a month ago, so it's possible they're just not helming the wheel, right now. And I hope so, too. Thank you for doing that, btw.
Re: large images, what's the "smallest" large image resolution it fails on, that you've seen?
|
|
2024-07-13 03:19:54
|
oh btw, reporting on the issue if it works or not would probably be helpful, I can provide an apk if necessary
|
|
|
|
okydooky_original
|
2024-07-13 05:16:25
|
Is it the same as linked above?
|
|
|
Quackdoc
|
2024-07-13 05:19:08
|
oh I forgot I linked it, but yeah it is, if you can give it a whirl and reply on the PR it would be nice
|
|
|
HCrikki
|
2024-07-13 04:33:19
|
https://www.fastly.com/blog/level-up-your-images-jpeg-xl-now-supported-by-image-optimizer/
|
|
|
jonnyawsom3
|
2024-07-13 04:54:45
|
> SSIM
> We used ssimulacra2 to generate these scores.
Well, at least they used the right tool
|
|
2024-07-13 04:58:56
|
> Migrating to JPEG XL reduces storage costs by serving both JPEG and JPEG XL clients with a single file.
Finally sounds like a CDN is actually using the proposed transcoding for legacy clients
|
|
|
|
SwollowChewingGum
|
|
> Migrating to JPEG XL reduces storage costs by serving both JPEG and JPEG XL clients with a single file.
Finally sounds like a CDN is actually using the proposed transcoding for legacy clients
|
|
2024-07-13 04:59:54
|
Is there something more than just bitstream reconstruction ?
|
|
|
jonnyawsom3
|
2024-07-13 05:03:31
|
No, that's it, but so far everyone has just been serving either jpeg or JXL instead of transcoding to save the storage
|
|
|
HCrikki
|
2024-07-13 05:42:05
|
reconstruction in both directions is literally instant at effort 7 and below (here every img below 12 megapixels converted in like 30ms). well doable live by any server, cache or web app. higher effort saves extra kilobytes but any reconstruction consistently saves 20% already
|
|
|
Quackdoc
|
2024-07-13 05:43:27
|
speaking of the reconstruction stuff, does imagemagick support the lossless transcoding?
|
|
|
|
okydooky_original
|
|
Quackdoc
oh I forgot I linked it, but yeah it is, if you can give it a whirl and reply on the PR it would be nice
|
|
2024-07-13 05:54:53
|
Will do. Did you also want to share the link in the issue or PR ticket itself, like Oupson did with his test?
|
|
|
Quackdoc
|
2024-07-13 05:55:35
|
nah, it's pretty sketchy to share apks in the first place and some projects really frown on that
|
|
|
|
okydooky_original
|
2024-07-13 06:07:07
|
Alright. Well, I just tried it and, so far, it works well! Viewing is fairly smooth (no major performance issue like in Oupson's build) and no crash. The only thing I've noticed is that it takes a few seconds to initially load the files in a folder and begin generating thumbnails.
But, a HUGE improvement is it detects jxl files natovely and will show a folder containing only them, whereas the other build required a non-JXL image to be present for the gallery to show the folder in the first place.
|
|
2024-07-13 06:10:33
|
Update: I tried a potentially breaking action, using the "resize" option to convert a non-jxl file into one and it seems it kept it as a jpeg with a jxl extension and bumped the size from 25kB to 25.7, but nothing exploded. That's good, at least.
|
|
|
Quackdoc
|
|
Alright. Well, I just tried it and, so far, it works well! Viewing is fairly smooth (no major performance issue like in Oupson's build) and no crash. The only thing I've noticed is that it takes a few seconds to initially load the files in a folder and begin generating thumbnails.
But, a HUGE improvement is it detects jxl files natovely and will show a folder containing only them, whereas the other build required a non-JXL image to be present for the gallery to show the folder in the first place.
|
|
2024-07-13 06:53:01
|
yeah, I found perf to be mostly fine, libjxl in general still struggles with some images unless you play around with manual cpu affinity stuff. but generally it shouldnt have too many issues
|
|
|
|
okydooky_original
|
2024-07-13 07:00:47
|
Cool. Well, I'm going to be daily driving this for a while, I think. I'm also using https://play.google.com/store/apps/details?id=ebusky.jpeg.xl.image.viewer.converter to do conversions on mobile, since it has a number of exposed settings, is user friendly, and is more up to date than the other option on the playstore. But, I don't think it has the 0.10.1 version included, since crashed on a larger image (the RAM issue), and it also doesn't let me select between VarDCT and Modular, but should be good enough until we get a better solution for Android.
|
|
|
jonnyawsom3
|
|
Cool. Well, I'm going to be daily driving this for a while, I think. I'm also using https://play.google.com/store/apps/details?id=ebusky.jpeg.xl.image.viewer.converter to do conversions on mobile, since it has a number of exposed settings, is user friendly, and is more up to date than the other option on the playstore. But, I don't think it has the 0.10.1 version included, since crashed on a larger image (the RAM issue), and it also doesn't let me select between VarDCT and Modular, but should be good enough until we get a better solution for Android.
|
|
2024-07-13 07:36:34
|
Might want to take a look at this https://github.com/T8RIN/ImageToolbox
|
|
2024-07-13 07:37:00
|
Also in the play store but I have a bug that stops compressed filesizes showing in the preview
|
|
2024-07-13 07:37:44
|
Has JXL transcoding, effort levels, RGB or RGBA selection, custom resizing filters... Almost too many options honestly
|
|
|
|
okydooky_original
|
2024-07-13 07:38:59
|
Nice. I just was comparing then output from the app I just linked to with the original images and...it appears to do some forced downsizing of the resolution. :/
|
|
2024-07-13 07:39:40
|
<@184373105588699137> I think you already mentioned this, but the Gallery app does not get the resolution info from jxl files.
|
|
|
Also in the play store but I have a bug that stops compressed filesizes showing in the preview
|
|
2024-07-13 07:45:28
|
I'm going to use Obtainium to get it. But, I can't see what the difference between the PkayStore version and the FOSS version is... do you have an idea? Maybe no Firebase or something?
|
|
|
jonnyawsom3
|
|
I'm going to use Obtainium to get it. But, I can't see what the difference between the PkayStore version and the FOSS version is... do you have an idea? Maybe no Firebase or something?
|
|
2024-07-13 07:49:52
|
If I recall the FOSS version replaces some system called features with open source variants, but I was confused too
|
|
|
|
okydooky_original
|
2024-07-13 08:46:46
|
Huh. Well, as long as it still works. Lol
|
|
2024-07-13 08:47:20
|
Thanks for the recommendation. I had no real clue about it, yet it appears to be, by far, the most developed app of its kind.
|
|
|
_wb_
|
|
Quackdoc
speaking of the reconstruction stuff, does imagemagick support the lossless transcoding?
|
|
2024-07-14 02:14:04
|
No, libraries like that have a pipeline that goes decode -> pixels -> processing -> encode, so the "first-class citizen" is an image buffer (with metadata), not a bitstream. That makes it hard to fit something like jpeg bitstream recompression / reconstruction in such libraries.
|
|
|
Quackdoc
|
2024-07-14 02:20:31
|
ah I see, looks like I'll have to find an open source cdn software I can ~~bastardize~~ mod to use libjxl or cjxl directly then
|
|
|
HCrikki
|
2024-07-14 05:46:16
|
https://github.com/Pixboost/transformimgs
|
|
|
Quackdoc
|
2024-07-14 05:50:42
|
seems like that's only an image cdn [Hmm](https://cdn.discordapp.com/emojis/1113499891314991275.webp?size=48&quality=lossless&name=Hmm)
|
|
|
HCrikki
|
2024-07-14 05:51:54
|
any issue with its jxl support?
|
|
|
Quackdoc
|
2024-07-14 05:54:34
|
no idea, I would need to look at it's implementation, but since it is only an image host it's a bit of a lowpri
|
|
2024-07-14 05:55:19
|
seems like it uses imagemagick
|
|
|
HCrikki
|
2024-07-14 05:55:53
|
theres jxl-ready fediverse image hosting scripts if its not cdns with conversion caps you need
|
|
|
Quackdoc
|
2024-07-14 05:56:36
|
I am aware of things like pict-rs but they don't support accept headers
|
|
2024-07-14 05:56:45
|
afaik
|
|
2024-07-14 06:40:13
|
asked if pict-rs will have support for accept headers in the future
|
|
|
Demiurge
|
2024-07-15 07:24:32
|
I'm pretty excited about how the momentum seems to be slowly building behind jxl despite the intentional and cartoonishly selfish sabotage by the webp tech lead.
|
|
|
VcSaJen
|
|
Demiurge
I'm pretty excited about how the momentum seems to be slowly building behind jxl despite the intentional and cartoonishly selfish sabotage by the webp tech lead.
|
|
2024-07-15 07:29:38
|
Don't you mean AOM?
|
|
|
|
SwollowChewingGum
|
2024-07-15 07:30:05
|
Yeah, I don’t think WebP is the problem now
|
|
|
Demiurge
|
2024-07-15 07:33:50
|
Well, the chrome codec team tech lead, formerly on2 tech lead and brainchild of webp and webm. The guy who unilaterally decided for everyone else that "the entire ecosystem" doesn't have "enough interest" in jxl because it hurts his pride or something. Jim Bankowski.
|
|
|
|
SwollowChewingGum
|
2024-07-15 07:34:48
|
I don’t particularly see anything wrong with the webM format but WebP can go burn in a fire lol
|
|
|
VcSaJen
|
2024-07-15 07:35:37
|
Well, Android didn't adopt JPEG XL, either, so I suspect the problem is higher in the hierarchy.
|
|
|
Demiurge
|
2024-07-15 07:36:16
|
And unilaterally removed all the code from chrome in a cartoonishly transparent and flailing bid to derail and snuff the unstoppable and growing momentum of jxl
|
|
|
|
SwollowChewingGum
|
2024-07-15 07:36:37
|
The chrome file format team and the android file format team are probably either the same team or in close communication with each other though
|
|
|
Demiurge
|
|
VcSaJen
Well, Android didn't adopt JPEG XL, either, so I suspect the problem is higher in the hierarchy.
|
|
2024-07-15 07:37:00
|
I don't think so. To be fair, Android hasn't adopted anything. They're still using JPEG...
|
|
|
VcSaJen
|
2024-07-15 07:37:15
|
They adopted AVIF.
|
|
|
Quackdoc
|
|
VcSaJen
Well, Android didn't adopt JPEG XL, either, so I suspect the problem is higher in the hierarchy.
|
|
2024-07-15 07:37:25
|
I havent even seen anyone make a PR for it or anything
|
|
|
|
SwollowChewingGum
|
|
Demiurge
I don't think so. To be fair, Android hasn't adopted anything. They're still using JPEG...
|
|
2024-07-15 07:37:37
|
I guess that makes sense. What else are they gonna use? (RE: using jpeg)
|
|
|
Demiurge
|
2024-07-15 07:37:37
|
afaik the android open source distribution doesn't even have a built-in camera and gallery app
|
|