|
VcSaJen
|
|
Quackdoc
I havent even seen anyone make a PR for it or anything
|
|
2024-07-15 07:37:56
|
There's an open issue by Jon
|
|
|
Demiurge
|
|
VcSaJen
They adopted AVIF.
|
|
2024-07-15 07:37:58
|
adopted it where?
|
|
|
Quackdoc
|
2024-07-15 07:38:16
|
they have thousands of open requests
|
|
2024-07-15 07:38:27
|
unless someone actually makes a PR for it, it's not gonna do anything
|
|
|
Demiurge
|
2024-07-15 07:38:34
|
In the browser? because that's decided on by Jim Bankowski, lead of the Chrome codec team
|
|
|
VcSaJen
|
2024-07-15 07:38:47
|
In the library
|
|
|
|
SwollowChewingGum
|
|
VcSaJen
They adopted AVIF.
|
|
2024-07-15 07:38:49
|
Well, I don’t think they’d want to use AVIF as an image capture format unless they know that the phones will all have hardware encoding chips. The encoding speed is just too bad.
|
|
|
Demiurge
|
2024-07-15 07:39:00
|
like I said afaik there isn't even a standard camera or gallery app included in the base android distribution
|
|
|
Quackdoc
|
|
SwollowChewingGum
Well, I don’t think they’d want to use AVIF as an image capture format unless they know that the phones will all have hardware encoding chips. The encoding speed is just too bad.
|
|
2024-07-15 07:39:57
|
as far as capture goes, it mostly just relies on the camera firmware itself for handling that, so that wont change anything anytime soon
|
|
|
|
SwollowChewingGum
|
|
Quackdoc
as far as capture goes, it mostly just relies on the camera firmware itself for handling that, so that wont change anything anytime soon
|
|
2024-07-15 07:40:41
|
Theoretically, the camera app could request the raw pixels from the camera sensor and then do the encoding itself
|
|
|
Quackdoc
|
2024-07-15 07:40:55
|
I know like, 2 camera apps that do that and one is defunct
|
|
|
VcSaJen
|
|
Quackdoc
as far as capture goes, it mostly just relies on the camera firmware itself for handling that, so that wont change anything anytime soon
|
|
2024-07-15 07:42:55
|
I thought JPEG comitee was busy with exactly that. Or was it standalone camera devices only?
|
|
|
Quackdoc
|
2024-07-15 07:44:01
|
no idea, gonna take a long while before they land in any android phone though
|
|
|
Demiurge
|
|
VcSaJen
In the library
|
|
2024-07-15 07:46:50
|
Ah, I see... Yeah, avif codec seems to be built into the OS with android 14. But I still don't think it looks like it's anything more than this one guy that for some reason decided that jxl must be stopped...
|
|
|
Quackdoc
|
2024-07-15 07:47:53
|
aosp is a very much "if a vendor want's it we can do it if they submit code" kinda thing
|
|
|
Demiurge
|
2024-07-15 07:47:56
|
He is the project lead responsible for the decision and he has not named anyone else after all
|
|
|
|
SwollowChewingGum
|
2024-07-15 07:48:36
|
Also given the original chrome bag report, it seems like the vendors do want to use the format
|
|
|
Quackdoc
|
2024-07-15 07:49:00
|
android vendor is a different beast
|
|
2024-07-15 07:49:17
|
like I said, I havent seen any traffic at all in a PR or anything
|
|
|
Demiurge
|
2024-07-15 07:49:33
|
Yeah, vendors like Samsung are already including a jxl codec on their own
|
|
|
|
SwollowChewingGum
|
2024-07-15 07:49:39
|
Has Jon made a pr?
|
|
|
VcSaJen
|
|
Quackdoc
aosp is a very much "if a vendor want's it we can do it if they submit code" kinda thing
|
|
2024-07-15 07:50:57
|
Are there any examples where they accepted a big feature like that from a random user? Not just an optimization or a bugfix.
|
|
2024-07-15 07:54:00
|
I would think apps abandoned by Google during Jelly Bean era would be not-dead if they accepted PRs...
|
|
|
Quackdoc
|
|
Demiurge
Yeah, vendors like Samsung are already including a jxl codec on their own
|
|
2024-07-15 07:54:20
|
are they? I havent heard of anyone that samsung is going to have JXL support 0.0
|
|
|
VcSaJen
Are there any examples where they accepted a big feature like that from a random user? Not just an optimization or a bugfix.
|
|
2024-07-15 07:54:55
|
I cant think of any times a user has added a big feature at all
|
|
2024-07-15 07:55:51
|
then again, AOSP is a massive project that has tons of sub dependencies, so it really wouldnt surprise me if it has happened
|
|
|
Oleksii Matiash
|
|
Quackdoc
are they? I havent heard of anyone that samsung is going to have JXL support 0.0
|
|
2024-07-15 09:26:54
|
Samsung is using jxl-in-dng, afair
|
|
|
Quackdoc
|
2024-07-15 09:27:55
|
ah so not anything platform related
|
|
|
HCrikki
|
|
Quackdoc
are they? I havent heard of anyone that samsung is going to have JXL support 0.0
|
|
2024-07-15 12:10:49
|
Gallery app has a working decoder for jxl codestreams but only loads it for DNG 1.7 raw images. Even budget phones' gallery app received it so that demonstrates itd be smart bypassing google and going for vendors with their own apps preinstalled and autoupdating
|
|
|
jonnyawsom3
|
|
SwollowChewingGum
Theoretically, the camera app could request the raw pixels from the camera sensor and then do the encoding itself
|
|
2024-07-15 12:13:38
|
Unfortunately the only options are completely raw bayer data, or jpeg at quality 100. There's no way to get processed, but uncompressed data from the sensor
|
|
|
Demiurge
I don't think so. To be fair, Android hasn't adopted anything. They're still using JPEG...
|
|
2024-07-15 12:20:16
|
They've had WebP since 4.2.1 and AVIF since Android 12
|
|
|
VcSaJen
|
2024-07-15 12:54:52
|
Just tested it in Android 14 with this: https://github.com/link-u/avif-sample-images (pretty cool that AVIF have this, *unlike* jxl). Built-in Files app doesn't recognize them, Google's Photos app shows some of them, very slowly, but most are very broken. Looks like those files are too advanced.
|
|
|
Oleksii Matiash
|
2024-07-15 12:55:34
|
Webp are also used in Android development, i.e. as in-app resources format
|
|
|
_wb_
|
2024-07-15 01:17:31
|
The AVIF in Android 12 is quite a limited and poor implementation though. No alpha, only 8-bit 420, old version of the file format. Rather rushed, if you ask me. Which is annoying now, because we are seeing Android app devs assuming they can use AVIF in Android 12+ and then they complain that some images don't decode or look wrong.
|
|
2024-07-15 01:20:22
|
I suppose it's the same people who rushed avif into Chrome who also rushed it into Android. Problem is: bugs in the initial rushed version can easily be fixed if it's Chrome, since most Chrome installs get updated regularly. But in Android, there's a really long tail of old Android versions out there, and they don't get updated. So the bugs 'stick' for a really long time.
|
|
2024-07-15 01:21:35
|
(it's for that reason that many apps and app dev frameworks don't rely on Android system libraries but just ship their own copy of everything they need)
|
|
|
0xC0000054
|
|
_wb_
The AVIF in Android 12 is quite a limited and poor implementation though. No alpha, only 8-bit 420, old version of the file format. Rather rushed, if you ask me. Which is annoying now, because we are seeing Android app devs assuming they can use AVIF in Android 12+ and then they complain that some images don't decode or look wrong.
|
|
2024-07-15 01:22:52
|
Interesting, I did not know that. That would explain a bug report I received a few years ago on my Paint.NET AVIF plugin about Android 12.
|
|
|
_wb_
|
2024-07-15 01:27:33
|
I think we made a wise decision not to pursue getting system-level jxl support in Android back when AVIF was boasting about it. It creates more problems than it solves. Maybe it's good PR, but it's not a good approach from an engineering point of view.
|
|
|
0xC0000054
|
2024-07-15 01:37:47
|
I don't know if it has already been mentioned, but the Windows 11 24H2 preview SDK has JpegXL support in the OS imaging framework (WIC). Credit for spotting it and the following image goes to Paint.NET developer Rick Brewster:
|
|
2024-07-15 01:38:49
|
Looking at the other SDK headers, it appears that they may also support animations.
|
|
|
username
|
|
0xC0000054
I don't know if it has already been mentioned, but the Windows 11 24H2 preview SDK has JpegXL support in the OS imaging framework (WIC). Credit for spotting it and the following image goes to Paint.NET developer Rick Brewster:
|
|
2024-07-15 01:41:30
|
it's been brought up in here a few times and there has even been registry entries spotted in fresh installs as well, although it seems like there isn't any actual decoder or encoder shipped with Windows or on the MS store yet.
|
|
|
0xC0000054
|
2024-07-15 01:44:06
|
I think it is a feature in the Win11 preview releases, or perhaps Microsoft is shipping the public headers in the preview SDK while they work to finish the OS implementation. I have no idea how their release process works.
|
|
|
jonnyawsom3
|
2024-07-15 01:44:11
|
It was first spotted in the SDK, then registry entries on Windows 11, and most recently Wincodecs.dll can create instances for the encoder and decoder, but the MS Store codec hasn't been released yet so they don't do anything
|
|
2024-07-15 01:48:02
|
SDK https://discord.com/channels/794206087879852103/803574970180829194/1201015191492640818
Registry https://discord.com/channels/794206087879852103/803574970180829194/1216042795643568278
Wincodecs https://discord.com/channels/794206087879852103/803574970180829194/1255914607235829852
|
|
|
0xC0000054
|
2024-07-15 01:50:07
|
Ok, so they are using the store to deliver the actual implementation. I guess that makes sense given that they did the same thing with AV1/AVIF. I just hope that this time around they release and support a fully working codec implementation. IIRC their AV1/AVIF implementation is effectively abandoned and doesn't even support transparency, which is ironic as that means it can't load all of the images that they contributed to the AVIF test suite.
|
|
|
jonnyawsom3
|
2024-07-15 01:53:18
|
It's set to support Animation, Lossless and 32 bpp with HDR10 based on the linked messages
|
|
2024-07-15 02:01:36
|
I'm mostly hoping they use multithreaded decoding, WebP feels pretty sluggish on my CPU due to slow single core speeds with 15 threads doing nothing
|
|
|
Cacodemon345
|
2024-07-15 05:56:53
|
I really really hope they actually release the codec and doesn't cave in to Google.
|
|
|
Demiurge
|
2024-07-15 09:40:19
|
jxl must be stopped! Nooo! Stop adding support! There's not enough interest! The future is webp and avif!
|
|
|
|
SwollowChewingGum
|
|
Demiurge
jxl must be stopped! Nooo! Stop adding support! There's not enough interest! The future is webp and avif!
|
|
2024-07-15 09:48:51
|
The future is non progressive decoding yuv420p images!!
|
|
|
Demiurge
|
2024-07-15 09:51:06
|
We need search engines to punish progressive images and reward serial-scanline webp instead!
|
|
|
HCrikki
|
2024-07-15 10:04:36
|
generate your photographs as jpegs - with the now traditional visual artifacts and huge filesize - warning, must choose a high quality % to look acceptable
|
|
2024-07-15 10:06:13
|
still salty anyone thought just adding a lite gainmap makes it better than capturing straight to hdr/jxl
|
|
|
Quackdoc
|
|
HCrikki
still salty anyone thought just adding a lite gainmap makes it better than capturing straight to hdr/jxl
|
|
2024-07-15 10:12:18
|
people with SDR screens or devices would be salty at you for saying this [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
|
_wb_
The AVIF in Android 12 is quite a limited and poor implementation though. No alpha, only 8-bit 420, old version of the file format. Rather rushed, if you ask me. Which is annoying now, because we are seeing Android app devs assuming they can use AVIF in Android 12+ and then they complain that some images don't decode or look wrong.
|
|
2024-07-15 10:28:02
|
I did do a little bit of investigation, and I didn't see anything spectacularly dumb in the AOSP source code, so im thinking it might be a bug in libgav1 itself.
|
|
|
Demiurge
|
|
Quackdoc
people with SDR screens or devices would be salty at you for saying this [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
2024-07-15 11:25:44
|
I have an sdr screen but gainmaps seem like a smoothbrain solution that creates more problems than it solves. Especially when you consider that nobody generates gainmaps by hand, so why save a bitmap when you could just save the instructions on how to generate the bitmap?
|
|
|
Quackdoc
|
|
Demiurge
I have an sdr screen but gainmaps seem like a smoothbrain solution that creates more problems than it solves. Especially when you consider that nobody generates gainmaps by hand, so why save a bitmap when you could just save the instructions on how to generate the bitmap?
|
|
2024-07-15 11:38:16
|
when creating a gainmap you do so in mastering process so you can verify it looks right
|
|
|
Demiurge
|
2024-07-15 11:43:24
|
I'm pretty sure gainmaps are chiefly created by computers and not humans, and even if a human was involved in the creation, different regions/pixels of the gainmap aren't being edited independently from each other.
|
|
|
Quackdoc
|
|
Demiurge
I'm pretty sure gainmaps are chiefly created by computers and not humans, and even if a human was involved in the creation, different regions/pixels of the gainmap aren't being edited independently from each other.
|
|
2024-07-15 11:45:38
|
a gain map is more or less a bad difference map between two images, the two images are still independently masteted, and gain map fills the difference
|
|
|
Demiurge
|
2024-07-15 11:45:57
|
In other words the gainmap is the output of some algorithm and it makes no sense to store it as a separate bitmap instead of just describing what algorithm was used to generate the bitmap. Using less bits.
|
|
2024-07-15 11:47:17
|
Why store 2 separate bitmaps when you could just store 1 image and a tiny algorithm that efficiently describes how to derive the 2nd image
|
|
|
Quackdoc
|
2024-07-15 11:47:47
|
except for now you have to have either a really complex algo you need to compute which causes massive complexity and security risk, all for saving maybe a few kb
|
|
2024-07-15 11:48:21
|
you pretty much are just making a computer generated shader at that point
|
|
|
Demiurge
|
2024-07-15 11:48:30
|
As mentioned previously, having 2 separate bitmaps creates lots of headaches with editing, and just plain doesn't make any sense that the HDR image could be a picture of a dog and the SDR image can be a picture of a banana.
|
|
2024-07-15 11:49:26
|
It's just a really dumb solution
|
|
|
Quackdoc
|
2024-07-15 11:49:38
|
sure it could be that, but is that *really* an issue?
|
|
|
Demiurge
|
2024-07-15 11:50:31
|
Well, the issue is, why have such a stupid solution to a problem, when there's a far more obvious solution that doesn't have any of those problems?
|
|
|
Quackdoc
|
2024-07-15 11:50:58
|
the far more obvious solution you present is far more complicated to implement
|
|
2024-07-15 11:51:16
|
gain maps give per pixel, and possibly per channel tonemapping
|
|
2024-07-15 11:51:29
|
if you want to do that with a single algo, thats gonna be one long complex algo
|
|
|
Demiurge
|
|
Quackdoc
except for now you have to have either a really complex algo you need to compute which causes massive complexity and security risk, all for saving maybe a few kb
|
|
2024-07-15 11:52:06
|
No reason why it needs to have unbounded complexity either. I'm pretty sure the algorithms that generate gainmaps all run at realtime speed in order for them to be viable for use in cameras
|
|
|
Quackdoc
|
2024-07-15 11:52:29
|
you need to parse and render that at client side
|
|
2024-07-15 11:52:35
|
that's an issue
|
|
|
Demiurge
|
2024-07-15 11:52:52
|
Pretty sure gainmaps are typically generated at realtime speed
|
|
2024-07-15 11:53:10
|
In editing software and camera firmware
|
|
|
Quackdoc
|
2024-07-15 11:53:17
|
yeah, but I dont need to implement an entire shader parser just to decode a gainmap
|
|
|
Demiurge
|
2024-07-15 11:53:54
|
It's just dumb to encode a gainmap in the first place when it's the output of a simple, realtime-speed algorithm.
|
|
2024-07-15 11:54:02
|
There's no need to
|
|
|
Quackdoc
|
2024-07-15 11:54:02
|
its not simple
|
|
2024-07-15 11:54:13
|
not if you want the flexibility gainmaps provide
|
|
|
Demiurge
|
2024-07-15 11:54:40
|
You don't want that flexibility. That flexibility is the problem and the source of editing headaches.
|
|
2024-07-15 11:55:15
|
You don't want the SDR and HDR images to be completely separate images with no automatic relationship with each other and require separate editing.
|
|
|
Quackdoc
|
|
Demiurge
You don't want that flexibility. That flexibility is the problem and the source of editing headaches.
|
|
2024-07-15 11:55:21
|
thats outright wrong
|
|
2024-07-15 11:55:37
|
gainmaps are the *minumum* that people are willing to concede
|
|
|
Demiurge
|
2024-07-15 11:55:59
|
You want to edit the same image twice, one for HDR and one for SDR
|
|
2024-07-15 11:56:03
|
?
|
|
2024-07-15 11:56:17
|
And not have any portion of the work automatically carry over?
|
|
|
Quackdoc
|
2024-07-15 11:57:17
|
Artists have already been willing to do this, distribution is the issue, Gain maps allow generating two properly distinct images from one base, and have it look actually good. Gainmaps are perfectly fine from a creators standpoint as long as the application you use isnt trash
|
|
|
Demiurge
|
2024-07-16 12:13:37
|
Still a waste of bits to encode a whole bitmap that could be described in a small algebraic formula...
|
|
2024-07-16 12:14:01
|
With reduced resolution too
|
|
|
Quackdoc
|
2024-07-16 12:14:18
|
you loose per pixel or per pixel group luma mapping with that
|
|
2024-07-16 12:14:22
|
so its less flexible
|
|
2024-07-16 12:14:26
|
and more prone to errors
|
|
|
Demiurge
|
2024-07-16 12:14:36
|
A reduced resolution bitmap that could be algebraically generated in full resolution
|
|
|
Quackdoc
you loose per pixel or per pixel group luma mapping with that
|
|
2024-07-16 12:15:42
|
I don't think it's common for individual regions of the gainmap to be edited independently of one another.
|
|
2024-07-16 12:16:02
|
At that point you have 2 separate images. A dog and a banana.
|
|
|
Quackdoc
|
2024-07-16 12:16:05
|
you wouldnt edit a gainmap, you edit two different pictures
|
|
|
Demiurge
|
2024-07-16 12:16:25
|
At that point it's not the same image anymore.
|
|
|
Quackdoc
|
2024-07-16 12:16:37
|
thats literally the point
|
|
|
Demiurge
|
2024-07-16 12:16:45
|
It would make more sense to store them in separate files since it's 2 separate bitmaps anyways
|
|
2024-07-16 12:17:11
|
It's 2 separate uncorrelated images, not 1
|
|
|
Quackdoc
|
2024-07-16 12:17:15
|
gainmaps let you efficiently serve two images, an HDR and SDR rendition, to the user without the user needing to worry which is which
|
|
2024-07-16 12:17:49
|
its not some limited algo, it's a fully fledged two seperate images in one
|
|
|
Demiurge
|
2024-07-16 12:18:00
|
But if those images are not correlated with each other anymore, it's deceptive to have them in the same file and have a different image show up depending on the environment.
|
|
2024-07-16 12:18:15
|
That's like the ambiguous multithreaded PNG decoding
|
|
|
Quackdoc
|
|
Demiurge
But if those images are not correlated with each other anymore, it's deceptive to have them in the same file and have a different image show up depending on the environment.
|
|
2024-07-16 12:18:51
|
if the images arent correlated, it's just going to look really stupid when a app tries to decode it
|
|
|
Demiurge
|
2024-07-16 12:19:27
|
https://www.da.vidbuchanan.co.uk/widgets/pngdiff/
|
|
|
Quackdoc
|
2024-07-16 12:19:41
|
the gainmap builds ontop of the base image, so it's just a suped up moddification of it, ofc its possible to do really stupid things with that, but just dont distribute stupid images
|
|
|
Demiurge
|
2024-07-16 12:19:50
|
They're 2 separate uncorrelated images pretending to be 1 image
|
|
2024-07-16 12:19:59
|
It's kinda deceptive
|
|
|
Quackdoc
|
2024-07-16 12:20:14
|
who cares? not artists, and not general users who get the image
|
|
|
Demiurge
|
2024-07-16 12:20:19
|
It doesn't make sense from a technical point of view
|
|
|
Quackdoc
|
2024-07-16 12:20:26
|
people have "abused" gifs and apngs with the same thing
|
|
2024-07-16 12:20:33
|
in the end it never became a real issue
|
|
|
Demiurge
|
2024-07-16 12:20:37
|
To design a format like that doesn't make sense
|
|
2024-07-16 12:20:49
|
It's a dumb idea
|
|
|
Quackdoc
|
2024-07-16 12:20:54
|
well, heres the thing, we had no alternatives, and we still have no alternatives
|
|
|
Demiurge
|
2024-07-16 12:21:04
|
Especially when there's a more obvious solution that doesn't have those problems
|
|
|
Quackdoc
|
2024-07-16 12:21:21
|
gainmaps are the best we have, and no one bothered with this "obvious solution" so clearly it's not that obvious
|
|
|
Demiurge
|
2024-07-16 12:22:23
|
Well, the alternative is tone mapping... and most gainmaps are just the bitmap output of a tonemapping algorithm. There should just be an agreed upon standard way to encode tonemapping parameters.
|
|
|
Quackdoc
|
2024-07-16 12:22:43
|
tonemapping sucks and is extremely limited in comparison to gainmaps
|
|
|
Demiurge
|
2024-07-16 12:22:53
|
Tone mapping is something that's built into image viewers already and runs at realtime speed. There should just be a standard for specifying custom parameters for the tonemapping algorithm
|
|
2024-07-16 12:23:38
|
since that's what gainmaps are being used for
|
|
|
Quackdoc
|
2024-07-16 12:24:40
|
gain maps give you per region tonemapping, ofc you could do this with an "algo" but then at that point it's just a bad gainmap anyways
|
|
|
Demiurge
|
2024-07-16 12:25:07
|
It would make a lot more sense and take up less bandwidth than encoding a separate low-resolution bitmap to generate an HDR image from an SDR base image...
|
|
|
Quackdoc
|
2024-07-16 12:25:27
|
you get less bandwidth and less flexibility/detail
|
|
|
_wb_
|
2024-07-16 12:26:40
|
Apart from manually generated bitstreams, all gain maps will be the result of _some_ local tone mapping algorithm. At least at the moment I am not aware of any editing tool that lets artists produce custom gain maps manually.
|
|
|
Demiurge
|
|
Quackdoc
gain maps give you per region tonemapping, ofc you could do this with an "algo" but then at that point it's just a bad gainmap anyways
|
|
2024-07-16 12:27:10
|
Is it really that common for people to generate an SDR image that has strong region-specific differences from the HDR counterpart?
|
|
|
_wb_
|
2024-07-16 12:27:41
|
Yes, local tone mapping does provide much better results than global tone mapping
|
|
|
Demiurge
|
2024-07-16 12:28:04
|
In a way that can't be described by an automated process I mean...
|
|
|
Quackdoc
|
|
Demiurge
Is it really that common for people to generate an SDR image that has strong region-specific differences from the HDR counterpart?
|
|
2024-07-16 12:28:08
|
Imagine say the northen lights at night, the SDR rendition will need a much stronger or weaker rendition for the stars vs the northern lights
|
|
2024-07-16 12:28:30
|
you also have content specific tonemapping too
|
|
2024-07-16 12:29:10
|
many people will find that some images look better with bt.2446a, some look better with ACES' mapper, some 2390 etc.
because dumb algorithms inherently suck for this
|
|
2024-07-16 12:29:41
|
sometimes you need to make a muted section of an image in SDR when it should really be popping like sun + screen + shade
|
|
|
Demiurge
|
2024-07-16 12:29:52
|
If there's no editing tool that allows custom gain maps then the gain map is generated by custom parameters to a tonemapping algorithm that would be more efficient to embed as metadata than an entirely new bitmap
|
|
2024-07-16 12:30:40
|
I'm sure these tonemapping kernels all run at realtime speed too
|
|
|
Quackdoc
|
|
_wb_
Apart from manually generated bitstreams, all gain maps will be the result of _some_ local tone mapping algorithm. At least at the moment I am not aware of any editing tool that lets artists produce custom gain maps manually.
|
|
2024-07-16 12:30:55
|
> At least at the moment I am not aware of any editing tool that lets artists produce custom gain maps manually.
Isn't adobe lightroom working on doing diffs? I thought I read that
|
|
|
Demiurge
|
2024-07-16 12:32:15
|
I guess the problem is that gainmaps exist right now but standardized algorithms and parameter encoding does not
|
|
|
Quackdoc
|
|
Demiurge
I guess the problem is that gainmaps exist right now but standardized algorithms and parameter encoding does not
|
|
2024-07-16 12:32:51
|
even with parameter encoding, I fail to see how this could be done in a way that is competitve to encoding a secondary image when it comes to regions
|
|
|
_wb_
|
2024-07-16 12:33:02
|
I don't see a way to manually adjust the gain map locally in Lightroom. I can adjust global parameters to the presumably local tone mapping algorithm it implements, but I don't see a way to adjust the result in specific regions.
|
|
|
Demiurge
|
2024-07-16 12:33:37
|
When you encode a gainmap bitmap, does it even take advantage of redundancy and prediction from the base image data?
|
|
2024-07-16 12:34:23
|
What's the advantage of having 2 images pretending to be 1 file?
|
|
|
Quackdoc
|
|
_wb_
I don't see a way to manually adjust the gain map locally in Lightroom. I can adjust global parameters to the presumably local tone mapping algorithm it implements, but I don't see a way to adjust the result in specific regions.
|
|
2024-07-16 12:34:37
|
I wouldn't be surprised if they just havent added it yet
|
|
|
_wb_
|
|
Demiurge
When you encode a gainmap bitmap, does it even take advantage of redundancy and prediction from the base image data?
|
|
2024-07-16 12:40:27
|
No
|
|
|
Quackdoc
|
2024-07-16 12:40:59
|
would there even be much redundancy at all?
|
|
|
_wb_
|
|
Demiurge
What's the advantage of having 2 images pretending to be 1 file?
|
|
2024-07-16 12:42:39
|
Conceptually I don't like it. It is like shipping movies as a Blu-ray disk and VHS tape taped together.
|
|
|
Demiurge
|
2024-07-16 12:42:58
|
At that point just have 2 separate images, it's more honest and logical and predictable that way...
|
|
2024-07-16 12:43:12
|
I mean 2 separate files, it's already 2 separate images...
|
|
2024-07-16 12:43:19
|
No reason to pretend it's 1
|
|
|
Quackdoc
|
|
Demiurge
I mean 2 separate files, it's already 2 separate images...
|
|
2024-07-16 12:43:39
|
except now you have to worry about everyone distributing it wrong
|
|
2024-07-16 12:44:33
|
does html even support distributin both SDR and HDR and swapping depending on a tag or something?
|
|
|
Demiurge
|
2024-07-16 12:44:41
|
I dunno, it's just my 2 cents. I don't have all the answers either, but on the surface gainmaps seem like a smoothbrain idea to me
|
|
2024-07-16 12:45:16
|
Especially after hearing wb's explanation of them
|
|
|
Quackdoc
|
2024-07-16 12:46:04
|
I mean the only real solution is to force everyone to migrate to HDR fullstop. but thats not happening any time soon
|
|
|
_wb_
|
2024-07-16 12:52:13
|
The thing is, gain maps is an approach that tries to kill two birds with one stone:
1. Graceful degradation/interoperability: making things work in legacy SDR-only applications, while also offering an enhanced experience for applications that do support HDR. This is the main focus of the UltraHDR approach (and JPEG XT a decade earlier).
2. Artistic control over the tone mapping.
Point 1 is very much about solving a temporary, transitional problem, while point 2 is fundamentally changing the concept of an image to be such that pixels do not have fixed sample values but the actual values depend on the viewing conditions and display capabilities, being interpolated between two variants of the image.
|
|
2024-07-16 01:05:53
|
But these are kind of contradictory goals to some extent. Implementing gain maps based image rendering properly is very complicated, since you cannot just produce a single decoded image buffer: you have to keep both the base image and the alternate image in memory, and dynamically update the interpolation between them since the viewing conditions will also change dynamically (background illumination and display brightness can change over time). So it's more complicated than just having one HDR image.
So my feeling is that very few applications will actually do that. Most will just use point 1: not implement it at all, since that also 'works'. This does give artists control over the tone mapping, but it does not give them HDR images. Other applications will partially implement it: use the gain map to reconstruct an HDR image, and then consider the image as if it is just that HDR image. That means artists do not get control over the tone mapping, but they do get HDR.
I don't really see it happening in practice that points 1 and 2 are actually both achieved in a satisfactory way.
|
|
|
Quackdoc
|
2024-07-16 01:15:51
|
I disagree that it's really that complicated for the majority of consumer applications. Most applications won't need to balance out between SDR and HDR. It is true that brightness changes and in context needs to be accounted for, but even then, with a gainmap you have pretty much "HDR" mode and "SDR" mode. It's far more likely that applications would only need to, at most, render the rendition it needs, then leave mapping up to the operating system as per the current status quo.
Applications that do juggle between (such as ios browser) them would far more likely to simply render out to HDR then do SDR -> HDR mapping for the rest of the content, much like how HDR is currently handled
Currently "SDR only application" applies to the vast majority of systems and operating modes today. and for the foressable future will be necessary, if we have learned anything from technology progression, "SDR only consumers" will be around for at least another decade.
|
|
|
_wb_
|
2024-07-16 01:21:15
|
The gain map spec does specify interpolation between the two images, and that would be quite important if you have e.g. a PQ image that actually goes up to 10k nits and you want to render it on a 1k nits display that is moving between outdoors viewing conditions and indoors.
|
|
2024-07-16 01:24:21
|
But yes, I don't think applications will actually implement that kind of dynamic interpolation, and they will rather be just decoding the SDR image when using an SDR display and reconstructing an HDR image just once when using an HDR display.
|
|
|
Quackdoc
|
2024-07-16 01:24:45
|
I thought the "android specification" for gainmaps more or less mapped to PQ with a hard nit value, Ill have to re-read it,
|
|
|
_wb_
|
2024-07-16 01:25:28
|
I was talking about the Adobe specification for gain maps
|
|
|
Quackdoc
|
2024-07-16 01:26:12
|
oh right, I forgot we need to deal with this, it really is "HDR 2.0" [av1_cheems](https://cdn.discordapp.com/emojis/720670067091570719.webp?size=48&quality=lossless&name=av1_cheems)
|
|
|
_wb_
|
2024-07-16 01:29:47
|
Anyway, the goal of the current ISO 21496 is to have just a single gain maps spec instead of all the proprietary single vendor approaches we have now, which I think is a noble goal. But I think the gain maps approach in general is just not a good approach if the goal is to get consistent rendering of HDR images.
|
|
2024-07-16 01:37:20
|
In practice, the actual image an end-user will see will not be consistent at all:
- they may see the SDR base image (in applications that don't implement it)
- they may see the HDR image, tone mapped with whatever default tone mapping the application or OS or display is doing
- they may see an interpolated image based on the display brightness at decode time
- they may see an interpolated image that dynamically changes when the display brightness is changing
|
|
|
Quackdoc
|
|
_wb_
In practice, the actual image an end-user will see will not be consistent at all:
- they may see the SDR base image (in applications that don't implement it)
- they may see the HDR image, tone mapped with whatever default tone mapping the application or OS or display is doing
- they may see an interpolated image based on the display brightness at decode time
- they may see an interpolated image that dynamically changes when the display brightness is changing
|
|
2024-07-16 02:06:32
|
this seems like a lot larger scope then is really necessary or wanted. the latter 2 parts shouldn't even be in scope IMO. It should be left entirely up to platform handling.
The second point I suppose *could* be an issue, but that is the current status quo anyways, unless you mean HDR->SDR, I don't forsee that being an issue since any application gainmap aware in the first place should be platform aware anyways.
|
|
2024-07-16 02:07:27
|
IMO simple is solid, and solid is good. a gainmap's scope should be limited to taking an SDR image to a typical HDR transfer, or vice versa, (though HDR -> SDR I suppose might be harder )
|
|
|
|
SwollowChewingGum
|
|
_wb_
Anyway, the goal of the current ISO 21496 is to have just a single gain maps spec instead of all the proprietary single vendor approaches we have now, which I think is a noble goal. But I think the gain maps approach in general is just not a good approach if the goal is to get consistent rendering of HDR images.
|
|
2024-07-16 02:08:15
|
wait, with gain maps, is it SDR+Gainmap=HDR or HDR+Gainmap=SDR?
|
|
2024-07-16 02:08:30
|
i.e. is the source image in SDR or HDR?
|
|
|
Quackdoc
|
|
SwollowChewingGum
wait, with gain maps, is it SDR+Gainmap=HDR or HDR+Gainmap=SDR?
|
|
2024-07-16 02:09:09
|
in theory it is both ways, but IMO SDR->HDR is the only one that makes sense
|
|
2024-07-16 02:11:57
|
no idea about the spec tho, Im not sure if it's public or not
|
|
|
_wb_
|
2024-07-16 02:16:26
|
Both ways are possible in the current spec. For 8-bit formats like de facto JPEG or Main profile HEIC, only SDR+gainmap->HDR makes sense. For formats that natively support HDR, I think the other direction is the one that makes the most sense.
|
|
|
Quackdoc
|
2024-07-16 02:19:32
|
does it make sense? I think the likely hood of an application supporting HDR being gainmap aware is significantly higher then the alternative, even if potential quality is higher, the loss in app support is far more significant, which pretty much breaks gainmap's single greatest usecase
|
|
|
|
SwollowChewingGum
|
|
Quackdoc
does it make sense? I think the likely hood of an application supporting HDR being gainmap aware is significantly higher then the alternative, even if potential quality is higher, the loss in app support is far more significant, which pretty much breaks gainmap's single greatest usecase
|
|
2024-07-16 02:20:26
|
but most software supporting new formats (ie jxl) should probably be hdr aware, and if not, the decoder can take that burden
|
|
|
_wb_
Both ways are possible in the current spec. For 8-bit formats like de facto JPEG or Main profile HEIC, only SDR+gainmap->HDR makes sense. For formats that natively support HDR, I think the other direction is the one that makes the most sense.
|
|
2024-07-16 02:21:07
|
are iPhone HEIC photos Main profile?
|
|
|
Quackdoc
|
|
SwollowChewingGum
but most software supporting new formats (ie jxl) should probably be hdr aware, and if not, the decoder can take that burden
|
|
2024-07-16 02:21:17
|
this is not true due how libraries work in the first place, many applications "just work" with JXL when their decoding library adds support for it
|
|
2024-07-16 02:21:46
|
also very little linux software is HDR aware in the first place anyways
|
|
2024-07-16 02:22:14
|
Developing an HDR application is often not "update your app's gui framework" sadly
|
|
|
|
SwollowChewingGum
|
|
Quackdoc
this is not true due how libraries work in the first place, many applications "just work" with JXL when their decoding library adds support for it
|
|
2024-07-16 02:22:46
|
shouldn't the decoding library support tonemapping? Couldn't you could have the libjxl (or whatever) decode function have an `bool is_hdr` argument which defaults to `false`?
|
|
|
Quackdoc
|
|
SwollowChewingGum
shouldn't the decoding library support tonemapping? Couldn't you could have the libjxl (or whatever) decode function have an `bool is_hdr` argument which defaults to `false`?
|
|
2024-07-16 02:23:09
|
this is a highly unique feature to JXL
|
|
2024-07-16 02:23:13
|
the answer is no
|
|
2024-07-16 02:24:00
|
you may have rudimentary support in your colour management system, but well, tonemapping sucks
|
|
2024-07-16 02:24:25
|
I suppose it could be a decent fallback, but it's not reliable at all
|
|
|
_wb_
|
|
SwollowChewingGum
are iPhone HEIC photos Main profile?
|
|
2024-07-16 02:49:24
|
Yes, as far as I understand. In principle, HEIC is always Main profile — if you want to use other profiles, you need HEIX.
|
|
|
Quackdoc
I suppose it could be a decent fallback, but it's not reliable at all
|
|
2024-07-16 03:00:44
|
Basically the question boils down to: do you want the SDR image to be reliable/well-defined, but you don't really know if and how the HDR image will be rendered since that will depend on the application (that's what you get with the SDR+gainmap->HDR approach), or do you want the HDR image to be well-defined, but you don't really know if SDR renditions will look good (that's what you get when encoding just an HDR image)?
With inverse gain maps (HDR+gainmap->SDR), in principle you can have both images well-defined and looking good, but I think it's overkill, and it would be better to have a well-defined parametrized "rendering intent" for HDR images, which basically normatively defines how to do the tone mapping (i.e., parameters of a global or local tone mapping algorithm) so it is consistent.
|
|
|
Demiurge
|
2024-07-16 03:04:05
|
But then people have to agree on those parameters and on an on-disk format to serialize them.
|
|
|
Quackdoc
|
2024-07-16 03:04:30
|
if the specification properly defines on how to render the gainmap out then the rendition should more or less be as well defined on the SDR, and any applications failing to do so would simply be spec in-compliant
|
|
|
_wb_
|
|
Demiurge
But then people have to agree on those parameters and on an on-disk format to serialize them.
|
|
2024-07-16 04:11:31
|
I think ICC profiles could be extended with a more elaborate rendering intent description than the current 4-value one (which is meant to describe gamut mapping rendering intent, not dynamic range mapping).
|
|
|
Demiurge
|
2024-07-16 05:57:05
|
iccMAX? https://www.color.org/iccmax.xalter
|
|
2024-07-16 05:58:17
|
https://github.com/InternationalColorConsortium/DemoIccMAX
|
|
2024-07-16 06:01:11
|
I don't see anything related to localized tone mapping or tone mapping of any kind though
|
|
2024-07-16 06:02:49
|
`Programmable transforms (e.g. direct encoding of device models) will be supported, with functional operators, conditional evaluation, persistent variables and vectorized operations for improved performance.`
So, executable code in ICC profiles? Great idea
|
|
|
_wb_
|
2024-07-16 06:04:15
|
no no, ICC 5 is not something anyone really wants to implement. They're working on a new revision of ICC 4 which should get some kind of tone mapping intent, but this is still very preliminary work
|
|
|
Quackdoc
|
|
Demiurge
https://github.com/InternationalColorConsortium/DemoIccMAX
|
|
2024-07-16 06:07:16
|
iccmax is a crapshoot, ~~feels a lot like tiff tbh~~
|
|
|
Demiurge
|
2024-07-16 08:43:58
|
Interesting...
|
|
2024-07-16 08:44:35
|
I guess in the meantime gainmaps are in a much more mature and usable state despite its flaws
|
|
|
_wb_
|
2024-07-16 10:17:46
|
Not really, there are different single-vendor ways of doing gain maps, there's the JPEG XT way that was defined 10 years ago but nobody really uses it, and then there's the new TC 42 spec but that one is only at CD stage so I wouldn't call it mature yet.
|
|
|
Demiurge
|
2024-07-16 10:54:47
|
I wonder why XT isn't more common...
|
|
|
|
SwollowChewingGum
|
2024-07-16 11:00:32
|
How does XT’s work ?
|
|
|
Demiurge
|
2024-07-16 11:03:18
|
It's just a regular JPEG file, compatible with all existing JPEG decoders. But it adds extensions for alpha channel, lossless, and floating point HDR, and cool stuff like that.
|
|
2024-07-16 11:04:58
|
https://github.com/thorfdbg/libjpeg
|
|
2024-07-16 11:05:33
|
Looks like it's literally just a drag and drop replacement for libjpeg too. Seems like a no brainer, I'm kinda surprised it didn't get any traction
|
|
2024-07-16 11:06:08
|
And instead people are literally re-inventing the wheel like with Android JPEG HDR which does the exact same thing but doesn't follow the existing standard way of doing it
|
|
2024-07-16 11:13:19
|
Hmm, it doesn't actually seem like such a drop-in replacement after all...
|
|
|
TheBigBadBoy - 𝙸𝚛
|
2024-07-16 11:34:00
|
I wish this was resolved <:PepeHands:808829977608323112>
https://github.com/mozilla/mozjpeg/issues/437
|
|
|
_wb_
|
|
Demiurge
Hmm, it doesn't actually seem like such a drop-in replacement after all...
|
|
2024-07-16 11:57:01
|
no, it's not a drop-in replacement like mozjpeg and jpegli. But it does implement the full JPEG standard and also JPEG XT.
|
|
2024-07-16 11:58:41
|
I guess the main problem with JPEG XT has been that if you're doing things that won't work with non-updated existing jpeg decoders, you can just as well use a newer codec that will compress better.
|
|
|
novomesk
|
2024-07-16 01:15:38
|
https://www.digikam.org/news/2024-07-14-8.4.0_release_announcement/
|
|
|
Demiurge
|
|
_wb_
I guess the main problem with JPEG XT has been that if you're doing things that won't work with non-updated existing jpeg decoders, you can just as well use a newer codec that will compress better.
|
|
2024-07-16 01:44:39
|
Problem is JPEG is still king...
|
|
|
yoochan
|
2024-07-16 01:44:55
|
not for long !
|
|
|
Demiurge
|
2024-07-16 01:47:09
|
It's the gold standard. You can even use newer codecs and still be compatible with JPEG. :)
|
|
2024-07-16 01:47:47
|
Since people are still struggling to improve upon it today, especially with the new arithmetic coder patents expiring
|
|
2024-07-16 01:48:35
|
And new codecs like jpegli or even pik and lepton
|
|
|
username
|
2024-07-16 01:49:11
|
jpegli isn't a new codec? (**EDIT:** I'm trying to say that jpegli isn't it's own codec it's just a new encoder for an old codec)
|
|
|
HCrikki
|
2024-07-16 01:49:24
|
I wonder how viable itd be to generate reconstructible jxl directly, without requiring a jpg already created
|
|
|
yoochan
|
2024-07-16 01:49:59
|
lepton devs push users to turn to JpegXL instead, and pik became JpegXL
|
|
|
Demiurge
|
2024-07-16 01:59:07
|
Android just made "HDR JPEG" and I struggle to see how or why it's different than jpeg-xt part 2 or part 7
|
|
|
username
jpegli isn't a new codec? (**EDIT:** I'm trying to say that jpegli isn't it's own codec it's just a new encoder for an old codec)
|
|
2024-07-16 01:59:40
|
It's a new codec for an old format 😂
|
|
|
lonjil
|
2024-07-16 02:02:16
|
pretty weird to put in the same list as Pik or even Lepton
|
|
|
Demiurge
|
2024-07-16 02:30:23
|
pik and lepton are "new" ish codecs that leverage existing jpeg compatibility
|
|
|
lonjil
|
2024-07-16 02:32:40
|
Lepton and Brunsli were new formats for losslessly packing JPEGs
|
|
2024-07-16 02:32:52
|
Pik was an entirely new image format meant to replace JPEG
|
|
|
Demiurge
|
2024-07-16 02:33:03
|
sometimes people use the word codec to mean "format" and I think that's a valid use for the word but I think it's even more commonly used to refer to a "coder/decoder"
|
|
|
jonnyawsom3
|
|
HCrikki
I wonder how viable itd be to generate reconstructible jxl directly, without requiring a jpg already created
|
|
2024-07-16 02:33:39
|
I was thought of back at launch, but not implemented yet
|
|
2024-07-16 02:33:39
|
https://res.cloudinary.com/cloudinary-marketing/image/upload/Web_Assets/blog/Encoder_diagram.png
|
|
|
Demiurge
|
2024-07-16 02:33:43
|
I haven't heard of Brunsli, I'll look it up. As for Pik, I'm pretty sure it was designed to be compatible with JPEG too?
|
|
|
lonjil
|
2024-07-16 02:33:44
|
jpegli is just a new encoder and decoder pair, not a new format at all, and so quite different from those three above.
|
|
|
jonnyawsom3
|
|
Demiurge
I haven't heard of Brunsli, I'll look it up. As for Pik, I'm pretty sure it was designed to be compatible with JPEG too?
|
|
2024-07-16 02:34:16
|
No, PIK was essentially what JXL is now. Jpegs could be 'upgraded' to PIK with a 20% reduction but it was a new format
|
|
|
lonjil
|
2024-07-16 02:34:35
|
Brunsli was Google's JPEG packer
|
|
|
Demiurge
|
2024-07-16 02:34:55
|
a coder/decoder pair is often properly called a codec
|
|
|
lonjil
|
2024-07-16 02:35:22
|
Early versions of JXL used Brunsli for JPEG compat instead of VarDCT + extra metadata.
|
|
2024-07-16 02:35:31
|
https://github.com/google/brunsli
|
|
2024-07-16 02:36:07
|
I haven't looked at the development history of Pik, but presumably it also used Brunsli for JPEG compat.
|
|
|
jonnyawsom3
|
2024-07-16 02:36:10
|
Although, while looking in PIK it pointed out low generation loss by encoding Jpeg lossily with the same 8x8 DCT blocks. Makes me wonder how much of an impact that would have in VarDCT if they were matched
|
|
|
Demiurge
|
|
No, PIK was essentially what JXL is now. Jpegs could be 'upgraded' to PIK with a 20% reduction but it was a new format
|
|
2024-07-16 02:36:17
|
And also losslessly converted back to JPEG
|
|
2024-07-16 02:36:30
|
So it's a new way of storing jpegs like lepton was
|
|
|
lonjil
|
2024-07-16 02:37:40
|
but that's irrelevant to the point
|
|
2024-07-16 02:38:12
|
like you're putting "new" in scare quotes because a new, much more capable format also has some compat features
|
|
|
jonnyawsom3
|
|
Demiurge
And also losslessly converted back to JPEG
|
|
2024-07-16 02:44:26
|
From what I can see, it was only one-way, and then it was moved to JpegXL instead
https://github.com/google/pik/issues/40
|
|
|
Demiurge
|
2024-07-16 02:57:49
|
lol I'm putting "new" in "scare quotes" because it's debatable if lepton should be called "new" when it's already considered "obsolete do not use"
|
|
2024-07-16 02:59:25
|
but it's still a "new codec" relatively speaking
|
|
2024-07-16 02:59:44
|
A new and already-obsolete codec
|
|
|
HCrikki
|
2024-07-16 06:27:19
|
anyone checked librewolf? oddly people think it has jxl but its disabled by default and as bas as in firefox nightly (bad alpha, no animation)
|
|
|
username
|
|
HCrikki
anyone checked librewolf? oddly people think it has jxl but its disabled by default and as bas as in firefox nightly (bad alpha, no animation)
|
|
2024-07-16 06:30:23
|
librewolf has patches in it's repo for better JXL support but they aren't included/compiled into librewolf
|
|
2024-07-16 06:31:05
|
funny thing is they don't seem to test if the browser works with them they just check if they can apply
|
|
|
HCrikki
|
2024-07-16 06:31:17
|
tried stable librewolf 128, had to enable the jxl flag but it works
|
|
2024-07-16 06:31:43
|
any way to get em merged?
|
|
2024-07-16 06:32:30
|
its just lw is gaining mindshare over floorp and waterfox so might as well win the derivatives
|
|
|
username
|
2024-07-16 06:40:58
|
the one they have in their repo is currently messed up a bit but I made a fixed version
|
|
2024-07-16 06:41:21
|
although it seems that's not the reason they don't include it by default
|
|
2024-07-16 06:42:58
|
it's been discussed in here before but there's a chance librewolf won't ever enable JXL support unless regular firefox does, which kinda ruins the point of them even having patches for it in the first place if they aren't ever going to actually use them
|
|
|
HCrikki
|
2024-07-16 06:50:57
|
odd given improvements like jxl could diminish the performance impact of the other compromises they make in the name of security
|
|
2024-07-16 06:51:30
|
ie sure jit is disabled but we still load faster than upstream
|
|
|
username
|
2024-07-16 06:51:47
|
my theory is they are afraid of fingerprinting if they enable JXL support
|
|
2024-07-16 06:52:07
|
because regular firefox does not support JXL
|
|
|
Demiurge
|
2024-07-16 08:18:44
|
The problem is http accept headers. https://wiki.whatwg.org/wiki/Why_not_conneg
|
|
2024-07-16 08:21:35
|
Browsers are an abomination and their poorly designed standards practically encourage and invite fingerprinting and privacy violations at every corner. Because the people in charge of making browsers and writing the standards all profit from it
|
|
|
Quackdoc
|
2024-07-16 08:30:56
|
accept headers themselves are great though, it's a shame CDNs are bloody retarded with them, around 70% of CDNs I come across will just crap themselves if you remove webp from the accept header, almost guranteed because whoever implemented it though it would be good to just chain `if not then break` statements
|
|
2024-07-16 08:35:54
|
Also a small part of me wonders if someone would submit a PR to rip out libjxl and replace it with jxl-oxide if they would be less hesitant since they like rust and all.
|
|
|
WAZAAAAA
|
|
Quackdoc
accept headers themselves are great though, it's a shame CDNs are bloody retarded with them, around 70% of CDNs I come across will just crap themselves if you remove webp from the accept header, almost guranteed because whoever implemented it though it would be good to just chain `if not then break` statements
|
|
2024-07-16 08:45:22
|
huh? I don't think I've ever had websites break with this installed https://addons.mozilla.org/firefox/addon/dont-accept-webp worst that's gonna happen is they ignore it and still give you a WEBP
...it does mention they've had to add exceptions for Patreon and Reddit though
|
|
|
Quackdoc
|
|
WAZAAAAA
huh? I don't think I've ever had websites break with this installed https://addons.mozilla.org/firefox/addon/dont-accept-webp worst that's gonna happen is they ignore it and still give you a WEBP
...it does mention they've had to add exceptions for Patreon and Reddit though
|
|
2024-07-16 08:57:30
|
one example I have saved
```
➜ ~ curl -s 'https://www.hollandamerica.com/content/dam/hal/inventory-assets/ships/RN/gallery/rotterdam-main-dinning-rom-c10.jpg' -H 'Accept: image/avif,image/webp,*/*' -o - | file -
/dev/stdin: ISO Media, AVIF Image
➜ ~ curl -s 'https://www.hollandamerica.com/content/dam/hal/inventory-assets/ships/RN/gallery/rotterdam-main-dinning-rom-c10.jpg' -H 'Accept: image/avif,*/*' -o - | file -/dev/stdin: JPEG image data, JFIF standard 1.01, aspect ratio, density 0x0, segment length 16, progressive, precision 8, 1542x867, components 3
```
|
|
2024-07-16 08:58:55
|
I come across a lot of CDNs like this
|
|
|
WAZAAAAA
|
2024-07-16 09:32:51
|
hmm is the problem that the 2nd command is prioritizing JPEG over AVIF?
|
|
|
Quackdoc
|
2024-07-16 09:34:50
|
no, or else the top one would serve a webp
|
|
2024-07-16 09:35:10
|
the issue is that it checks for webp in the accept header, then bails before it can check for avif
|
|
2024-07-17 12:23:05
|
neat, betterdiscord has a plugin to embed images, so if you use bdbrowser with an enabled fork like thorium or a custom built electron with JXL support you can make discord embed the images, its pretty buggy since for some reason if you open a jxl image it will just open it 1:1 which makes nothing clickable on larger images
EDIT: sadly it doesn't work with jxl-crx
|
|
|
_wb_
|
2024-07-17 01:04:35
|
```
bash-3.2$ curl -s 'https://res.cloudinary.com/jon/f_auto/sample' -H 'Accept: image/avif' -o - | file -
/dev/stdin: ISO Media, AVIF Image
bash-3.2$ curl -s 'https://res.cloudinary.com/jon/f_auto/sample' -H 'Accept: image/jxl' -o - | file -
/dev/stdin: JPEG XL codestream
bash-3.2$ curl -s 'https://res.cloudinary.com/jon/f_auto/sample' -H 'Accept: image/webp' -o - | file -
/dev/stdin: RIFF (little-endian) data, Web/P image, VP8 encoding, 864x576, Scaling: [none]x[none], YUV color, decoders should clamp
bash-3.2$ curl -s 'https://res.cloudinary.com/jon/f_auto/sample' -H 'Accept: image/jpg' -o - | file -
/dev/stdin: JPEG image data, JFIF standard 1.02, resolution (DPI), density 100x100, segment length 16, progressive, precision 8, 864x576, components 3
```
|
|
|
Quackdoc
|
2024-07-17 01:38:04
|
it's always nice to see good CDNs [av1_PepeHappy](https://cdn.discordapp.com/emojis/654081052012314643.webp?size=48&quality=lossless&name=av1_PepeHappy)
|
|
2024-07-17 03:06:01
|
btw <@106423637103353856> , one of the CDN's im talking about is shopify, which is one of the larger ones
```ps
➜ ~ curl -s 'https://shop.vote.org/cdn/shop/products/mockup-of-a-baby-wearing-a-sublimated-onesie-30023.png?v=1652303987&width=360' -H 'Accept: image/jxl,image/avif,image/webp,*/*' -o - | file -
/dev/stdin: JPEG XL codestream
➜ ~ curl -s 'https://shop.vote.org/cdn/shop/products/mockup-of-a-baby-wearing-a-sublimated-onesie-30023.png?v=1652303987&width=360' -H 'Accept: image/jxl,image/avif,*/*' -o - | file -
/dev/stdin: PNG image data, 360 x 360, 8-bit/color RGBA, non-interlaced
➜ ~ curl -s 'https://premioinc.com/cdn/shop/products/ACO-3000_1_50x.jpg' -H 'Accept: image/jxl,image/avif,image/webp,*/*' -o - | file -
/dev/stdin: JPEG XL codestream
➜ ~ curl -s 'https://premioinc.com/cdn/shop/products/ACO-3000_1_50x.jpg' -H 'Accept: image/jxl,image/avif,*/*' -o - | file -
/dev/stdin: JPEG image data, Exif standard: [TIFF image data, little-endian, direntries=6, orientation=upper-left, xresolution=86, yresolution=94, resolutionunit=2], progressive, precision 8, 50x28, components 3
```
|
|
|
WAZAAAAA
huh? I don't think I've ever had websites break with this installed https://addons.mozilla.org/firefox/addon/dont-accept-webp worst that's gonna happen is they ignore it and still give you a WEBP
...it does mention they've had to add exceptions for Patreon and Reddit though
|
|
2024-07-17 03:06:45
|
so this will prevent a LOT of websites from serving avif or jxl
|
|
|
_wb_
|
2024-07-17 09:01:47
|
I informally passed the above bugreport to someone I happen to know at Shopify
|
|
|
Quackdoc
|
2024-07-17 09:05:13
|
I hope this is fixed 0.0.
|
|
|
Demiurge
|
2024-07-17 01:00:48
|
looks like it checks for webp support first before *conditionally* checking for other formats
|
|
|
Quackdoc
accept headers themselves are great though, it's a shame CDNs are bloody retarded with them, around 70% of CDNs I come across will just crap themselves if you remove webp from the accept header, almost guranteed because whoever implemented it though it would be good to just chain `if not then break` statements
|
|
2024-07-17 01:06:53
|
why is an accept header better than a <picture> element?
|
|
|
_wb_
|
2024-07-17 01:38:11
|
If you want to make variants based on both format and responsive breakpoints, it causes a bit of a combinatorial explosion that will make your html pretty tedious to write (or if automated, still pretty verbose and adding some overhead)
|
|
2024-07-17 01:39:35
|
Ideally, with Accept headers and a client hint that says something about the viewport width, you can just use a single img tag and let the server worry about which variant to serve for each case.
|
|
|
VcSaJen
|
|
Demiurge
looks like it checks for webp support first before *conditionally* checking for other formats
|
|
2024-07-17 01:44:42
|
AFAIR, it also checks for AVIF support second before *conditionally* checking for JXL.
|
|
|
Demiurge
|
2024-07-17 01:58:26
|
I dunno, the arguments in "why not conneg" sound pretty strong to me. Especially wrt cache/cdn
|
|
2024-07-17 06:06:03
|
I just tried completely stripping all "Accept" and "Accept-Language" headers from my browser's requests. So far I don't notice any issues... Imagine how much useless data could be saved and stripped from HTTP GET requests if browsers stopped sending it for every request...
|
|
2024-07-17 06:31:40
|
User agent crap should be slimmed down a lot too and contain less fingerprint data like CPU and kernel type...
|
|
|
Quackdoc
|
|
Demiurge
why is an accept header better than a <picture> element?
|
|
2024-07-17 07:48:39
|
accept headers mean you don't need to go back and update stuff to serve new and netter images
|
|
|
Demiurge
I just tried completely stripping all "Accept" and "Accept-Language" headers from my browser's requests. So far I don't notice any issues... Imagine how much useless data could be saved and stripped from HTTP GET requests if browsers stopped sending it for every request...
|
|
2024-07-17 07:49:10
|
you are likely to get a lot more jpegs and pngs without accept headers, which would really increase the data used
|
|
|
Demiurge
|
2024-07-17 07:51:57
|
You are likely to get less webp, which in itself is a victory...
|
|
|
Quackdoc
|
2024-07-17 07:57:20
|
can't deny that [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
|
Demiurge
|
2024-07-17 08:00:50
|
Let the browser choose a variant. That's what <picture> is for after all.
|
|
2024-07-17 08:02:42
|
There ought to be a standard HTML tag for describing a simple TOC for a progressive image too. So browsers know how much data to request to get a certain size.
|
|
|
Quackdoc
|
2024-07-17 08:08:27
|
the picture element requires you to go back and edit every single HTML page to add support for new images, and requires you to have dedicated paths for each image, accept headers are a lot more simple for web developers and way better for websites that can simply set and forget.
accept headers are why the majority of sites are actually serving JXL, without them, maybe you would have a hundred sites or so, maybe less. thanks to accept headers, literally millions of sites (4,794,511 made with shopify according to builtwith for example) support and serve JXL.
|
|
2024-07-17 08:09:02
|
accept headers save a LOT more data then picture tags due thanks to every single web site developer not needing to update every single page/template
|
|
|
jonnyawsom3
|
|
Demiurge
There ought to be a standard HTML tag for describing a simple TOC for a progressive image too. So browsers know how much data to request to get a certain size.
|
|
2024-07-17 09:20:37
|
I remember trying to manually send a partial request to fetch half a JXL image, but I couldn't find a way to do it from the browser tools, only scrips for video players, ect where you can buffer a few seconds
|
|
|
Quackdoc
|
2024-07-17 09:36:51
|
does JXL have a method to signifify that you only need to download x amount of bytes if you want y image?
|
|
|
Demiurge
|
|
Quackdoc
accept headers save a LOT more data then picture tags due thanks to every single web site developer not needing to update every single page/template
|
|
2024-07-17 10:34:57
|
But the different formats are often at different URLs anyways and so the web server has to serve a different page depending on what the headers say.
|
|
|
Quackdoc
does JXL have a method to signifify that you only need to download x amount of bytes if you want y image?
|
|
2024-07-17 10:35:17
|
Yeah, the JXL header has a table of contents
|
|
|
Quackdoc
|
|
Demiurge
But the different formats are often at different URLs anyways and so the web server has to serve a different page depending on what the headers say.
|
|
2024-07-17 10:35:30
|
in some cases
|
|
|
Demiurge
|
2024-07-17 10:35:42
|
But you need to already be reading the JXL file to read the header/toc
|
|
|
|
SwollowChewingGum
|
2024-07-17 10:35:55
|
Is that only when using the container?
|
|
|
Demiurge
|
2024-07-17 10:36:13
|
I don't think so no
|
|
2024-07-17 10:36:18
|
It's all JXL files
|
|
|
jonnyawsom3
|
2024-07-17 10:36:35
|
If I recall someone was doing tests to figure out the average percentage needed for the first progressive pass in a set of files
|
|
|
Demiurge
|
2024-07-17 10:38:08
|
The web page itself would need to contain some information in an html tag to tell the browser how much bytes to request to get a preview
|
|
2024-07-17 10:39:50
|
Not all JXL is progressive either. Most lossless jxl is tile-by-tile
|
|
2024-07-17 10:40:34
|
One tile at a time
|
|
|
_wb_
|
2024-07-17 11:06:06
|
The TOC is part of the frame header, it is obligatory. Decoders can use it to do parallel or ROI decoding. It could also be used to efficiently determine truncation offsets for the progressive passes.
|
|
|
Quackdoc
|
|
Demiurge
Not all JXL is progressive either. Most lossless jxl is tile-by-tile
|
|
2024-07-17 11:10:34
|
well that's still "progressive" compared to other formats
|
|
|
Demiurge
|
|
If I recall someone was doing tests to figure out the average percentage needed for the first progressive pass in a set of files
|
|
2024-07-17 11:22:01
|
This is an excellent "first step" I would say. You would STILL need a way to signal to the browser beforehand that the image is progressive-coded. But if a web browser knows that a file is progressively coded, and knows the size of the file, then it can request the first x% of the file, x being a simple heuristic informed by statistics and the file size and dimensions. There would need to be some statistical analysis based on a large corpus of images at various dimensions and bitrate levels to see what the average ratios are.
|
|
2024-07-17 11:22:51
|
But again... there would STILL need to be a standard way to signal whether or not an image is progressive, in the HTML
|
|
2024-07-17 11:26:22
|
I don't think there's any way for the browser to know whether an image it's about to fetch is progressive or not, and no one did any statistical analysis to establish a good heuristic to determine what % to fetch based on known information about the dimensions and filesize. Maybe there's a way for the browser to tell a server to stop sending the rest of the file?
|
|
2024-07-17 11:27:05
|
The HTTP "I don't need any more thanks" signal
|
|
2024-07-17 11:27:19
|
Is that a thing?
|
|
|
Quackdoc
|
|
_wb_
The TOC is part of the frame header, it is obligatory. Decoders can use it to do parallel or ROI decoding. It could also be used to efficiently determine truncation offsets for the progressive passes.
|
|
2024-07-17 11:29:32
|
yeah this would be the best, if a browser could easily query how much bits are needed for the various offsets it could trivially stop the download.
|
|
|
_wb_
|
2024-07-18 12:06:23
|
Browsers could just fetch the first few kB of any image, and this would always be useful. For non-progressive images, it tells you the image dimensions which is useful to avoid layout shifts, for progressive jxl it tells you the offsets of the passes and perhaps you already get the first pass too, for small images.
|
|
|
Demiurge
|
|
Quackdoc
yeah this would be the best, if a browser could easily query how much bits are needed for the various offsets it could trivially stop the download.
|
|
2024-07-18 12:32:37
|
You don't want too many round trip requests before displaying content. That creates unacceptable latency on certain connections.
|
|
2024-07-18 12:33:10
|
The information should be included as part of the embed tag.
|
|
|
Quackdoc
|
2024-07-18 12:33:18
|
it can be done asynchronously
|
|
|
Demiurge
|
2024-07-18 12:33:41
|
If you have to wait on a round trip before content is displayed, it's a bad idea.
|
|
|
Quackdoc
|
2024-07-18 12:37:14
|
the other thing that would work out well would be a browser simply telling the decoder what resolution is needed, then let the decoder send signal to stop the transfer, that would need some kind of state manager but well, browsers are full of that shit anyways
|
|
|
_wb_
|
2024-07-18 12:40:12
|
The problem with that is that http does not really have a way to abort a transfer. You can break the connection but that's a rather drastic thing to do since there will be many transfers going on through the same connection.
|
|
2024-07-18 12:44:59
|
The best way to do it is server-side, prioritizing the transfer of the initial segment of each image above the fold (or close to the fold) corresponding to the first pass of each image. This is very nontrivial to do though, since servers generally serve requests in a stateless way and this requires per-client coordination of how all the requests coming from the same client are prioritized, i.e. it requires state.
|
|
|
|
SwollowChewingGum
|
|
_wb_
The problem with that is that http does not really have a way to abort a transfer. You can break the connection but that's a rather drastic thing to do since there will be many transfers going on through the same connection.
|
|
2024-07-18 12:55:45
|
Can’t the servers just send 206 partial content and then terminate the connection?
|
|
|
jonnyawsom3
|
|
Demiurge
I don't think there's any way for the browser to know whether an image it's about to fetch is progressive or not, and no one did any statistical analysis to establish a good heuristic to determine what % to fetch based on known information about the dimensions and filesize. Maybe there's a way for the browser to tell a server to stop sending the rest of the file?
|
|
2024-07-18 01:06:19
|
This is what I was trying to use, as just said above https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/206
|
|
|
_wb_
|
|
SwollowChewingGum
Can’t the servers just send 206 partial content and then terminate the connection?
|
|
2024-07-18 01:11:04
|
Sure, but that doesn't allow the client to choose where to stop.
|
|
|
|
SwollowChewingGum
|
2024-07-18 01:14:06
|
Theoretically some new header could work
|
|
|
_wb_
|
2024-07-18 01:14:13
|
206 responses would be a way to prioritize the initial pass of each image, but it still requires multiple requests. Server-side prioritizing of the initial pass would be a way to have only a single request per image.
|
|
|
|
SwollowChewingGum
|
2024-07-18 01:14:45
|
What do you mean exactly by “prioritising the initial pass”?
|
|
|
Demiurge
|
|
_wb_
Sure, but that doesn't allow the client to choose where to stop.
|
|
2024-07-18 01:20:42
|
It does oncw you have the toc and lqip and make a new request.
|
|
2024-07-18 01:22:51
|
Then it can at least load what was sent and make a new request if it wants more. But then the server determines how much to send at first
|
|
2024-07-18 01:23:03
|
And the server doesn't have dpi info
|
|
2024-07-18 01:23:41
|
Or any kind of idea what the client wants
|
|
|
_wb_
|
2024-07-18 01:44:01
|
that's where Client Hints are useful, they can inform the server about dpr and viewport width
|
|
|
SwollowChewingGum
What do you mean exactly by “prioritising the initial pass”?
|
|
2024-07-18 01:49:00
|
http stream prioritization can be used to send e.g. the first progressive pass with a high priority and the rest with a low priority, so if there are multiple images on a page, instead of sending them mostly sequentially, you would first get all the previews and then all the detail
|
|
|
|
SwollowChewingGum
|
|
Demiurge
Then it can at least load what was sent and make a new request if it wants more. But then the server determines how much to send at first
|
|
2024-07-18 02:48:31
|
But having multiple requests for each image causes a lot of unnecessary overhead
|
|
|
_wb_
|
2024-07-19 09:55:25
|
Anyone with an Apple device: please open a Feedback Assistant (https://developer.apple.com/bug-reporting/) bugreport about HDR images not working in Safari and in Preview.
I have been told that the relevant Apple team is aware of the issue but has a hard time getting this prioritized, and for that it apparently helps if the bug gets reported by many people.
|
|
|
spider-mario
|
2024-07-19 11:15:53
|
this is not JXL-specific, right? it’s all HDR images?
|
|
2024-07-19 11:16:03
|
(including Apple-produced ones?)
|
|
|
VcSaJen
|
2024-07-19 12:34:37
|
Is it Safari-specific, or webkit-specific?
|
|
|
|
SwollowChewingGum
|
|
VcSaJen
Is it Safari-specific, or webkit-specific?
|
|
2024-07-19 12:54:50
|
Could be Core Image specific
|
|
|
spider-mario
|
2024-07-19 01:03:16
|
it works in the Photos app
|
|
|
|
SwollowChewingGum
|
|
spider-mario
it works in the Photos app
|
|
2024-07-19 01:03:59
|
Does preview use WebKit?
|
|
2024-07-19 01:04:20
|
HDR images don’t seem to work in preview
|
|
|
spider-mario
|
2024-07-19 01:05:36
|
webkit, I doubt it, but probably Core Image?
|
|
|
|
SwollowChewingGum
|
|
spider-mario
webkit, I doubt it, but probably Core Image?
|
|
2024-07-19 01:05:55
|
That’s why I was thinking it was CoreImage
|
|
|
spider-mario
|
2024-07-19 01:06:24
|
and why I mentioned Photos which probably also does (?) and in which it works
|
|
|
|
SwollowChewingGum
|
2024-07-19 01:06:44
|
Maybe it’s an issue with how WebKit uses Core Image (?)
|
|
2024-07-19 01:06:53
|
If it does at all?
|
|
|
spider-mario
|
2024-07-19 01:06:54
|
that would be my guess
|
|
2024-07-19 01:07:05
|
that Core Image doesn’t automatically make it work, but also doesn’t inherently prevent it
|
|
2024-07-19 01:07:18
|
that it’s up to the application to use it properly
|
|
|
|
SwollowChewingGum
|
2024-07-19 01:07:23
|
So photos is probably using Core Image + something else?
|
|
2024-07-19 01:08:03
|
To view HDR JXLs I use chromium because finder and preview tonemap them to SDR
|
|
|
_wb_
|
|
spider-mario
this is not JXL-specific, right? it’s all HDR images?
|
|
2024-07-19 01:44:57
|
Correct. I can't get any image format to render in HDR in Preview or Safari.
|
|
2024-07-19 01:47:36
|
My guess would be that CoreImage by default returns 8-bit tone mapped pixel buffers, and only with a specific option returns HDR image buffers, as a way to avoid breaking existing applications that use it. And then they probably updated Photos to use the new option or new API, but not the rest.
|
|
|
Demiurge
|
2024-07-21 10:58:34
|
Update: Ever since setting my browser to stop sending HTTP "Accept" and "Accept-Language" headers, most things still seem to work just fine, but I noticed that certain websites are a little bit weird if you aren't logged in. Like github. I guess some HTTP servers assume bot traffic if they don't like the HTTP headers.
|
|
|
_wb_
|
2024-07-21 11:38:21
|
Everything should work if you don't send Accept headers, you should just get the fallback for everything. No gz/br compression on your html/css/js, only jpeg/png/gif for images. It works, it just uses more bandwidth.
|
|
|
Demiurge
|
2024-07-21 07:37:24
|
I kept the accept-encoding header, just not the other two.
|
|
2024-07-21 07:44:10
|
Compression should be a mandatory part of http that's inferred in the version string. The other two are even more useless and dumb
|
|
2024-07-21 07:45:05
|
The problem it creates for caches and CDNs seem like the biggest problem with the accept header
|
|
2024-07-21 08:06:16
|
A few servers actually ignore your requests if you don't have webp in your accept header.
|
|
|
Quackdoc
|
2024-07-21 08:11:04
|
yeah, shopify T.T
|
|
|
Demiurge
|
2024-07-21 08:13:40
|
If you don't like webp, you don't deserve to access their services :)
|
|
|
Crite Spranberry
|
2024-07-22 12:02:26
|
https://github.com/Eclipse-Community/r3dfox/releases/tag/v128.0.2
|
|
2024-07-22 12:16:26
|
I should mention, it has JXL decoder
|
|
|
Demiurge
|
2024-07-22 02:18:09
|
theoretically, `image/webp;q=0` in the accept header should tell websites that you do not accept webp, but in practice I think most web servers just see if webp is mentioned in the accept string and assume webp is supported if it's found, without checking the q=0
|
|
2024-07-22 05:28:41
|
This seems to work pretty well. Less webp, you advertise/ask for jxl, and servers that look for and require `Accept: text/html` are happy. Best of all worlds.
|
|
2024-07-22 05:30:11
|
idk if it actually matters but you can change the bottom one to `text/html,image/jxl,*/*` if any server actually cares about that. I don't think so, though.
|
|
2024-07-22 05:31:39
|
unfortunately if you omit the `Accept:` header entirely, some websites will rarely behave weird. They're just expecting `Accept: text/html`
|
|
|
𐑛𐑦𐑕𐑣𐑸𐑥𐑩𐑯𐑦 | 最不調和の伝播者 | 異議の元素
|
|
Demiurge
theoretically, `image/webp;q=0` in the accept header should tell websites that you do not accept webp, but in practice I think most web servers just see if webp is mentioned in the accept string and assume webp is supported if it's found, without checking the q=0
|
|
2024-07-22 07:22:59
|
correct, since checking for the existence of substrings is much faster than fully parsing the Accept header
|
|
|
Quackdoc
|
2024-07-22 07:24:53
|
ehhh, it's not really any actual perf savings, more like dev time savings. this might be an issue if you are running on a pentium II or something xD
|
|
|
𐑛𐑦𐑕𐑣𐑸𐑥𐑩𐑯𐑦 | 最不調和の伝播者 | 異議の元素
|
|
Quackdoc
ehhh, it's not really any actual perf savings, more like dev time savings. this might be an issue if you are running on a pentium II or something xD
|
|
2024-07-22 07:48:40
|
faster for the devs to code XD
|
|
2024-07-22 07:49:53
|
also some web servers may not have modules to parse Accept headers at all, so for those web servers `*image/example*` might be your only call
|
|
|
Demiurge
|
2024-07-22 09:30:50
|
https://developer.mozilla.org/en-US/docs/Web/HTML/Element/source
|
|
2024-07-22 09:31:04
|
Who needs to parse accept headers anyways?
|
|
|
Traneptora
|
|
Demiurge
Who needs to parse accept headers anyways?
|
|
2024-07-22 05:17:45
|
CDNs that don't have access to upstream HTML
|
|
2024-07-22 05:18:12
|
it's been explained why content negotiation is better than <source> elements
|
|
|
Demiurge
|
2024-07-22 10:11:52
|
Accept headers cripple caches...
|
|
|
Traneptora
CDNs that don't have access to upstream HTML
|
|
2024-07-22 10:12:39
|
The CDN doesn't need to change the HTML...
|
|
2024-07-22 10:13:31
|
Parsing accept headers and serving dynamic html is a waste of time and bad design
|
|
2024-07-22 10:14:13
|
You're putting way too much work on the http server
|
|
|
lonjil
|
|
Demiurge
|
2024-07-22 10:15:30
|
This isn't new or radical information here
|
|
2024-07-22 10:15:40
|
Everyone already knows this
|
|
|
Traneptora
|
|
Demiurge
The CDN doesn't need to change the HTML...
|
|
2024-07-22 10:16:03
|
well if you want to add a new image option, how do you do that without changing the HTML
|
|
2024-07-22 10:16:27
|
the CDN cannot do that without changing the HTML unless it has content negotiation
|
|
|
Demiurge
Everyone already knows this
|
|
2024-07-22 10:17:07
|
"Everyone already knows this" and yet many people here disagree with you, including professionals
|
|
|
Demiurge
|
2024-07-22 10:18:06
|
I see, so if they add support for a new format, they have to modify the page to add it... the CDN has no control over that, it becomes the website owner's responsibility.
|
|
|
Traneptora
|
2024-07-22 10:18:10
|
Shopify added JXL images to its repertoire and all the apple devices that can view them started seeing them. None of the shopify sites had to update their HTML to allow this to happen.
|
|
2024-07-22 10:18:31
|
If you had it your way, literally all the individual sites would have to do that, even though they hire shopify to deliver what's best
|
|
2024-07-22 10:18:53
|
There is a reason the experts do it the way they do it
|
|
|
Demiurge
|
2024-07-22 10:19:23
|
Changing responsibility around isn't really a good or a bad thing, unless you assume website owners should have more or less responsibility
|
|
|
lonjil
|
2024-07-22 10:20:53
|
but it isn't just "changing responsibilities"
|
|
2024-07-22 10:21:01
|
it's limiting the agency of website owners
|
|
|
Demiurge
|
|
Traneptora
"Everyone already knows this" and yet many people here disagree with you, including professionals
|
|
2024-07-22 10:21:11
|
Regardless of your position on this no one can argue that dynamic web pages are the root cause for performance problems with web servers and the first thing to get optimized is reducing the amount of work the server has to do.
|
|
2024-07-22 10:21:18
|
And making things more static
|
|
|
lonjil
|
2024-07-22 10:21:29
|
right now, website operators can *choose* whether to use imagesets or whether to the server/a CDN go off of the Accept header
|
|
|
Traneptora
|
2024-07-22 10:21:30
|
most of the performance issues with dynamic webpages are due to the number of roundtrips
|
|
|
lonjil
|
2024-07-22 10:21:39
|
You're saying that website owners shouldn't have that choice
|
|
|
Traneptora
|
2024-07-22 10:21:41
|
not due to processing time from the accept header
|
|
2024-07-22 10:22:20
|
why does it feel like every time I have a discussion with you I have to defend accepted industry practice developed by experts to counteract your opinion that everyone should just do things your way instead and everything would be better
|
|
|
Demiurge
|
|
Traneptora
most of the performance issues with dynamic webpages are due to the number of roundtrips
|
|
2024-07-22 10:22:51
|
I'm not talking about latency, I am talking about performance problems due to increased demand and waiting on CPU
|
|
|
Traneptora
|
2024-07-22 10:23:07
|
parsing an accept header is fairly cpu-light
|
|
2024-07-22 10:23:20
|
I think you're vastly overestimating the amount of cpu cycles that content negotiation requires
|
|
2024-07-22 10:23:29
|
pcre is very fast, and you don't even need to use regex to do it
|
|
|
Demiurge
|
2024-07-22 10:23:43
|
Making everything as static as possible is the universal first step to solving those problems
|
|
|
Traneptora
|
2024-07-22 10:23:56
|
your perceived problems*
|
|
|
lonjil
|
2024-07-22 10:23:58
|
which problems though
|
|
|
Demiurge
|
2024-07-22 10:24:23
|
DDoS problems or server can't keep up with demand
|
|
|
Traneptora
|
2024-07-22 10:24:34
|
DDoS is not going to be fixed by avoiding an accept header
|
|
2024-07-22 10:25:02
|
servers that can't keep up with demand aren't going to fix it by ignoring the accept headers
|
|
2024-07-22 10:25:15
|
in fact, they can! they just need to serve one image for one URL
|
|
2024-07-22 10:25:33
|
which is already possible to do
|
|
2024-07-22 10:26:05
|
things that are slow on serverside are things that can't be made static, like database lookups
|
|
2024-07-22 10:26:26
|
for me to log into a website requires the http server to contact its SQL server and compare the received credentials, generate a crypographic token, and send it over
|
|
2024-07-22 10:26:31
|
that's the kind of stuff that is slow
|
|
2024-07-22 10:26:35
|
accept header parsing is not
|
|
2024-07-22 10:27:33
|
anonymous fetches are quite low-cpu intensive even if they have to parse the accept header. as you are aware, most sites don't do a fully compliant parse and just search for specific things like `image/webp`
|
|
|
Demiurge
|
2024-07-22 10:27:56
|
I mean, things aren't absolute either, and there is a point and merit to what you're saying, just like there's also merit in saying that there are downsides to dynamic web pages and that accept headers are kinda stupid.
|
|
|
Traneptora
|
2024-07-22 10:28:21
|
accept headers aren't "kinda stupid"
|
|
2024-07-22 10:28:37
|
there's no merit to saying that anyway, as it's fairly unproductive and means nothing
|
|
|
Demiurge
|
2024-07-22 10:31:53
|
I think <source> is a good alternative to them and most of the time they are just wasted bytes and another source of errors and wasted time in configuration.
|
|
|
w
|
2024-07-22 10:32:19
|
if the server side render is server side render, an accept header isn't going to make it not a server side render
|
|
|
Traneptora
|
2024-07-22 10:33:00
|
20 bytes saved on request is entirely negligible
|
|
|
w
|
2024-07-22 10:33:20
|
and if they use a source tag with not server side render, it's not going to magically disappear
|
|
|
Demiurge
|
2024-07-22 10:34:25
|
There is merit to what you're saying that they provide choice and some benefit when serving up dynamic web pages, but dynamic web pages are something that everyone optimizing for performance wants to avoid in the first place.
|
|
|
w
|
2024-07-22 10:35:10
|
people aren't worrying about performance for server side rendering
|
|
|
Demiurge
|
2024-07-22 10:36:03
|
True, but server side rendering is something that's avoided, cached, and usually protected against spam
|
|
|
w
|
2024-07-22 10:36:24
|
web 2 relies on server side rendering
|
|
2024-07-22 10:36:44
|
people want to avoid client side rendering because that's the slow one
|
|
|
Demiurge
|
2024-07-22 10:37:44
|
What?
|
|
|
Traneptora
|
2024-07-22 10:38:07
|
phones are slower than servers
|
|
2024-07-22 10:38:16
|
any cpu time that can be moved from client to server is good
|
|
|
w
|
2024-07-22 10:38:21
|
wikipedia for example, server side rendering, works without javascript
|
|
2024-07-22 10:38:28
|
all the device has to do is render it once
|
|
|
Traneptora
|
2024-07-22 10:38:46
|
avoiding JS as much as possible is the biggest bonus to perf
|
|
|
Demiurge
|
2024-07-22 10:39:35
|
You don't need javascript to render a basic article with some text and pictures... But they do use some dynamic elements for logged in users.
|
|
|
w
|
2024-07-22 10:39:46
|
YouTube, client side render, constant loading boxes, horrible on mobile and requires the app to be usable
|
|
2024-07-22 10:40:16
|
the dynamic elements you are thinking of take like no processing power
|
|
2024-07-22 10:40:20
|
It's super fast for servers
|
|
2024-07-22 10:40:30
|
this has been solved
|
|
|
Demiurge
|
|
w
YouTube, client side render, constant loading boxes, horrible on mobile and requires the app to be usable
|
|
2024-07-22 10:40:32
|
Jokes on you, their app is unusable too ;)
|
|
|
Traneptora
|
2024-07-22 10:40:52
|
the UI might not be amazing but it's not slow
|
|
|
Demiurge
|
|
w
this has been solved
|
|
2024-07-22 10:41:05
|
And it's only done for logged in users. Regular users are served a cached static page.
|
|
|
w
|
2024-07-22 10:41:14
|
no they arent
|
|
2024-07-22 10:41:33
|
well you can say everything is cached
|
|
2024-07-22 10:41:36
|
Even for logged in
|
|
2024-07-22 10:41:46
|
But it's not an issue
|
|
|
Traneptora
|
|
Demiurge
And it's only done for logged in users. Regular users are served a cached static page.
|
|
2024-07-22 10:41:48
|
that's definitely not true
|
|
|
w
|
2024-07-22 10:41:58
|
accept header isn't going to make it faster or anything
|
|
|
Traneptora
|
2024-07-22 10:42:05
|
you can find out yourself by loading up an incognito/private window and then shift reloading
|
|
|
Demiurge
|
2024-07-22 10:42:55
|
Really? I haven't examined the software that closely but I assume it does what everyone does and tries to make as many things as static as possible and limit the number of server side calculations it has to do before sending the web page
|
|
|
w
|
2024-07-22 10:43:18
|
why would it need to do that when it's already so fast and takes no processing power
|
|
2024-07-22 10:43:35
|
The only reason for caching is for bandwidth
|
|
2024-07-22 10:44:16
|
I know because I've done it a lot
|
|
|
Demiurge
|
2024-07-22 10:44:49
|
Because of how many people visit wikipedia, even a 5% difference in speed could be pretty useful
|
|
|
w
|
2024-07-22 10:44:58
|
and how would it do that when the content is dynamic
|
|
2024-07-22 10:45:48
|
And having a non conforming accept header if anything will miss the cache
|
|
|
Demiurge
|
2024-07-22 10:46:12
|
Servers that serve dynamic pages usually try to cache and serve static pages whenever it gets identical requests. Parsing more headers and junk in the requests means there are less matches.
|
|
|
w
|
2024-07-22 10:46:36
|
one of the fields for cache is the entire header
|
|
2024-07-22 10:46:48
|
generally
|
|
2024-07-22 10:46:56
|
at least aws has that
|
|
2024-07-22 10:47:16
|
so if you modify your header it will do the opposite of your goal
|
|
|
Demiurge
|
2024-07-22 10:47:46
|
The User-Agent header is probably the worst offender
|
|
2024-07-22 10:47:51
|
Way too long and detailed
|
|
|
w
|
2024-07-22 10:47:57
|
That doesn't matter
|
|
2024-07-22 10:48:02
|
It just hashes it
|
|
2024-07-22 10:48:07
|
The entire thing
|
|
|
Demiurge
|
2024-07-22 10:48:27
|
Then it still matters because the more detailed it is the more variants there will be
|
|
|
w
|
2024-07-22 10:48:43
|
yeah and you think only putting one thing isn't unique?
|
|
|
Demiurge
|
2024-07-22 10:48:58
|
It gives details about your CPU and whether you're using X11...
|
|
|
w
|
2024-07-22 10:49:07
|
what are you talking about
|
|
2024-07-22 10:49:10
|
holy shit
|
|
|
Demiurge
|
2024-07-22 10:49:16
|
The User-Agent header
|
|
2024-07-22 10:49:56
|
Which part is confusing you exactly? Do you not know what x11 is?
|
|
2024-07-22 10:50:22
|
I'm just saying that's too much info in the headers...
|
|
|
w
|
2024-07-22 10:50:28
|
As someone who had to deal with this exactly for work, I'll let you know this is not an issue (on AWS)
|
|
|
lonjil
|
2024-07-22 10:50:41
|
Wikipedia does in fact cache pages and fetch already-rendered pages from geographically close servers when you're not logged in.
|
|
|
Demiurge
|
2024-07-22 10:50:49
|
Mostly for fingerprinting/privacy reasons
|
|
|
lonjil
|
2024-07-22 10:51:34
|
And if you're logged in, *most* of the page is pre-rendered, but the response is not from a geographically close server, unless you happen to live in the right place.
|
|
2024-07-22 10:51:53
|
So CPU-wise logged in and logged out are basically identical
|
|
2024-07-22 10:52:07
|
But logged out users get a lower latency response
|
|
|
w
|
2024-07-22 10:52:43
|
if you modify your accept header, you are making yourself more unique
|
|
|
Demiurge
|
2024-07-22 10:53:06
|
It would be good if browsers could agree on common user agent and accept strings just for the sake of less fingerprinting. With the bare minimum of info.
|
|
|
w
|
2024-07-22 10:53:18
|
they already do?
|
|
2024-07-22 10:53:23
|
what is your user agent
|
|
|
Demiurge
|
2024-07-22 10:53:37
|
Like "iPhone" or "Firefox" instead of the whole shebang
|
|
2024-07-22 10:54:02
|
No need for version numbers, cpu type, etc
|
|
|
w
|
2024-07-22 10:54:16
|
they're meant to indicate... the user agent
|
|
|
Demiurge
|
2024-07-22 10:54:41
|
They could be made a lot more generalized
|
|
2024-07-22 10:54:49
|
Look at curl's user agent
|
|
|
w
|
2024-07-22 10:54:49
|
they are already most generalized
|
|
2024-07-22 10:55:18
|
if you are using a normal browser
|
|
|
Demiurge
|
2024-07-22 10:55:29
|
No...? They include a lot of unnecessary details.
|
|
|
w
|
2024-07-22 10:55:34
|
like what
|
|
2024-07-22 10:55:37
|
You're making stuff up
|
|
|
Demiurge
|
2024-07-22 10:55:52
|
???
|
|
2024-07-22 10:56:47
|
Do I have to repeat myself again? Or is there no point because I'm making it up anyhow? Do I need to go find and print out my user agent string for you?
|
|
|
w
|
2024-07-22 10:56:53
|
Yeah sure
|
|
|
Demiurge
|
2024-07-22 10:57:55
|
`Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0`
|
|
|
w
|
2024-07-22 10:58:34
|
X11; Linux x86_64 is the name of the os
|
|
|
Demiurge
|
2024-07-22 10:58:45
|
My OS isn't X11...
|
|
|
w
|
2024-07-22 10:58:53
|
Yeah it's generalizing
|
|
|
Demiurge
|
2024-07-22 10:59:18
|
That's not generalizing, that's being awfully specific about my machine...
|
|
|
w
|
2024-07-22 10:59:25
|
how
|
|
2024-07-22 10:59:32
|
It's a Linux machine I think that's useful
|
|