JPEG XL

Info

rules 57
github 35276
reddit 647

JPEG XL

tools 4225
website 1655
adoption 20712
image-compression-forum 0

General chat

welcome 3810
introduce-yourself 291
color 1414
photography 3435
other-codecs 23765
on-topic 24923
off-topic 22701

Voice Channels

General 2147

Archived

bot-spam 4380

coverage

Post links to articles, blog posts, reddit / hackernews / forum posts, media coverage about or related to JXL here!

5peak
2025-03-12 04:47:24
0.5 $K for enter price? Just kidding, right?
RaveSteel
2025-03-12 04:52:54
Their tiers lmao
2025-03-12 04:53:02
VcSaJen
_wb_ would this be a fair/correct description of OS support for jxl?
2025-03-12 07:36:08
IMHO, there's not much of a point of talking about manually installable linux packages, that's not "support", that's just how external non-OS software works there. And third-party plugins are not OS itself.
5peak
VcSaJen IMHO, there's not much of a point of talking about manually installable linux packages, that's not "support", that's just how external non-OS software works there. And third-party plugins are not OS itself.
2025-03-12 08:17:44
Definition of the support might be blurred. But the real problem is creating sustainable open source software.
_wb_
2025-03-12 08:22:07
it's not so clear where to draw the line in some linux distros — in many distros, technically almost everything is "optional" and the bare minimum install doesn't include much at all...
2025-03-12 08:28:31
I think the main thing is to be in the default repos of the default package manager so either you'll end up having libjxl installed automatically in many cases because something dragged it in as a dependency (e.g. gimp or imagemagick), or it's easy enough to get it installed (just `apt install jpeg-xl` or whatever).
5peak
2025-03-12 08:47:03
SYN+ACK
CrushedAsian255
2025-03-12 09:16:51
ACK
0xC0000054
RaveSteel Their tiers lmao
2025-03-12 11:47:09
I am wondering if the whole site is a spoof of those types of developer subscription services.
CrushedAsian255
2025-03-13 12:30:22
> By using our free tier, you agree to `console.log` our sponsor message. This requirement is waived for Enterprise customers, giving you complete control over your browser console. Me: ```js console.log(response.sponsor) console.clear() ```
0xC0000054 I am wondering if the whole site is a spoof of those types of developer subscription services.
2025-03-13 12:34:37
i think it is
Traneptora
0xC0000054 I am wondering if the whole site is a spoof of those types of developer subscription services.
2025-03-13 12:55:58
it's absolutely that
Tirr
Huh https://techbookfest.org/product/kPqEF2hKyKBrcAHcKazFK3?productVariantID=fEiVwDjgz1XkYaViaY95At
2025-03-13 03:26:50
looking at the TOC, it seems to start by briefly explaining about coding tools of lossy compression, like DCT, quantization, entropy coding and chroma subsampling
2025-03-13 03:27:39
maybe I'll give it a read
Demiurge
_wb_ would this be a fair/correct description of OS support for jxl?
2025-03-13 07:14:02
Maybe it would be less wordy if it merely mentioned the support being added to the backend libraries gdk-pixbuf/kimageformats used by gnome/kde and distributed by Red Hat, Debian, Arch, BSD, Slack, whatev version 123.45
2025-03-13 07:15:07
That would be the simplest way to say it and more laconic.
2025-03-13 07:17:54
Instead of listing so many different small distros, just focus on "support was quickly added to the backend libraries gdk-pixbuf and kimageformats used by GNOME/KDE, as early as X, as distributed by Red Hat version Y, Debian Z, BSD XYZ..."
Huh https://techbookfest.org/product/kPqEF2hKyKBrcAHcKazFK3?productVariantID=fEiVwDjgz1XkYaViaY95At
2025-03-13 07:20:16
We found JXK-tan
5peak
Demiurge Instead of listing so many different small distros, just focus on "support was quickly added to the backend libraries gdk-pixbuf and kimageformats used by GNOME/KDE, as early as X, as distributed by Red Hat version Y, Debian Z, BSD XYZ..."
2025-03-13 07:53:26
Please do not forget OIIO = https://github.com/AcademySoftwareFoundation/OpenImageIO
VcSaJen
_wb_ it's not so clear where to draw the line in some linux distros — in many distros, technically almost everything is "optional" and the bare minimum install doesn't include much at all...
2025-03-13 07:59:24
Some distros have several "tiers": minimal, full, etc. If jxl pixbuf is installed by default on full but not minimal, and minimal is the main tier, it is still support, just not on all tiers.
CrushedAsian255
Tirr looking at the TOC, it seems to start by briefly explaining about coding tools of lossy compression, like DCT, quantization, entropy coding and chroma subsampling
2025-03-13 08:51:13
Is it only in Japanese or is it English
Demiurge
2025-03-13 09:07:23
I wouldn't spend too much space talking about Linux workstations that are not very common to begin with. It seems kinda desperate to me.
2025-03-13 09:07:48
I would keep the linux support as simple and brief and direct as you can.
2025-03-13 09:08:58
Of course it's nice that kde and gnome have great support sooner than anyone else. But leave it at that, no need to spend too many words on it since it's not a common workstation product
2025-03-13 09:09:30
Good for Steam Deck users
2025-03-13 09:09:42
The Steam Deck comes with it ootb
AccessViolation_
2025-03-13 09:09:49
in which ways does it seem desperate?
Demiurge
2025-03-13 09:10:21
Well, mentioning so many different obscure desktop Linux configurations such as Solus OS which seems to be semi-abandoned
2025-03-13 09:11:55
It seems a little too desperate to list so many different Linux distros instead of keeping it short
AccessViolation_
2025-03-13 09:12:12
I don't get it, desperate for what
Demiurge
2025-03-13 09:12:31
Btw I was surprised to find libjxl is preinstalled on the Steam Deck
2025-03-13 09:13:07
I don't know if most people here are aware but you can add that to the list of commercial products and platforms that ship with jxl support
AccessViolation_
2025-03-13 09:13:48
as I see it, it's just documentation, it doesn't seem desperate for anything, at worst it's going to not apply to a lot of people, but that doesn't mean it's not interesting to read or useful to document
Demiurge
AccessViolation_ I don't get it, desperate for what
2025-03-13 09:16:33
It's not that noteworthy at all to mention if a very obscure Linux distro packages and distriutes libjxl
2025-03-13 09:17:57
There are countless very obscure Linux distros that package a very wide, broad scope of software and mentioning each one is not necessary.
jonnyawsom3
Demiurge Btw I was surprised to find libjxl is preinstalled on the Steam Deck
2025-03-13 09:29:32
You reminded me https://github.com/ValveSoftware/gamescope/issues/1525
RaveSteel
2025-03-13 09:51:23
Trust this will be implemented at some point
_wb_
Demiurge There are countless very obscure Linux distros that package a very wide, broad scope of software and mentioning each one is not necessary.
2025-03-13 04:35:45
I wasn't trying to mention all Linux distros, for example there are a lot that are derived from Debian but I didn't mention any of them (except Ubuntu) since it's kind of implicit that they also have a libjxl package. The smaller ones I listed are independent (as far as I understand) so it's somewhat relevant that they have libjxl packages.
Demiurge
2025-03-13 04:49:21
Well, it's just my opinion. Brevity is a virtue, and I think the sentence could be restructured for brevity by focusing on support being added to the underlying libraries in GNOME/KDE, as distributed by major distributions.
_wb_
2025-03-13 07:18:06
Yes, good point to add something about gnome/KDE/qt/gdk-pixbuf/kimageformats
Demiurge
2025-03-14 12:21:31
Yep. Mention the upstream first before the downstream distributors, of which there are too many to list them all.
novomesk
2025-03-14 09:39:44
JXL support in kimageformats is optional. Sometimes packagers built it without libjxl dependency. For example, recently I used "pkg install kf6-kimageformats" on FreeBSD and only limited version of kimageformats was installed. However when I compiled it from "the Ports" using "cd /usr/ports/graphics/kf6-kimageformats/ && make install clean" JXL started to work in KDE.
_wb_
2025-03-16 08:22:43
Cursed jxl logo here: https://www.leproton.com/2025/03/le-jpeg-xl-sur-apple-et-windows-11-pour-l-instant.html
2025-03-16 08:23:21
https://image.over-blog.com/LBZze1EdbyMhMqosIycioj5Ec0s=/filters:no_upscale()/image%2F0576035%2F20250309%2Fob_287611_presse-papiers-1.jpg
RaveSteel
2025-03-16 08:24:13
A nice apple green xdd
_wb_
2025-03-16 08:24:15
How does that even happen, I wonder
TheBigBadBoy - 𝙸𝚛
2025-03-16 08:39:35
"Le JPEG XL" feels so weird (and my motherlanguage is French)
_wb_
2025-03-16 09:00:03
Le jépègue excellent
couleur
2025-03-16 09:34:49
we pronounce it gipeg
spider-mario
2025-03-16 09:58:43
from saying it so often in English, I say it that way in French too
2025-03-16 09:59:01
djèï pègue excèle
2025-03-16 10:00:33
well, not quite – I think that ‘L’ in “XL” is dark? (https://en.wikipedia.org/wiki/Voiced_dental,_alveolar_and_postalveolar_lateral_approximants#Velarized_or_pharyngealized_alveolar_lateral_approximant)
2025-03-16 10:01:01
which French doesn’t have
2025-03-16 10:01:06
so “excèle” is not quite it
2025-03-16 10:04:31
jonnyawsom3
2025-03-16 10:24:58
<@245794734788837387> new soundboard just dropped, and it's a silky smooth one
username
<@245794734788837387> new soundboard just dropped, and it's a silky smooth one
2025-03-16 10:36:28
took me a bit to understand what you meant lol (I've been awake quite long)
jonnyawsom3
2025-03-16 10:40:26
Same
CrushedAsian255
_wb_ Cursed jxl logo here: https://www.leproton.com/2025/03/le-jpeg-xl-sur-apple-et-windows-11-pour-l-instant.html
2025-03-16 10:57:30
got the colour space wrong maybe?
Meow
CrushedAsian255 got the colour space wrong maybe?
2025-03-17 02:37:53
le JPEG XR
AccessViolation_
2025-03-18 08:57:49
2025-03-18 08:57:50
<@794205442175402004> maybe this is something that could be addressed about the graphic?
2025-03-18 09:00:54
it is implied that what's being shown is how many bytes it takes to encode the gradient. it might be good to specify whether the header or other data not used specifically to encode the visuals, is included, and if it is, maybe you could do something along the lines of > 523 bytes + 12 bytes (header) also: it might be interesting to show what that gradient would have looked like in JPEG XL without adaptive LF smoothing :3
Demiurge
2025-03-18 09:13:38
I think the least confusing way would be **535 bytes (12 byte minimal jxl header)** **700 bytes (150 byte minimal avif header)**
AccessViolation_
2025-03-18 09:18:12
just putting it in parentheses doesn't make it all that clear whether the header is included in the first byte count
Demiurge
2025-03-18 09:19:08
(inc. X bytes minimal Y header)
AccessViolation_
2025-03-18 09:30:08
that works but then people are gonna have to do math when they wanna know how much bytes it took to encode the gradient. this graphic is a visual aid specifically for the section about how adaptive LF smoothing can preserve slow gradients at low quality settings; IMO it doesn't make much sense to include header bytes (or other metadata if applicable) at all. the paper has a different section which specifically talks about the header size
Demiurge
2025-03-18 09:35:13
It's debatable whether including the header size at all is even relevant and expected in the first place
2025-03-18 09:35:48
I think it's more confusing not to put the total file size first.
2025-03-18 09:38:17
X bytes (y payload + z minimal header) This might be closer to what you like. X bytes (y-byte minimal header) This keeps the focus on the total file size, with insight into the header overhead for the curious.
_wb_
2025-03-18 09:54:42
The way I see it, it's the problem of formats with verbose headers that they waste bytes on them, not my problem 🙂
AccessViolation_
2025-03-18 10:07:17
to be honest that does bother me a bit, it feels like that's nearing "intentionally misleading" territory 🫤
CrushedAsian255
2025-03-18 10:14:47
however the thing about the headers is that it only really affects small images
2025-03-18 10:15:19
even if AVIF's header is 1 kB, if its a 2 MB image that's 0.049% of the filesize
Demiurge
2025-03-18 10:39:57
Wouldn't 1K be exactly 0.05% of 2M?
2025-03-18 10:40:28
JEDEC/IEC units don't exist and should never be used ever
AccessViolation_
2025-03-18 10:47:07
no they are based because we should all be using binary instead of decimal in our daily lives anyway :)
_wb_
2025-03-18 11:01:01
It is only misleading if you assume small images do not matter. If you look at histograms of image file sizes on the web, that's not a valid assumption.
CrushedAsian255
2025-03-18 11:03:10
*fine*, even if AVIF's header is 1 K**i**B, if its a 2 M**i**B image that's 0.049% of the filesize
_wb_ It is only misleading if you assume small images do not matter. If you look at histograms of image file sizes on the web, that's not a valid assumption.
2025-03-18 11:04:09
sure, the web has many small images but i would say it would be around 10-100 KiB would be small, not < 1 KiB
_wb_
2025-03-18 11:05:07
the median web image is 0.058 megapixels: https://almanac.httparchive.org/en/2024/media#image-dimensions
2025-03-18 11:05:48
the median file size is 12 kilobytes: https://almanac.httparchive.org/en/2024/media#byte-sizes
CrushedAsian255
2025-03-18 11:06:20
What size header is AVIF actually again? Was it around 300 bytes?
_wb_
2025-03-18 11:06:26
that is excluding the 6% of images on the web that are just 1 pixel tracking images / spacer images
CrushedAsian255
2025-03-18 11:06:49
what is the smallest possible 1x1 image?
2025-03-18 11:07:02
I guess it would just be "which image has the smallest header"?
username
CrushedAsian255 I guess it would just be "which image has the smallest header"?
2025-03-18 11:08:41
https://docs.google.com/presentation/d/1LlmUR0Uoh4dgT3DjanLjhlXrk_5W2nJBDqDAMbhe8v8/edit#slide=id.gde87dfbe27_0_43
CrushedAsian255
2025-03-18 11:10:21
what is the point of a tracking image anyway?
2025-03-18 11:10:49
what does it track?
2025-03-18 11:11:27
is it a way to snatch headers without js?
Demiurge
CrushedAsian255 *fine*, even if AVIF's header is 1 K**i**B, if its a 2 M**i**B image that's 0.049% of the filesize
2025-03-18 11:40:06
Only SI multiples of 1000 are a valid way to count
2025-03-18 11:41:32
1024 nonsense is the dumbest mistake of computers. Dumber than null terminated strings.
2025-03-18 11:42:07
Dumber than Objective C
2025-03-18 11:42:28
Dumber than GNU/Linux
_wb_
CrushedAsian255 what is the point of a tracking image anyway?
2025-03-18 11:42:54
usually it's basically to let some third party analytics thing get logs on your traffic, including e.g. things like html email where adding a tracking pixel is a simple way to get an idea of how many times it gets read
Demiurge
2025-03-18 11:43:05
Kilo always =1000
_wb_
2025-03-18 11:45:34
yes, kilo should always be 1000. It does make sense to design stuff like cpu cache lines etc in multiples of 1024 though (or whatever other power of two) since many things are more convenient that way.
spider-mario
Demiurge JEDEC/IEC units don't exist and should never be used ever
2025-03-18 11:57:28
they should be used with their correct prefixes “I bought a 500GB [SI] hard drive to complement my 4 GiB [IEC] of RAM”
2025-03-18 11:57:34
“that’s about 465.7 GiB”
2025-03-18 11:58:49
kB = 1000B KiB = 1024B KB = abomination
Demiurge
2025-03-18 12:00:00
There's almost never a good reason to ever use kiB instead of kB. Especially for file sizes, it should never ever be used
spider-mario
2025-03-18 12:00:20
(KiB, uppercase)
Demiurge
2025-03-18 12:00:22
File sizes are properly expressed in bytes. Not in some bizarre multiple
AccessViolation_
Demiurge There's almost never a good reason to ever use kiB instead of kB. Especially for file sizes, it should never ever be used
2025-03-18 02:57:06
i raise you: https://www.youtube.com/watch?v=rDDaEVcwIJM
Demiurge
2025-03-18 03:03:04
Eh. Technically the best radix is the natural base, *e*
2025-03-18 03:03:13
2.7...
2025-03-18 03:03:32
So ternary is actually closer to ideal than binary
AccessViolation_
_wb_ It is only misleading if you assume small images do not matter. If you look at histograms of image file sizes on the web, that's not a valid assumption.
2025-03-18 03:07:54
I think I misinterpreted your intent. it would be intentionally misleading if you were counting on the idea that people would think those bytes go towards the encoding of the gradient directly. if your intent was to show how large those are as actual image files then it's not intentionally misleading :)
Demiurge Eh. Technically the best radix is the natural base, *e*
2025-03-18 03:08:47
chapter 1 of this video goes into how this is a misconception, if you're interested
2025-03-18 03:10:06
I will also say that there are many more subjective and objective benchmarks in addition to radix economy, and the authors found binary ticks the most of these for any base, making it the overall best
_wb_
AccessViolation_ I think I misinterpreted your intent. it would be intentionally misleading if you were counting on the idea that people would think those bytes go towards the encoding of the gradient directly. if your intent was to show how large those are as actual image files then it's not intentionally misleading :)
2025-03-18 03:19:41
If the implicit assumption is that the gradient would be a part of a larger image, then comparing at same payload size (ignoring headers) would be the more correct way to do it, but if you're just comparing that particular small image of a gradient, then comparing at the same file size is not wrong. Either way: if you'd make a comparison at same payload size, excluding headers, the result would not be hugely different: jxl would still look better than the rest, the difference would just be a little smaller. So in that sense I don't think it's misleading: it's not like the avif gradient will look better than the jxl one if the avif header overhead is ignored...
Demiurge
AccessViolation_ I will also say that there are many more subjective and objective benchmarks in addition to radix economy, and the authors found binary ticks the most of these for any base, making it the overall best
2025-03-18 03:29:51
I see... But the Soviet Union found ternary computers to be ridiculously more efficient than the counter-revolutionary plot by the bourgeoisie American capitalist pigs to promote binary
2025-03-18 03:32:21
This video is clearly another one of their schemes to keep the working class divided
jonnyawsom3
2025-03-18 03:36:59
Perhaps <#806898911091753051> now that the Soviet Union is involved? xD
Demiurge
AccessViolation_ chapter 1 of this video goes into how this is a misconception, if you're interested
2025-03-18 04:06:50
I took a closer look at that section. They are actually cheating by saying the first digit carries no information since the first digit is always one.
2025-03-18 04:07:57
If the first digit is always one, then this only applies for numbers of the same magnitude. Not for numbers of different digit lengths
2025-03-18 04:08:13
You can't just cheat by getting a free digit
2025-03-18 04:10:03
If it carries no information and is always the same value then you can't compare it and count it in the comparison because otherwise the comparison is only valid with numbers of the same magnitude/number of digits
2025-03-18 04:11:23
Then you're right back where you started anyways, with leading zeroes
AccessViolation_
Demiurge If the first digit is always one, then this only applies for numbers of the same magnitude. Not for numbers of different digit lengths
2025-03-18 04:15:54
how do you mean?
2025-03-18 04:16:04
wait let's move to off topic
A homosapien
spider-mario
2025-03-20 12:23:17
Transcribed & Translated "Bienvenue sur ce serveur Discord dédié à Jpeg XL." "Welcome to this Discord server dedicated to Jpeg XL."
jonnyawsom3
2025-03-20 12:56:20
It's a single line mention of JPEG XL, but I stumbled across this <https://assets.publishing.service.gov.uk/media/67d1abd1a005e6f9841a1d94/Final_decision_report1.pdf#page=153>
2025-03-20 12:56:40
spider-mario
A homosapien Transcribed & Translated "Bienvenue sur ce serveur Discord dédié à Jpeg XL." "Welcome to this Discord server dedicated to Jpeg XL."
2025-03-20 08:28:41
(small correction: dédié à JPEG XL)
A homosapien
2025-03-20 08:33:37
I got pretty close for somebody who doesn't speak French 😅
lonjil
Demiurge Eh. Technically the best radix is the natural base, *e*
2025-03-21 03:44:24
so you're saying we should write the number 3 as 100200112000001...[it goes on forever]?
DZgas Ж
2025-03-21 05:50:26
No mention of Jpeg xl <:PepeSad:815718285877444619>
RaveSteel
2025-03-21 05:57:30
Make an edit😎
Demiurge
lonjil so you're saying we should write the number 3 as 100200112000001...[it goes on forever]?
2025-03-22 03:43:37
Hmm, depends on how much precision you want I suppose 😂
2025-03-22 03:45:31
The most efficient integer base would be 3 since it's closer to *e* than 2... This applies to telephone driven menu systems for example. If you want to waste the least amount of time listening to the options, give 3 choices for each level of the menu.
2025-03-22 03:46:25
You'll waste less time getting to the desired choice compared to a binary tree
KKT
2025-03-25 05:20:07
https://freedium.cfd/https://medium.com/@arnoldgunter/your-website-needs-this-image-format-in-2025-not-png-jpeg-740978a2526f
_wb_
2025-03-25 05:25:58
https://discord.com/channels/794206087879852103/803574970180829194/1354054294533308427
AccessViolation_
KKT https://freedium.cfd/https://medium.com/@arnoldgunter/your-website-needs-this-image-format-in-2025-not-png-jpeg-740978a2526f
2025-03-25 05:45:47
new slop article just dropped > Basically, it's like going from a flip phone camera to a DSLR — without slowing down your website. - unnecessary analogy to explain the concept of "better" > Let me prove you wrong. There's an image format called AVIF that outperforms all the others by 10 times — faster, smaller, and sharper! - it doesn't > AVIF stores high-quality images at much smaller file sizes than JPEG or PNG - it doesn't, this is only true for JPEG, not PNG > since the ྖs, AV1 - since the what
Quackdoc
2025-03-25 05:49:19
this article is horrid lol
KKT
2025-03-25 06:08:45
It is terrible. Like most Medium posts.
spider-mario
2025-03-25 06:27:12
> And they thought, "Hey, if we can make videos better, why not images too?" they thought wrong
AccessViolation_
2025-03-25 06:47:22
that weird symbol seems to return pages that include '1990' when searched. I'm not sure AV1 has been around for quite that long
_wb_
2025-03-25 06:47:33
Why do these numbers feel made-up?
2025-03-25 06:51:43
If those numbers are close to something real, then it cannot be a photographic image, since then the PNG would be much larger compared to lossy formats with SSIM values that low
2025-03-25 06:52:03
But I suspect they're just completely made up
2025-03-25 06:56:18
On photographic images, JPEG is actually slightly better than WebP in most of the quality range — you need to go down to ridiculously low quality for WebP to beat JPEG, since at some point JPEG becomes a blocky mess while WebP is a more acceptable blurry mess.
AccessViolation_
2025-03-25 06:56:59
huh, genuinely surprised JPEG beats WebP like at all
_wb_
2025-03-25 06:57:56
I mean good jpeg encoders like jpegli and mozjpeg
2025-03-25 06:58:16
Unoptimized libjpeg-turbo doesn't beat webp
2025-03-25 07:01:08
AVIF does beat JPEG by 15% or so on average, more for non-photographic images or photos with lots of high-contrast edges. On some images though, it doesn't beat JPEG and performs slightly worse.
2025-03-25 07:05:54
For example on that bottom image, AVIF is actually 30% larger than JPEG at 1 JND
Laserhosen
AccessViolation_ that weird symbol seems to return pages that include '1990' when searched. I'm not sure AV1 has been around for quite that long
2025-03-25 07:30:24
> U+0F96 Tibetan Subjoined Letter Cha > "since the Chas, AV1" Hope that clears it up.
Demiurge
_wb_ But I suspect they're just completely made up
2025-03-25 09:09:34
They aren't consciously made up
2025-03-25 09:10:39
Someone made an LLM to write an article about this and the LLM generated those numbers based on marketing materials like "avif is 40% the size of JPEG and higher quality!"
2025-03-25 09:11:33
It's not an act of conscious deception, just your standard issue AI generated slop
2025-03-25 09:14:49
1 JND = the threshold of visual transparency? In a side by side test?
2025-03-25 09:15:03
Or flicker test?
_wb_
2025-03-25 09:27:19
In this test, 1 JND is in a pairwise comparison with in-place switching between orig and distorted, so you see (A vs B), you can switch between that view and (orig vs orig) as many times as you want but no faster than 2 Hz, and have to select which is the most distorted, A or B.
2025-03-25 09:27:42
(that's called the PTC protocol)
2025-03-25 09:30:49
But the BTC protocol was also used, which is similar but with 10 Hz flickering, 2x zoom and a 2x artifact amplification (per-pixel sample differences between orig and distorted are artificially made larger). This allows getting precise results below 1 JND, since it makes more artifacts noticeable. But the scale is based on PTC conditions, not BTC.
2025-03-25 09:31:09
PTC = plain triplet comparison BTC = boosted triplet comparison
CrushedAsian255
2025-03-25 09:32:18
So how many PTC is one BTC?
2025-03-25 09:33:02
All I know is 1 BTC is 88047.85 USD
2025-03-25 09:33:20
Or are they not units specifically
_wb_
2025-03-25 09:33:34
It depends on the overall distortion, but roughly 1 JND in BTC conditions is 0.5 JND or so in PTC conditions.
CrushedAsian255
2025-03-25 09:34:16
so around 2:1 on avg?
_wb_
2025-03-25 09:34:39
Yeah but the gain is larger at low distortion and lower at high distortion
Demiurge
2025-03-25 09:35:10
I don't see how limiting the speed of switching between them would matter for the test at all. Best is to avoid flickering altogether and just have 3 images side by side, the first image is always the original and the last 2 are shuffled so you have to choose which image does not match the others.
_wb_
2025-03-25 09:35:31
And it also depends on the type of boosting that was done, e.g. in the HDR experiment we didn't do artifact amplification, only flicker and zoom.
2025-03-25 09:36:15
Doing in-place switching makes artifacts a lot easier to detect than if your eyes need to travel
CrushedAsian255
Demiurge I don't see how limiting the speed of switching between them would matter for the test at all. Best is to avoid flickering altogether and just have 3 images side by side, the first image is always the original and the last 2 are shuffled so you have to choose which image does not match the others.
2025-03-25 09:36:39
Flicker makes it easier for a human to distinguish
_wb_
2025-03-25 09:37:06
Demiurge
2025-03-25 09:37:31
The test would be too easy with flicker.
_wb_
2025-03-25 09:38:58
Easy is good, the goal is to measure quality, not to punish the test subjects 🙂
Demiurge
2025-03-25 09:42:06
Well if you can't tell which one is more distorted or which image doesn't match when you have 3 of them side by side, then it's probably good quality!
CrushedAsian255
2025-03-25 09:42:57
Not for production workflows!
Demiurge
2025-03-25 09:43:10
Basically you would have 3 identical views side by side, the first one would always be the original, the last two would be shuffled, and you have to guess which one is the bad one. If you can't guess then it's not bad!
2025-03-25 09:43:26
"One of these things is not like the others" is easy
CrushedAsian255
2025-03-25 09:43:28
You need a quality score that doesn’t end at “humans can’t see the difference”
Demiurge
CrushedAsian255 You need a quality score that doesn’t end at “humans can’t see the difference”
2025-03-25 09:44:48
Then you can have 2 views where one side is the original and the other side is the flicker view, like what _wb_ described. That's the flicker test which is easier.
_wb_
2025-03-25 09:44:51
The problem with side-by-side protocols for still images is that they cannot provide a fine grained quality scale, especially not in the range above say libjpeg q50. Basically anything above q50 will be noisy data with confidence intervals that largely overlap.
Demiurge
2025-03-25 09:45:08
That's only a slight modification
2025-03-25 09:45:36
Instead of 3 views you have 2 views where the last two shuffled images are combined into one flicker view.
2025-03-25 09:46:10
A test like that would be trivially easy for all but the most indistinguishably transparent algorithms
_wb_
2025-03-25 09:46:23
Many variations are possible
Demiurge
2025-03-25 09:49:44
If you want the 3 view method I described to be even harder, for judging images that are lower quality, you can make it harder by resizing the images smaller. Or start fading the image out with alpha transparency on a white background, so errors and distortion are less significant.
2025-03-25 09:50:09
As long as you do the exact same thing to all 3 images.
_wb_
2025-03-25 09:51:16
But median test subjects tend to be less good at detecting artifacts than professionals, that's also something to take into account. 1 JND is a difference that can be noticed by half of the observers and not by the other half, or in a forced choice pairwise comparison, you have 75% selecting the distorted image and 25% selecting the original image. So we call it "visually lossless" at 1 JND or below, but actually by definition half of the population can actually see a visual difference.
2025-03-25 09:52:07
I am always surprised at how low quality 1 JND actually is, even in testing conditions that make it easy to notice differences.
Demiurge
2025-03-25 09:52:21
Why white? So it makes it evenly-harder to see all parts of the image, instead of a greater disadvantage on darker areas
_wb_ I am always surprised at how low quality 1 JND actually is, even in testing conditions that make it easy to notice differences.
2025-03-25 09:54:56
Hmm, yeah that's a problem. The test methods I am proposing assume the test subject knows what differences to look for.
_wb_
2025-03-25 09:55:04
You don't need to make it harder, standard bt.500 test protocols like DSIS already work well enough if you're interested in very low to medium quality images
Demiurge
_wb_ You don't need to make it harder, standard bt.500 test protocols like DSIS already work well enough if you're interested in very low to medium quality images
2025-03-25 09:56:32
You might need to make it harder for yourself if you are tuning a codec by yourself and want to give yourself a disadvantage since you know what to look for and better at spotting differences than the median.
_wb_
2025-03-25 09:57:03
The challenge is to test in the high fidelity range while being able to use naive test subjects — paying professional photographers or other eagle eyes to do quality assessment tasks is quite expensive 🙂
Demiurge
2025-03-25 09:57:19
Double blind testing is sacred
_wb_
2025-03-25 09:57:44
Also naive test subjects are more representative for the general population so more relevant for most use cases
Demiurge
_wb_ Also naive test subjects are more representative for the general population so more relevant for most use cases
2025-03-25 09:58:38
But it's not good enough to build a product that caters to the utterly clueless.
2025-03-25 09:59:03
Setting standards so low is not enough to be competitive. A good product satisfies professionals too.
2025-03-25 09:59:45
You never know who's going to depend on your technology.
_wb_
2025-03-25 10:00:13
Which is why we are using the AIC-3 methodology which can go below 1 JND
Demiurge
2025-03-25 10:02:48
Like for example, people with medical or astronomical sensors or even cameras shooting RAW, they need a codec that doesn't only perform well perceptually at a single predefined intensity scale, but rather, something that performs equally well at different scales when they adjust the levels and gamma curve.
_wb_
2025-03-25 10:03:07
It took me a couple of years to get the rest of the JPEG committee to understand that "below 1 JND" is actually a very relevant quality range. They understand it now, but it took time to get the AIC-3 project going
Demiurge Like for example, people with medical or astronomical sensors or even cameras shooting RAW, they need a codec that doesn't only perform well perceptually at a single predefined intensity scale, but rather, something that performs equally well at different scales when they adjust the levels and gamma curve.
2025-03-25 10:03:31
Yeah or just normal photographers who want to keep some room for postproduction
Demiurge
_wb_ It took me a couple of years to get the rest of the JPEG committee to understand that "below 1 JND" is actually a very relevant quality range. They understand it now, but it took time to get the AIC-3 project going
2025-03-25 10:04:59
So in other words you convinced them that there actually is demand for high fidelity. By pointing out for example the average jpeg bitrate on the internet is surprisingly high.
_wb_ Yeah or just normal photographers who want to keep some room for postproduction
2025-03-25 10:06:49
I think libjxl has a lot of room for improvement when it comes to making it simpler for people who need gamma-scalable compression
2025-03-25 10:07:03
Idk if you would call it that.
2025-03-25 10:07:14
But you know what I mean
2025-03-25 10:07:32
Like the recent question from the one with astronomical TIFF
2025-03-25 10:07:59
In that situation perceptual coding is a clumsy tool
2025-03-25 10:08:28
There should be an obvious "scalable mode" for those users
_wb_
2025-03-25 10:08:37
Maybe we should add some option to indicate a range of max nits that has to be taken into account, not just a single value. It's a subtle difference from using the highest max nits value as the intensity target
Demiurge
2025-03-25 10:09:21
There's an option called "disable perceptual" but idk if that's the same thing
2025-03-25 10:09:38
idk what that actually does in libjxl
2025-03-25 10:10:17
Is there a mode that performs equal at high and low intensity scaling?
2025-03-25 10:10:54
Like a "logarithmic quantization" mode or whatever you would name that type of idea
2025-03-25 10:11:32
Floating point quantization behaves like this. The larger the number the larger the error.
2025-03-25 10:12:10
Human eyes also. Less sensitive to large changes at high intensity, more sensitive to small changes at low intensity.
_wb_
2025-03-25 10:18:36
Disable perceptual is what you may want to do if the data is not visual at all. It is supposed to just optimize for PSNR. But that's not really what you want in use cases where the image is visual but it may be viewed at (or postprocessed to) different brightness levels...
Demiurge
2025-03-25 10:20:16
optimize_psnr would be a clearer and more accurate name. Maybe even a new option like x264's "tune=" knob
_wb_ Disable perceptual is what you may want to do if the data is not visual at all. It is supposed to just optimize for PSNR. But that's not really what you want in use cases where the image is visual but it may be viewed at (or postprocessed to) different brightness levels...
2025-03-25 10:23:32
What would be better than PSNR for that?
2025-03-25 11:19:20
I think psnr should scale at high and low intensities, right?
jonnyawsom3
AccessViolation_ huh, genuinely surprised JPEG beats WebP like at all
2025-03-26 01:22:42
As Jon said, sometimes it actually beats AVIF <https://giannirosato.com/blog/post/jpegli-xyb/>
2025-03-26 01:26:54
Should probably make cjpegli use 444 by default to avoid the drop-off, and give it JXL transcoding compatibility
A homosapien
2025-03-26 01:28:00
It's funny because the API has different defaults compared to the CLI
Mine18
2025-03-26 01:28:32
2025-03-26 01:28:41
<@862810094734934036>
jonnyawsom3
_wb_ Maybe we should add some option to indicate a range of max nits that has to be taken into account, not just a single value. It's a subtle difference from using the highest max nits value as the intensity target
2025-03-26 01:35:18
I mentioned it before, but having an option like `--intensity_target` that only influences the encoder, not the set brightness in the resulting JXL, would be a decent work around until something more thorough can be done
Quackdoc
2025-03-26 01:49:31
it seems like it would be nifty
damian101
Should probably make cjpegli use 444 by default to avoid the drop-off, and give it JXL transcoding compatibility
2025-03-26 01:52:28
I think it should just scale in linear light for chroma subsampling...
2025-03-26 01:52:54
like the sharpyuv option does for libwebp and libavif
Quackdoc
2025-03-26 01:54:00
does it not already?
jonnyawsom3
2025-03-26 02:16:28
Before I forget again https://github.com/libjxl/libjxl/issues/4163
Meow
As Jon said, sometimes it actually beats AVIF <https://giannirosato.com/blog/post/jpegli-xyb/>
2025-03-26 02:16:31
When using XYB
jonnyawsom3
2025-03-26 02:16:36
Yeah
Meow
2025-03-26 02:17:20
I tested Jpegli + XYB before and the file size actually decreased further more significantly
Demiurge
2025-03-26 04:01:57
It's surprising that the bug with cjpegli still remains and it still doesn't use 444 by default in xyb when it's a one liner fix
2025-03-26 04:02:26
I think the problem is that it's not clear where the decision making happens because of the confusing layout of the spaghetti code
2025-03-26 04:03:12
But still it's not hard to find with basic ctrl+f skills
2025-03-26 04:03:58
It kinda boggles my mind sometimes, libjxl
2025-03-26 04:06:15
I wish the official reference software for the ultimate new format, wasn't such a mess...
2025-03-26 04:07:20
Not to undermine my appreciation and gratitude that it exists at all.
HCrikki
2025-03-26 05:10:09
https://old.reddit.com/r/Android/comments/1jjq3cr/when_will_google_stop_holding_back_innovation_and/ Clueless people who blindly trusted the online dark propaganda to suppress and slow adoption of jxl really want the world stuck on decades old JPEG for more decades...
2025-03-26 05:13:35
Odd its not known that apple ecosystem' support is not just ios or dng. according to napkin math and due to apple's longterm support for devices, its now closer to *almost 90% of all apple devices* in active use that have succesfully connected to the apple store at least once in the last 12 months - meaning including ipados, macos, visionos, and *not just safari* - every app on those oses can read, use and generate jxl without needing to add their own library
2025-03-26 05:16:26
According to caniuse, almost 43% of all **mobile** devices actively used in the US can decode jxl (UK, canada, japan and germany trail closely - big markets whose digital sector and datacenters needs more efficiency - if your content is local or accessible only through your own apps, you would gain from earlier adoption). Focusing on the total % overlooks the fact implementation on chromium would have it gain an extra 65%
Quackdoc
HCrikki https://old.reddit.com/r/Android/comments/1jjq3cr/when_will_google_stop_holding_back_innovation_and/ Clueless people who blindly trusted the online dark propaganda to suppress and slow adoption of jxl really want the world stuck on decades old JPEG for more decades...
2025-03-26 05:24:40
wait for jxl-rs
2025-03-26 05:24:48
realistically I dont see much happeneing until it does come
HCrikki
2025-03-26 05:25:29
why wait? like what does it change to the discussion compared to what we already got ?
2025-03-26 05:26:36
is there an ambassador committee working behind the scenes to support flawless integrations into big business' workflows and web services?
Quackdoc
2025-03-26 05:26:57
well to begin with, I doubt aosp nor chromium teams will actually care until jxl-rs releases because c++ is a security issue
Meow
2025-03-26 05:27:25
jxl-rs is the saviour library for the entire world of image formats 🤔
HCrikki
2025-03-26 05:28:15
c++ was always an excuse. its just they had no idea a rewrite could actually be completed quick and destroy even more of the excuses before they could prepare new ones
Quackdoc
2025-03-26 05:28:18
well mozilla has already said they would implement jxl support when jxl-rs lands, aosp is pushing hard for rust right now and is in a mad dash to rustify everything they possibly can, as for chromium, wait until the pressure builds up, but that wont happen until jxl-rs
HCrikki c++ was always an excuse. its just they had no idea a rewrite could actually be completed quick and destroy even more of the excuses before they could prepare new ones
2025-03-26 05:28:42
this is not true, like it or not, history has proven without a doubt that C/C++ **are** security issues
2025-03-26 05:29:11
and images have historically been a massive issue for security
2025-03-26 05:29:16
that and videos
HCrikki
2025-03-26 05:29:46
code aside, they could always have given vocal support and flattering blog posts in support - jxl is a format, not specific libraries
Quackdoc
2025-03-26 05:30:47
why bother with that?
HCrikki
2025-03-26 05:31:45
its the silence and innuendos that cause the rest of the imaging ecosystem to stick with what they got. like, 'maybe they know something we dont - adopting could be bad'
2025-03-26 05:32:15
myths should be debunked
Quackdoc
2025-03-26 05:32:51
I dont think so, I mean, look at what is already happening, apple implement jxl support and its been working fine, mozilla will be implementing jxl soon, windows and osx have it, linux has it depending on distro and config ofc
2025-03-26 05:33:08
right now the bottleneck is jxl-rs
Meow
Quackdoc I dont think so, I mean, look at what is already happening, apple implement jxl support and its been working fine, mozilla will be implementing jxl soon, windows and osx have it, linux has it depending on distro and config ofc
2025-03-26 05:39:10
Not really working fine. Opening JXL is significantly slower on macOS (not that f*cking "osx")
2025-03-26 05:39:57
Using "osx" is as horrible as using "JPEG-XL"
spider-mario
2025-03-26 08:34:39
> The iPhone only supports it because of DNG. riiight
2025-03-26 08:34:59
which is why they went all the trouble of implementing non-DNG jxl support throughout their OS and browser
Cacodemon345
2025-03-26 08:49:10
Most people aren't waiting for JXL IMO; they're instead waiting for the day they can upload HEIC/HEIF images to social media instead.
CrushedAsian255
spider-mario which is why they went all the trouble of implementing non-DNG jxl support throughout their OS and browser
2025-03-26 09:15:39
And throughout the entire ecosystem
HCrikki
2025-03-26 09:18:32
more efficient storage of photos on their phones is a pretty concern id say. thats why derivatives of bloated jpeg dont exactly thrill (including hdr at no extra filesize isnt a win when jxl can too include hdr in either true hdr or gainmap and sill be half as small and with higher visual quality)
2025-03-26 09:19:41
people cant keep getting even larger capacity mobiles and ever-bigger sdcards to compensate
CrushedAsian255
HCrikki people cant keep getting even larger capacity mobiles and ever-bigger sdcards to compensate
2025-03-26 09:20:25
Like how almost all 4K content is HEVC/AV1 ?
Meow
Cacodemon345 Most people aren't waiting for JXL IMO; they're instead waiting for the day they can upload HEIC/HEIF images to social media instead.
2025-03-26 10:21:03
People can't wait for something they don't even know
Demiurge
Meow Not really working fine. Opening JXL is significantly slower on macOS (not that f*cking "osx")
2025-03-26 10:52:26
jxl-rs is going to be mostly SIMD code
2025-03-26 10:53:00
If it's faster than libjxl then maybe apple will upgrade to it later
2025-03-26 10:53:28
Along with everyone else... libjxl might be left in the dust before long
2025-03-26 10:54:04
Which is not a bad thing. It was due for a major rewrite/refsctor anyways...
Cacodemon345 Most people aren't waiting for JXL IMO; they're instead waiting for the day they can upload HEIC/HEIF images to social media instead.
2025-03-26 10:56:21
heic is such a meme format... I hope apple's camera app starts capturing jxl soon instead. I notice apple heic is much larger than an equivalent jxl.
Cacodemon345
Demiurge Along with everyone else... libjxl might be left in the dust before long
2025-03-26 10:57:06
libjxl should reach 1.0 first; projects are still waiting for a fully-stable 1.0 release.
Demiurge
Cacodemon345 libjxl should reach 1.0 first; projects are still waiting for a fully-stable 1.0 release.
2025-03-26 10:58:59
Eh, it might not ever? And it would be perhaps less effort at this point to rewrite from scratch compared to cleaning up all the messiness.
2025-03-26 10:59:51
It's the messiest and most disorganized codec library I have ever seen honestly
2025-03-26 11:00:26
Compared to the libraries for other audio and image codecs
Cacodemon345
Demiurge Eh, it might not ever? And it would be perhaps less effort at this point to rewrite from scratch compared to cleaning up all the messiness.
2025-03-26 11:01:01
It NEEDs to at the very least; projects will just move on from this JPEG standard as well like JXR if it takes way too long to release 1.0.
Demiurge
2025-03-26 11:01:10
Not counting motion video. Haven't taken a look at those
Cacodemon345 It NEEDs to at the very least; projects will just move on from this JPEG standard as well like JXR if it takes way too long to release 1.0.
2025-03-26 11:02:38
We don't necessarily need a libjxl 1.0. A jxl-rs 1.0 would be almost the same thing. Or a libjxl-ng fork/rewrite
Cacodemon345
2025-03-26 11:02:49
At the very least this image format should not suffer the same fate that befell JPEG 2000 and JPEG XR.
2025-03-26 11:03:05
Even lossless JPEG support is not exactly guaranteed.
Demiurge
2025-03-26 11:03:08
Lots of people abandon the reference encoder. No one uses the MP3 reference encoder.
2025-03-26 11:04:33
Agreed, but if something shows up that completely outclasses the reference library in every way, people typically just abandon the reference library. Like people abandoning libjpeg for libjpeg-turbo
2025-03-26 11:05:44
As long as a very strong replacement shows up.
2025-03-26 11:06:01
If it's strong enough, the reference library eventually fades into irrelevance.
jonnyawsom3
Demiurge We don't necessarily need a libjxl 1.0. A jxl-rs 1.0 would be almost the same thing. Or a libjxl-ng fork/rewrite
2025-03-26 11:20:37
A decoder only rust implementation isn't 'almost the same thing'
Demiurge
2025-03-26 11:26:20
libjxl is not easy to compile with all of its dependencies and lack of separation in the build system. The rust rewrite might actually be more convenient to compile, let alone contribute to, simply by virtue of being written simpler from scratch.
2025-03-26 11:27:02
And you also forget that writing a decoder is a lot more work than an encoder.
2025-03-26 11:28:04
It's not guaranteed that jxl-rs will remain "decoder-only" for long
2025-03-26 11:29:11
It makes sense to focus on a complete, compliant decoder first.
2025-03-26 11:31:48
A decoder has to be complete and thorough and flexible enough to handle all possible valid jxl files, whereas an encoder gets to define its own rules and complexity.
spider-mario
Demiurge And you also forget that writing a decoder is a lot more work than an encoder.
2025-03-26 11:57:08
writing “an” encoder, sure; writing a good one is another question
2025-03-26 11:57:16
in a sense, writing a decoder is more “mechanical”
2025-03-26 11:57:26
whereas an encoder is more of an open-ended problem
Demiurge
2025-03-26 12:01:43
libjxl includes multiple different encoders, you could say. The lossy modular+dct encoder is merely "okayish" compared to what the format is capable of. The fast-lossless encoder is phenomenal. And the regular lossless encoder is also "decent but lacking many specific special case optimizations"
2025-03-26 12:02:23
Actually you could say the regular lossless encoder is very inefficient with time.
2025-03-26 12:04:12
Lacking special-case optimizations, you get terrible performance and surprisingly ineffective compression sometimes.
2025-03-26 12:05:36
The only encoder I would say is phenomenal is fast-lossless and it's much, much simpler and smaller than a decoder. And it was written by Luca who is also in charge of jxl-rs
2025-03-26 12:07:37
The other encoders in libjxl is what you would expect from a rough first draft that gets replaced later.
2025-03-26 12:09:06
A good first-exercise in some of the main strengths and capabilities of the new format, but something you replace with a second draft.
jonnyawsom3
2025-03-26 12:34:11
I mean, that's what libjxl is. A testbed of the past 6 years turned into a viable product. It's up to future implementations to focus on their requirements
CrushedAsian255
2025-03-26 12:37:35
Maybe someone should make like some kind of "JXL encoder boilerplate" library for people to experiment with designing different possible encoding models
2025-03-26 12:37:47
Wouldn't be the most performant but would allow for experimentation
2025-03-26 12:38:57
it would have functions for things that are effectively "intrinsics" for JXL, such as the structure of the data, writing an ANS codestream, etc
2025-03-26 12:39:11
while leaving high level decisions like Varblock size and modular tree to the developer
A homosapien
Cacodemon345 At the very least this image format should not suffer the same fate that befell JPEG 2000 and JPEG XR.
2025-03-26 02:05:42
JPEG 2000 and JPEG XR had different problems which hampered their adoption. For J2K, it was the lack of a fast encoder which gave people the false perception that it was slow. For JXR the encoder just sucked, which gave the perception that the format itself was limited (being made by Microsoft didn't help things either). JPEG XR just wasn't production ready.
2025-03-26 02:07:30
JPEG XL is in a unique position where the only thing holding it back is basically company politics (chromium team 😔).
jonnyawsom3
2025-03-26 02:10:33
Company politics and complex integration I'd say. Seems to be a lot of build/config issues making adoption slack behind in Github repos
A homosapien
2025-03-26 02:12:24
We can always overcome technical hurdles, but we can't do anything about company politics.
jonnyawsom3
2025-03-26 02:15:03
Of all the people here, Youtubers, Discord staff, Researchers. It's a shame we don't have anyone from inside the Chromium team to give real answers
Meow
A homosapien JPEG 2000 and JPEG XR had different problems which hampered their adoption. For J2K, it was the lack of a fast encoder which gave people the false perception that it was slow. For JXR the encoder just sucked, which gave the perception that the format itself was limited (being made by Microsoft didn't help things either). JPEG XR just wasn't production ready.
2025-03-26 02:53:44
JXR has regained some popularity thanks to some gaming utilities and HDR wallpapers for Windows 11
jonnyawsom3
2025-03-26 02:55:25
Namely because it was the only native HDR format in Windows, so games had to save using it
Meow
2025-03-26 03:12:36
Hmm does Photos support HDR JXL?
dogelition
Meow Hmm does Photos support HDR JXL?
2025-03-26 03:39:44
i asked someone to test it before photos actually supported .jxl (so files had to be renamed to e.g. .png) and it was only displaying it in sdr
2025-03-26 03:39:56
might be different now
Quackdoc
HCrikki more efficient storage of photos on their phones is a pretty concern id say. thats why derivatives of bloated jpeg dont exactly thrill (including hdr at no extra filesize isnt a win when jxl can too include hdr in either true hdr or gainmap and sill be half as small and with higher visual quality)
2025-03-26 05:28:36
I save literal gigabytes on my phone by using jxl
Company politics and complex integration I'd say. Seems to be a lot of build/config issues making adoption slack behind in Github repos
2025-03-26 05:30:41
indeed, a lot of people dont want to also wind up pulling other deps like hwy
couleur
_wb_
2025-03-26 08:36:26
is jpegxl good at deduplicating two or more same spots
username
couleur is jpegxl good at deduplicating two or more same spots
2025-03-26 08:41:28
format/spec wise yes (via "patches" and such) however the current libjxl encoder can't really deduplicate stuff that well
A homosapien
2025-03-26 08:57:34
The code for better patch detection was lost 😭
w
couleur is jpegxl good at deduplicating two or more same spots
2025-03-27 04:48:28
No
jonnyawsom3
2025-03-27 05:06:58
It can, but not currently https://discord.com/channels/794206087879852103/803645746661425173/1280929625555603480 https://discord.com/channels/794206087879852103/803645746661425173/1247714711777054730
Demiurge
2025-03-27 06:53:06
I thought it was good at that
2025-03-27 06:53:09
Like screenshots
2025-03-27 06:53:41
Like that discord screenshot where it deduplicated almost everything but <@386612331288723469>'s avatar
Meow JXR has regained some popularity thanks to some gaming utilities and HDR wallpapers for Windows 11
2025-03-27 06:55:20
Is there even a JXR encoder that doesn't have far worse quality than jpeg-turbo?
monad
2025-03-27 11:45:13
it is good at extracting areas of contrast completely surrounded by flat color, often text. if you had an image of a script with connected letters, they wouldn't be individually extracted.
jonnyawsom3
2025-03-29 07:18:48
Creator of TestUFO left a comment https://issues.chromium.org/issues/40168998#comment471
2025-03-29 07:19:07
I tried to clear up some misconceptions they made
A homosapien
2025-03-29 07:35:24
AVIF as a successor to... PNG? 💀 God where does misinformation like this spread?
jonnyawsom3
2025-03-29 07:37:30
I was gonna link the Valve gamescope issue and AVIF being awful at lossless, but I'm on my phone and didn't want to loose the draft switching tabs
CrushedAsian255
A homosapien AVIF as a successor to... PNG? 💀 God where does misinformation like this spread?
2025-03-29 09:16:15
Would it be accurate to say AVIF is successor to WebP
A homosapien
2025-03-29 09:16:56
That would be a more accurate statement
jonnyawsom3
2025-03-29 09:20:04
> Even though AVIF (usually) tends to be a better off-the-web format than WEBP is generally as a potential successor to PNG that has more capabilities (e.g. HDR).
HCrikki
2025-03-29 09:20:38
going back the original videos for animations also isnt conversion but substituting source. jxl losslessly converts gifs, avif is inefficient at that
spider-mario
2025-03-29 09:42:21
> I'd love to play a bit more with JXL now that HDR JXL is supported by Safari Tech Preview 125 it is? 🥰
Tirr
2025-03-29 09:47:26
oh really?
2025-03-29 09:51:47
Preview on 15.4 beta doesn't support HDR JXL, it even shows severe banding which was not there before. maybe something is broken at some point. but I can update to 15.4 RC, I hope this is fixed there
CrushedAsian255
Tirr Preview on 15.4 beta doesn't support HDR JXL, it even shows severe banding which was not there before. maybe something is broken at some point. but I can update to 15.4 RC, I hope this is fixed there
2025-03-29 10:11:41
Can you send me the hdr jxl
_wb_
2025-03-29 01:44:20
Was this already posted? https://www.reddit.com/r/Android/comments/1jjq3cr/when_will_google_stop_holding_back_innovation_and/
Azteca
2025-03-29 01:47:30
https://arstechnica.com/science/2025/03/scientists-are-storing-light-we-cannot-see-in-formats-meant-for-human-eyes/
2025-03-29 01:49:16
The paper was posted in here earlier this month but Ars gets a lot of readers.
Meow
2025-03-29 01:53:54
Popular comment > Of course, Google in its infinite wisdom decided to deprecate support for JPEG XL in Chrome.
jonnyawsom3
2025-03-29 01:54:03
Really wish they'd contacted us for assistance on that paper... Could've been even better
Meow
2025-03-29 01:54:58
Discord doesn't look academic
jonnyawsom3
2025-03-29 01:57:28
> And while Spectral JPEG XL dramatically reduces file sizes, its lossy approach may pose drawbacks for some scientific applications. Or, just set JPEG XL to Lossless....
Meow
2025-03-29 01:59:10
Is "Spectral JPEG XL" a formal term for JPEG XL?
Quackdoc
_wb_ Was this already posted? https://www.reddit.com/r/Android/comments/1jjq3cr/when_will_google_stop_holding_back_innovation_and/
2025-03-29 02:01:57
still havent seen a PR from anyone for aosp, and sadly, we wont be able to see internal development going forwards so I we wont even know if there is internal work until the next AOSP release
2025-03-29 02:02:38
its funny, if you submit a PR to aosp now, google devs will tell if it is accepted or not, but you wont actually know until the next code drop
2025-03-29 02:04:56
<:kekw:808717074305122316>
HCrikki
2025-03-29 02:25:40
Upstreaming in AOSP would only be ideal and reduce duplicated work for OEMs. OEMs' gallery apps can implement the support and itd be as good, since theyre both preinstalled and separately updated as apps
Quackdoc
2025-03-29 03:49:09
not really, you also need aosp to support it so things like documentsui has thumbnails, nearby share so on and so forth
2025-03-29 03:49:14
you really do need the whole shebang
Meow
2025-03-29 05:14:59
> And they've been blasted for it in github for years, and they kept marching on with that. > > They moved to support a newer "format", jpeg_r (ultrahdr) instead. > > Its baffling.
2025-03-29 05:17:29
And this disgusting comment > OpenEXR has some really great lossy formats too - DWAA and DWAB. And OpenEXR can support 100s of channels. Not sure why they have to jerry-rig JPEG-XL to do that.
Quackdoc
Meow And this disgusting comment > OpenEXR has some really great lossy formats too - DWAA and DWAB. And OpenEXR can support 100s of channels. Not sure why they have to jerry-rig JPEG-XL to do that.
2025-03-29 05:27:02
where is the context for this?
2025-03-29 05:27:34
oh nvm
Demiurge
2025-03-29 06:03:44
It's just too bad they didn't combine the channels into one file using an RGB visual base image with extra "spectral separation" channels in between
2025-03-29 06:04:38
But there were libjxl limitations in the number of extra channels I think
2025-03-29 06:08:17
Blue spectral channels can be encoded as "difference from blue" etc
Quackdoc
2025-03-29 06:14:52
can other channels be paired in xyb too? as in xyb+xyb+xyb+xyb? to get the same encoding benefits or is that better left to layers?
lonjil
Quackdoc still havent seen a PR from anyone for aosp, and sadly, we wont be able to see internal development going forwards so I we wont even know if there is internal work until the next AOSP release
2025-03-29 06:28:59
that was already the case for years lol
2025-03-29 06:29:27
going "fully private" just means going from 99% private to 100% private..
Quackdoc
2025-03-29 06:30:01
nah, for aosp most of the stuff was in the open, only proprietary stuff like gms was private
lonjil
2025-03-29 06:30:37
it wasn't developed in the open, they just did periodic code dumps
Quackdoc
2025-03-29 06:42:01
that's not true though ?
2025-03-29 06:42:41
a lot of review can be found here https://android-review.googlesource.com/q/status:open+-is:wip
Demiurge
Quackdoc can other channels be paired in xyb too? as in xyb+xyb+xyb+xyb? to get the same encoding benefits or is that better left to layers?
2025-03-29 07:45:45
You don't use layers, you put spectral info in extra channels. But there isn't a specified way to do it.
Quackdoc
2025-03-29 08:19:40
Im not necesairly talking about spectral info
Lucius
Demiurge You don't use layers, you put spectral info in extra channels. But there isn't a specified way to do it.
2025-03-29 08:45:39
My understanding was that extra channels are always losslessly compressed. Am I mistaken? If not, how would you produce a lossy multispectral image?
Demiurge
2025-03-29 08:53:30
No, they can still be lossy, and you can specify the quality separately. The extra channels just aren't DCT though.
2025-03-29 08:53:40
So you have all the modular coding tools but not DCT
2025-03-29 08:54:21
Which is fine because jxl is still very powerful even without DCT and has a lot of useful lossy compression tools without DCT, like a haar wavelet transform.
Lucius
Demiurge No, they can still be lossy, and you can specify the quality separately. The extra channels just aren't DCT though.
2025-03-29 09:05:11
I was misremembering the discussion in connection with an observation some weeks ago about extra channels of pixeltype float. The compressed extra channels turned out to be larger than they would have been if they had been stored losslessly, so compressing them was pointless. The argument was that extra channels are mainly used for stuff which compresses well, like alpha and masks.
Demiurge
2025-03-29 09:11:37
Yep, it's lossless by default and modular mode is mostly designed for lossless compression but it's good at lossy too.
2025-03-29 09:11:54
libjxl has an option to set the quality of extra channels
2025-03-29 09:12:32
For photographic data lossy compression makes sense even for modular mode
Lucius
Demiurge Yep, it's lossless by default and modular mode is mostly designed for lossless compression but it's good at lossy too.
2025-03-29 09:21:31
There's an open issue on the large size of compressed float extra channels on github (#4082)
Demiurge
2025-03-29 09:27:35
It makes sense that sometimes lossless is better than lossy. Lossy uses a progressive, wavelet algorithm by default, that is bad on some types of non photo data
2025-03-29 09:28:12
Lossless is not progressive by default and uses a typical sequential lossless compression
jonnyawsom3
Creator of TestUFO left a comment https://issues.chromium.org/issues/40168998#comment471
2025-03-30 12:53:25
<@532010383041363969> In regards to [your comment](<https://issues.chromium.org/issues/40168998#comment473>) replying to mine. > I agree that there are deviations from the usual JPEG XL use, but their compromises are well-judged, particularly given the typical time constraints of such research. I believe their compression approach is significantly superior to the most straight-forward application of JPEG XL. When I said 'a standard JXL file', I meant using a single file with the required extra channels storing their spectral data, as opposed to separate files for every channel that they ended up doing. I agree they took the right steps, but it could have been even better given a brief discussion here or with anyone knowledgeable with the format. Using a single file would allow referencing previous channels in MA trees. A lossless test could have been done relatively easily. Seemingly lots of low hanging fruit ripe for the picking if they had been informed.
Kaldaien
Mine18
2025-03-30 12:49:19
Ah, it's still moy problematic. AVIF HDR tonemapped by Discord: HDR PNG (tonemapped? or left alone?) by Discord: SKIV tonemapping the AVIF HDR to SDR: Still needs a lot of work, the last time I saw tonemapping that bad with such wildly wrong colors, the Rec 2020 to Rec 709 matrix was transposed.
Mine18
2025-03-30 12:58:58
ok nice, Scott is still working on it a bit more so hopefully it should look properly later
Meow
2025-03-31 08:15:16
When I searched "openexr vs jpeg xl" on DuckDuckGo, that _JPEG XL, the New Image Format Nobody Wanted or Needed_ was the third result
2025-03-31 08:18:38
But the article mentions neither OpenEXR nor EXR
dogelition
2025-04-09 06:55:45
https://rec98.nmlgc.net/blog/2025-04-09
HCrikki
2025-04-09 03:58:55
phoronix covered ubuntu 25.04's inclusion of jxl enabled by default https://www.phoronix.com/news/Ubuntu-25.04-JPEG-XL-Default
2025-04-09 03:59:39
less known was that all other editions of *buntu (kubuntu, xubuntu, etc) apparently had it out of the box - reason for inconsistency is only base ubuntu swapped around the minimal and normal install, turning them into normal and full
2025-04-09 04:01:55
odd how pro critics interested in photography arent aware almost all ios and macos supports jxl, and the top photo managers and image editors (ie pixelmator, serif suite, lightroom subscription and classic, zoner) also do
2025-04-09 04:03:03
zoner even supported it since mid-2022 even though it never mentioned it in any changelog
5peak
Meow But the article mentions neither OpenEXR nor EXR
2025-04-10 12:00:17
Guess who provides the search disservices 4 DDG?
jonnyawsom3
Quackdoc unfortunately he deleted the repo with the jxl-oxide update, tho I have an archived build here https://cdn.discordapp.com/attachments/673202643916816384/1331819433487765526/jxl-oxide-mv3.7z?ex=6798efc0&is=67979e40&hm=23549a6550579eeaafc3597b37ae758bdf1ff648c191251fa718e0f24193f6d5&
2025-04-25 07:11:24
I don't suppose you remember roughly when that version was made? Curious if I can wedge a newer oxide version in and get better performance. Tried just a WASM swap but it errored saying something about `__wbg_instanceof_Window_def73ea0955fc569`
Quackdoc
2025-04-25 07:15:27
no it was quite a while ago sadly
jonnyawsom3
2025-04-25 07:31:20
Ah well, still works, just a few bugs that would've been nice to fix. I did change the blob mime type to PNG though, since the browser was reading it as text
_wb_
2025-05-11 01:22:06
https://vqeg.org/umbraco/surface/FolderList/GetFile?directory=2025_05_Meta_USA&filename=VQEG_SAM_2025_117.pdf&pageId=1669&m=0
2025-05-11 01:22:45
^ VQEG presentation I did last week with Mohsen
CrushedAsian255
_wb_ ^ VQEG presentation I did last week with Mohsen
2025-05-11 05:08:09
What is the VQEG? Video quality expert group or something?
_wb_
2025-05-11 05:40:23
Yes. https://vqeg.org
novomesk
2025-05-13 09:20:05
https://discussion.fedoraproject.org/t/upcoming-switch-to-jxl-format-for-default-wallpaper/153086
veluca
2025-05-13 09:24:12
I guess we should suggest the new and improved `--faster_decoding` options...
jonnyawsom3
veluca I guess we should suggest the new and improved `--faster_decoding` options...
2025-05-13 09:43:00
Those aren't released yet though, so not sure if they would
veluca
2025-05-13 09:44:27
that can be fixed 😄
2025-05-13 09:44:53
but also the decoder doesn't need to change for that
jonnyawsom3
2025-05-13 09:45:17
Also true, they could just encode with a main version
_wb_
2025-05-13 09:45:39
Are they using libjxl with multithreading?
2025-05-13 09:47:53
If decode speed is an issue and lossless is required, besides faster_decoding the other big thing that effects decode speed is the encode effort
2025-05-13 09:48:20
e1-3 encodes will decode much faster than e9+ encodes
jonnyawsom3
2025-05-13 09:51:14
Well, that's because faster decoding is effectively the same as lower effort settings. It just does more thorough checks with the same coding tools. If someone has an account, could they link to the faster decoding PR and ask them to repeat the tests using it. We've seen decode speeds faster than lossy, but it depends on multithreading and image size
2025-05-13 09:52:49
Maybe we should've made a bigger comparison, against all effort levels too. Since it also beats lossless e1 and e2 now
veluca
2025-05-13 09:54:07
I'll make an account 😄
spider-mario
2025-05-15 09:00:50
> JPEG-XL’s support for lossless transcoding is impressive, but requires the input data to be a lossy JPEG image. When using non-JPEG input data, lossless JXL imposes some serious penalties: Much bigger files and slower decoding. That’s a crappy combo. ??? are they holding the size of lossless JXL to the standard of recompressed JPEGs?
jonnyawsom3
2025-05-15 09:16:30
"The lossless file is bigger" In other news, the more water you drink, the more it costs
2025-05-15 09:26:00
If anyone has accounts for these already, might be worth mentioning the PR here too. Should be up to 4x faster with a 30% density penalty, or 2x and 5%. <https://gitlab.gnome.org/GNOME/gnome-shell/-/issues/6886#note_2440128> <https://discussion.fedoraproject.org/t/f43-change-proposal-switch-to-jxl-format-for-default-wallpaper-self-contained/145923/3>
Lumen
2025-05-15 09:26:14
it is not intuitive for people the fact that reencoding a low quality avif into lossless anything would increase filesize they think that the avif is lossless compared to itself so the result should be equal or smaller to the avif in some way maybe
2025-05-15 09:26:43
while the lossless size reduction is compared to the pixel array size
jonnyawsom3
2025-05-15 09:26:56
This is using PNG input
Lumen
spider-mario > JPEG-XL’s support for lossless transcoding is impressive, but requires the input data to be a lossy JPEG image. When using non-JPEG input data, lossless JXL imposes some serious penalties: Much bigger files and slower decoding. That’s a crappy combo. ??? are they holding the size of lossless JXL to the standard of recompressed JPEGs?
2025-05-15 09:28:00
I was answering to that
2025-05-15 09:28:04
just to be sure
spider-mario
2025-05-15 10:04:41
the originals are PNG as far as I understand
Foxtrot
Lumen it is not intuitive for people the fact that reencoding a low quality avif into lossless anything would increase filesize they think that the avif is lossless compared to itself so the result should be equal or smaller to the avif in some way maybe
2025-05-15 12:32:32
They think it's like zip. Put in anything and it gets losslessly compressed.
damian101
2025-05-15 01:43:12
or at least not larger, like putting a zip in a zip will never increase size much
2025-05-15 01:44:30
But this is simply because they don't see how large the image becomes after decoding, before it can be reencoded.
HCrikki
2025-05-15 02:43:02
they lack context. take a jpg, losslessly compressing that into any other format is *supposed* to make it several times larger in filesize - jxl is the unique outlier that manages to do a conversion at the highest possible quality (100% pixel exact) yet also guarantee smaller filesize than the original
2025-05-15 02:45:56
many conversion scripts and utilities mess conversions by doing full encodes the old way instead of leveraging reversible lossless transcoding's superfast path when the source is a jpeg and is also in an unmodified state (ie not edited after being opened)
Mine18
2025-05-15 03:06:37
i think people expect newer formats to take smaller space than rhe original, even at lossless
Demiurge
_wb_ e1-3 encodes will decode much faster than e9+ encodes
2025-05-15 03:19:45
Actually I think only e1 and e2 decode fast for lossless, e3 and above are always the same decode speed when I checked
jonnyawsom3
Mine18 i think people expect newer formats to take smaller space than rhe original, even at lossless
2025-05-15 03:33:40
It does, when the input is originally lossless too
Demiurge Actually I think only e1 and e2 decode fast for lossless, e3 and above are always the same decode speed when I checked
2025-05-15 03:34:46
It's because 3 is a fixed tree, but it's still the weighted predictor, so it's twice as slow
_wb_
2025-05-15 03:35:28
I get something like this for a random photographic image: ``` Encoding kPixels Bytes BPP E MP/s D MP/s Max norm SSIMULACRA2 PSNR pnorm BPP*pnorm QABPP Bugs ---------------------------------------------------------------------------------------------------------------------------------------- jxl:d0:1 1084 1382376 10.1963931 768.200 140.849 nan 100.00000000 99.99 0.00000000 0.000000000000 10.196 0 jxl:d0:2 1084 1271944 9.3818477 51.116 185.763 nan 100.00000000 99.99 0.00000000 0.000000000000 9.382 0 jxl:d0:3 1084 1120205 8.2626222 39.143 103.180 nan 100.00000000 99.99 0.00000000 0.000000000000 8.263 0 jxl:d0:4 1084 1106477 8.1613646 4.607 67.287 nan 100.00000000 99.99 0.00000000 0.000000000000 8.161 0 jxl:d0:7 1084 1062487 7.8368947 1.236 44.334 nan 100.00000000 99.99 0.00000000 0.000000000000 7.837 0 Aggregate: 1084 1182855 8.7247286 24.458 95.763 0.00000000 100.00000000 99.99 0.00000000 0.000000000000 8.725 0 ```
Demiurge
2025-05-15 03:36:12
So then only e1 and e2 decode super fast. e3 decodes just as fast as e9
2025-05-15 03:36:24
For lossless
Tirr
2025-05-15 03:36:48
the last row is not for e9
_wb_
2025-05-15 03:36:50
well 103 Mpx/s is a bit faster than the 37 Mpx/s I get for an e9 encoded image
Demiurge
_wb_ I get something like this for a random photographic image: ``` Encoding kPixels Bytes BPP E MP/s D MP/s Max norm SSIMULACRA2 PSNR pnorm BPP*pnorm QABPP Bugs ---------------------------------------------------------------------------------------------------------------------------------------- jxl:d0:1 1084 1382376 10.1963931 768.200 140.849 nan 100.00000000 99.99 0.00000000 0.000000000000 10.196 0 jxl:d0:2 1084 1271944 9.3818477 51.116 185.763 nan 100.00000000 99.99 0.00000000 0.000000000000 9.382 0 jxl:d0:3 1084 1120205 8.2626222 39.143 103.180 nan 100.00000000 99.99 0.00000000 0.000000000000 8.263 0 jxl:d0:4 1084 1106477 8.1613646 4.607 67.287 nan 100.00000000 99.99 0.00000000 0.000000000000 8.161 0 jxl:d0:7 1084 1062487 7.8368947 1.236 44.334 nan 100.00000000 99.99 0.00000000 0.000000000000 7.837 0 Aggregate: 1084 1182855 8.7247286 24.458 95.763 0.00000000 100.00000000 99.99 0.00000000 0.000000000000 8.725 0 ```
2025-05-15 03:37:18
Hold on I haven't looked at this yet lol
_wb_
2025-05-15 03:37:21
ran it again with e9 included now: ``` Encoding kPixels Bytes BPP E MP/s D MP/s Max norm SSIMULACRA2 PSNR pnorm BPP*pnorm QABPP Bugs ---------------------------------------------------------------------------------------------------------------------------------------- jxl:d0:1 1084 1382376 10.1963931 821.328 155.343 nan 100.00000000 99.99 0.00000000 0.000000000000 10.196 0 jxl:d0:2 1084 1271944 9.3818477 52.250 179.652 nan 100.00000000 99.99 0.00000000 0.000000000000 9.382 0 jxl:d0:3 1084 1120205 8.2626222 41.265 105.087 nan 100.00000000 99.99 0.00000000 0.000000000000 8.263 0 jxl:d0:4 1084 1106477 8.1613646 4.330 63.568 nan 100.00000000 99.99 0.00000000 0.000000000000 8.161 0 jxl:d0:7 1084 1062487 7.8368947 1.149 40.994 nan 100.00000000 99.99 0.00000000 0.000000000000 7.837 0 jxl:d0:9 1084 1052224 7.7611949 0.234 37.039 nan 100.00000000 99.99 0.00000000 0.000000000000 7.761 0 ```
Demiurge
2025-05-15 03:38:26
But when I was doing testing on my old cpu my experience is that for lossless e3 and above was all the same speed
2025-05-15 03:41:39
My old CPU decoded all effort levels at the same speed. Except the first two.
2025-05-15 03:42:09
Idk if it being old has something to do with that
2025-05-15 03:42:11
Pre haswell
jonnyawsom3
2025-05-15 04:17:14
We had similar hardware dependant results for the faster deciding PR. Sometimes making FD3 faster than FD4. The same might be happening for e3
2025-05-15 04:21:30
I'll run a test too when I'm home. <@794205442175402004> if you can upload the image and send the command you ran, we can make a direct comparison
_wb_
2025-05-15 04:22:13
`benchmark_xl --input=001.png --codec=jxl:d0:1,jxl:d0:2,jxl:d0:3,jxl:d0:4,jxl:d0:7,jxl:d0:9 --encode_reps=5 --decode_reps=30 --num_threads 0`
2025-05-15 04:22:44
AccessViolation_
spider-mario > JPEG-XL’s support for lossless transcoding is impressive, but requires the input data to be a lossy JPEG image. When using non-JPEG input data, lossless JXL imposes some serious penalties: Much bigger files and slower decoding. That’s a crappy combo. ??? are they holding the size of lossless JXL to the standard of recompressed JPEGs?
2025-05-15 04:22:59
lol. most informed tech journalist?
HCrikki
Mine18 i think people expect newer formats to take smaller space than rhe original, even at lossless
2025-05-15 04:48:58
this is only a reasonable expectation when converting from a *lossless* input to a *lossless* output (ie png->lossless webp/jxl). lossy to lossless never made sense, only jxl's reversible transcoding exists at all for media content afaik (nothing for video or audio)
jonnyawsom3
_wb_ ran it again with e9 included now: ``` Encoding kPixels Bytes BPP E MP/s D MP/s Max norm SSIMULACRA2 PSNR pnorm BPP*pnorm QABPP Bugs ---------------------------------------------------------------------------------------------------------------------------------------- jxl:d0:1 1084 1382376 10.1963931 821.328 155.343 nan 100.00000000 99.99 0.00000000 0.000000000000 10.196 0 jxl:d0:2 1084 1271944 9.3818477 52.250 179.652 nan 100.00000000 99.99 0.00000000 0.000000000000 9.382 0 jxl:d0:3 1084 1120205 8.2626222 41.265 105.087 nan 100.00000000 99.99 0.00000000 0.000000000000 8.263 0 jxl:d0:4 1084 1106477 8.1613646 4.330 63.568 nan 100.00000000 99.99 0.00000000 0.000000000000 8.161 0 jxl:d0:7 1084 1062487 7.8368947 1.149 40.994 nan 100.00000000 99.99 0.00000000 0.000000000000 7.837 0 jxl:d0:9 1084 1052224 7.7611949 0.234 37.039 nan 100.00000000 99.99 0.00000000 0.000000000000 7.761 0 ```
2025-05-15 05:29:17
What version was that? I'm getting quite different results, particularly for e3...
Mine18
HCrikki this is only a reasonable expectation when converting from a *lossless* input to a *lossless* output (ie png->lossless webp/jxl). lossy to lossless never made sense, only jxl's reversible transcoding exists at all for media content afaik (nothing for video or audio)
2025-05-15 05:30:31
yeah but people generally aren't cognizant of that fact
_wb_
What version was that? I'm getting quite different results, particularly for e3...
2025-05-15 05:38:36
Some recent git version, didn't check
jonnyawsom3
2025-05-15 05:39:54
Well, this is what a build from the 28th of March gave me. Faster than usual because of clang ``` Encoding kPixels Bytes BPP E MP/s D MP/s Max norm SSIMULACRA2 PSNR pnorm BPP*pnorm QABPP Bugs ---------------------------------------------------------------------------------------------------------------------------------------- jxl:d0:1 1084 1380528 10.1827623 212.832 75.570 nan 100.00000000 99.99 0.00000000 0.000000000000 10.183 0 jxl:d0:2 1084 1272032 9.3824968 18.957 75.326 nan 100.00000000 99.99 0.00000000 0.000000000000 9.382 0 jxl:d0:3 1084 1120295 8.2632860 15.972 45.585 nan 100.00000000 99.99 0.00000000 0.000000000000 8.263 0 jxl:d0:4 1084 1106408 8.1608556 2.499 42.932 nan 100.00000000 99.99 0.00000000 0.000000000000 8.161 0 jxl:d0:7 1084 1064685 7.8531071 0.557 33.294 nan 100.00000000 99.99 0.00000000 0.000000000000 7.853 0 jxl:d0:9 1084 1050079 7.7453734 0.085 31.107 nan 100.00000000 99.99 0.00000000 0.000000000000 7.745 0 ```
2025-05-15 06:16:11
Using a build from a week ago. Effort 1 encode is so slow because this is from Github, so no clang ``` Encoding kPixels Bytes BPP E MP/s D MP/s Max norm SSIMULACRA2 PSNR pnorm BPP*pnorm QABPP Bugs ---------------------------------------------------------------------------------------------------------------------------------------- jxl:d0:1 1084 1380528 10.1827623 95.371 67.659 -nan(ind) 100.00000000 99.99 0.00000000 0.000000000000 10.183 0 jxl:d0:2 1084 1272028 9.3824673 17.253 69.709 -nan(ind) 100.00000000 99.99 0.00000000 0.000000000000 9.382 0 jxl:d0:3 1084 1120289 8.2632417 13.716 35.423 -nan(ind) 100.00000000 99.99 0.00000000 0.000000000000 8.263 0 jxl:d0:4 1084 1106561 8.1619841 1.918 31.634 -nan(ind) 100.00000000 99.99 0.00000000 0.000000000000 8.162 0 jxl:d0:7 1084 1062571 7.8375143 0.447 24.742 -nan(ind) 100.00000000 99.99 0.00000000 0.000000000000 7.838 0 jxl:d0:9 1084 1052308 7.7618145 0.100 22.359 -nan(ind) 100.00000000 99.99 0.00000000 0.000000000000 7.762 0 ``` So Weighted seems very dependant on CPU architecture
Traneptora
spider-mario > JPEG-XL’s support for lossless transcoding is impressive, but requires the input data to be a lossy JPEG image. When using non-JPEG input data, lossless JXL imposes some serious penalties: Much bigger files and slower decoding. That’s a crappy combo. ??? are they holding the size of lossless JXL to the standard of recompressed JPEGs?
2025-05-16 03:49:52
> It takes over half a second for my machine to load f42-NIGHT_frompng_q100.jxl into a GdkPixbuf.Pixbuf, more than 3× as long as it takes to load the PNG or JPG versions! That’s an awfully stiff performance penalty to pay just for a 50% smaller file. This is very slow! What's up with that?
2025-05-16 03:53:05
``` $ time djxl f42-01-night.jxl --disable_output JPEG XL decoder v0.12.0 6bd5a2ce [_AVX2_] Decoded to pixels. 4032 x 3024, 68.130 MP/s [68.13, 68.13], , 1 reps, 8 threads. real 0m0.190s user 0m1.361s sys 0m0.040s ```
2025-05-16 03:53:56
it has an alpha channel too which is a bit strange
2025-05-16 03:54:32
also this is strange
2025-05-16 03:54:34
``` $ umbrielpng -v night.png PNG signature found: night.png Chunk: IHDR, Size: 25, Offset: 8, CRC32: c0f5cea0 Chunk: iCCP, Size: 8729, Offset: 33, CRC32: 0164f360 Chunk: IDAT, Size: 8204, Offset: 8762, CRC32: 71bbb7ff Chunk: 536 more IDAT chunks Chunk: IEND, Size: 12, Offset: 4407155, CRC32: ae426082 Size: 4032x3024, Color: 8-bit RGB + Alpha ICC Profile Length: 9080 ICC profile represents a non-sRGB profile with sRGB primaries iCCP: wp: 34570, 35853, r: 64843, 33086, g: 32117, 59787, b: 15589, 6605 ```
2025-05-16 03:55:45
hm, libjxl detects it as an sRGB profile
2025-05-16 03:55:47
huh
2025-05-16 03:55:55
my own code is lacking then
2025-05-16 04:01:06
displaycal says it's a 4096-entry curve, I wonder if that means it's a LUT
_wb_
2025-05-16 04:09:59
could be a very inefficient sRGB profile. 9080 bytes is kind of silly to just represent sRGB.
2025-05-16 04:10:33
avoiding trivial alpha is a simple way to get some decode speedup
Traneptora
2025-05-16 04:10:43
the question was more "why does my code not detect it as sRGB but libjxl does" but upon inspecting, it it uses a 4096-entry LUT for rTRC/gTRC/bTRC
2025-05-16 04:10:59
and my code only looks at `para`, not `curv` LUTs
2025-05-16 04:11:07
and libjxl probably handles the `curv` case
_wb_
2025-05-16 04:11:41
libjxl does handle tabled transfer curves and checks if it's close enough to the parametric curve, yes
Traneptora
2025-05-16 04:12:09
so that's mystery is solved, but it's got a trivial (i.e. fully opaque) alpha channel
jonnyawsom3
2025-05-16 04:12:15
I'm not sure what's more worrying, not encoding it properly to a JXL, or the original PNGs for distribution not being optimized themselves
Traneptora
2025-05-16 04:12:37
the `.src.rpm` contains a lossless JXL
2025-05-16 04:13:02
I just decoded it to `night.png` so I could inspect it
2025-05-16 04:13:21
ideally they should strip the trivial alpha channel though
_wb_
2025-05-16 04:14:39
yeah they should — it's a pity we cannot make libjxl do it automatically, since that would break layered images with an opaque background layer (which is a pretty common case for layered images) and animations that start with an opaque frame.
2025-05-16 04:16:44
trivial alpha doesn't cost much in terms of compression, but it does cost quite a bit in terms of decode speed. Maybe we could do something clever in the decoder to detect this case and avoid allocating buffers that will only store trivial alpha, but that's probably not so easy to do...
jonnyawsom3
2025-05-16 04:17:58
I was exploring how to minimize alpha impact a few weeks ago. Trying squeeze, different group size, predictors, ect to find the minimal decode speed impact and density. Since it's empty anyway, more tricks can be used
_wb_
2025-05-16 04:41:11
I think for the coding itself, probably best is to use an empty subtree for the alpha channel with a leaf node that represents `- Set 255`, and then a decoder fast path that detects this case and does a `memset`. But some of the decode speed impact is also coming from the extra buffer (first an int32 one, which then gets converted to a float32 one, and then to an interleaved RGBA buffer with whatever sample type the application requests).
jonnyawsom3
2025-05-16 05:20:14
AFAIK Oxide has a single/no predictor fastpath while libjxl doesn't. Could be a speedup for many cases and open new avenues for faster decoding
Traneptora
_wb_ trivial alpha doesn't cost much in terms of compression, but it does cost quite a bit in terms of decode speed. Maybe we could do something clever in the decoder to detect this case and avoid allocating buffers that will only store trivial alpha, but that's probably not so easy to do...
2025-05-17 01:32:15
you'd probably have to both check the entropy stream for one-symbol and also check the MA tree
2025-05-17 01:32:47
of course it wouldn't catch all scenarios but it could be a fast-path
jonnyawsom3
Traneptora > It takes over half a second for my machine to load f42-NIGHT_frompng_q100.jxl into a GdkPixbuf.Pixbuf, more than 3× as long as it takes to load the PNG or JPG versions! That’s an awfully stiff performance penalty to pay just for a 50% smaller file. This is very slow! What's up with that?
2025-05-17 09:43:14
Found it https://github.com/libjxl/libjxl/issues/1097
2025-05-17 09:43:27
Seems like at some point it became singlethreaded
_wb_
2025-06-02 10:32:59
https://cloudinary.com/labs/aic-3-and-hdr
spider-mario
2025-06-02 10:38:08
uh oh, confidence intervals
2025-06-02 10:38:18
my nemeses
2025-06-02 10:38:24
the windmills to my Don Quixotte
_wb_
2025-06-03 05:18:52
What's wrong with confidence intervals? Assuming you interpret them correctly I think they do convey useful information.
spider-mario
2025-06-03 09:08:19
sadly not https://link.springer.com/article/10.3758/s13423-015-0947-8
2025-06-03 09:09:44
> The benefits that modern proponents see CIs as having are considerations outside of confidence interval theory; hence, if used in the way CI proponents suggest, CIs can provide severely misleading inferences. For many CIs, proponents have not actually explored whether the CI supports reasonable inferences or not. For this reason, we believe that appeal to CI theory is redundant in the best cases, when inferences can be justified outside CI theory, and unwise in the worst cases, when they cannot. > […] > Once one has collected data and computed a confidence interval, how does one then interpret the interval? The answer is quite straightforward: one does not – at least not within confidence interval theory. [8] As Neyman and others pointed out repeatedly, and as we have shown, confidence limits cannot be interpreted as anything besides the result of a procedure that will contain the true value in a fixed proportion of samples. Unless an interpretation of the interval can be specifically justified by some _other_ theory of inference, confidence intervals must remain uninterpreted, lest one make arbitrary inferences or inferences that are contradicted by the data.
_wb_
2025-06-03 12:12:18
> Bayesian credible intervals differ from frequentist confidence intervals by two major aspects: > > - credible intervals are intervals whose values have a (posterior) probability density, representing the plausibility that the parameter has those values, whereas confidence intervals regard the population parameter as fixed and therefore not the object of probability. Within confidence intervals, confidence refers to the randomness of the very confidence interval under repeated trials, whereas credible intervals analyse the uncertainty of the target parameter given the data at hand. > - credible intervals and confidence intervals treat nuisance parameters in radically different ways. (https://en.wikipedia.org/wiki/Credible_interval#Contrasts_with_confidence_interval)
2025-06-03 12:20:32
In an application like this, it is reasonable to assume the population parameter (i.e. the 'real' JND value you would get hypothetically by getting pairwise comparison responses from the entire world population) is indeed fixed: for every distorted image, the actual JND distance to the source image in principle has a fixed value and is not the object of probability. So I think that in this particular application, confidence intervals coincide with credible intervals...
spider-mario
2025-06-03 01:31:09
Bayesians also regard the parameter as fixed, just unknown
2025-06-03 01:31:19
and they allow themselves to use probability to express that uncertainty
2025-06-03 01:31:53
frequentists don’t, so they come up with this sort of workaround where they answer a different question than the one they really would like to have answered
2025-06-03 01:34:49
p-values: instead of “probability that the hypothesis is true, given the data” (nonsensical for a frequentist: either the hypothesis is true or it isn’t), compute “frequency at which we would obtain data that extreme, given that the null hypothesis is true” confidence intervals: instead of an interval such that the true parameter has a given probability of being in the interval (nonsensical for a frequentist if both the interval and the parameter are fixed), come up with a method such that in repeated sampling, a given proportion of samples will lead to the creation of an interval that encompasses the true parameter; such intervals are called “confidence intervals” by virtue of how they were created but the individual intervals are not intrinsically imbued with any property whatsoever
2025-06-03 01:36:04
https://sami.boo/jaynes/whats-wrong-with-bayesian-methods/#:~:text=For%20decades%20Bayesians,to%20be%20made%2E > For decades Bayesians have been accused of “supposing that an unknown parameter is a random variable”; and we have denied, hundreds of times and with increasing vehemence, that we are making any such assumption. We have been unable to comprehend why our denials have no effect, and that charge continues to be made.
_wb_
2025-06-03 01:48:32
Right. So it generally makes more sense to consider credible intervals (an interval such that the probability that the real value is within the interval is N%) than to consider confidence intervals (an interval that is constructed in a way that has a N% probability of being an interval that contains the real value).
spider-mario
2025-06-03 01:49:50
I am at least of that opinion 😄
2025-06-03 01:50:07
https://sami.boo/jaynes/confidence-intervals-vs-bayesian-intervals/ has a couple of examples where they might diverge (I especially like the truncated exponential)
_wb_
2025-06-03 01:58:23
Question is how different these intervals are in this case (confidence interval computed by bootstrapping; parameter estimation procedure based on maximum likelihood estimation of a JND scale according to Thurstonian Case V). It may very well boil down to the same thing. I do agree though that it makes more sense to talk about credible intervals instead of confidence intervals; the only reason why confidence intervals were computed is that "this is what is usually done in the literature" but that's of course a bad argument — there are plenty of examples of "tradition" being just wrong.
2025-06-09 01:19:58
The long paper on JXL is publicly available now: https://arxiv.org/abs/2506.05987