JPEG XL

Info

rules 57
github 35276
reddit 647

JPEG XL

tools 4225
website 1655
adoption 20712
image-compression-forum 0

General chat

welcome 3810
introduce-yourself 291
color 1414
photography 3435
other-codecs 23765
on-topic 24923
off-topic 22701

Voice Channels

General 2147

Archived

bot-spam 4380

on-topic

Whatever else

Traneptora
2024-01-29 03:50:20
including mpv?
2024-01-29 03:50:29
or plplay
diskorduser
2024-01-29 03:53:08
2024-01-29 03:55:34
even gimp also opens it with wrong colors
Traneptora
diskorduser even gimp also opens it with wrong colors
2024-01-29 03:58:02
versions?
diskorduser
2024-01-29 03:58:25
gimp 2.10 from arch repo
Traneptora
2024-01-29 03:58:32
of cjpegli
2024-01-29 03:58:42
can you upload the sample PNG?
2024-01-29 03:59:36
ah wait
2024-01-29 03:59:43
the input has an alpha channel doesn't it
diskorduser
2024-01-29 03:59:52
https://wallhaven.cc/w/2ydj6m
Traneptora
2024-01-29 04:00:31
cjpegli doesn't support input with alpha
2024-01-29 04:00:45
the bug here is that it's silently producing the wrong output
2024-01-29 04:00:56
rather than failing
diskorduser
2024-01-29 04:01:34
It should ignore alpha channel. right?
Traneptora
2024-01-29 04:01:44
theoretically yes but some bug is preventing it from doing so
2024-01-29 04:02:01
it looks like it's decoding the alpha image to a pixel bufer and reinterpret-casting `RGBA[4]` to `RGB[3]`
2024-01-29 04:02:15
which is causing the colors to be incorrect
diskorduser
2024-01-29 04:02:41
mpv shows it as gbrp
Traneptora
2024-01-29 04:02:53
Yea, what's happening is the original image has an alpha channel
2024-01-29 04:03:04
cjpegli calls libpng, gets the decoded pixel buffer
2024-01-29 04:03:09
which is an `rgba` buffer
2024-01-29 04:03:14
but it interprets it as `rgb24` which is incorrect
2024-01-29 04:03:35
if you open the original in GIMP and strip the alpha channel
2024-01-29 04:03:46
and then save it as a PNG, and then encode with cjpegli it works fine
diskorduser
2024-01-29 04:03:58
okay. i will test it again
Traneptora
2024-01-29 04:04:07
but this si definitely a bug in cjpegli
diskorduser
2024-01-29 04:06:58
now everything opens it fine after removing alpha channel
2024-01-29 04:07:02
except mpv
2024-01-29 04:08:45
Maybe the mpv doesn't work with xyb jpegs or something.
Traneptora
diskorduser Maybe the mpv doesn't work with xyb jpegs or something.
2024-01-29 04:11:20
it does, what is your mpv and avcodec version
2024-01-29 04:12:04
and libplacebo version
2024-01-29 04:12:30
I put in a lot of bug fixes into both avcodec's mjpegdec as well as libplacebo specfiically to make that work
diskorduser
2024-01-29 04:14:32
I turned off my laptop. I will check the version later. I installed fresh arch like 3 days ago. So it should have the latest mpv I think.
Traneptora
2024-01-29 04:15:29
https://github.com/libjxl/libjxl/issues/3219
diskorduser I turned off my laptop. I will check the version later. I installed fresh arch like 3 days ago. So it should have the latest mpv I think.
2024-01-29 04:16:11
are you using `--vo=gpu-next`?
2024-01-29 04:16:25
`--vo=gpu --gpu-api=opengl` doesn't color manage properly
diskorduser
2024-01-29 04:16:51
I will check and report tomorrow
Traneptora
2024-01-29 04:17:54
I ask because out of the box mpv i.e. `mpv --no-config --pause xyb.jpg` fails
2024-01-29 04:18:05
but `mpv --no-config --pause --vo=gpu-next xyb.jpg` does work
2024-01-29 04:18:37
eventually they'll make `vo=gpu-next` the default and a lot of problems will go away overnight
Posi832
Traneptora https://github.com/libjxl/libjxl/issues/3219
2024-01-29 10:21:56
There's already an issue for this : / https://github.com/libjxl/libjxl/issues/2671
Traneptora
2024-01-29 10:23:09
huh, TIL
Posi832 There's already an issue for this : / https://github.com/libjxl/libjxl/issues/2671
2024-01-29 10:23:44
closed mine as duplicate and mentioned it in the comments, thanks for pointing it out
Posi832
2024-01-29 10:26:23
You're welcome
yurume
2024-01-30 10:02:30
question: is it even possible to detect whether a particular lossless file actually went through JPEG?
veluca
2024-01-30 10:03:08
not detect for sure, but there's a few heuristics you can do
yurume
2024-01-30 10:03:23
assuming that the quality was low enough to be considered "that's a JPEG artifact" for who knows things
veluca
2024-01-30 10:03:29
which basically boil down to "do a DCT and see if a few values are suspiciously close to 0"
yurume
2024-01-30 10:04:10
that was indeed my first instinctive answer, but wondered if there exists other technique
2024-01-30 10:05:06
my question stems from a possibility to treat such lossless input as if lossily compressed
jonnyawsom3
yurume my question stems from a possibility to treat such lossless input as if lossily compressed
2024-01-30 10:37:21
In what way? Using lossy encoding on already lossy images?
yurume
2024-01-30 10:37:51
yeah, essentially a reconstruction of JPEG from PNG, so to say
jonnyawsom3
2024-01-30 10:38:43
Hmm, so detect the 8x8 blocks, the 0 DCT, and then use the same 8x8 blocks to avoid more artefacts?
lonjil
2024-01-30 11:04:06
I think I've seen something like that, but google is unhelpful.
diskorduser
Traneptora but `mpv --no-config --pause --vo=gpu-next xyb.jpg` does work
2024-01-30 04:51:29
with gpu-next mpv works correctly.
Traneptora
2024-01-30 05:11:46
👍
2024-01-30 05:11:56
I want us to make it the default already tbh
Quackdoc
2024-01-30 05:23:58
isn't it mandatory now anyways?
Traneptora
2024-01-30 05:24:07
is what mandatory
Quackdoc
2024-01-30 05:29:15
libplacebo
Traneptora
2024-01-30 05:29:40
yes, but `--vo=gpu --gpu-api=opengl` doesn't use it
Quackdoc
2024-01-30 05:30:20
yes I know, Im just saying now that it's mandatory, is there even a real reason not to have it default?
Traneptora
2024-01-30 05:30:33
still "experimental" or someth
Quackdoc
2024-01-30 05:31:41
the most stable experimental experience in the world lol
Traneptora
2024-01-30 05:48:21
zerover
diskorduser
2024-01-31 10:55:45
I don't see cjxl in windows zips. Is it supposed to be like that? https://artifacts.lucaversari.it/libjxl/libjxl/latest/
Moritz Firsching
2024-01-31 11:57:58
are they present in the static ones?
Olav
2024-01-31 03:58:16
Yes it is
gb82
2024-01-31 10:06:06
Hey, I had a question about licensing with proprietary codecs. Specifically, I am curious about VVC, which to my understanding is a patented technology. Despite this, it appears that FFmpeg is able to ship with their native ffvvc decoder for VVC decoding. How is this possible? Web browsers have not incorporated built-in decoding for other patented codecs like HEVC, but it seems like it is not only somehow possible but also GPL compatible. Is there a specific reason for this discrepancy? And then for codecs like JPEG-XL, which are published behind a paid ISO spec, I take it there aren't any restrictions for writing code around the format, even if you haven't bought the specification. How do patent pools differ from this? VVC's spec is open if I recall, but the codec itself is patented. Seems confusing to understand
afed
2024-01-31 10:45:56
it's mostly related to commercial use and binaries and not in all countries, like videolan binaries and compiled ffmpeg is hosted in france for this reason free open source code and other codec implementations typically don't require licensing and royalties <https://www.videolan.org/legal.html>
_wb_
2024-01-31 11:28:34
There is copyright and then there are patents. Code has a copyright license that can be FOSS or not (e.g. x265/x266/libjxl are FOSS so you can freely copy them). The ISO standards have a copyright that does not allow free distribution. But that only matters for the standard text itself. Patents apply to ideas. JXL is not patent-encumbered so anyone can use it without having to worry about royalties. HEVC and VVC are patent-encumbered so you may have to worry about royalties, regardless of what implementation you use. In terms of copyright you can use x265/x266 freely but in terms of patents, you may still have to pay royalties when you are using it.
lonjil
2024-02-01 12:01:46
I don't think the phrasing about parents is entirely correct. Patents apply to *inventions*. Which I suppose you can say is a kind of an idea, but if I have the idea "a machine that makes pancakes", I can't patent that. If there's some kind of problem that affects pancake making machines, and I invent a solution to that problem, I could patent that, and anyone who wants to build better pancake making machines than were previously possible, would have to license my patent. Well, that's how it works in theory, anyway. Patent offices are very incompetent when it comes to evaluating patent applications involving software and computer hardware. Did you know SiFive has a patent for something Intel and others have been doing for decades, but they wrote something like "in the context of RISC-V" in their patent application.
Quackdoc
2024-02-01 01:04:58
rather then inventions, it would be more accurate to say implementations, since you can iterate on ideas and patent the iterations.
gb82 Hey, I had a question about licensing with proprietary codecs. Specifically, I am curious about VVC, which to my understanding is a patented technology. Despite this, it appears that FFmpeg is able to ship with their native ffvvc decoder for VVC decoding. How is this possible? Web browsers have not incorporated built-in decoding for other patented codecs like HEVC, but it seems like it is not only somehow possible but also GPL compatible. Is there a specific reason for this discrepancy? And then for codecs like JPEG-XL, which are published behind a paid ISO spec, I take it there aren't any restrictions for writing code around the format, even if you haven't bought the specification. How do patent pools differ from this? VVC's spec is open if I recall, but the codec itself is patented. Seems confusing to understand
2024-02-01 01:10:52
ffmpeg doesn't distribute anything, however in regards to what you are asking ffvvc being included in lgpl and gpl IMO are clear violations of lgpl and gpl since they explicitly require that no patent fees are charged in which both via-la and access advance's patent pools both require payments for decoders (well technically via-la uses the term codec so that may not actually apply here, but advance does use "decoder and/or encoder")
gb82
_wb_ There is copyright and then there are patents. Code has a copyright license that can be FOSS or not (e.g. x265/x266/libjxl are FOSS so you can freely copy them). The ISO standards have a copyright that does not allow free distribution. But that only matters for the standard text itself. Patents apply to ideas. JXL is not patent-encumbered so anyone can use it without having to worry about royalties. HEVC and VVC are patent-encumbered so you may have to worry about royalties, regardless of what implementation you use. In terms of copyright you can use x265/x266 freely but in terms of patents, you may still have to pay royalties when you are using it.
2024-02-01 01:20:30
Ah okay, so as far as I am understanding, a patent may demand that a user pays royalties, and is agnostic to the software they are using so implementation-wise, it doesn't actually matter if the code is FOSS or not depending on your use case as a corporation or individual? The copyright part I am understood on
Quackdoc ffmpeg doesn't distribute anything, however in regards to what you are asking ffvvc being included in lgpl and gpl IMO are clear violations of lgpl and gpl since they explicitly require that no patent fees are charged in which both via-la and access advance's patent pools both require payments for decoders (well technically via-la uses the term codec so that may not actually apply here, but advance does use "decoder and/or encoder")
2024-02-01 01:21:13
where is the specific language around that? I believe you, but there might be some sort of loophole since FFmpeg has a native HEVC decoder too (afaik)
Quackdoc
2024-02-01 01:22:03
check the pdf here https://accessadvance.com/licensing-programs/vvc-advance/
gb82
2024-02-01 01:22:19
I mean in the GPL
2024-02-01 01:22:48
ah I'll look for it
Quackdoc
2024-02-01 01:24:47
``` You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. ```
2024-02-01 01:26:07
also
2024-02-01 01:26:12
``` 12. No Surrender of Others' Freedom. If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. ```
2024-02-01 01:27:17
this was from gplv3 gplv2 may have had some different terms, but the gist is the same
afed
2024-02-01 01:32:23
vvc is no different from aac, avc, hevc and other not royalty free formats for code distribution, but, implementations may not be compatible with gpl/lgpl just free open source implementations don't grant patents
lonjil
Quackdoc ``` You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. ```
2024-02-01 01:33:31
I don't believe either of these really apply
2024-02-01 01:33:49
The restrictions aren't imposed by whoever wrote the code, they're imposed by the patent holders.
2024-02-01 01:34:18
And the authors of the code presumably have not entered any agreement with them to impose those restrictions on anyone else.
2024-02-01 01:34:53
Distribution is most likely illegal, but no due to any violations of copyright law.
afed
2024-02-01 01:37:51
it's the same as there are non-free and free implementations of aac encoders/decoders for ffmpeg
Traneptora
gb82 Hey, I had a question about licensing with proprietary codecs. Specifically, I am curious about VVC, which to my understanding is a patented technology. Despite this, it appears that FFmpeg is able to ship with their native ffvvc decoder for VVC decoding. How is this possible? Web browsers have not incorporated built-in decoding for other patented codecs like HEVC, but it seems like it is not only somehow possible but also GPL compatible. Is there a specific reason for this discrepancy? And then for codecs like JPEG-XL, which are published behind a paid ISO spec, I take it there aren't any restrictions for writing code around the format, even if you haven't bought the specification. How do patent pools differ from this? VVC's spec is open if I recall, but the codec itself is patented. Seems confusing to understand
2024-02-01 01:40:45
so here is where you get into the difference between copyrights and patents
2024-02-01 01:40:59
code can be copyrighted, and algorithms can be patented, but you can't patent code or copyright an algorithm
2024-02-01 01:41:10
they're governered by separate, but related, branches of intellectual property law
2024-02-01 01:41:38
with regard to the laws in the United States, implementing a patented algorithm in open-source code is entirely legal
2024-02-01 01:42:35
the precedent that judges have ruled is that source code is governed under first-amendment free speech, and providing source code for software that implements a patented algorithm is functionally equivalent to describing how an invention works - which is also public information, since patents are public record
2024-02-01 01:43:05
so it's not illegal in the US to implement a patented algorithm in source code
2024-02-01 01:43:34
distributing *binaries* that implement a patented algorithm *would* be subject to American patent law
2024-02-01 01:43:43
this is why, for years, LAME (the mp3 encoder), was source-only.
2024-02-01 01:44:02
this is all as it pertains to American IP law
2024-02-01 01:44:17
in Europe it's much simpler - algorithms and software patents don't exist in the EU
2024-02-01 01:45:26
that said, in the US, if the technologies are patented, then it's up to the patent holder how to handle that
2024-02-01 01:45:38
in the case of Fraunhofer's patents on mp3, they chose to be restrictive
2024-02-01 01:45:48
the JXL devs chose explicitly to do the opposite
afed
2024-02-01 01:52:27
also, i hope the xhe-aac/usac decoder will be ready for the next ffmpeg release <:H266_VVC:805858038014672896>
gb82
2024-02-01 02:00:54
So overall, what I'm getting is: - Copyright applies to code, patents apply to inventions/implementations. ISO standards have a copyright on the text, but this doesn't restrict implementing the ideas - Patents apply regardless of the code implementation. So using a FOSS codec implementation doesn't absolve you of potential patent licensing fees if the codec is patented - The GPL explicitly prohibits imposing additional restrictions like patent royalties on exercising the GPL rights. But patent holders can still demand royalties separately from the GPL license - It's legal in the US to implement patented algorithms in source code due to free speech protections. Distributing binaries may still incur patent liability - In the EU software patents don't really exist. In the US it's up to the patent holder how to enforce them My conclusion is that it's legally permissible to implement patented codecs in open source software, but patent holders can still demand licensing fees for usage depending on jurisdiction. And it seems like the GPL doesn't provide protection against separate patent claims
2024-02-01 02:11:04
> for exercise of rights granted under this License this is the important part with GPL I think, considering the software is separate from the patented invention
Traneptora
gb82 So overall, what I'm getting is: - Copyright applies to code, patents apply to inventions/implementations. ISO standards have a copyright on the text, but this doesn't restrict implementing the ideas - Patents apply regardless of the code implementation. So using a FOSS codec implementation doesn't absolve you of potential patent licensing fees if the codec is patented - The GPL explicitly prohibits imposing additional restrictions like patent royalties on exercising the GPL rights. But patent holders can still demand royalties separately from the GPL license - It's legal in the US to implement patented algorithms in source code due to free speech protections. Distributing binaries may still incur patent liability - In the EU software patents don't really exist. In the US it's up to the patent holder how to enforce them My conclusion is that it's legally permissible to implement patented codecs in open source software, but patent holders can still demand licensing fees for usage depending on jurisdiction. And it seems like the GPL doesn't provide protection against separate patent claims
2024-02-01 03:25:31
this is all correct yes
Quackdoc
gb82 So overall, what I'm getting is: - Copyright applies to code, patents apply to inventions/implementations. ISO standards have a copyright on the text, but this doesn't restrict implementing the ideas - Patents apply regardless of the code implementation. So using a FOSS codec implementation doesn't absolve you of potential patent licensing fees if the codec is patented - The GPL explicitly prohibits imposing additional restrictions like patent royalties on exercising the GPL rights. But patent holders can still demand royalties separately from the GPL license - It's legal in the US to implement patented algorithms in source code due to free speech protections. Distributing binaries may still incur patent liability - In the EU software patents don't really exist. In the US it's up to the patent holder how to enforce them My conclusion is that it's legally permissible to implement patented codecs in open source software, but patent holders can still demand licensing fees for usage depending on jurisdiction. And it seems like the GPL doesn't provide protection against separate patent claims
2024-02-01 06:22:17
GPL does provide protections against it, When you developer GPL software, and you use GPL libraries there can be no restrictions applied as per gpl's terms, that includes the need to pay royalties
lonjil I don't believe either of these really apply
2024-02-01 06:25:04
it doesz this is why fedora/RHEL needed to strip the still patented codecs from libfdk aac before distribution
gb82
2024-02-01 06:34:47
oh right, so Red Hat agrees with your take here
sklwmp
2024-02-01 08:33:05
trying to libvips-convert some XSane-generated PNM files led me to discover that libvips cannot handle PNM files with two or more lines of comments lol
2024-02-01 08:33:10
had to use imagemagick
2024-02-01 08:33:16
time to open an issue
2024-02-01 08:34:46
simply using the following PBM file breaks libvips: ```pbm P1 # # 1 1 0 ```
2024-02-01 08:35:00
output: ``` (vips:2151): GLib-GObject-CRITICAL **: 16:34:01.737: value "0" of type 'gint' is invalid or out of range for property 'width' of type 'gint' (vips:2151): GLib-GObject-CRITICAL **: 16:34:01.743: value "0" of type 'gint' is invalid or out of range for property 'width' of type 'gint' ```
2024-02-01 08:35:56
adding numbers to the comments breaks libvips even further: Image: ``` P1 # 0 # 0 1 1 0 ``` libvips output: ``` (vips:2280): GLib-GObject-CRITICAL **: 16:35:49.029: value "0" of type 'gint' is invalid or out of range for property 'width' of type 'gint' (vips:2280): GLib-GObject-CRITICAL **: 16:35:49.029: value "0" of type 'gint' is invalid or out of range for property 'height' of type 'gint' (vips:2280): GLib-GObject-CRITICAL **: 16:35:49.034: value "0" of type 'gint' is invalid or out of range for property 'width' of type 'gint' (vips:2280): GLib-GObject-CRITICAL **: 16:35:49.034: value "0" of type 'gint' is invalid or out of range for property 'height' of type 'gint' ```
2024-02-01 09:51:38
https://twitter.com/CCBalert/status/1752357113618165975?s=19 what exactly is this talking about?
2024-02-01 09:52:38
oh, i assume this https://twitter.com/the_yellow_fall/status/1752157114292949007?s=19
yoochan
2024-02-01 09:55:03
ouch !
Quackdoc
2024-02-01 10:23:27
I love how people act like its some kind of major issue, man RCE exploits happen and get patched all the time
2024-02-01 10:23:31
bros need to chill
sklwmp
sklwmp trying to libvips-convert some XSane-generated PNM files led me to discover that libvips cannot handle PNM files with two or more lines of comments lol
2024-02-01 10:30:09
aaaaand it's a one-line fix <:KekDog:805390049033191445>
yurume
Quackdoc I love how people act like its some kind of major issue, man RCE exploits happen and get patched all the time
2024-02-01 10:30:59
correct but still not a good thing to have
Quackdoc
yurume correct but still not a good thing to have
2024-02-01 10:31:29
~~so when the rust rewrite?~~
2024-02-01 10:31:31
https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&name=dogelol&quality=lossless
yurume
2024-02-01 10:38:41
seriously, do you want one?
2024-02-01 10:39:10
more accurately: will core devs ever want to migrate to Rust if feasible?
Quackdoc
2024-02-01 10:41:21
well it would be nice to have something other then ffmpeg for sure at the very least, and typically rust is just in general a really solid target right now. rust-av iirc at one point was going to do something like that but plans fell through
2024-02-01 10:43:38
>do not submit, im assuming this is CI testing or something
Tirr
2024-02-01 10:45:34
core devs using CI env to try some potential fix
jonnyawsom3
2024-02-01 11:06:28
"Nothing to see here!"
_wb_
2024-02-01 12:49:51
I'm struggling — I'm trying to find a way to somehow visualize the set of Rec.2100 HLG colors that are _not_ within the sRGB range (either because they're out of gamut or because they're too bright), but I can't seem to find a good way to do that
2024-02-01 12:50:12
<@604964375924834314> any ideas? have you done something like this at some point?
spider-mario
2024-02-01 12:51:50
in a sense, it depends on how bright you consider sRGB to be; and if you render the “HLG way”, none will really be too bright _per se_ (although they can be too (bright×saturated))
2024-02-01 12:53:01
if you target 80 cd/m², then you render HLG by applying the OETF⁻¹ followed by a gamma of 0.81
2024-02-01 12:53:25
this 0.81 gamma may push some colours even out of Rec. 2020 gamut
damian101
spider-mario in a sense, it depends on how bright you consider sRGB to be; and if you render the “HLG way”, none will really be too bright _per se_ (although they can be too (bright×saturated))
2024-02-01 01:31:44
HLG targets a Bt.1886 SDR screen with 250 nits peak brightness.
2024-02-01 01:31:57
aside a compatible HDR display, of course
2024-02-01 01:34:35
Most SDR Blu-rays nowadays are tonemapped with a target of 160 nits btw, Dolby recommendation. Another recommended target is 203 nits.
2024-02-01 01:34:59
And 100 nits of course, from the BT.709 specification.
2024-02-01 01:35:52
The DCI peak white was close to sRGB iirc
2024-02-01 01:36:17
And people say HDR is complicated...
lonjil
Quackdoc it doesz this is why fedora/RHEL needed to strip the still patented codecs from libfdk aac before distribution
2024-02-01 01:36:50
I think they do that because they don't want to violate patents, not because of GPL violations. I mean, presumably they'd still strip patented stuff from anything licensed MIT, right?
spider-mario
Most SDR Blu-rays nowadays are tonemapped with a target of 160 nits btw, Dolby recommendation. Another recommended target is 203 nits.
2024-02-01 01:42:57
is that specific to HLG or is that target also used for PQ?
The DCI peak white was close to sRGB iirc
2024-02-01 01:43:09
iirc, it’s even lower (around 48 cd/m² or so)
damian101
spider-mario is that specific to HLG or is that target also used for PQ?
2024-02-01 01:43:13
PQ does not target SDR screens, it has to be tonemapped
spider-mario
2024-02-01 01:43:24
yes, my question was about tone mapping
damian101
spider-mario iirc, it’s even lower (around 48 cd/m² or so)
2024-02-01 01:44:08
you are absolutely correct
spider-mario yes, my question was about tone mapping
2024-02-01 01:45:24
SMPTE 2084 PQ itself always has a peak of 10'000 nits. Which can then be tonemapped to a peak of your liking, ideally to whatever the peak of your target display is.
SMPTE 2084 PQ itself always has a peak of 10'000 nits. Which can then be tonemapped to a peak of your liking, ideally to whatever the peak of your target display is.
2024-02-01 01:45:58
Tonemapping can be improved by taking metadata into account to get the actual peak brightness of the source, ideally the specific scene. Results in more dynamic range of the target display being used.
2024-02-01 01:47:51
Btw, I recently noticed cjxl now targets 255 nits by default, instead of 80 nits.
_wb_
spider-mario in a sense, it depends on how bright you consider sRGB to be; and if you render the “HLG way”, none will really be too bright _per se_ (although they can be too (bright×saturated))
2024-02-01 01:47:54
Let's say sRGB is max 80 nits and with its normal standard-gamut primaries. Now consider "all of HLG", assuming max 1000 nits (which puts the SDR white at around 80 nits since the max HLG white is 12 times as bright as the corresponding SDR white) and with its normal Rec2020 primaries. Then what I'd like to see is some visualization of the boundary between "all of HLG" and "sRGB"
Btw, I recently noticed cjxl now targets 255 nits by default, instead of 80 nits.
2024-02-01 01:48:55
255 nits for "SDR white" is not an unreasonable assumption, many SDR displays are 300-400 nits nowadays so 80 nits is quite dim by today's standards...
damian101
_wb_ Let's say sRGB is max 80 nits and with its normal standard-gamut primaries. Now consider "all of HLG", assuming max 1000 nits (which puts the SDR white at around 80 nits since the max HLG white is 12 times as bright as the corresponding SDR white) and with its normal Rec2020 primaries. Then what I'd like to see is some visualization of the boundary between "all of HLG" and "sRGB"
2024-02-01 01:49:04
Well, in Linear light, such a comparison would be very much unfair...
_wb_ 255 nits for "SDR white" is not an unreasonable assumption, many SDR displays are 300-400 nits nowadays so 80 nits is quite dim by today's standards...
2024-02-01 01:49:51
I totally agree, for the average user this is very much an improvement. Many people used to complain that distance 1.0 was not transparent.
Well, in Linear light, such a comparison would be very much unfair...
2024-02-01 01:51:48
By which I mean, a perceptually uniform transfer curve should ideally be used, suited for HDR, like PQ.
_wb_
2024-02-01 01:52:30
i'm not saying the visualization should be based on linear light, that would show too little detail in the darks. I was thinking more like a Rec2100 HLG RGB cube or something (or something like XYB or Lab would also work)
Quackdoc
lonjil I think they do that because they don't want to violate patents, not because of GPL violations. I mean, presumably they'd still strip patented stuff from anything licensed MIT, right?
2024-02-01 01:52:56
the entire reason that fdk-aac is not considered GPL compatible is explicitly because it doesn't grant patent licence use. if someone were to create a VVC library that would give all users the appropriate patent licenses and protections that comply with GPL it is possible, but as far as I know this has pretty much never been done before. and yes that is exactly what they did, fdk-aac-free is libfdk with the patented bits removed to make it gpl compatible
lonjil
2024-02-01 01:54:15
fdk-aac is developed by the patent holder though, isn't it?
2024-02-01 01:54:41
That's a very different situation from a 3rd party developing some software.
2024-02-01 01:55:44
the GPL prevents you, the person giving someone else software, from imposing any additional requirements. But for patents, such impositions may come from someone else, which means the person distributing the software isn't the one imposing any additional requirements, and thus aren't violating the GPL.
_wb_
2024-02-01 02:12:38
Both HEVC and VVC are heavily patent-encumbered but you can still write GPL-licensed software (or software with any other license) that implements those codecs. Patents can prevent you from legally _using_ such software though, since as soon as you run it on an actual machine then it becomes an "embodiment" of the patent-protected invention and for that you need a license from the patent holder (who may decide to grant such a license only if you pay royalties).
lonjil
lonjil the GPL prevents you, the person giving someone else software, from imposing any additional requirements. But for patents, such impositions may come from someone else, which means the person distributing the software isn't the one imposing any additional requirements, and thus aren't violating the GPL.
2024-02-01 02:13:07
The fdk-aac license specifically imposes such requirements (and so presumably doesn't count as FOSS, actually) > You may use this FDK AAC Codec software or modifications thereto only for purposes that are authorized by appropriate patent licenses.
Quackdoc
2024-02-01 02:26:36
GPLv2 implies a patent license grant, https://copyleft.org/guide/comprehensive-gpl-guidech7.html GPLv3 explicitly requires a patent grant, Just because you write the code as GPL and distribute it as such doesn't mean you actually have the right to give a patent license grant. assuming a 3rd party creates a library that uses a patented technologies and says "here have a patent grant", doesn't mean you actually have one, now I'm sure you may have some degree of protection under the law in some states, I think Canada does have a degree of protections? But regardless, you would for sure be able to be penalized for continuing to use the patents without an appropriate license from that point forwards.
afed
2024-02-01 02:36:55
most of the ffmpeg encoders and decoders are patent encumbered and for example x264/x265 have lgpl compatibility, even though they are commercial encoders and formats require royalties so vvc is in the same boat, but implementations may have specific or extra licensing requirements
2024-02-01 02:41:04
in theory there could be even patented open source encoders for av1, even if the format is royalty free because they use some proprietary techniques or preprocessing to improve encoding quality, which require royalties
Quackdoc
2024-02-01 02:43:47
yeah I don't know why they have it listed as (l)gpl compatible, fedora/rhel actually just out disables everything then re-enables them piece meal https://src.fedoraproject.org/rpms/ffmpeg/blob/rawhide/f/ffmpeg.spec
afed
2024-02-01 02:45:00
https://mailman.videolan.org/pipermail/x264-devel/2010-July/007508.html
Quackdoc
2024-02-01 02:46:51
yeah... I always just chalked this up to videolan not caring because they don't need to care
2024-02-01 02:47:38
also because gplv2 only *implies* a patent license, there is ofc legal gray area, if it was gplv3 it would be a lot more cut and dry
afed
2024-02-01 02:58:01
there are no gray areas for source code, but for binary files or when it's distributed in software by a company providing commercial services, there maybe required some special licensing
spider-mario
SMPTE 2084 PQ itself always has a peak of 10'000 nits. Which can then be tonemapped to a peak of your liking, ideally to whatever the peak of your target display is.
2024-02-01 03:12:16
I’m aware of that, but if Dolby has a recommendation as to which peak to use when mapping HLG, it would seem plausible that it might also have one when tone mapping PQ – does it?
lonjil
2024-02-01 03:35:14
> GPLv2 implies a patent license grant, https://copyleft.org/guide/comprehensive-gpl-guidech7.html GPLv3 explicitly requires a patent grant, Just because you write the code as GPL and distribute it as such doesn't mean you actually have the right to give a patent license grant. That's so that a company like Microsoft can't contribute code and then later add restrictions by claiming to hold relevant patents. It isn't relevant if the code contributor doesn't have any relevant patents.
2024-02-01 03:36:17
> assuming a 3rd party creates a library that uses a patented technologies and says "here have a patent grant", doesn't mean you actually have one, now I'm sure you may have some degree of protection under the law in some states, I think Canada does have a degree of protections? But regardless, you would for sure be able to be penalized for continuing to use the patents without an appropriate license from that point forwards. Yes, the patent holder could sue you. But that doesn't really affect GPL compatiblity.
damian101
spider-mario I’m aware of that, but if Dolby has a recommendation as to which peak to use when mapping HLG, it would seem plausible that it might also have one when tone mapping PQ – does it?
2024-02-01 03:37:08
Oh, the recommendation is a general one. What SDR peak to master towards for general distribution. And since PQ maps brightness absolutely, the 160 nits recommendation remains. HLG kind of does, too, since 1000 nits is the default peak brightness, but there's a different HLG transfer function for any peak brightness > 250 nits, which might have been used during creation, so you never know for sure.
Quackdoc
lonjil > assuming a 3rd party creates a library that uses a patented technologies and says "here have a patent grant", doesn't mean you actually have one, now I'm sure you may have some degree of protection under the law in some states, I think Canada does have a degree of protections? But regardless, you would for sure be able to be penalized for continuing to use the patents without an appropriate license from that point forwards. Yes, the patent holder could sue you. But that doesn't really affect GPL compatiblity.
2024-02-01 03:43:57
> Yes, the patent holder could sue you. But that doesn't really affect GPL compatiblity. it does. for GPLv3 it's explicit, for gplv2. keep this in mind from section 7 of gplv2 > **If, as a consequence of a court judgment** or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) **that contradict the conditions of this License**, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. **For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program.**
2024-02-01 03:45:13
if the end user(indirectly) or distributor (directly) needs to pay royalties when they recieve a copy, or distribute, then it's no bueno
lonjil
2024-02-01 03:45:39
So you're saying that if someone shows up with a patent for something in Linux, suddenly every person who does anything with Linux is a GPL violator?
2024-02-01 03:46:35
But actually > conditions are imposed on you this only matters *if conditions have been imposed*
Quackdoc
2024-02-01 03:46:37
if someone shows up with a patent, they win a court case, and make it so distributors need to pay royalties, then yes
lonjil
2024-02-01 03:47:09
which is not something that has been brought up as actually being the case w.r.t. anything we've talked about
Quackdoc
lonjil But actually > conditions are imposed on you this only matters *if conditions have been imposed*
2024-02-01 03:47:10
that's like saying murder is illegal only if you get caught
lonjil
2024-02-01 03:47:15
not at all
Quackdoc
2024-02-01 03:47:20
it is.
lonjil
2024-02-01 03:48:03
Didn't we establish that transmitting source code is protected by the first amendment in the US?
Quackdoc
2024-02-01 03:48:07
Patent usage has terms, you abide by those terms, or you don't. But you don't get to pretend you are in line with those terms when you arent
lonjil Didn't we establish that transmitting source code is protected by the first amendment in the US?
2024-02-01 03:48:23
GPL is about binaries too not just code.
lonjil
2024-02-01 03:48:31
Sure
2024-02-01 03:48:55
But you're sorta also proving my point?
Quackdoc
2024-02-01 03:48:56
anybody can say "just don't compile and distribute binaries" but practicing that is different
2024-02-01 03:49:12
a GPL violation is a violation if it makes the binaries undistributable
2024-02-01 03:49:29
thats literally the point of GPL
lonjil
Quackdoc Patent usage has terms, you abide by those terms, or you don't. But you don't get to pretend you are in line with those terms when you arent
2024-02-01 03:49:37
so it doesn't actually matter what license it is. Those patent concerns apply regardless.
Quackdoc
2024-02-01 03:50:23
correct, GPL is one of the only licenses that turns that on it's head an says if there are these concerns, then GPL isn't compatible.
2024-02-01 03:51:09
ofc GPL makes explicitly sure to say that it applies to binary distribution, AGPL goes a bit of a step further in limiting the actual use of the binary, GPLv3 also *kinda* does
lonjil
2024-02-01 03:53:26
AGPL does not limit usage
Quackdoc
2024-02-01 03:53:47
ah, maybe there is confusion here. when something is GPL incompatible, it means you cannot distribute the binary, GPL, even v3 explicitly states the the restrictions of the license do not apply to "internal" usages where an entity will use the incompatible code in house
lonjil
2024-02-01 03:54:19
It states that because copyright law can't control internal use.
Quackdoc
lonjil AGPL does not limit usage
2024-02-01 03:54:29
> If your software can interact with users remotely through a computer network, you should also make sure that it provides a way for users to get its source
2024-02-01 03:54:44
I dunno, seems like it does to me
lonjil
2024-02-01 03:55:03
that's not in the text of the license
2024-02-01 03:55:21
it says END OF TERMS AND CONDITIONS just a bit higher up
Quackdoc ah, maybe there is confusion here. when something is GPL incompatible, it means you cannot distribute the binary, GPL, even v3 explicitly states the the restrictions of the license do not apply to "internal" usages where an entity will use the incompatible code in house
2024-02-01 03:56:03
If I take the code of something with a GPL-incompatible license and mash it up with the code of a GPL'd project, distributing that code would almost certainly be a violation.
Quackdoc
2024-02-01 03:56:12
> Section 13. Remote Network Interaction; Use with the GNU General Public License. > > Notwithstanding any other provision of this License, if you modify the Program, your modified version must prominently offer all users interacting with it remotely through a computer network (if your version supports such interaction) an opportunity to receive the Corresponding Source of your version by providing access to the Corresponding Source from a network server at no charge, through some standard or customary means of facilitating copying of software. This Corresponding Source shall include the Corresponding Source for any work covered by version 3 of the GNU General Public License that is incorporated pursuant to the following paragraph.
lonjil
2024-02-01 03:56:22
yeah, says literally nothing about usage
2024-02-01 03:56:26
that talks about modification
2024-02-01 03:56:42
if you modify the code, you must ensure that your modified version behaves in a certain way
2024-02-01 03:56:54
doesn't impose any requirement on how you use it
lonjil If I take the code of something with a GPL-incompatible license and mash it up with the code of a GPL'd project, distributing that code would almost certainly be a violation.
2024-02-01 03:58:28
There's just a lot of ambiguity about what counts as a derivative work. Binaries are easy to talk about because obviously putting a bunch of stuff into a single binary makes for a derivative work. Having clearly separate parts inside a single repo probably doesn't. But you can absolutely end up with a combined derivative work in source form.
Quackdoc
2024-02-01 04:00:46
the issue is that this section denotes "remote network interaction" as shown by this paragraph to not fall under internal use and outside of GPL's restrictions, and may or may not fall issue with patent licensing terms. Now it is arguable because of the sentence "if you modify the Program" However that's a for lawyer shanigians to prove or disprove
lonjil
2024-02-01 04:02:22
it's just an extremely poorly written license trying to be clever to hack the law
Quackdoc
2024-02-01 04:03:21
exactly, and that creates ambiguity, and ambiguity breeds trouble
Traneptora
sklwmp https://twitter.com/CCBalert/status/1752357113618165975?s=19 what exactly is this talking about?
2024-02-01 04:40:07
the risk is that the parser crashes, not RCE fwiw the bug has already been fixed and it's been included in the 6.1.1 release
_wb_
2024-02-01 04:41:49
You can never rule out that some troll comes out of his cave, looks at any piece of software (GPL'ed or whatever), considers it to infringe on their patents, tracks anyone using it and sues them. Regardless of whether they actually have a valid claim or not, this litigation is expensive and unless it is extremely clearly a bogus claim (e.g. the software in question is more than 20 years old), it will likely be cheaper to settle than to try to win in court. This is always true regardless of how hard you try to avoid this scenario — that's just how the patent system works. It is silly and it sucks.
Traneptora
2024-02-01 04:47:23
GPL is also exceptionally verbose
_wb_
2024-02-01 04:49:20
To try to somewhat mitigate this patent troll risk, for jxl we did the following things: - try very hard not to put any known patent-encumbered or potentially patent-encumbered thing in the spec (besides patents we filed ourselves) - defensively patent stuff related to jxl just so we can explicitly grant a universal license to anyone. This serves two purposes: (compared to not patenting anything in the first place) 1. makes it harder for anyone else to claim patents relevant to jxl 2. a defensive clause that revokes the license in case someone else litigates, so if a patent troll makes claims on jxl, Google/Cloudinary can counter-sue them
afed
2024-02-01 04:54:32
yeah, even opus or vp8 faced patent claims and litigation may take decades to resolve
Quackdoc
2024-02-01 04:56:10
thankfully with GPL software as a business, you can always just punt it up the chain. so say opus has been found to infringe in some patents, as a buisness you can usually just stop using opus and story will end there. ofc this isnt a happy ending, but it prevents smaller buisness from getting *royally* screwed over
2024-02-01 04:56:11
hehe
Traneptora
2024-02-01 05:04:01
well, the most clear example x264
2024-02-01 05:04:04
a GPL H.264 encoder
afed
2024-02-01 05:13:57
also that's why hevc only recently started being used as a streaming format, even though the biggest patent holders allowed it without royalties back in 2018 (for free public streaming), but some of the patents was claimed by Velos, which was not part of that patent pool and had different terms and royalty payments so most companies were still afraid to use hevc for streaming but, recently, Velos ended its patent pool https://rethinkresearch.biz/articles/velos-ends-its-patent-pool-for-hevc-simplifying-video-codec-licensing-picture/
Quackdoc
2024-02-01 05:20:55
x264 binaries are considered non distributable by a few folk, technically it would be "non distributable" in any country that upholds software patent law, and distributable in countries that don't. This is kinda complicated because gplv2 only implies that patent rights. x265 and x265 are completely gplv3 incompatible though
2024-02-01 05:21:55
ofc most people don't care nor do they need to, ~~Im the kind of person who would pirate windows HEVC licence even if it's 99% off~~
2024-02-01 05:21:57
https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&name=dogelol&quality=lossless
Traneptora
Quackdoc I love how people act like its some kind of major issue, man RCE exploits happen and get patched all the time
2024-02-01 05:22:46
it's not even RCE though, it's an out-of-array-access read
Quackdoc
2024-02-01 05:23:33
well, serves me right for not looking into it myself
Fraetor
2024-02-02 10:23:20
Is anyone else at FOSDEM this weekend?
Traneptora
Fraetor Is anyone else at FOSDEM this weekend?
2024-02-02 11:21:40
A lot of the ffmpeg folk are. I couldn't justify going from North America
yurume
2024-02-03 05:47:29
one of the worst uses of raster images: https://million.dev/blog/million-3.en-US (see the hero image, which is an extremely huge PNG image with a simple text and a very faint gradient background with heavy dithering, resulting in 18 MB of hard-to-compress data)
Traneptora
2024-02-03 06:30:58
It does not surprise me in the slightest that React devs don't consider an 18MB front and center PNG to be an issue
2024-02-03 06:32:31
React is slow garbage and that will never change
190n
yurume one of the worst uses of raster images: https://million.dev/blog/million-3.en-US (see the hero image, which is an extremely huge PNG image with a simple text and a very faint gradient background with heavy dithering, resulting in 18 MB of hard-to-compress data)
2024-02-03 06:43:36
💀
2024-02-03 06:44:00
it's not even good quality
2024-02-03 06:44:05
like all the edges are super fuzzy
yurume
2024-02-03 06:44:39
if you can artificially increase its contrast enough, you can also see that it is not even a uniform gradient
monad
2024-02-03 06:47:33
At least it's not implemented as a small, scaled logo on every page. 👀
190n
2024-02-03 07:10:08
wait it has a fuzzy transparent edge too lmao
2024-02-03 07:10:09
wtf
2024-02-03 07:10:55
my working theory was that this was exported from a vector graphics program (probably illustrator) at way too high dpi (and raster instead of vector for some reason)
2024-02-03 07:11:07
but now i think it must have gotten a gaussian blur over the whole thing
monad
yurume one of the worst uses of raster images: https://million.dev/blog/million-3.en-US (see the hero image, which is an extremely huge PNG image with a simple text and a very faint gradient background with heavy dithering, resulting in 18 MB of hard-to-compress data)
2024-02-03 08:59:53
Harder for jxl--it's an e9 fail. (timings are inconsistent, cpu was doing other light work) ```u, r = Pareto frontier for time/size (u+s, r); a = part of 'all' | MP/s (u+s) | bpp | MP/s (r) best of ura 1.6169350 0.085161 0.56606 oxipng_9.0.0_o5 1.6169350 0.085161 0.56606 all 1.6169350 0.066307 0.52039 oxipng_9.0.0_o6 ur 1.6299835 0.17804 0.6382 oxipng_9.0.0_o4 ur 1.6507810 0.9757 1.148 oxipng_9.0.0_o2 1.6507810 0.26721 0.9094 oxipng_9.0.0_o3 ur 1.7207080 1.907 1.906 oxipng_9.0.0_o1 1.7296946 0.8081 1.598 cwebp_1.2.4_z9 ur 1.9944319 35.4 34.8 cwebp_1.2.4_z0 2.0253719 4.933 4.933 cwebp_1.2.4_z4 2.1887485 3.324 3.324 cwebp_1.2.4_z7 2.2142698 5.343 5.339 cwebp_1.2.4_z3 2.2626757 6.32 6.31 cwebp_1.2.4_z2 2.2928884 3.700 3.698 cwebp_1.2.4_z6 2.3195678 8.74 8.73 cwebp_1.2.4_z1 2.4399981 0.14235 0.15266 cjxl_v0.9.0-a16a3e22_e9 ur 2.4784559 >6.0e+03 >6.0e+03 original_none_ 2.4784559 21.1 21.1 oxipng_9.0.0_o0 2.4874057 4.664 4.660 cwebp_1.2.4_z5 2.5751008 2.542 2.542 cwebp_1.2.4_z8 2.6041731 0.52547 0.6346 cjxl_v0.9.0-a16a3e22_e8 2.7807762 1.350 1.831 cjxl_v0.9.0-a16a3e22_e7 3.0606478 2.362 5.431 cjxl_v0.9.0-a16a3e22_e4 3.2489964 4.592 20.9 cjxl_v0.9.0-a16a3e22_e3 3.4137870 2.106 3.230 cjxl_v0.9.0-a16a3e22_e6 3.6974985 7.53 22.6 cjxl_v0.9.0-a16a3e22_e2 3.7474613 2.876 5.111 cjxl_v0.9.0-a16a3e22_e5 4.7557438 66 1.0e+02 cjxl_v0.9.0-a16a3e22_e1```
Traneptora
2024-02-03 05:16:55
at the least, difficult images like this are helpful to improve the tree learning
2024-02-03 05:17:04
it may be possible that the tree learning is overfitting
2024-02-03 05:17:22
when the ideal thing to do is to encode the residuals
veluca
Traneptora it may be possible that the tree learning is overfitting
2024-02-03 05:20:07
I'm not sure what you mean here 😛
Traneptora
2024-02-03 05:20:44
I mean, the image itself has a slight randomized dither pattern
2024-02-03 05:20:45
2024-02-03 05:21:03
it's possible that e9 is attempting to deduce the pattern, rather than just predicting the average
2024-02-03 05:21:09
and encoding the residuals
veluca
2024-02-03 05:21:54
hard to say what it's actually doing, but I would not expect "it learns too big of a tree" to be an issue
2024-02-03 05:22:38
it *might* mess up subsequent the subsequent lz77 pass though, which I assume is why PNG and webp do better
Traneptora
2024-02-03 05:28:42
I'm wondering how well squeeze works on this content
veluca
2024-02-03 05:39:58
you can easily try no? `cjxl --responsive -e9 -d0` or somesuch
diskorduser
2024-02-03 05:57:14
anyone using arch linux with gnome? for me, newly encoded jxl files cannot be set as wallpaper. old jxl files work fine.
veluca
diskorduser anyone using arch linux with gnome? for me, newly encoded jxl files cannot be set as wallpaper. old jxl files work fine.
2024-02-03 06:06:27
uh, weird...
diskorduser
2024-02-03 06:08:55
HCrikki
2024-02-03 06:09:25
is that a jxl with layers?
diskorduser
2024-02-03 06:09:32
No
2024-02-03 06:10:23
2024-02-03 06:11:48
old jxl file work fine
Quackdoc
2024-02-03 06:17:19
is it a simple parser issue I wonder?
diskorduser
2024-02-03 06:20:30
this file works. like this all old jxls encoded last year work fine.
diskorduser
2024-02-03 06:21:00
this does not.
HCrikki
2024-02-03 06:22:13
one is generated as bitstream gnome background wasnt updated yet to parse (iinm supposed to be in 46)?
diskorduser
2024-02-03 06:26:19
I think the bug depends on source files. I encoded 3 pngs to jxl. one works fine. other two does not.
2024-02-03 06:27:09
HCrikki
2024-02-03 06:29:06
sdcs has an alpha channel
2024-02-03 06:29:36
compared properties using jxlinfo -v
diskorduser
2024-02-03 06:33:15
I think the png with alpha work fine as jxl. png without alpha does not work.
2024-02-03 06:35:25
I added alpha channel to non working png with gimp, saved it and encoded it as jxl. now they are working.
Traneptora
veluca you can easily try no? `cjxl --responsive -e9 -d0` or somesuch
2024-02-03 10:46:43
I tried `cjxl -d 0 -e 9 --progressive input.png test2.jxl`
2024-02-03 10:46:53
and it turned out to be significantly lower density
2024-02-03 10:46:55
28M vs 18M
2024-02-03 10:47:03
which doesn't really surprise me, lossless squeeze is kinda expensive
diskorduser anyone using arch linux with gnome? for me, newly encoded jxl files cannot be set as wallpaper. old jxl files work fine.
2024-02-03 10:47:40
I'm using arch linux with MATE. can you post a sample that doesn't work?
veluca
Traneptora which doesn't really surprise me, lossless squeeze is kinda expensive
2024-02-03 10:53:47
yeah, I am not very surprised either 😄
Traneptora
2024-02-04 06:39:31
I believe the matrix bridge was intentionlly removed because because it wasn't possible to police the spam from it
diskorduser
Traneptora I'm using arch linux with MATE. can you post a sample that doesn't work?
2024-02-04 07:02:08
it is just jxls without alpha does not work.
Traneptora
diskorduser it is just jxls without alpha does not work.
2024-02-04 07:08:24
any JXL without alpha?
2024-02-04 07:16:22
I ask because my current desktop background is JXL without alpha
2024-02-04 07:16:35
diskorduser
Traneptora
2024-02-04 02:37:12
Does it work on your computer with arch+gnome?
Traneptora
2024-02-04 03:57:21
I'm not installing gnome to test, but it works fine in MATE
2024-02-04 03:57:33
which is a gtk based gnome2 fork
DZgas Ж
2024-02-04 07:17:02
hm
yoochan
DZgas Ж hm
2024-02-04 08:54:24
Interesting. Doesn't seems exactly compatible with brotli though
sklwmp
2024-02-05 12:00:18
> Existing optimized Brotli decompression functions (CPU implementations) should be able to decompress the Brotli-G bitstream, while more optimal data-parallel implementations on hosts or accelerators can further improve performance.
2024-02-05 12:00:23
from: https://gpuopen.com/brotli-g-sdk-announce/
2024-02-05 12:00:51
it should at least be able to be decompressed by regular brotli implementations
2024-02-05 12:00:53
just not the other way around
2024-02-05 12:00:57
iiuc
Traneptora
2024-02-05 12:01:39
it's a specific subset of brotli that has a very fast decompression path if used
2024-02-05 12:01:48
brotli decoders that don't have that check can still decode it normally
2024-02-05 12:01:55
at least that's how I understand it
2024-02-05 12:02:42
examples being like, 8-bit lossless JXL files with one frame can be decoded directly to a buffer
2024-02-05 12:03:15
jxlatte doesn't do this - it normalizes modular files as floats, performs frame blending, (a no-op for these files), and then denormalizes them back into integers
spider-mario
2024-02-05 09:01:57
I wonder how compression density compares
2024-02-05 09:02:04
and how much faster decompression gets
yoochan
2024-02-05 09:57:17
meanwhile, chrome intends to push Zstd because _Zstd is roughly three times faster than Brotli for decompression_ https://chromestatus.com/feature/6186023867908096
username
2024-02-05 11:20:24
https://github.com/GPUOpen-LibrariesAndSDKs/brotli_g_sdk/issues/1
Quackdoc
yoochan meanwhile, chrome intends to push Zstd because _Zstd is roughly three times faster than Brotli for decompression_ https://chromestatus.com/feature/6186023867908096
2024-02-05 02:03:33
at least it shouldn't be faster then brotli-g when gpu is used for decod... hopefully
Traneptora
2024-02-05 05:00:39
> Dr. Mark Adler foresaw this kind of extensions and demanded that we add a possibility to add comment blocks that will be skipped by the normal decoder. sounds very much like JXL extensions block
_wb_
yoochan meanwhile, chrome intends to push Zstd because _Zstd is roughly three times faster than Brotli for decompression_ https://chromestatus.com/feature/6186023867908096
2024-02-05 05:04:26
In what world does it matter that decompression can be done at 1.5 GB/s instead of 0.5 GB/s? Compared to typical internet connection speeds, and especially in the bigger picture when you consider all the other stuff that needs to be done with the stuff you want to apply general-purpose compression to (whether it's html, css, js or whatever), this is very much _not_ a performance bottleneck that matters, imo.
Traneptora
2024-02-05 05:11:05
it's also not clear that Zstd is 3x faster than brotli at decompression
_wb_
2024-02-05 05:22:27
I mean, facebook's benchmarks say zstd decodes at 1.5 GB/s while brotli and gzip do 'only' 0.5 GB/s. I just fail to see how this is relevant for a _web transfer encoding_. It matters maybe for something like filesystem-level compression or whatever, but for web transfer?
lonjil
2024-02-05 05:24:53
The only thing I've seen for zstd is that CDNs love Brotli when they can pre-compress stuff, but don't like it at all when they have something they can't recompress and need to do it as fast as possible while servicing a request. Zstd was proposed as being better at such compression speeds.
Traneptora
2024-02-05 05:30:37
zstd having better compression speeds may be relevant as a content-encoding for on-the-fly compression, that's true
2024-02-05 05:30:48
but end-user decompression speed would not be the reason
_wb_
2024-02-05 05:31:50
https://raw.githubusercontent.com/facebook/zstd/master/doc/images/DCspeed5.png
2024-02-05 05:32:01
facebook published this plot
Traneptora
2024-02-05 05:33:14
what were the paramters used though?
Quackdoc
2024-02-05 05:33:46
I guess it may be relevant on something like an rpi3... maybe...
_wb_
2024-02-05 05:33:53
i'm assuming just effort 0 to 11?
2024-02-05 05:37:22
I don't know what kind of speed you want for on-the-fly compression, but as long as you can do streaming encode, I would assume you don't really care about speeds that are faster than the speed at which the input is being produced (assuming there is some html spitting program providing the input) or faster than the typical end-user network speeds.
gb82
2024-02-05 05:51:31
https://giannirosato.com/blog/post/lossless-data-comp/
2024-02-05 05:51:52
not sure how applicable this is for the web, but I disclose my parameters here
afed
2024-02-05 06:29:41
there are also more accurate tools for benchmarks, with less impact of i/o and other stuff <https://github.com/powturbo/TurboBench> <https://github.com/inikep/lzbench> also a lot depends on the implementation, it's like comparing jxl-oxide vs highly asm optimized dav1d for avif with some fast settings and conclude that avif is 20x faster than jxl in decoding as a format and yeah, for web use decoding speed is not that important when it is much higher than typical bandwidth and streaming efficiency and low memory consumption are much more important at least until 10-50gb/s is common and most servers can also provide such speeds
yoochan
_wb_ In what world does it matter that decompression can be done at 1.5 GB/s instead of 0.5 GB/s? Compared to typical internet connection speeds, and especially in the bigger picture when you consider all the other stuff that needs to be done with the stuff you want to apply general-purpose compression to (whether it's html, css, js or whatever), this is very much _not_ a performance bottleneck that matters, imo.
2024-02-05 07:42:53
Agreed. I love brotli. Especially its concept of pre-shared dictionary. As usual, Google take as sole justification some doubtful benchmarks.
lonjil
2024-02-05 07:44:19
zstd also has that concept
2024-02-05 07:50:31
Actually, I believe this is false: > chrome intends to push Zstd because Zstd is roughly three times faster than Brotli for decompression almost everything they talk about is compression, not decompression. The Chrome status page itself says this: > Supporting zstd content-encoding in the browser would allow sites to spend less time and CPU/power on compression on their servers, resulting in reduced server costs. There are several published benchmarks and existing research showing promising potential wins. Zstd is roughly three times faster than Brotli for decompression. Combined with zstd being faster at compression, this will result in faster page load times. Which mentions both compression and decompression, but then the first documentation link mentions only compression: > Having the browser support it as an Content-Encoding could provide a benefit for CDNs and origins that want to compress their dynamic content with higher compression than gzip but less CPU cost than Brotli. For dynamic (non-cacheable) content (i.e. HTML, JSON (API calls), etc.), it is unlikely that Brotli 11 is used, due to high CPU and TTFB cost. Zstd could play an interesting role in dynamic content scenarios, as it often offers better compression ratios than Brotli (at equivalent CPU costs), or better CPU costs (at equivalent ratios). The second link mentions some current users of it: > For example, Facebook uses zstd for all of its content-coding across its app and resources. Akamai has been experimenting internally with it and uses it for at-rest log storage, observing improved speed compared to gzip with smaller files. Android also uses zstd widely.
2024-02-05 07:51:35
The second link also talks about compression performance, with a very small mention of decompression speed: > For static content, it is more reasonable to spend time/bandwidth on compression. However, for dynamic content, it is less reasonable because we can't cache the compressed content. For browsers, it is important to consider what the user observes (i.e. page load times), so for dynamic content, it is more observable if we spend longer on compression, as opposed to static content, where we can do the compression in advance. Supporting zstd content-encoding in the browser would allow sites to spend less time and CPU/power on compression on their servers, resulting in reduced server costs. Zstd is roughly three times faster than Brotli for decompression. Combined with zstd being faster at compression, this will result in faster page load times. > > Meta and Akamai have both found zstd to be useful as striking a good balance between CPU usage and compression ratios. If we were to compare plain zstd and plain brotli for first views and dynamic scenarios, dynamic content still contributes a significant number of bytes and milliseconds to a page load, so we expect wins from zstd in these scenarios. For static scenarios, brotli 11 is probably still the best, since we typically care less about compression CPU cost when the results can be reused across many users.
afed
2024-02-05 08:25:30
yeah, decompression is not that important for web use at such speeds but, even for compression speed, it would be better to improve current encoder implementations or use some more optimized ones and then zstd doesn't look that good like here https://github.com/powturbo/TurboBench/issues/43
2024-02-05 08:32:57
if zstd brought any significant benefits or features as a format, that would be good, but it's just a more optimized implementation for some use cases
lonjil
2024-02-05 08:47:59
it's just a content encoding anyway, those are easier to get rid of than image formats if they don't work out
_wb_
2024-02-05 08:48:42
We got rid of JPEG XR quite painlessly
2024-02-05 08:49:54
Internet Explorer supported that back when it was still a major browser, yet getting rid of it was not a significant problem.
Cacodemon345
2024-02-05 09:17:18
JXR was only gotten rid of because no camera devices ever actually supported it, not to mention the patent licensing woes.
_wb_
2024-02-05 09:26:59
Was there a patent licensing thing with JXR? I don't think so. Just no decent FOSS implementation, only some Microsoft abandonware...
diskorduser
2024-02-06 04:32:53
https://www.reddit.com/r/GalaxyS24Ultra/s/4OHQAnNWWi
2024-02-06 04:33:09
Display scientist.... 🤔
_wb_
2024-02-06 04:55:42
this does look pretty bad, I must say
2024-02-06 04:56:33
the amount of variation in intensity between those subpixels, when it is supposed to be displaying a uniform gray, I mean
2024-02-06 04:59:44
unless it's an artifact of how the picture was taken, if the exposure time was too short to cover enough cycles of the PWM that I assume is used to dim them (that would also explain why there is a bigger problem in the darker grays)
2024-02-06 05:08:26
but I guess this person knows how to do it properly and the those S24 displays are really not properly "de-muraed"
yurume
lonjil Actually, I believe this is false: > chrome intends to push Zstd because Zstd is roughly three times faster than Brotli for decompression almost everything they talk about is compression, not decompression. The Chrome status page itself says this: > Supporting zstd content-encoding in the browser would allow sites to spend less time and CPU/power on compression on their servers, resulting in reduced server costs. There are several published benchmarks and existing research showing promising potential wins. Zstd is roughly three times faster than Brotli for decompression. Combined with zstd being faster at compression, this will result in faster page load times. Which mentions both compression and decompression, but then the first documentation link mentions only compression: > Having the browser support it as an Content-Encoding could provide a benefit for CDNs and origins that want to compress their dynamic content with higher compression than gzip but less CPU cost than Brotli. For dynamic (non-cacheable) content (i.e. HTML, JSON (API calls), etc.), it is unlikely that Brotli 11 is used, due to high CPU and TTFB cost. Zstd could play an interesting role in dynamic content scenarios, as it often offers better compression ratios than Brotli (at equivalent CPU costs), or better CPU costs (at equivalent ratios). The second link mentions some current users of it: > For example, Facebook uses zstd for all of its content-coding across its app and resources. Akamai has been experimenting internally with it and uses it for at-rest log storage, observing improved speed compared to gzip with smaller files. Android also uses zstd widely.
2024-02-07 12:23:33
that's a very interesting argument. so it is essentially claiming that brotli's specialization on very short inputs (via preset dictionary) doesn't work for dynamic content scenario, right?
2024-02-07 12:24:15
I wonder that can be rather fixed by a separate compression mode for brotli.
lonjil
2024-02-07 12:24:24
wdym?
yurume
2024-02-07 12:24:53
ah sorry for some rambling, so let me explain a bit more
2024-02-07 12:25:55
technically speaking brotli uses a more sophiscated prediction scheme than zstd, so brotli should offer a better compression at its extreme (and we frequently observe this at brotli level 11)
2024-02-07 12:26:24
but that prediction scheme is only useful when there are enough amount of data, which is not the case for the web situation
2024-02-07 12:26:52
so brotli came up with a preset dictionary that can be used to compress a short amount of data
2024-02-07 12:27:43
ergo, if the preset dictionary is properly used, zstd can't really compete with brotli at the compression ratio because zstd has no such specialization
2024-02-07 12:29:01
but they are talking about the equivalent CPU costs, so both zstd and brotli are used at the lower level
2024-02-07 12:29:24
and I guess brotli's lower level might not be that effective at using the preset dictionary?
lonjil
2024-02-07 12:32:27
idk how the tests were done, but since the brotli dictionary is already in web browsers, I would assume that they'll use it for zstd too.
yurume
2024-02-07 12:32:48
of course that's also possible
2024-02-07 12:33:03
(both zstd and brotli accept a custom dictionary AFAIK)
2024-02-07 12:33:39
I would like to know the exact brotli and zstd levels they are using for dynamic contents
2024-02-07 12:33:53
because that would hint the *acceptable* level of CPU usage for their use case
2024-02-07 12:39:30
I've looked at the brotli source code and there seems four major thresholds: levels 0--2, level 3, levels 4--6, levels 7--9, levels 10--11
2024-02-07 12:42:02
these groups use substantially different algorithms from the previous
2024-02-07 12:42:16
so that might be the issue for web uses?
2024-02-07 12:44:52
zstd in comparison has 22 (positive) compression levels
2024-02-07 12:45:27
the number of algorithms is roughly same but the number of levels is larger so that can adapt better
2024-02-07 12:45:52
(of course, assuming that they are not really tweaking other compression parameters that much)
Oleksii Matiash
2024-02-07 04:50:20
Files shared via such services are mostly already compressed, no?
Traneptora
2024-02-07 06:14:16
most files shared on such sites have some sort of compression built-in, yes
2024-02-07 06:14:30
even zipfiles do, and compressing already compressed data is very difficult
yurume
2024-02-07 07:05:36
especially when that already but less optimally compressed data should be fully reproduced
Traneptora
2024-02-08 01:38:37
ye, decoding and repacking isn't an option
diskorduser
2024-02-11 10:01:31
https://x.com/DirectX12/status/1755659515368980579?t=bMjsZEIjtC9mNwz9gvacUg&s=33
2024-02-11 04:43:57
why does this jpg compress so much in jxl? The file size is around 62% after compressed to jxl.
2024-02-11 04:44:11
lossless transcode btw
afed
2024-02-11 04:51:30
optimized jpeg
DZgas Ж
2024-02-11 04:53:30
1. compressing the color component in the usual way 2. manually compressing the component and then replacing it in the image jpeg
Traneptora
diskorduser why does this jpg compress so much in jxl? The file size is around 62% after compressed to jxl.
2024-02-11 05:25:40
the huffman tables aren't optimized, so there's a lot to be gained in the entropy compression department
2024-02-11 05:26:11
since jxl transcodes takes the DCT coeffs and compresses them using JXL's native entropy coding, a lot is saved that way
2024-02-11 05:26:26
however just optimizing the actual huffman table can shave almost 25% from the JPEG itself
_wb_
2024-02-11 05:29:10
Unoptimized JPEG is relatively common, cameras tend to produce that. It is also often such JPEGs that people like to compare to when showing off a new codec 🙂
190n
2024-02-11 07:46:21
even `xz -9` is able to make that file 13% smaller
spider-mario
2024-02-11 09:22:40
you can get a smaller, same-quality jpeg using: ```console jpegtran -optimize -copy all [-progressive] < input.jpg > output.jpg ```
2024-02-11 09:22:47
(`-progressive` is up to you)
damian101
2024-02-12 09:19:52
Can't install the JPEG XL Plasma plugin from the AUR on Arch Linux: `libjxl_cms was not found`
Traneptora
2024-02-13 12:14:22
the AUR module might not list that under provides
damian101
2024-02-13 12:00:21
yeah, I was wrong, not a Plasma-related plugin, I meant the qt5/qt6 JPEG XL Plugin
novomesk
yeah, I was wrong, not a Plasma-related plugin, I meant the qt5/qt6 JPEG XL Plugin
2024-02-13 02:46:56
Try to send me your version of libjxl and a complete log, so I may see what failed.
damian101
2024-02-13 04:34:27
cjxl v0.9.0 4e4f49c5
2024-02-13 04:35:59
``` -- Checking for module 'libjxl_cms>=0.9.0' -- Package 'libjxl_cms', required by 'virtual:world', not found CMake Error at src/CMakeLists.txt:28 (message): libjxl_cms was not found! -- The following OPTIONAL packages have been found: * Qt6CoreTools (required version >= 6.6.1) * Qt6Core * OpenGL * XKB (required version >= 0.5.0), XKB API common to servers and clients., <http://xkbcommon.org> * WrapVulkanHeaders * Qt6GuiTools (required version >= 6.6.1) * Qt6DBusTools (required version >= 6.6.1) -- The following REQUIRED packages have been found: * ECM (required version >= 5.89.0), Extra CMake Modules., <https://commits.kde.org/extra-cmake-modules> * Qt6Gui (required version >= 5.14.0) -- Configuring incomplete, errors occurred! ==> ERROR: A failure occurred in build(). Aborting... -> error making: qt6-jpegxl-image-plugin-exit status 4 ```
novomesk Try to send me your version of libjxl and a complete log, so I may see what failed.
2024-02-13 04:37:03
issue occurs with libjxl-metrics-git, not with libjxl from repository
novomesk
issue occurs with libjxl-metrics-git, not with libjxl from repository
2024-02-13 05:05:41
It seems it is an old pre-release 0.9 snap-shot from Jun 19, 2023. If you really want to use that unrelease snapshot I can give you tips how to edit the cmake scripts. However I recommend the official releases.
damian101
novomesk It seems it is an old pre-release 0.9 snap-shot from Jun 19, 2023. If you really want to use that unrelease snapshot I can give you tips how to edit the cmake scripts. However I recommend the official releases.
2024-02-13 05:06:58
I don't think you can trust the version number from the AUR for git packages
2024-02-13 05:07:56
AUR: 0.8.2.r404 actual: 0.9.2.r0
spider-mario
2024-02-13 06:17:20
yeah, the pkgver is updated at build time
2024-02-13 06:17:34
it’s not automatically tracked by the AUR
2024-02-13 06:17:48
and it’s frowned upon to bump the pkgver manually with no other change
damian101
2024-02-13 06:18:18
why not just disable the version then...
spider-mario
2024-02-13 06:19:16
what do you mean?
damian101
spider-mario what do you mean?
2024-02-13 06:47:31
the version entry
2024-02-13 06:47:54
no information is better than false information
spider-mario
2024-02-13 06:50:56
it’s not outright false, though, just potentially misleading if one is not familiar with VCS packages
damian101
2024-02-13 07:07:23
VCS?
Traneptora
2024-02-13 07:15:24
version-control-system
2024-02-13 07:15:30
it means `-git` and other types
2024-02-13 07:15:45
they're usually `-git` but sometimes `-svn` or `-hg`
spider-mario
2024-02-13 07:27:51
rarely `-bzr`
2024-02-13 07:28:02
and I don’t remember encountering a `-darcs` package but there might be a few
2024-02-13 07:28:35
well, it seems there is _one_ https://aur.archlinux.org/packages/hikari-darcs
lonjil
2024-02-13 07:29:24
How long until -pijul
Traneptora
2024-02-13 07:35:05
we'll wrap around back to `-cvs`
lonjil How long until -pijul
2024-02-13 07:36:25
is it actually better than git
lonjil
2024-02-13 07:37:34
I think it lacks some stuff but the fundamentals seem way better than Git
Traneptora
2024-02-13 07:38:25
it says it's based on "patches"
2024-02-13 07:38:28
but that just sounds like svn
spider-mario
2024-02-13 07:43:39
no, it’s “patches” as opposed to “snapshots” as in git, svn, etc.
2024-02-13 07:43:40
(most other than darcs)
2024-02-13 07:44:11
see their FAQ https://pijul.org/faq
_wb_
2024-02-13 07:55:44
I remember when we used cvs and it was so much better than what we did before, which was exchanging emails or floppies with source code in a tarball
spider-mario
2024-02-13 08:10:36
Traneptora
spider-mario no, it’s “patches” as opposed to “snapshots” as in git, svn, etc.
2024-02-13 08:38:26
I'm not really sure what "patches" means in this context tho
2024-02-13 08:38:34
since patches are already a thing used by other VCS
2024-02-13 08:39:34
also their idea is that patches are unordered by some patches are dependent on others to make sense, which I'm not sure what they do in that case when they're unordered
2024-02-13 08:44:10
they also cite these fairly pathological examples as what git does wrong: https://tahoe-lafs.org/~zooko/badmerge/concrete-good-semantics.html
2024-02-13 08:44:23
when this will never happen in practice because git tags diffs with function names
2024-02-13 08:44:57
it's also not a fundamental problem with git itself that requires a whole new VCS, when you could just patch the merge algorithm
2024-02-13 08:45:18
they also haven't explained how a "patch" doesn't produce the same issue
2024-02-13 08:45:53
especially in the pathological case
lonjil
Traneptora I'm not really sure what "patches" means in this context tho
2024-02-13 08:45:54
in most VCSs, the only things that are stored are snapshots, and diffs are synthesized on demand. With Pijul, imagine that you only store diffs, and to check the repo out, you applied all the diffs. Except, instead of diffs, you have mathematically rigorous structures that encode change and acts as a good conflict-free replicated data type.
Traneptora
2024-02-13 08:46:53
the pathological example was attempting to apply ``` @@ -1,5 +1,5 @@ A B -C +X D E ``` to ``` A B C D E G G G A B C D E ```
2024-02-13 08:47:09
patching directly applies only the first one
2024-02-13 08:47:19
but they claimed in the context of the full tree, it should replace the second
2024-02-13 08:47:31
I don't see how switching to unordered patches fixes this problem
2024-02-13 08:48:38
the whole concept of "patches always commute" doesn't make sense when a patch modifies code that hasn't been added by a later patch yet
2024-02-13 08:48:47
something about that is missing from their description
lonjil
2024-02-13 08:49:00
Patches that depend on other patches aren't unordered
Traneptora
2024-02-13 08:49:28
so they have an entire dependency tree of patches?
2024-02-13 08:49:35
how is this not just like a commit history
lonjil
2024-02-13 08:50:14
two patches that modify different things (like different lines) are unordered.
Traneptora
2024-02-13 08:50:25
sure, but you can reorder git commits the same way
2024-02-13 08:50:35
they're ordered but you can easily change the order
2024-02-13 08:50:39
it's generally not an issue
lonjil
2024-02-13 08:50:50
no, you have to create new commits using a diff
Traneptora
2024-02-13 08:51:09
two existing commits can frequently have their order swapped in git
2024-02-13 08:51:17
`git rebase -i` does that
lonjil
2024-02-13 08:51:57
you're creating new commits based on the computed diffs.
Traneptora
2024-02-13 08:52:21
not sure why that's any different than "swapping the order" from an end user standpoint
2024-02-13 08:53:38
git plumbing is ugly, I'm aware but from a user standpoint git works fine
lonjil
2024-02-13 08:54:54
say you have different branches. Like a develop branch, a stable branch, or whatever. Try merging from branch A into branch B, and then do the same some time later after more development in the two branches. Won't work very well. So you try cherry picking instead. Works better, but can still create lots of unneccesary conflicts.
2024-02-13 08:56:00
> it's also not a fundamental problem with git itself that requires a whole new VCS, when you could just patch the merge algorithm yes and no. The only way to make git merge work as well as pijul would be to make merges work like this: 1. check out the most recent common ancestor of all parents going into the merge. 2. compute all the diffs along all paths leading to the current merge. 3. using all that information, apply Pijul's merge algorithm.
lonjil say you have different branches. Like a develop branch, a stable branch, or whatever. Try merging from branch A into branch B, and then do the same some time later after more development in the two branches. Won't work very well. So you try cherry picking instead. Works better, but can still create lots of unneccesary conflicts.
2024-02-13 08:59:05
While in Pijul, you can just apply any patch to any number of branches, and if there is a conflict, the resulting conflict-resolution patch will be valid any time the same conflict happens again, in any other branch. git has rerere for this, but it's just heuristics, not mathematical certainty.
Traneptora
lonjil say you have different branches. Like a develop branch, a stable branch, or whatever. Try merging from branch A into branch B, and then do the same some time later after more development in the two branches. Won't work very well. So you try cherry picking instead. Works better, but can still create lots of unneccesary conflicts.
2024-02-13 09:01:54
at least as I see it there is no "development" branch typically but rather a branch for a specific thing to do
2024-02-13 09:02:13
and then that code gets merged into master, and the branch is obsoleted
2024-02-13 09:02:42
packporting specific fixes to certain stable release branches can be done with cherry picking but you generall won't be merging master into stable
lonjil
2024-02-13 09:07:05
I meant 'develop' as in the bleeding edge "master" branch
2024-02-13 09:07:33
it's a somewhat common way to name branches
2024-02-13 09:08:27
I do wonder how much of projects having long running branches other than the main one is due to how bad git is at it, though.
MSLP
2024-02-13 09:10:57
Pijul just needs 15 years of "git-plumbing" worth of developement and it will be great 😄
spider-mario
Traneptora when this will never happen in practice because git tags diffs with function names
2024-02-13 09:12:04
doesn’t that require git to know about the development language?
2024-02-13 09:12:17
might work for C, but what if you’re not writing C?
Traneptora
2024-02-13 09:12:18
it does, but it does
2024-02-13 09:12:36
in either case I find the examples to be fairly pathological
2024-02-13 09:12:58
especially considering they can be fixed by a better merge algorithm and not by a new VCS from scratch
spider-mario
2024-02-13 09:18:46
they seem to have their reasons not to have gone that route (there’s a FAQ entry)
lonjil
2024-02-13 09:25:07
a new merge algorithm that is basically "import your project to Pijul, let Pijul figure the merge out, then port that back to git", does sound a little silly.
spider-mario
2024-02-13 10:34:15
https://github.com/boolean-option stop_doing_Nix.png
Traneptora
2024-02-13 10:37:52
I feel like the fact that this has to exist is a design flaw
username
spider-mario
2024-02-13 11:41:00
this is an actual folder structure that exists on my computer for a fork of 7-Zip that I made ~~I personally think it's more funny then it is shameful~~
HCrikki
2024-02-14 11:11:47
is it possible to guarantee a server will serve jxl in priority to applications that support the feature? or more specifically, that a client app will prioritize fetching fetching jxl if those files exist
2024-02-14 11:12:27
regardless of any webmaster server's preferred priority
_wb_
2024-02-14 12:01:36
No, there is no way to force a server to do anything, most will not return jxl at all, whatever you do. But if they do support jxl, then you might be able to force it to return a jxl if you send an Accept header in your request that only allows image/jxl.
VcSaJen
2024-02-14 02:52:48
It's possible if you MITM yourself with a proxy+self-signed root certificate. Proxy can do conversion to JXL on-the-fly, kinda like old Opera Turbo.
HCrikki
2024-02-14 04:26:15
i meant the later scenario where its a user trying to consume more jxl, not doing anything with servers themselves.
Traneptora
HCrikki is it possible to guarantee a server will serve jxl in priority to applications that support the feature? or more specifically, that a client app will prioritize fetching fetching jxl if those files exist
2024-02-14 05:10:53
generally speaking the best a client can do (like a browser) is declare that it supports JXL
2024-02-14 05:11:07
you can use the `accept` header for HTTP to tell the server what you prefer
2024-02-14 05:11:17
naturally the server still chooses what to give to you
2024-02-14 05:11:52
browsers already do this so that's as good as it gets tbh
CrushedAsian255
2024-02-15 10:35:21
is this a real mirror? https://artifacts.lucaversari.it/libjxl/libjxl/latest/ i found it [this slightly old bilibili video](https://www.bilibili.com/video/av337259856/)
spider-mario
CrushedAsian255 is this a real mirror? https://artifacts.lucaversari.it/libjxl/libjxl/latest/ i found it [this slightly old bilibili video](https://www.bilibili.com/video/av337259856/)
2024-02-15 10:45:49
yes, this is <@179701849576833024>’s
CrushedAsian255
2024-02-16 02:35:33
I only jumped on the jpeg xl train at v0.8.2 so I don’t know
2024-02-16 02:35:59
Currently I’m using the master branch head 0.10.0 and it feels faster than 0.8.2
afed
2024-02-16 02:38:49
probably because it's still gcc for these builds, not clang https://canary.discord.com/channels/794206087879852103/803645746661425173/1052978962055307357
yurume
2024-02-16 02:37:45
currently blocked by several issues, the major issue being that it became too complex to develop in a (single) C file
2024-02-16 02:38:04
I was also quite busy last year but that's the major factor
ProfPootis
2024-02-16 08:18:38
mpv appears to have inherited this as well. No clue how to open an issue for this.
2024-02-16 08:20:20
same things happens with other animated jxls not just that one
Traneptora
ProfPootis mpv appears to have inherited this as well. No clue how to open an issue for this.
2024-02-16 09:44:24
what issue is being described here, exactly?
2024-02-16 09:44:48
I'm not sure what you're describing
ProfPootis
2024-02-16 09:45:12
on the left, the animated jxl is looping correctly. in ffplay, it tries and fails to loop
Traneptora
2024-02-16 09:45:30
ffplay never loops input
2024-02-16 09:45:43
mpv also doesn't loop input videos regardless of any kind of tag. it does that for GIF animations as well.
2024-02-16 09:45:49
you can use `--loop-file` in mpv to make it loop
spider-mario
Traneptora ffplay never loops input
2024-02-16 10:02:36
(it does if you pass it `-loop -1`)
ProfPootis
2024-02-16 10:26:40
2024-02-16 10:34:31
same thing for mpv
Traneptora
spider-mario (it does if you pass it `-loop -1`)
2024-02-16 10:38:50
that's a specific avoption iirc for some demuxers
2024-02-16 10:39:28
If mpv won't loop with --loop-file tho, then that's a regression that I will have to investigate
2024-02-16 10:39:40
in the meantime --loop-playlist should work
ProfPootis
2024-02-16 10:41:11
yep, that works
HCrikki
2024-02-17 12:25:03
lineageos 21 will be shipping a new default gallery app that looks powered by coil. would be cool if they included the coil decoder for jpegxl - looks like itd be a very easy integration to make
2024-02-17 12:27:05
https://github.com/lineage-next/android_packages_apps_Glimpse
Simulping
2024-02-17 12:42:34
https://twitter.com/animeterrorist/status/1750662322786636161 gosh I feel terrible for the guy
2024-02-17 12:42:43
entire thread rallied up against him
2024-02-17 12:42:50
because their favorite tools don't support webp
jonnyawsom3
2024-02-17 12:47:57
We had a day long discussion about it before
Simulping
2024-02-17 12:49:59
care to give a link?
jonnyawsom3
2024-02-17 01:04:40
I could've sworn there was a longer thread, but that's all I can find too
damian101
2024-02-17 01:14:38
what even doesn't support webp?
lonjil
2024-02-17 01:19:21
windows ships with libwebp but refuses to open webp files in anything but paint unless you install the webp addon from the store
username
what even doesn't support webp?
2024-02-17 03:00:28
there's quite a lot of software that unknowingly supports WebP flawlessly yet doesn't expose it to the user at all meaning the only way you can use WebP is to pretend it's another format like PNG or JPEG for the software to allow it through, The only personal example I can think of off the top of my head is Steam chat however I remember watching a friend scroll through the replies on a random WebP hate tweet and there was a very large amount of people saying stuff like "just rename it to .png and then it works fine" and this was on something that didn't mention any piece of software or hardware at all
damian101
2024-02-17 03:01:11
right, also lots of web services