|
fab
|
2023-08-30 08:32:53
|
And studied at Britannic school
|
|
2023-08-30 08:33:03
|
I'm anglosaxon
|
|
|
uis
|
2023-09-01 11:23:10
|
Anglo-
|
|
|
yoochan
|
2023-09-09 01:53:48
|
for french speakers, jpeg xl is briefly mentioned in this very confusing blog post on linuxfr https://linuxfr.org/users/tisaac/journaux/du-format-et-de-la-taille-des-images
|
|
|
Moritz Firsching
|
2023-09-20 06:47:11
|
https://news.ycombinator.com/item?id=37580490
|
|
2023-10-10 07:29:17
|
https://blog.adobe.com/en/publish/2023/10/10/hdr-explained
"If your favorite browser doesn’t support AVIF, JPEG XL, or HDR display, make sure you have the latest version and check back often for updates. You may even want to send the developers a note indicating that you’d really like them to support these features!"
|
|
|
spider-mario
|
2023-10-10 07:43:57
|
interestingly, https://helpx.adobe.com/content/dam/help/en/camera-raw/using/gain-map/jcr_content/root/content/flex/items/position/position-par/table/row-3u03dx0-column-4a63daf/download_section/download-1/Gain_Map_1_0d12.pdf explains how to store a gain map in a JXL file
|
|
|
Moritz Firsching
|
2023-10-11 01:02:04
|
https://news.ycombinator.com/item?id=37839875
|
|
|
|
Squid Baron
|
2023-10-11 05:13:36
|
I didn't realize they added JPEG XL compression to DNG
|
|
|
Foxtrot
|
2023-10-11 05:20:14
|
Now they should add it into PSD files 🙂
|
|
|
HCrikki
|
2023-10-11 05:21:08
|
and PDF. perhaps sooner than people think...
|
|
|
gb82
|
|
Squid Baron
I didn't realize they added JPEG XL compression to DNG
|
|
2023-10-11 05:30:31
|
Wait, since when?
|
|
|
|
Squid Baron
|
2023-10-11 05:31:07
|
1.7, the latest version of the spec
|
|
2023-10-11 05:31:33
|
released in June 2023
|
|
2023-10-11 05:31:39
|
https://helpx.adobe.com/camera-raw/digital-negative.html
|
|
|
|
afed
|
2023-10-11 05:36:02
|
<:FeelsReadingMan:808827102278451241>
|
|
|
Foxtrot
|
2023-10-11 07:07:39
|
Hmm, I didn't know it's possible to have lossless jpeg
|
|
|
lonjil
|
2023-10-11 07:09:17
|
https://en.wikipedia.org/wiki/Lossless_JPEG
|
|
|
HCrikki
|
2023-10-13 01:37:50
|
https://imageengine.io/jpeg-xl-a-new-era-for-image-optimization/
|
|
|
_wb_
|
2023-10-13 07:40:11
|
all those marketing superlatives, lol — but I'm not complaining 🙂
|
|
|
spider-mario
|
2023-10-13 08:34:52
|
> This format also boasts a flexible modular mode structure, an adaptive quantization field, and context modeling to fine-tune image compression to an extraordinary level.
|
|
2023-10-13 08:35:01
|
a flexible modular mode structure 😎
|
|
2023-10-13 08:35:21
|
> Its superior compression efficiency outperforms current image formats, and it caters to a range of image features, including the elusive alpha channel and animated images.
|
|
2023-10-13 08:35:27
|
the elusive alpha channel
|
|
|
Tirr
|
2023-10-13 08:59:11
|
Modular encoding has meta-adaptive feature so it's somewhat flexible 😉
|
|
|
jonnyawsom3
|
2023-10-13 02:01:58
|
My favourite part was the first 4 bullet points being
It's good
It's small
It's small
It's small
|
|
|
spider-mario
|
2023-10-13 02:27:35
|
> ImageEngine, excited by Apple’s announcement, worked tirelessly to integrate support for JPEG XL into ImageEngine in June 2023, ensuring that any user of iOS 17 or macOS Sonoma with Safari browser will automatically receive a JPEG XL image.
the excitement is palpable (and appreciated), got to hand it to them
|
|
|
jonnyawsom3
|
2023-10-16 01:09:58
|
https://foolip.github.io/interop-reactions/
|
|
2023-10-16 01:10:15
|
Safe to say we're making a mark
|
|
|
username
|
2023-10-16 01:24:03
|
how much do the reactions effect/sway the actions/choices of the interop team?
|
|
|
jonnyawsom3
|
2023-10-16 01:32:24
|
The decisions come down to the arguments for/against more than reactions, but it does show there's community intrest above all the rest
|
|
|
_wb_
|
2023-10-17 02:33:28
|
https://unthinking.photography/articles/history-and-environmental-impact-of-digital-image-formats
|
|
|
jonnyawsom3
|
2023-10-17 03:41:27
|
I know it's just coincidence, but after <#803574970180829194> I can't help but feel responsible for the mention of DNG haha
|
|
|
lonjil
|
2023-10-17 04:06:48
|
I checked out the sources for the energy of storage claims. It seems they found an estimate of energy usage of the Internet, and an estimate of how much data is transferred each year on the Internet, both from 2011, and simply divded the figures.
However, the former reports that most "Internet power" comes from desktops and laptops simply being plugged in, not actual transfer and storage on the cloud. It also reports that the total energy usage of the Internet is so low compared to everything else that efficiency improvements have little impact on humanity's energy usage.
|
|
|
Moritz Firsching
|
2023-11-02 08:32:30
|
https://indieweb.social/@stoyan/111340812717487490
|
|
|
jonnyawsom3
|
2023-11-03 12:44:04
|
Direct link https://youtu.be/VJaa1Le4W7c
|
|
|
190n
|
2023-11-03 03:00:51
|
i guess "jexel" is canon? <:WTF:805391680538148936>
|
|
2023-11-03 03:01:25
|
ah i hadn't heard chorus #2 yet
|
|
|
yoochan
|
2023-11-03 08:48:26
|
someone to post it on
https://bugs.chromium.org/p/chromium/issues/detail?id=1178058 ?
|
|
|
_wb_
|
2023-11-09 09:48:41
|
https://www.computerweekly.com/blog/CW-Developer-Network/Cloudinary-The-dramatic-story-of-JPEG-XL-support-so-far
|
|
2023-11-15 04:37:21
|
https://www.businesswire.com/news/home/20231115698449/en/Cloudinary-Uses-AI-to-Further-Improve-Image-Quality-Online-Announces-JPEG-XL-Support-for-Apple-Ecosystem/?feedref=JjAwJuNHiystnCoBq_hl-W8j9Oi60kFTomUxRDB8jhBsNpnJw7dvSfTyni2BOVIOrCOi9QzgjCezTS3Nw_X6kJUrpSBm-Hav1w-UkdSlG3nDlC87j5HoE75BMeA8LaVacVjEfZIyqgxRRsuxHCTK2w%3D%3D&utm_source=dlvr.it&utm_medium=twitter
|
|
|
|
afed
|
2023-11-15 04:55:23
|
is ssimulacra2 also used or is it a completely different ai?
|
|
|
_wb_
|
2023-11-15 05:03:21
|
completely different, ssimulacra2 is not ai, it's a classic metric
|
|
2023-11-15 05:04:40
|
but also based on CID22 data
|
|
|
Eugene Vert
|
2023-11-18 01:40:47
|
https://github.com/typst/typst/pull/2701
|
|
2023-11-18 01:41:27
|
> Typst is a new markup-based typesetting system that is designed to be as powerful as LaTeX while being much easier to learn and use.
|
|
|
diskorduser
|
|
Eugene Vert
https://github.com/typst/typst/pull/2701
|
|
2023-11-18 01:58:14
|
Isn't this suitable for <#803574970180829194> ?
|
|
|
username
|
2023-12-02 10:06:43
|
https://aspect.bildhuus.com/blog/posts/aspect-preview-33
|
|
2023-12-11 04:27:18
|
JPEG XL is one of the formats that is being considered to be added to PDF https://pdfa.org/pdf-moves-ahead-with-hdr/
|
|
|
HCrikki
|
2023-12-11 04:36:34
|
its the only one that makes sense. old pdfs without jpeg2000 images could be losslessly recompressed to newer pdf specs with zero data loss and higher storage/bandwidth gains
|
|
2023-12-11 04:38:36
|
imo its way more crucial for the format to be enshrined into the pdf spec than browsers directly. this way everything that ships a pdf reader (including browsers) would have to support decoding the files, and websites serving manuals would use as least webasm decoders. together, the install count of pdf/ebook readers must approach that of browsers (android's acrobat reader alone is like 600+ millions and afaik desktop reader is almost as ubiquitous as flash player once was)
|
|
|
Quackdoc
|
2023-12-11 05:04:16
|
AV2 is really expensive, I doubt it will be viable for PDF any time soon, dav1d is actually just insanely well optimized which is why avif is even usable
|
|
|
Oleksii Matiash
|
|
HCrikki
its the only one that makes sense. old pdfs without jpeg2000 images could be losslessly recompressed to newer pdf specs with zero data loss and higher storage/bandwidth gains
|
|
2023-12-11 05:05:50
|
jbig2 is still better for b\w images, unfortunately
|
|
|
a goat
|
|
Quackdoc
AV2 is really expensive, I doubt it will be viable for PDF any time soon, dav1d is actually just insanely well optimized which is why avif is even usable
|
|
2023-12-11 08:08:50
|
Wait we're on AV2 now?
|
|
|
Quackdoc
|
2023-12-11 08:09:40
|
I think they are calling it AVM, im not sure if its actually supposed to be a codec that will be useful in application, or just a bed for experimentation and development
|
|
|
|
veluca
|
|
Quackdoc
I think they are calling it AVM, im not sure if its actually supposed to be a codec that will be useful in application, or just a bed for experimentation and development
|
|
2023-12-11 08:23:02
|
it will be a codec sooner or later
|
|
|
Cacodemon345
|
2023-12-11 09:55:39
|
Sounds like we can forget about JPEG XL getting added to PDF then.
|
|
2023-12-11 09:56:51
|
(Assuming the JPEG XL supporters crowd gets outnumbered by the AVIF crowd).
|
|
|
Quackdoc
|
2023-12-11 10:02:52
|
I doubt it, avif is already harsh enough to decode even with how good dav1d miraculously is AV2 would be panick mode
|
|
|
Cacodemon345
|
2023-12-11 10:36:36
|
Then you will need hard metrics for JPEG XL vs. AVIF (the latter with dav1d+libavif, the former with libjxl).
|
|
|
Quackdoc
|
2023-12-11 10:40:01
|
hard metrics for what?
|
|
|
fab
|
|
Quackdoc
I doubt it, avif is already harsh enough to decode even with how good dav1d miraculously is AV2 would be panick mode
|
|
2023-12-11 10:53:37
|
Av1 1280x720 529kbps of a Romanian singers make the phone hot
|
|
2023-12-11 10:53:57
|
It can stress cpu if you watch the video for more than 8 10 seconds
|
|
2023-12-11 10:54:27
|
Fb can use easier to decode fonts
|
|
2023-12-11 10:55:06
|
|
|
2023-12-11 10:55:25
|
The font are available now are not easy to decode
|
|
2023-12-11 10:55:51
|
Discord font looks more rounded now
|
|
2023-12-11 10:56:06
|
You know that rounded font are not easy to decode
|
|
|
HCrikki
its the only one that makes sense. old pdfs without jpeg2000 images could be losslessly recompressed to newer pdf specs with zero data loss and higher storage/bandwidth gains
|
|
2023-12-11 11:09:57
|
JPEG XL will be bad before 2024 and anyway it won't be able to handle adoption
|
|
2023-12-11 11:25:00
|
|
|
2023-12-11 11:25:30
|
This same as discord font 11:56am 2023-12-11
|
|
2023-12-11 11:25:55
|
If you would need that design in the future you'll have
|
|
2023-12-11 11:32:40
|
|
|
2023-12-11 11:33:03
|
That one iz my favourite input it will increase your legibility
|
|
2023-12-11 11:34:47
|
|
|
2023-12-11 11:35:02
|
https://github.com/zhimoe/programming-fonts
|
|
2023-12-11 12:49:22
|
|
|
2023-12-11 12:49:26
|
Input sans perfect not narrow
|
|
|
Traneptora
|
|
Cacodemon345
Then you will need hard metrics for JPEG XL vs. AVIF (the latter with dav1d+libavif, the former with libjxl).
|
|
2023-12-11 01:27:09
|
Keep in mind that Adobe is already on the JXL train. JXL web adoption is being blocked almost exclusively by the chromium codecs team
|
|
|
Cacodemon345
|
|
Quackdoc
hard metrics for what?
|
|
2023-12-11 01:27:56
|
Decode performance.
|
|
|
_wb_
|
2023-12-11 02:17:42
|
who cares about decode performance for something like PDF? any raster format decode will in any case be pretty fast compared to complex vector rendering...
|
|
|
lonjil
|
2023-12-11 02:27:43
|
depends on what you're using PDF for
|
|
2023-12-11 02:28:07
|
Some people put very high quality raster images in PDFs with nothing else.
|
|
|
_wb_
|
2023-12-11 02:37:43
|
yes — and how fast that will render will depend more on the image resolution than on the format (and in case of formats that allow multithreaded decode, on the number of threads you're using for the decode)
|
|
2023-12-11 02:42:19
|
my point is: who cares if it's 50 MPx/s per thread or 70 MPx/s per thread — I don't see how that makes a fundamental difference for putting it in the PDF spec or not. In normal uses it will be "fast enough" anyway.
|
|
|
jonnyawsom3
|
2023-12-11 06:46:59
|
Yeah... The metrics are good and fun to look at, but as soon as you look at the real world where most images are under 12MP, 600MP/s vs 60MP/s is a lot less significant
|
|
|
Quackdoc
|
|
_wb_
yes — and how fast that will render will depend more on the image resolution than on the format (and in case of formats that allow multithreaded decode, on the number of threads you're using for the decode)
|
|
2023-12-11 06:48:34
|
There are some PDFs that are about 200 plus pages of just images. If you open one of those on a lower end device and scroll and you have bad decode performance It can be a very bad experience in some cases even locking up the device.
|
|
2023-12-12 04:16:48
|
that depends on the book, western comics can often be full colour, same with korean and chinese comics
|
|
|
_wb_
|
|
Quackdoc
There are some PDFs that are about 200 plus pages of just images. If you open one of those on a lower end device and scroll and you have bad decode performance It can be a very bad experience in some cases even locking up the device.
|
|
2023-12-12 07:01:42
|
Sure, but whether those images are j2k or avif or jxl will not make much of a difference.
|
|
2023-12-12 07:07:44
|
Compared to whether they are 2 Mpx or 200 Mpx per page
|
|
|
Oleksii Matiash
|
|
_wb_
Sure, but whether those images are j2k or avif or jxl will not make much of a difference.
|
|
2023-12-12 07:17:34
|
In my experience pdf with large j2k decodes much slower than with equal sized jpg. If jpg-pdf shows pages almost immediately j2k-pdf gives some visible delay. I have modern cpu, but it does not help much to j2k, as it's decoder is single threaded according to task manager
|
|
|
_wb_
|
2023-12-12 07:19:35
|
A lot depends on the quality of the implementation of the viewer — e.g. is it pre-decoding upcoming pages or does it only start decoding when you navigate, does it use multiple threads or not, etc.
|
|
2023-12-12 07:25:46
|
It depends — if it's lossless jxl without squeeze, you have to decode the whole image; if it's lossy then you should be able to get a 1:8 preview fast.
But for something like PDF, I think it's also possible to embed explicit page thumbnails for quick navigation — that's probably even better for a snappy UX.
|
|
2023-12-12 07:29:51
|
in general, there are many ways to make PDF files load faster or slower and it depends on the use case what kind of trade-off you want — if it's for casual viewing then you probably don't need a very high image resolution but you may want to have thumbnails and navigation indices for fast random access; if it's for archival or printing then you need a higher resolution but you maybe care less about random access and decode speed (as long as it decodes faster than the printer can print it's ok)
|
|
|
Oleksii Matiash
|
|
_wb_
A lot depends on the quality of the implementation of the viewer — e.g. is it pre-decoding upcoming pages or does it only start decoding when you navigate, does it use multiple threads or not, etc.
|
|
2023-12-12 07:30:33
|
I'm using Foxit, and it definitely decodes "just in time"
|
|
|
Quackdoc
|
|
_wb_
A lot depends on the quality of the implementation of the viewer — e.g. is it pre-decoding upcoming pages or does it only start decoding when you navigate, does it use multiple threads or not, etc.
|
|
2023-12-12 07:43:30
|
PDF is popular enough that viewers and very slow hardware and low quality applications imo should be considered. I'm talking about hardware that is nearly decades old in performance. for instance at bliss, there is a collaboration with EIDU to provide tablets to low income countries, in the case of the specific program, kenya. using the taifa elimu tabs. Performance is actually very significant in this use case.
I was using a 2018 A8 to read full colored comics encoded in avif with tachiyomi, and while it worked, avif decode times were very noticeable at times making the device a bit sluggish. while it's fine to say "it depends on the implementation of the viewer" realistically PDF spec should account for the fact that PDFs are still going to be used on really ancient hardware and with viewers that simply aren't of superb quality.
|
|
|
fab
|
|
Quackdoc
PDF is popular enough that viewers and very slow hardware and low quality applications imo should be considered. I'm talking about hardware that is nearly decades old in performance. for instance at bliss, there is a collaboration with EIDU to provide tablets to low income countries, in the case of the specific program, kenya. using the taifa elimu tabs. Performance is actually very significant in this use case.
I was using a 2018 A8 to read full colored comics encoded in avif with tachiyomi, and while it worked, avif decode times were very noticeable at times making the device a bit sluggish. while it's fine to say "it depends on the implementation of the viewer" realistically PDF spec should account for the fact that PDFs are still going to be used on really ancient hardware and with viewers that simply aren't of superb quality.
|
|
2023-12-12 07:57:37
|
Snapdragon 780g is decade old on performance
|
|
2023-12-12 07:57:43
|
Like i3 330m
|
|
2023-12-12 07:57:51
|
But is armv8
|
|
|
_wb_
|
|
Quackdoc
PDF is popular enough that viewers and very slow hardware and low quality applications imo should be considered. I'm talking about hardware that is nearly decades old in performance. for instance at bliss, there is a collaboration with EIDU to provide tablets to low income countries, in the case of the specific program, kenya. using the taifa elimu tabs. Performance is actually very significant in this use case.
I was using a 2018 A8 to read full colored comics encoded in avif with tachiyomi, and while it worked, avif decode times were very noticeable at times making the device a bit sluggish. while it's fine to say "it depends on the implementation of the viewer" realistically PDF spec should account for the fact that PDFs are still going to be used on really ancient hardware and with viewers that simply aren't of superb quality.
|
|
2023-12-12 12:04:22
|
Sure, and a spec should certainly allow making PDFs that will work well even on low-end devices. I don't think a spec should aim to make it impossible to make files that will render slowly — at least for the current PDF spec that is certainly not the case.
|
|
|
Quackdoc
|
|
_wb_
Sure, and a spec should certainly allow making PDFs that will work well even on low-end devices. I don't think a spec should aim to make it impossible to make files that will render slowly — at least for the current PDF spec that is certainly not the case.
|
|
2023-12-12 06:55:26
|
I don't mean to say that they shouldn't allow it. I just mean that. When you look at the benefits of JXL versus AVIF or PDF, I don't think AVIF offers enough at all to consider it over JXL considering the performance cost.
|
|
|
Demiurge
|
2023-12-15 12:52:32
|
https://wiki.archlinux.org/title/JPEG_XL
|
|
|
novomesk
|
2024-01-19 02:17:23
|
https://blogs.gentoo.org/mgorny/2023/12/07/a-format-that-does-one-thing-well-or-one-size-fits-all/
|
|
|
jonnyawsom3
|
2024-01-19 02:34:12
|
A small mention of JXL, but one nonetheless
>>> Modern image formats go even further. WebP, AVIF and JPEG XL all support both lossless and lossy compession, high color depths, alpha channel, animation. Therefore, they are suitable both for computer-generated images and for photography. Effectively, they can replace all their predecessors with a “one size fits all” format.
|
|
|
novomesk
|
2024-01-19 02:43:43
|
While it is a small mention of JXL, it is important mention in its consequences. Michał Górny is one of the most important Gentoo Linux developers and because he is aware of JXL, he already many several steps to improve support of JPEG XL in Gentoo.
|
|
|
yurume
|
|
novomesk
https://blogs.gentoo.org/mgorny/2023/12/07/a-format-that-does-one-thing-well-or-one-size-fits-all/
|
|
2024-01-20 02:05:33
|
Good read indeed. The most notable failure in this topic IMO is layered pseudo-formats like .tar.gz, which were ultimately proven inefficient for various reasons. When your concern is cross-cutting, the solution also has to be cross-cutting.
|
|
|
_wb_
|
2024-01-20 05:20:47
|
For .tar.gz I think indeed the separation went a bit too far, but for something like .svg.gz it does make sense imo.
|
|
2024-01-20 05:24:34
|
(I also like to use layered extensions to keep track of what happened, e.g. I tend to make files with names like `screenshot.png.jxl`, `photo.jpg.jxl`, `image.png.d2e6.jxl`)
|
|
|
MSLP
|
2024-01-20 05:25:51
|
* svg.br 😉
|
|
|
Traneptora
|
2024-01-20 05:34:01
|
I don't boether with `image.png.jxl` if I'm doing a lossless recompression of a PNG image
|
|
2024-01-20 05:34:06
|
since it's the same pixel data
|
|
2024-01-20 05:34:14
|
for jpeg reconstructions I can see that though
|
|
2024-01-20 05:34:34
|
to let you instantly scan which `.jxl` files are JPEG reconstructions and which ones are not
|
|
|
jonnyawsom3
|
2024-01-20 06:14:12
|
Since currently encode options can't be saved as metadata, putting it in the name to reference later helps. For lossless and jpeg reconstruction it matters less though, since the jpegs naturally add reconstruction data by default and lossless is the same pixels as said above.
I say this, doing none of it myself and just having "Name.jxl" xP
Although, I'm also only testing for now until I find applications with support that I can adapt to quickly, especially since I frequently share to others and they usually can't open JXL files at all (<https://embed.moe/> does help though)
|
|
|
Traneptora
|
2024-01-20 06:52:08
|
It may be a good idea to define an Encode Settings extension bundle
|
|
2024-01-20 06:52:20
|
in annex N
|
|
2024-01-20 06:52:56
|
Or possibly an encode setings BMFF box
|
|
|
jonnyawsom3
|
2024-01-20 06:56:45
|
I was assuming a simple XML tag, although something more specific to JXL would work
|
|
|
_wb_
|
2024-01-20 08:22:44
|
Not sure if it makes sense to standardize encode settings, since encoding isn't standardized anyway.
I would just do it as a plaintext box, could call it `encs` or something (for encode settings), which consists of two lines:
encoder name and version
encode parameters
You could `brob` it if it gets a bit large, though probably that's not needed if the params are only mentioning the non-default stuff for that encoder version.
So e.g. something like
```
libjxl 0.9.1
-d 2 -e 6 --epf 0
```
where we write the encode settings in a syntax that can be copypasted to a cjxl command (even if the encode wasn't done by cjxl; it would be libjxl writing the box, not cjxl)
|
|
|
Traneptora
|
|
_wb_
Not sure if it makes sense to standardize encode settings, since encoding isn't standardized anyway.
I would just do it as a plaintext box, could call it `encs` or something (for encode settings), which consists of two lines:
encoder name and version
encode parameters
You could `brob` it if it gets a bit large, though probably that's not needed if the params are only mentioning the non-default stuff for that encoder version.
So e.g. something like
```
libjxl 0.9.1
-d 2 -e 6 --epf 0
```
where we write the encode settings in a syntax that can be copypasted to a cjxl command (even if the encode wasn't done by cjxl; it would be libjxl writing the box, not cjxl)
|
|
2024-01-20 10:44:45
|
if there's going to be a plaintext metadata box, might make sense to have a `jxlm` box, where the first four bytes are a metadata ID e.g. `encs` and then the metadata type
|
|
2024-01-20 10:45:09
|
that way you don't need to register a billion box types with the m4reg
|
|
2024-01-20 10:45:24
|
we can just consider plaintext metadata to all go in `jxlm`
|
|
2024-01-20 10:45:41
|
kind of like the equivalent of PNG's `iTXt`
|
|
|
_wb_
|
2024-01-20 10:59:57
|
It shouldn't have a name starting with jxl though, because iirc we specified `brob` to not be allowed on boxes with a name that starts with `jxl`
|
|
|
Traneptora
|
2024-01-20 11:00:30
|
sure, but the idea stands
|
|
|
lonjil
|
2024-01-20 11:01:09
|
`mxlj`
|
|
|
_wb_
|
2024-01-20 11:03:06
|
But yes, something like `text` would be useful, which in the payload first has a 4-byte identifier (or a zero-byte or newline terminated arbitrary length identifier, maybe) followed by arbitrary UTF-8 text
|
|
|
Traneptora
|
|
_wb_
But yes, something like `text` would be useful, which in the payload first has a 4-byte identifier (or a zero-byte or newline terminated arbitrary length identifier, maybe) followed by arbitrary UTF-8 text
|
|
2024-01-21 04:54:59
|
null-terminated string is awkward for parsing IMO. even png's `iCCP` has a cap on the name length
|
|
2024-01-21 04:55:13
|
I think a 4-byte id just makes more sense than an arbitrary-size ID
|
|
|
Oleksii Matiash
|
2024-01-21 07:50:18
|
Maybe I'm wrong, but for me it looks useful to have "used distance" saved somewhere somehow that it can be read as value, not as text-to-be-parsed. It would be very useful for **fast** detection (by script, for example) whether the file was saved losslessly, or not, without necessity to add something like .png.jxl
|
|
|
Traneptora
|
2024-01-21 07:56:16
|
It's generally not possible to determine that fwiw
|
|
2024-01-21 07:56:59
|
Lossless requires modular mode without XYB
|
|
2024-01-21 07:57:12
|
typically lossy modular uses squeeze
|
|
2024-01-21 07:57:36
|
but the absence of Squeeze is not an indication that loss wasn't used
|
|
2024-01-21 07:58:38
|
Basically im saying that just because a codestream format can store pixel data losslessly doesn't mean the encoder actually did
|
|
2024-01-21 07:59:36
|
a modular image without XYB, gaborish, or Squeeze is probably lossless but there's no guarantee
|
|
|
Oleksii Matiash
|
2024-01-21 08:19:53
|
Got it, thank you
|
|
|
_wb_
|
|
Traneptora
a modular image without XYB, gaborish, or Squeeze is probably lossless but there's no guarantee
|
|
2024-01-21 08:22:03
|
And without delta palette. Regular palette is probably lossless (at least libjxl doesn't use that in a lossy way) but delta palette is most likely lossy (at least libjxl doesn't use it in a lossless way)
|
|
2024-01-21 08:23:33
|
But of course you never know if someone first did pngquant or something before converting to jxl, or decoded a jpeg to pixels and then losslessly encoded that. Lossless is in the end a property of a whole workflow, not of a bitstream.
|
|
|
Oleksii Matiash
|
|
_wb_
But of course you never know if someone first did pngquant or something before converting to jxl, or decoded a jpeg to pixels and then losslessly encoded that. Lossless is in the end a property of a whole workflow, not of a bitstream.
|
|
2024-01-21 08:27:43
|
Well, I can disagree here. Surely, nothing can prevent people to losslessly resave already lossy image, but for personal use I can be confident that did not do that
|
|
2024-01-21 08:29:09
|
But yes, in most cases it would be useless
|
|
2024-01-21 08:30:22
|
Just got the idea to register "new extension" locally - jxll for lossless 🙂 Just a joke, of course
|
|
|
Fraetor
|
|
yurume
Good read indeed. The most notable failure in this topic IMO is layered pseudo-formats like .tar.gz, which were ultimately proven inefficient for various reasons. When your concern is cross-cutting, the solution also has to be cross-cutting.
|
|
2024-01-21 10:05:45
|
Where have they proven inefficient? I understand that in many cases a zip like format is better due to supporting random access, but for use cases where you'll need the majority of the contents (such as code archives), I've always found tar.[something] files win.
|
|
|
spider-mario
|
2024-01-21 11:21:24
|
something like 7z would let you e.g. use different compression for different files, or group them by extension
|
|
2024-01-21 11:21:49
|
(`-ms=e`)
|
|
|
Traneptora
|
|
spider-mario
something like 7z would let you e.g. use different compression for different files, or group them by extension
|
|
2024-01-22 12:42:51
|
~~just sort before you tar, ez~~
|
|
2024-01-22 12:47:43
|
this has created my favorite shell function I've ever written
|
|
2024-01-22 12:47:46
|
```sh
sortbyext(){
sed 's_^\(\([^/]*/\)*\)\(.*\)\(\.[^\./]*\)$_\4/\3/\1_' | sed 's_^\(\([^/]*/\)*\)\([^\./]\+\)$_/\3/\1_' | sort -t/ -k1,1 -k2,2 -k3,3 | sed 's_^\([^/]*\)/\([^/]*\)/\(.*\)$_\3\2\1_'
}
```
|
|
|
yurume
|
|
Fraetor
Where have they proven inefficient? I understand that in many cases a zip like format is better due to supporting random access, but for use cases where you'll need the majority of the contents (such as code archives), I've always found tar.[something] files win.
|
|
2024-01-22 01:45:56
|
As spider-mario said, I was thinking about 7z or rar in this case (zip is not there, sadly). Thinking about .tar.gz, tar knows nothing about compression and gzip can see the whole archive, which looks like the win-win. However you can't easily extract *some* files from the archive, and gzip also knows nothing about archival so that it doesn't understand file ordering (mostly) do not matter. So even for .tar.gz, the best implementation would do both the compression and archival at once. And at that point you don't really have to distinguish tar from gzip.
|
|
|
jonnyawsom3
|
|
_wb_
But yes, something like `text` would be useful, which in the payload first has a 4-byte identifier (or a zero-byte or newline terminated arbitrary length identifier, maybe) followed by arbitrary UTF-8 text
|
|
2024-01-22 01:59:32
|
Would also allow lossless preservation of PNG text chunks in their original form, if the user doesn't override it with other data (Such as the encode settings)
|
|
|
spider-mario
something like 7z would let you e.g. use different compression for different files, or group them by extension
|
|
2024-01-22 02:02:10
|
`qs` is the "group by type" parameter, and can help a lot with almost no cost if on an SSD
|
|
|
Traneptora
|
|
yurume
As spider-mario said, I was thinking about 7z or rar in this case (zip is not there, sadly). Thinking about .tar.gz, tar knows nothing about compression and gzip can see the whole archive, which looks like the win-win. However you can't easily extract *some* files from the archive, and gzip also knows nothing about archival so that it doesn't understand file ordering (mostly) do not matter. So even for .tar.gz, the best implementation would do both the compression and archival at once. And at that point you don't really have to distinguish tar from gzip.
|
|
2024-01-22 04:52:25
|
this is why `pixz` was invented fwiw
|
|
2024-01-22 04:53:35
|
it's an xz-file-format compressor (and decompressor) that can create .tar.xz archives with good random access
|
|
|
yurume
|
2024-01-22 05:08:56
|
does pixz require pixz-produced files for fast partial decompression? then that's effectively a new format (defined as a subset of the existing format of course)
|
|
|
Traneptora
|
2024-01-22 06:48:39
|
yea
|
|
2024-01-22 06:48:54
|
if someone compressed a tar archive as a solid block there's not much you can do about that
|
|
|
Oleksii Matiash
|
|
yurume
As spider-mario said, I was thinking about 7z or rar in this case (zip is not there, sadly). Thinking about .tar.gz, tar knows nothing about compression and gzip can see the whole archive, which looks like the win-win. However you can't easily extract *some* files from the archive, and gzip also knows nothing about archival so that it doesn't understand file ordering (mostly) do not matter. So even for .tar.gz, the best implementation would do both the compression and archival at once. And at that point you don't really have to distinguish tar from gzip.
|
|
2024-01-22 07:25:49
|
[offtop] Bit late again, gzip uses very small window, while 7z and rar5 have it up to 3.8/1 GB respectively. Also, 7z compresses catalog, while rar does not.
|
|
|
yurume
|
2024-01-22 07:27:13
|
window is an important factor, but you need to somehow model the distance distribution so even a larger window would be disadvantaged there.
|
|
|
Oleksii Matiash
|
|
yurume
window is an important factor, but you need to somehow model the distance distribution so even a larger window would be disadvantaged there.
|
|
2024-01-22 07:32:17
|
I've never seen such case, in my experience larger window is always better. But it is not scientifical, just statistics. Also forget to mention, that AFAIK gzip does not create solid archives, and in many cases it also does significantly less compression
|
|
|
Traneptora
|
2024-01-22 07:35:39
|
gzip does make solid archives
|
|
2024-01-22 07:36:07
|
you can improve the ratio for large tarchives and small-window compressors like gzip by sorting by file extension before you tar
|
|
|
spider-mario
|
|
`qs` is the "group by type" parameter, and can help a lot with almost no cost if on an SSD
|
|
2024-01-22 07:56:17
|
I meant “grouping by type” in the sense of making solid blocks
|
|
2024-01-22 07:56:56
|
in between tar’s “one solid block for everything” (`7z -ms=on`) and zip’s “no multi-file solid block at all” (`7z -ms=off`), 7z is also capable of making one solid block per file extension (`7z -ms=e`)
|
|
2024-01-22 07:57:35
|
which can be a nice compromise in terms of giving good compression while not having to decompress absolutely everything when you need just one file
|
|
|
username
|
2024-01-22 08:04:17
|
I'm sad that rar5 doesn't support having multiple solid blocks, it's either the whole archive is non-solid or solid
|
|
|
Oleksii Matiash
|
|
Traneptora
you can improve the ratio for large tarchives and small-window compressors like gzip by sorting by file extension before you tar
|
|
2024-01-22 08:14:15
|
I do not accept the idea of tar'ing and then compressing, for large archives it is just waste of time and resources. So archivers that compress many files in one archive is my choice 🤷♂️ Regarding gzip and solid - really? Ok, google did not give an answer
|
|
|
username
I'm sad that rar5 doesn't support having multiple solid blocks, it's either the whole archive is non-solid or solid
|
|
2024-01-22 08:14:30
|
7z does
|
|
|
username
|
2024-01-22 08:16:34
|
I know but 7z lacks features that I need/want
|
|
2024-01-22 08:19:53
|
one of which being the "recovery record" feature of rar, I know for 7z I could use Parchives however that isn't as pleasant/nice as having support built right into the format
|
|
2024-01-22 08:20:59
|
I have a friend who backed up a project they made into a 7z file and some amount of it got corrupt and made him lose all the data
|
|
|
Traneptora
|
2024-01-22 08:21:06
|
gzip is a single file compression algorithm
|
|
2024-01-22 08:21:18
|
like all of them it is solid
|
|
|
Oleksii Matiash
|
2024-01-22 08:26:22
|
Well, that's not what I call solid, but ok, I got it
|
|
|
Traneptora
|
2024-01-22 08:28:24
|
solid means you have to decide from the start
|
|
2024-01-22 08:28:33
|
no random access
|
|
|
jonnyawsom3
|
|
spider-mario
I meant “grouping by type” in the sense of making solid blocks
|
|
2024-01-22 08:37:28
|
Ah right, I had been playing with per-file compression types in 7z a while ago so had that stuck in my head
|
|
|
username
|
|
Ah right, I had been playing with per-file compression types in 7z a while ago so had that stuck in my head
|
|
2024-01-22 08:40:09
|
i'm curious, did your testing include or make use of this https://www.tc4shell.com/en/7zip/smart7z/ (or any of the other plugins on that site) ?
|
|
|
jonnyawsom3
|
2024-01-22 08:44:20
|
Yeah, was my first time looking at the plugins so tried nearly all of them to see what I liked/what worked
|
|
|
Oleksii Matiash
|
|
Traneptora
solid means you have to decide from the start
|
|
2024-01-22 08:52:01
|
I mean another "feature" of being solid - many files are treated as one "stream" increasing compression ratio. I. e. like tar but without tar. The fact that all files prior to the needed one must be decompressed before - well, it is not big price for such compression imrovement. At least for me
|
|
|
yurume
|
2024-01-22 09:04:48
|
a better choice would be a learned preset dictionary that is usable for each file type
|
|
2024-01-22 09:05:54
|
a solid compression does show that similar files can be compressed further by making the compressor learn their pattern, but there are better ways to teach that
|
|
|
jonnyawsom3
|
2024-01-22 09:23:24
|
Yeah, that's what I meant with the `qs` parameter. Since it groups types it can learn the format
|
|
|
username
|
2024-01-22 09:40:22
|
seems like there's been some recent JPEG XL related posts on Lemmy.World (Reddit alternative):
https://lemmy.world/post/11004635 (post on "memes")
https://lemmy.world/post/11021604 (post on "Programmer Humor@lemmy.ml")
|
|
|
spider-mario
|
2024-01-22 10:05:51
|
> Let’s rename JXL to GPEJ to give GIF pronunciation a run for its money
😁
|
|
|
yoochan
|
2024-01-22 10:40:23
|
gépèje ?
|
|
2024-01-22 10:41:05
|
(with south of france accent)
|
|
|
MSLP
|
|
username
seems like there's been some recent JPEG XL related posts on Lemmy.World (Reddit alternative):
https://lemmy.world/post/11004635 (post on "memes")
https://lemmy.world/post/11021604 (post on "Programmer Humor@lemmy.ml")
|
|
2024-01-22 04:21:55
|
there seem to be consequences: https://github.com/thunder-app/thunder/issues/1073
🤪
|
|
|
HCrikki
|
2024-01-22 05:14:17
|
trimed on cromite, this jxl is served as a jpg ? image viewers fail to load the jpeg but open the jxl fine
|
|
2024-01-22 05:15:40
|
i wonder if it was a jpeg losslesly converted to jxl
|
|
|
jonnyawsom3
|
2024-01-22 05:18:28
|
It's like github, labelled Jpg to upload on the site apparently
|
|
|
Traneptora
|
|
HCrikki
trimed on cromite, this jxl is served as a jpg ? image viewers fail to load the jpeg but open the jxl fine
|
|
2024-01-22 05:37:04
|
if you add `?format=jxl` to the URL it will serve a JXL
|
|
|
_wb_
|
2024-01-23 06:46:18
|
Nothing new but just a short blogpost on the Samsung adoption thing: https://cloudinary.com/blog/samsung-now-supports-dng-1-7-including-jpeg-xl
|
|
|
Cacodemon345
|
2024-01-25 01:40:40
|
Google sure is facing intensified pressure to add JXL support to Chrome.
|
|
|
diskorduser
|
2024-01-25 02:03:25
|
Samsung should add jxl support in Samsung browser
|
|
|
HCrikki
|
2024-01-25 02:17:03
|
even without browsers, any web service that has a mobile app would benefit from going mostly jxl
|
|
2024-01-25 02:17:21
|
or even exclusively if their web service can only be accessed from within a mobile app
|
|
|
_wb_
|
2024-02-01 10:08:49
|
Tom Claburn from The Register asked me about the Interop 2024 thing so I expect he'll write about it...
|
|
|
MSLP
|
2024-02-02 04:26:45
|
I hope he'll contact the parties behind the decision too, and someone will give him some insight, at least on the stance of the party they're associated with.
It's interesting what are the Microsoft plans (in terms of Edge support) - eg. AVIF integration may have been done differently than in upstream chromium, so the future for JPEG XL in Edge may be different than in Chrome
|
|
|
Jim
|
2024-02-03 12:50:08
|
It will be different. There are code suggestions that they are working on a JXL extension for Windows for the Imaging Component, which also makes it work in Edge. That's why AVIF and AV1 didn't work in Edge in the same version it was available in Chrome; Because Microsoft replaced the image & video backend in Chromium with the one built into Windows. So it looks like support is going Safari -> Edge -> ?
No idea who will be next, possibly Firefox, but that would likely be quite a bit later. My guess is that Google is going to push back on it right up until the point where they absolutely can't anymore.
|
|
|
HCrikki
|
2024-02-03 01:00:31
|
firefox's code in nightly should be updated despite the position on standard. sure its disabled out of the box but just integrating the working uptodate patches firefox derivatives already ship would be a potential precursor to out of box support in the *Developper edition* (based on nightly)
|
|
|
MSLP
|
2024-02-03 03:06:05
|
Ye, since it is disabled by default they could at least merge already existing patches for animiation & other features, as it won't affect their stable distributed binaries, and other builders could have the full featured support
|
|
|
VcSaJen
|
|
HCrikki
firefox's code in nightly should be updated despite the position on standard. sure its disabled out of the box but just integrating the working uptodate patches firefox derivatives already ship would be a potential precursor to out of box support in the *Developper edition* (based on nightly)
|
|
2024-02-03 04:33:16
|
Firefox Developer Edition and Firefox Beta are essentially the same. Weirdly Firefox docs wrongly indicated that JXL support was available in Beta, it was fixed years later.
|
|
|
MSLP
|
2024-02-03 07:33:21
|
hmmm... was it fixed? the patch <https://github.com/uriesk/firefox-jxl-rpmspec/blob/main/firefox-enable-jxl.patch> suggests that's it's in nightly only, or is it that beta also uses `is_nightly` flag?
|
|
|
username
|
2024-02-03 07:34:08
|
Firefox Beta is not compiled with JXL support
|
|
2024-02-03 07:34:27
|
I think they mean the Docs where fixed
|
|
|
MSLP
|
2024-02-03 07:35:10
|
ah, so "fixed" the other way around than I thought
|
|
|
username
|
|
_wb_
Tom Claburn from The Register asked me about the Interop 2024 thing so I expect he'll write about it...
|
|
2024-02-03 11:57:14
|
seems like it's out now https://www.theregister.com/2024/02/03/jpeg_xl_interop_2024/
|
|
|
spider-mario
|
2024-02-03 12:22:06
|
> JPEG XL, an ISO/IEC 18181 standard,
well, _the_ ISO/IEC 18181 standard
|
|
2024-02-03 12:22:10
|
an ISO/IEC standard
|
|
|
Traneptora
|
2024-02-03 05:12:53
|
comments on that are funny
|
|
2024-02-03 05:12:58
|
"JPEG XL was killed by patent trolls"
|
|
2024-02-03 05:13:02
|
hm
|
|
|
_wb_
|
2024-02-03 07:29:51
|
No trolls yet — fingers crossed it remains like that...
|
|
2024-02-03 07:56:00
|
Those comments are mostly cringe...
|
|
|
.vulcansphere
|
2024-02-04 07:33:27
|
Knowing El Reg, I usually avoid the comments section (except for laughing)
|
|
|
Moritz Firsching
|
2024-02-04 05:44:31
|
https://news.ycombinator.com/item?id=39250938
|
|
|
Jim
|
2024-02-04 06:40:44
|
The trolls are now being released.
|
|
|
username
|
2024-02-04 07:17:53
|
I keep seeing the security argument appear. And it's like yeah? the same exact thing goes for literally any other format so wouldn't it be nice to have something that isn't made to be thrown away after like 7 or so years? wouldn't it be nice to have something that's valid for current **AND** future use cases instead of something that's going to and already is facing use case issues due to limitations?
|
|
2024-02-04 07:21:05
|
the security argument applies to future formats as well so wouldn't it be nice to have something that negates the need for future formats so it wouldn't have to be a decision that has to be considered again in the future?
|
|
|
_wb_
|
2024-02-04 07:41:50
|
I think the security argument is quite overrated tbh. Everything can have bugs but at least codecs are integrated in much more than just browsers so there are a lot of eyes on it. There's a ton of web-only code that gets added to browsers all the time, which imo is a much bigger security risk...
|
|
|
Quackdoc
|
2024-02-04 07:55:03
|
I don't necessarily think it's overrated. but those are issues which could be solved.
|
|
|
lonjil
|
2024-02-04 08:05:58
|
Just use jxl-oxide
|
|
|
Quackdoc
|
2024-02-04 08:06:50
|
rust is nice but it's no magic bullet, it still needs the typical fuzzing and what not
|
|
|
_wb_
|
2024-02-04 08:28:53
|
I mean, most of the serious bugs will allow you to craft an image that will cause the browser tab to crash. But there are plenty of other ways to do that anyway.
Exploiting a bug in a codec in a way that does something more sneaky/malicious than crashing the tab is probably in principle possible but has anyone seen such exploits? In the end it can still only affect the rendering process which should be isolated from most of the really sensitive stuff, right?
|
|
|
Quackdoc
|
2024-02-04 08:30:06
|
ive seen a couple before that allow execution of code, sandbox escapes are a different story
|
|
|
_wb_
|
2024-02-04 08:31:56
|
Compared to new browser functionality like entire new web apis that can be called directly from Javascript and that may have bugs since the code is brand new and gets written _only_ for browsers, I think codecs are kind of not _that_ risky
|
|
2024-02-04 08:34:58
|
I mean, I don't want to downplay the risks, any code should get thorough security review etc. But it just feels weird that for image formats this is a big argument while for all the other code that goes into browsers it doesn't seem to be used as often as an argument to block new features...
|
|
2024-02-04 08:37:17
|
(also for webp and avif this argument didn't stop Chrome from integrating them very early on, even before these codecs were integrated in anything else)
|
|
|
Quackdoc
|
2024-02-04 08:38:07
|
well I haven't investigated it myself, but iirc the latest libwebp vuln was actually really significant. I can't say how much for chromium, but at the very least, the libwebp bug in android allowed for arbitrary code execution
|
|
2024-02-04 08:39:38
|
im not sure how chrome handles images or anything so I can't really comment to what degree chrome/electron apps may or may not have been effected, but at the very least, it did prove that attacking media decoders is still a potential attack vector
|
|
|
spider-mario
|
2024-02-04 08:41:03
|
true, but considering that it’s one such incident, in code that’s about 12 years old by now, it’s not very strong evidence of a general trend
|
|
|
yoochan
|
2024-02-04 08:41:43
|
Someone in the hackernews speaks about wuffs. Did you knew this project? https://github.com/google/wuffs?tab=readme-ov-file
|
|
|
Quackdoc
|
|
spider-mario
true, but considering that it’s one such incident, in code that’s about 12 years old by now, it’s not very strong evidence of a general trend
|
|
2024-02-04 08:49:26
|
Im not sure it would be considered out of trend however, at the very least, AWS found at least some degree of reason to believe dav1d could be a potential issue, and reached out to Prossimo funding them $1M to work on 4 projects, rustls, ntpd-rs, sudo/su-rs and important to the discussion at hand, rav1d, a direct rust port of dav1d.
There have been other cases of denial of service attacks, but nothing major I can remember, though I think something fairly recently happened with some audio codec it was also some really old code however?
|
|
|
|
veluca
|
2024-02-04 09:50:35
|
I am not sure what to think of rav1d
|
|
2024-02-04 09:50:47
|
last I heard, they plan to still keep the raw asm files, which... yeah
|
|
2024-02-04 09:51:10
|
the thing about codecs is that bugs can be a lot more subtle, which makes finding them by fuzzing or other means a lot harder
|
|
2024-02-04 09:51:15
|
and sandboxes are not perfect
|
|
2024-02-04 09:51:28
|
but it's IMO not *enough* of a reason
|
|
|
yoochan
Someone in the hackernews speaks about wuffs. Did you knew this project? https://github.com/google/wuffs?tab=readme-ov-file
|
|
2024-02-04 09:58:11
|
on wuffs: I have mixed opinions... on one hand, more efforts for safe codecs are good. On the other hand, I would prefer that it didn't come via a *completely new language*
|
|
|
_wb_
|
2024-02-04 10:00:20
|
No language protects against brainos.
|
|
|
|
veluca
|
|
_wb_
No language protects against brainos.
|
|
2024-02-04 10:01:45
|
well, a few languages can protect against silly mistakes giving random people on the internet remote code execution 😉
|
|
|
_wb_
|
2024-02-04 10:10:51
|
Sure, but as long as the language is Turing complete it is impossible to make it fully safe by design. Though of course there is a spectrum of how easy it makes it to shoot yourself in the foot.
|
|
|
|
veluca
|
2024-02-04 10:15:22
|
yeah, and it can be pretty wide 😛
|
|
|
Traneptora
|
2024-02-04 11:04:14
|
yea, though some languages make mistakes easy and silent
|
|
2024-02-04 11:04:19
|
such as SHELL
|
|
2024-02-04 11:05:25
|
relatively inoccuous stuff like `cd $(dirname $0)` are not safe in shell
|
|
2024-02-04 11:06:02
|
you have to do `cd "$(dirname "$0")"`
|
|
2024-02-04 11:06:30
|
in Shell, the language itself permits you do to these bad things, which are expressly legal, but basically never what you want
|
|
2024-02-04 11:07:00
|
in C (since '89), if you do something weird it will usually emit a compiler warning and in C11 it will probably be a compiler error
|
|
2024-02-04 11:08:18
|
languages like shell and php often do the wrong thing with no warning
|
|
2024-02-04 11:08:22
|
php is especially bad in this regard
|
|
|
spider-mario
|
|
Traneptora
you have to do `cd "$(dirname "$0")"`
|
|
2024-02-04 11:20:48
|
to be even safer, I suspect one should perhaps even do `cd -- "$(dirname -- "$0")"`
|
|
2024-02-04 11:21:04
|
(although it starts to verge on paranoia)
|
|
|
Traneptora
|
2024-02-04 11:21:53
|
`--` are a recent extension but yes
|
|
|
spider-mario
|
2024-02-04 11:21:56
|
(but being at least a little paranoid is required for robust shell scripts)
|
|
|
Traneptora
|
2024-02-04 11:22:13
|
ye, the fact that you have to do `rm -f -- "$foo"` to be safe is kinda silly
|
|
2024-02-04 11:23:05
|
the fact that differences in the name of a file can break a script is a thing relatively unique to shell
languages with variables and functions, like C, Java, Python, etc. don't have this issue
|
|
|
spider-mario
|
2024-02-04 11:23:10
|
don’t forget having to do
```
IFS='' while read -d '' f; do
...
done < <(find -print0 ...)
```
to safely iterate through the results of `find`
|
|
2024-02-04 11:23:27
|
(reading line-by-line would break if some filenames contain `\n`)
|
|
2024-02-04 11:23:59
|
honestly, I’d rather just write a perl script
|
|
|
Traneptora
|
2024-02-04 11:24:04
|
you don't actually need parameter substitution here
|
|
|
spider-mario
|
2024-02-04 11:24:41
|
you do if your loop is supposed to modify variables from outside the loop
|
|
2024-02-04 11:24:47
|
a pipe would spawn a subshell
|
|
|
Traneptora
|
2024-02-04 11:24:50
|
ye, that's true
|
|
|
spider-mario
|
2024-02-04 11:25:10
|
always doing it this way spares you from having to think about whether it applies
|
|
2024-02-04 11:25:24
|
and getting bitten if you forget about it
|
|
|
Traneptora
|
2024-02-04 11:26:01
|
ye, but idk if you can do that in posix SH
|
|
2024-02-04 11:26:20
|
the point is that your code is unsafe but will still work most of the time
|
|
2024-02-04 11:26:24
|
it's just very fragile
|
|
|
spider-mario
|
|
Traneptora
|
2024-02-04 11:27:23
|
the problem with php and shell are that both actively encourage incorrect code
|
|
2024-02-04 11:27:32
|
or rather, dont' make it obvious that you've written something incorrect
|
|
2024-02-04 11:28:25
|
like, consider Java's `String.indexOf` function
|
|
2024-02-04 11:29:04
|
`"foo".indexOf('b')` returns `-1`, which is not a legal array index in Java. attempting to index an array with -1 will crash the program
|
|
2024-02-04 11:29:33
|
in Python, `"foo".index('b')` raises a `ValueError`
|
|
2024-02-04 11:29:47
|
if you wrote code that didn't handle that scenario, when it shows up, your program crashes
|
|
2024-02-04 11:29:49
|
and you go "right, whoops" and fix it
|
|
2024-02-04 11:30:21
|
in C if you call `strchr("foo", 'b')` it returns `NULL`, and doing anything interesting with that as though it were a real string segfaults your program
|
|
2024-02-04 11:30:23
|
and you go "oops"
|
|
|
spider-mario
|
2024-02-04 11:30:29
|
in Haskell, you get a `Maybe Int`, and you have to handle both cases explicitly
|
|
2024-02-04 11:31:00
|
(or use Maybe as the monad that it is, or whatever)
|
|
|
Traneptora
|
2024-02-04 11:31:04
|
ye, point is, even if the language doesn't force you to check for the failure case, your program fails hard and immediately if you, as a programmer, forget
|
|
2024-02-04 11:31:18
|
which makes these sorts of bugs show up fairly noisily
|
|
2024-02-04 11:31:22
|
and gets them fixed
|
|
2024-02-04 11:31:42
|
in PHP, the `index` method in a string returns a nonnegative integer on a success, but returns `FALSE` on a failure
|
|
2024-02-04 11:31:59
|
so if you, as a programmer, forgot to deal with that case, then most language functions will just silently convert it to 0 for you
|
|
2024-02-04 11:35:05
|
```
php -r '$foo = "xyz"; var_dump($foo[strpos($foo, "b")]);'
```
this prints `"x"`
|
|
2024-02-04 11:35:26
|
in super recent versions of PHP it also prints a warning to the console
|
|
2024-02-04 11:35:28
|
but that's it
|
|
2024-02-04 11:35:44
|
there's no indication that it didn't work
|
|
|
spider-mario
|
2024-02-04 11:38:38
|
yeah, returning a different type is already kind of a smell in itself
|
|
|
Traneptora
|
2024-02-04 11:41:14
|
"For us at OpenBSD, we tend to rely a lot on exploit-mitigation techniques in the operating system. We tend to know that, we try to review our code thoroughly, and stuff we bring in from things like ports, and upstream maintainers, at least if it has mistakes, usually, when you get to the fact that our address space is aggressively randomized, is aggressively put to a place where... malloc moves around. You end up next to dead pages. It's usually likely that if you run a piece of software for any given time on OpenBSD, eventually if it's got issues, it starts to crash. And when it starts to crash, we tend to find these things, find these bugs, and push them upstream."
|
|
2024-02-04 11:41:17
|
-Bob Beck, 2014 (an OpenBSD developer)
|
|
2024-02-04 11:41:41
|
he was at a talk about the OpenSSL vulnerability called heartbleed
|
|
2024-02-04 11:41:53
|
and why it prompted them to fork it and create LibreSSL
|
|
2024-02-04 11:42:05
|
(spoiler: the reason was not heartbleed)
|
|
2024-02-04 11:47:41
|
"Their malloc-replacement library was really the final straw for us. ... Eventually, they decided 'that malloc is slow on some platforms, so let's assume malloc is slow everywhere, and lets implement our own cache that never frees anything and just re-uses the objects. Better yet, the way that it reuses objects is it keeps a last-in-first-out queue.' So if you actually are doing a use-after-free, chances are excellent that that object is still there. Matter of fact, it's almost certain that if you free something and use it immediately, that thing is still there, and it doesn't matter that you freed it. Okay, as you can image, this made Heartbleed that much worse. Anything that would have attempted to free memory, you know, recently used keys, bits like that, well that's still all there, it's still all there in a last-in-first-out manner, sitting there right in the same chunk of memory it was allocated in."
|
|
2024-02-04 11:47:56
|
makes you wonder how a globally used cryptographic library could have this issue
|
|
|
|
veluca
|
|
Traneptora
```
php -r '$foo = "xyz"; var_dump($foo[strpos($foo, "b")]);'
```
this prints `"x"`
|
|
2024-02-04 11:57:51
|
IIRC at some point you couldn't `var_dump` an expression, just a variable, so at least that was an improvement 😛
|
|
|
Traneptora
|
2024-02-05 12:00:31
|
<:kek:857018203640561677>
|
|
|
Jim
|
2024-02-05 01:38:44
|
The thing I find most odd is the sheer amount of scrutiny jxl gets compared to the rest of the eco system.
When AV1/AVIF was being readied for release, there was not massive pushback with people saying it needed X and Y and Z. They just stuck it in, even with it's performance issues, not-yet-solidified standard, and lack of web-related features.
But as JXL is getting ready for release there are always people claiming
> - "welllll, there is always reluctance because it needs more implementations."
> - "Oh, there are implementations? There security and memory issues to think about."
> - "All images formats have that? Well, well nobody wants features like progressive rendering. Just make your photos postage-stamp sized and your problems are solved."
> - "Oh, the majority of people want that? Well, that's a matter of opinion."
> - "Ok, ok, ok, I see your points... but we don't want an all-encompassing format. We want one format for each type of image. One for lossy, one for ultra-lossy (don't mind that nobody actually asked for that one), one for lossless, one for animations, and so on."
> - "I see your point, but we already have a format for lossy, PNG. We have one for animations... well, two actually, GIF and WEBP. We have one for photos, JPG... and one for really tiny images and video stills, AVIF. So there are too many formats already, people are getting confused by all of them..."
> - "What do you mean I just said we need multiple formats? Don't put words in my mouth by looking at my post history."
The bar just keeps getting moved further and further.
Google can put BS in their browser that seriously threatens privacy and possibly security, that's wildly unpopular and that other browsers refuse to even touch, but a new image format that has the potential to finally retire some old formats? No.
|
|
|
Quackdoc
|
2024-02-05 02:02:36
|
IMO it's understandable why avif wasn't scrutinized. everything needed for AVIF was already there.They didn't actually need to add anything for avif (well apparently firefox did because firefox doesn't make sense.) All the scrutiny happened when trying to debate adding AV1 to browsers, which wasn't a hard decision at all, AVIF is a mostly free benefit that came bundled with that
|
|
|
w
|
2024-02-05 02:16:17
|
I hate that argument avif is not just av1
|
|
|
Jim
|
2024-02-05 02:18:01
|
Firefox had trouble because it had no way to do video in an image tag. Adding AVIF was easy, making it play in an img element was not. It worked initially when AVIF's animation was separate from AV1, but when they changed it to be a full AV1 video it broke in FF until they figured out the video part.
As for AVIF or AV1, nobody said "well we already have webp, we don't need another format" - they say that about JXL. I don't remember any real pushback as far as security yet jxl is run over with a fine-tooth comb.
|
|
2024-02-05 02:20:01
|
My point is that the amount of scrutiny is 1000x what other formats had when they were being introduced.
|
|
|
Quackdoc
|
|
w
I hate that argument avif is not just av1
|
|
2024-02-05 02:21:07
|
true, avif is av1 in a stupid container https://cdn.discordapp.com/emojis/721359241113370664.webp?size=48&name=yep&quality=lossless
|
|
|
Jim
Firefox had trouble because it had no way to do video in an image tag. Adding AVIF was easy, making it play in an img element was not. It worked initially when AVIF's animation was separate from AV1, but when they changed it to be a full AV1 video it broke in FF until they figured out the video part.
As for AVIF or AV1, nobody said "well we already have webp, we don't need another format" - they say that about JXL. I don't remember any real pushback as far as security yet jxl is run over with a fine-tooth comb.
|
|
2024-02-05 02:26:56
|
> AVIF's animation was separate from AV1, but when they changed it to be a full AV1
wasn't it that way since some of the first revisions of the avif spec?
|
|
|
w
|
2024-02-05 02:28:07
|
Avif takes a lot more than just playing it as av1
|
|
2024-02-05 02:28:37
|
Even in Chrome
|
|
2024-02-05 02:29:00
|
It's like saying webp is vp8
|
|
|
Quackdoc
|
2024-02-05 02:29:52
|
?
|
|
2024-02-05 02:30:16
|
treating avif images as av1 videos works fairly well when you add some simple brand parsing to determine how to treat it
|
|
|
w
Avif takes a lot more than just playing it as av1
|
|
2024-02-05 02:31:47
|
are there any cases where it is more then that? I haven't really encountered any myself. outside ofc the simple brand parsing I mentioned before.
|
|
|
w
|
2024-02-05 02:32:01
|
idk like when there's a damn icc profile attached
|
|
|
Quackdoc
|
2024-02-05 02:32:43
|
ICC handling isn't really that special, but aside from ICC and alpha (in which similar techniques have been used before) it's not really all that different
|
|
|
w
|
2024-02-05 02:33:31
|
it is different because you can't just stick it in the GPU overlay decoder/compositor
|
|
|
Quackdoc
|
2024-02-05 02:33:45
|
also it's worth noting ffmpeg has supported mp4 files with icc for a while now, though at the time it may have been a bit novel I suppose
|
|
|
username
|
|
Quackdoc
ICC handling isn't really that special, but aside from ICC and alpha (in which similar techniques have been used before) it's not really all that different
|
|
2024-02-05 02:35:35
|
I mean you could *kinda* say the same thing about lossy WebP... like iirc I don't think they really changed anything from VP8 and also the changes they did make weren't really to the core of VP8 (EDIT AFTER THIS POINT) and where more so container level stuff
|
|
|
w
|
2024-02-05 02:36:33
|
I guess the decoding of the image data may be the same but what I mean is the image is more than just the image data
|
|
|
Quackdoc
|
2024-02-05 02:36:52
|
yeah, the vast majority of avif can be handled properly when you treat it as a simple video at least. ICC will ofc as all things container based, be a bit sketchy on whether or not your player can handle it. Same with alpha. Using a second track for alpha isn't exactly novel, but it's not super common either.
|
|
|
w
|
2024-02-05 02:38:04
|
sure it may be possible to just treat it as video but obviously nobody is going to do that
|
|
|
Quackdoc
|
2024-02-05 02:38:07
|
the PR for handling ICC in mov/mp4 was initially shared on the ML in 2019 or 2020 iirc. so it for sure would have been an issue for the earlier days, at least for ffmpeg
|
|
|
w
|
2024-02-05 02:38:27
|
like have you seen the chromium avif module
|
|
2024-02-05 02:38:33
|
there's so much it's doing
|
|
|
Quackdoc
|
2024-02-05 02:39:38
|
isn't it only like 600loc?
|
|
2024-02-05 02:40:25
|
ah it's 1.4kloc now
|
|
2024-02-05 02:42:28
|
but yeah, personally for the longest time, I was just using MPV to view avif files since it worked quite well before everything supported avif
|
|
|
Jim
|
2024-02-05 02:45:14
|
I've heard you can just take an AV1 keyframe, wrap it with a AVIF header and it will work fine.
|
|
|
Quackdoc
|
2024-02-05 02:45:36
|
but in terms of actual technology to bolt onto the browser, avif added quite little, the decoders were the same anyways since it's not like there is any change in attack surface so long as you are using the same version decoder for avif and video handling. so there isn't really much burden there, for chromium they did wind up electing to properly support it with libavif iirc
|
|
|
Jim
I've heard you can just take an AV1 keyframe, wrap it with a AVIF header and it will work fine.
|
|
2024-02-05 02:45:48
|
you can take an entire av1 video and wrap it with an avif header and it works fine
|
|
|
w
|
2024-02-05 02:46:05
|
if they use ffmpeg anyway why not support the cum video format and the pee poo image format
|
|
|
Quackdoc
|
|
Quackdoc
you can take an entire av1 video and wrap it with an avif header and it works fine
|
|
2024-02-05 02:46:22
|
I made some really nasty animated images with that lol
|
|
|
Jim
|
|
Quackdoc
> AVIF's animation was separate from AV1, but when they changed it to be a full AV1
wasn't it that way since some of the first revisions of the avif spec?
|
|
2024-02-05 02:46:47
|
No, I remember initially it was just a sequence of frames. Once browsers started implementing it, they changed it to be an AV1 video. Chrome made the change within 1 version but Firefox took many months with broken animated AVIFs no longer working before they fixed it.
|
|
|
Quackdoc
|
|
Jim
No, I remember initially it was just a sequence of frames. Once browsers started implementing it, they changed it to be an AV1 video. Chrome made the change within 1 version but Firefox took many months with broken animated AVIFs no longer working before they fixed it.
|
|
2024-02-05 02:53:36
|
I don't think this was the case, I'd seen this claimed a lot, but even in early versions of the spec it was just "animated" in fact `avio` was added afterwards to denote intra sequence only files. I did quicky look at the git commit history and it doesn't seem like this was the case
|
|
|
Jim
|
2024-02-05 02:58:35
|
I distinctly remember it happening, mostly in Firefox. That you saw it claimed should tell you it wasn't people's imagination. Chrome came out with a new version pretty much right at the time they made the decision and released a new version so it was hardly noticeable at all. Firefox initially worked, I remember testing it and the original "sequence of images" worked but when the next version of FF came out with the updated AVIF decoder that eliminated sequence of images (not sure why they didn't deprecate it, but probably again since nobody was using it yet). Firefox devs said it would take quite a bit of new code to get videos in img tags working. So for quite a while still AVIFs worked but both the old and new video AVIFs were broken until they could get video images to work.
|
|
|
Quackdoc
|
2024-02-05 03:02:45
|
just because people said it was true doesn't make it true, Im looking back through the various commits to the file and indeed, Im not able to find any info that states that images that use the `avis` brand where ever to be intra only. This is the commited bit where it was first introduced. It made no specifcs as to whether or not the sequence(s) should be intra only.
```ps
<h3>AVIF image sequence brand</h3>
Files conformant with the brand-independent restrictions in this document (sections [[#image-item-and-properties]], [[#image-sequences]] and [[#alpha-images]]) shall include the brand
<dfn value="" export="" for="AVIF Image Sequence brand">avis</dfn> in the [=compatible_brands=] field of the [=FileTypeBox=].
<p>Files should also carry a compatible brand to identify the AVIF profile (see section [[#profiles]]), if any, with which the file complies.</p>
<p>If 'avis' is specified in the major_brand field of the FileTypeBox, the file extension should be ".avifs".
The MIME Content Type for files with the ".avifs" file extension shall be "image/avif-sequence".</p>
```
this test file from 2019 has i frames and p frames https://github.com/AOMediaCodec/av1-avif/pull/50
this 2018 commit states that `As of today, you have to inspect the track and detect that all samples are 'sync' samples. This might be a bit late and could result in files being downloaded while not being playable, if the decoder only supports intra.` https://github.com/AOMediaCodec/av1-avif/issues/27
EDIT: oops copied the wrong bit
|
|
|
Jim
I distinctly remember it happening, mostly in Firefox. That you saw it claimed should tell you it wasn't people's imagination. Chrome came out with a new version pretty much right at the time they made the decision and released a new version so it was hardly noticeable at all. Firefox initially worked, I remember testing it and the original "sequence of images" worked but when the next version of FF came out with the updated AVIF decoder that eliminated sequence of images (not sure why they didn't deprecate it, but probably again since nobody was using it yet). Firefox devs said it would take quite a bit of new code to get videos in img tags working. So for quite a while still AVIFs worked but both the old and new video AVIFs were broken until they could get video images to work.
|
|
2024-02-05 03:06:35
|
yeah the avis part was added october 3rd 2018 if the git history is to be believed, the bit issue to add avio was created october 16th 2018. even if there was a commit in there that specified it was intra only, that is far too short of a time for this to become a significant issue. that or the avif spec guys royally messed with the git tracking dates
|
|
2024-02-05 03:13:36
|
you can see the commit range here https://github.com/AOMediaCodec/av1-avif/commits/ae382b8967bd4e5dd7b7f94fea05d06df907b0e1/index.bs just go down to `oct 3`
|
|
|
Jim
|
2024-02-05 03:19:13
|
Just because they commited something doesn't mean it was instantly landed in a new version and added to browsers in the same instant.
> even if there was a commit in there that specified it was intra only, that is far too short of a time for this to become a significant issue
I already said that Chrome updated very quickly and had video in img tags working shortly after they announced the change. Virtually nobody was using it in the wild outside of the test sites at that point. Hell, even today I don't think I've seen any AVIF video images yet. They all just use gif or h264. Not sure why you are acting like I way saying it caused the end of the Internet as we know it.
|
|
|
Quackdoc
|
2024-02-05 03:21:04
|
this was before avif was even a proper spec, and there was only about a month between when `avis` was introduced, and when `avio` was introduced
|
|
|
Jim
|
2024-02-05 03:24:50
|
I don't remember when exactly it happened, don't really care. Hell, they still have sites claiming AVIF has progressive rendering support. I wouldn't trust that. I didn't make up what happened. But I'm also not going to go back and try to find some specific commits in the software to prove that it happened.
|
|
|
Quackdoc
|
2024-02-05 03:26:13
|
Im not saying it was never an issue in software, but it *was* never an issue with avif itself, and I think being honest is critical when it comes to comparing one thing to another. especially when we are making comparisons to something/someone to get them to add support to an app or device
|
|
|
Jim
|
2024-02-05 03:27:42
|
You should stop telling me that and go tell the team that made up the BS benchmarks that nobody has been able to replicate. Or the team thats claiming AVIF has progressive rendering. You're going to go after me for being dishonest after all the things Google has been dishonest about?
|
|
|
Quackdoc
|
2024-02-05 03:29:03
|
I'm not going after you? I'm letting you know so you don't spread false information, spreading false information when trying to compare something to something else is one of the most damaging things you can do to a cause you support.
|
|
|
Jim
|
2024-02-05 03:29:57
|
Again, Google does it all the time.
|
|
2024-02-05 03:30:52
|
Yet their one of the most followed companies out there.
|
|
2024-02-05 03:32:24
|
I'm not spreading false information. I know it happened, maybe when I have time I'll go back and find what where how. I just don't appreciate you telling me I imagined something and that I have to have some git commit for every sentance I write.
|
|
|
Quackdoc
|
2024-02-05 03:33:10
|
Im not saying you imagined it, I'm saying you were told wrong if you believe the issue was with the spec. Also google has one of, if not the worst reputation in tech circles.
|
|
2024-02-05 03:33:19
|
I dont see why we should try and emulate that
|
|
|
Jim
|
2024-02-05 03:44:08
|
Again, I would have to go back and see if I can find the forum posts from the devs - I remember reading at the time about the change being made. What I meant was not a change to how the spec was written, but that they changed from requiring it be a sequence of images to a video which would likely be a change in what browsers require. I don't mean that the spec itself was rewritten, my point was they were requiring a change in what the browsers were going to support.
|
|
2024-02-05 03:45:11
|
I'm guessing that is where you got hung up. I don't really care what the spec says, it probably supported videos or sequences early on. But as far as browsers were concerned, the sequences of images got removed and replaced with a video.
|
|
|
_wb_
|
2024-02-05 06:33:46
|
AVIF was intra-only initially, just like WebP. Which makes sense for an image format: why require everyone to implement all inter coding tools if it's just to encode still images?
Back when AVIF was a candidate to become JPEG XL, it was assumed that inter was not going to be part of it.
|
|
|
Quackdoc
|
2024-02-05 06:48:36
|
I'm still a bit conflicted. On one hand, av1 spec is a bloody goliath. It's a lot for an image format. However as an "animated image" avif is still really nice. but as far as I know, when the "animated avif" got pushed into the public spec, it had been as "video"
|
|
2024-02-05 06:49:06
|
it's kind of funny that this issue was actually acknowledged though
```
As discussed last week, we should consider whether we want to define a brand to identify when image sequences are intra-only coded.
As of today, you have to inspect the track and detect that all samples are 'sync' samples. This might be a bit late and could result in files being downloaded while not being playable, if the decoder only supports intra.
It might be better defined also within MPEG, as this is codec-independent.
```
|
|
2024-02-05 06:49:57
|
I think an only intra decoder would still be viable, but im not sure if it would be worth supportiing intra only sequences or not
|
|
|
_wb_
|
2024-02-05 07:41:15
|
In my opinion it's mostly a question of expected life span of a codec.
If you aim for short cycles, a new codec every 5 years or so, and assume content negotiation, then you can just include whatever in a codec even if it gives relatively little "bang for buck" (e.g. inter coding tools for an image format, just to get better compression of animation) since the codec will be retired anyway — e.g. once AV3 development starts, the browser can retire AV1F support since everyone is using AV2F anyway.
If you aim for long cycles, a new codec every 30 years or so, and don't assume content negotiation, then it makes more sense to keep the spec simple since decoders will have to be kept around forever.
|
|
|
lonjil
|
2024-02-05 08:54:56
|
Browsers will never retire AV1F. They'll just add future AVn versions to the AVIF spec and expect browsers to support them forever.
|
|
|
username
|
2024-02-05 10:17:41
|
seems like the interop 2024 retrospective will bring up JPEG XL https://github.com/web-platform-tests/interop/issues/611#issuecomment-1925850571
|
|
|
sklwmp
|
2024-02-05 10:25:03
|
ngl, i wonder if by this point developers and/or the internet have gotten annoyed at us JXL "activists" pushing on every platform we find
|
|
2024-02-05 10:25:17
|
but it's good to see pressure having *some* effect
|
|
2024-02-05 10:33:42
|
if only public pressure could push Chromium over the edge 🙂
|
|
|
yoochan
|
2024-02-05 10:46:07
|
and firefox be less lazy
|
|
|
|
veluca
|
|
sklwmp
if only public pressure could push Chromium over the edge 🙂
|
|
2024-02-05 10:57:22
|
this sentence with an extra capital letter is funnier 😛
|
|
|
Jim
|
2024-02-05 11:09:48
|
Firefox is more lacking resources than lazy. They had multiple rounds of layoffs over the past couple years yet I'm surprised that they are still largely keeping up with new features.
|
|
|
yoochan
|
2024-02-05 11:16:32
|
but for the jpeg xl, code is contributed by the community, they would just need to review and unflag... but I agree it would be nice if they could hire some devs again
|
|
|
lonjil
|
2024-02-05 11:48:35
|
They did I think, Firefox dev budget went up and I saw a Firefox dev on mastodon saying they have more resources now.
|
|
|
HCrikki
|
2024-02-05 11:57:13
|
Review and merge only concern upstreaming. Itd be the best outcome but 3rdparty builds including that support as one extra patch built in would be more immediately useful
|
|
2024-02-05 11:58:00
|
Without forking intentd, like with mandriva's build of chromium with jxl
|
|
|
Jim
|
|
username
seems like the interop 2024 retrospective will bring up JPEG XL https://github.com/web-platform-tests/interop/issues/611#issuecomment-1925850571
|
|
2024-02-05 11:58:40
|
I don't expect much more than the talking points they already talked about along with some fluff about how the selections move the web forward. Here is that user's github bio. Is there anywhere where Google isn't the gatekeeper of the web?
|
|
|
lonjil
|
2024-02-05 12:00:09
|
It's called interop, not the new standards for the web committee
|
|
|
Jim
|
|
lonjil
They did I think, Firefox dev budget went up and I saw a Firefox dev on mastodon saying they have more resources now.
|
|
2024-02-05 12:01:46
|
They laid off 25% of their staff a few years ago. They are also pushing heavily into AI. If they hired largely AI developers that isn't likely to add significant resources for general browser development.
|
|
2024-02-05 12:02:06
|
So public input doesn't matter?
|
|
|
lonjil
|
|
Jim
They laid off 25% of their staff a few years ago. They are also pushing heavily into AI. If they hired largely AI developers that isn't likely to add significant resources for general browser development.
|
|
2024-02-05 12:03:19
|
Well I was talking about Firefox devs saying Firefox development is getting more support so idk how that relates to AI.
|
|
|
Jim
|
|
lonjil
It's called interop, not the new standards for the web committee
|
|
2024-02-05 12:04:11
|
Safari has support for jxl, so it is not a new feature.
|
|
|
lonjil
|
|
Jim
So public input doesn't matter?
|
|
2024-02-05 12:04:12
|
It matters insofar as the public is likely to notice interop problems.
|
|
|
Jim
Safari has support for jxl, so it is not a new feature.
|
|
2024-02-05 12:05:41
|
It would be a new standard for the web, though? Safari supports J2K and mp4 in img tags, but no one thinks it's an interop problem.
|
|
|
Jim
|
2024-02-05 12:09:26
|
Have you read Mozilla's blog lately? They are pushing AI for various things like translations, fake and malicious content detection, etc.
https://www.techspot.com/news/101370-mozilla-pivots-towards-ai-focus-firefox-market-share.html
|
|
|
lonjil
It would be a new standard for the web, though? Safari supports J2K and mp4 in img tags, but no one thinks it's an interop problem.
|
|
2024-02-05 12:11:01
|
J2K is not a web-focused format, WebKit is removing it, and mp4 is a file container. I assume you mean h265, but that has patents around it and is the reason for AV1.
|
|
|
lonjil
|
2024-02-05 12:11:17
|
No, I mean h264
|
|
2024-02-05 12:11:48
|
If you gave an MP4 containing an h264 stream, Safari will play it like it was a gif.
|
|
2024-02-05 12:12:09
|
Like AVIF but without inventing a new format for it.
|
|
|
Jim
|
2024-02-05 12:13:32
|
h264 did have patents around it originally. I believe they removed it but support in all browsers is not likely considering there is still the threat of lawsuit and AV1 is already there as a replacement. You still need 3rd party support to get it working in certain browsers.
|
|
2024-02-05 12:14:03
|
Well, VP8 and AV1
|
|
|
lonjil
|
2024-02-05 12:14:24
|
All browsers do support h264 as far as I know.
|
|
2024-02-05 12:14:56
|
I'm not saying that these features should be adopted, I'm just saying, one browser having a certain feature doesn't really say much.
|
|
|
Jim
|
2024-02-05 12:15:05
|
No, all browsers support the mp4 container (its used for AV1 and others). Not all browsers support the h264 codec out of the box.
|
|
|
lonjil
|
2024-02-05 12:16:01
|
Correction: all browsers anyone actually uses supports h264 OOB.
|
|
2024-02-05 12:16:22
|
In any case, not really relevant.
|
|
2024-02-05 12:17:03
|
Putting a regular video in an img tag is a great feature that both Chrome and Firefox devs really hate, so we're never getting it.
|
|
|
Jim
|
2024-02-05 12:17:28
|
You really seem anti-jxl. Is that why you're here?
|
|
|
lonjil
|
2024-02-05 12:18:05
|
I'm incredibly pro JXL
|
|
2024-02-05 12:18:34
|
Never once have I argued against JXL here
|
|
|
Jim
|
|
lonjil
Putting a regular video in an img tag is a great feature that both Chrome and Firefox devs really hate, so we're never getting it.
|
|
2024-02-05 12:18:37
|
That's a good thing. It's not meant for an actual video. It supports videos as long as they are looped and muted (like a gif).
|
|
|
lonjil
|
2024-02-05 12:19:38
|
I have argued about what I believe are incorrect notions and misunderstandings, that will not actually lead to JXL being adopted.
|
|
|
Jim
|
|
lonjil
Correction: all browsers anyone actually uses supports h264 OOB.
|
|
2024-02-05 12:20:43
|
You seem to be defending Google, defending monopolies, pushing back against public opinion. Those ideals are what is likely to get jxl adopted.
|
|
|
lonjil
|
|
Jim
That's a good thing. It's not meant for an actual video. It supports videos as long as they are looped and muted (like a gif).
|
|
2024-02-05 12:22:39
|
That's what the Safari feature is. Put an h264 in an MP4 in an img tag on Safari, and it'll play looping and without audio, as if it was a gif. This could easily be extended to other video formats and containers. But Firefox and Chrome hate it. So to put AV1 video into img tags, they introduced animated AVIF.
|
|
|
Jim
|
2024-02-05 12:23:54
|
They hated the original idea which was to allow any video to play (with audio, unlooped) in an img tag. The compromise to gain support was to require it be muted and looped to prevent it from being used to just play long videos with audio.
|
|
2024-02-05 12:26:01
|
Granted, there are still many people that would rather just require using a video tag, which is fine.
|
|
|
lonjil
|
|
Jim
You seem to be defending Google, defending monopolies, pushing back against public opinion. Those ideals are what is likely to get jxl adopted.
|
|
2024-02-05 12:27:57
|
No. I explained what the point of interop is. The Chrome team has disproportionate control over the web regardless. I'm very annoyed by the situation, and always tell people that Chrome is the new IE, but I try to be realistic and accurate in my critiques and complaints. If someone has the same goal as me, but I think they're presenting incorrect facts, or relying on faulty logic, or what have you, I argue against that.
|
|
|
Jim
They hated the original idea which was to allow any video to play (with audio, unlooped) in an img tag. The compromise to gain support was to require it be muted and looped to prevent it from being used to just play long videos with audio.
|
|
2024-02-05 12:29:05
|
I'm pretty sure Jon has advocated for just putting videos in img tags to replace gif, and been rebuffed. This much more recently than Safari's support.
|
|
|
Jim
|
2024-02-05 12:32:04
|
I understand the issues: having audio playing automatically from an img that that can't be stopped is a huge issue. I think there should be a length limit too. On the other side, using a video tag would be a better solution except that browsers have different rules for what videos can autoplay, plus it's a user preference. So replacing gifs with video tags where it simply won't play on some people's browsers is also an issue. People will complain, be told to enable autoplay, then be annoyed when other, larger videos autoplay when they don't want them to. So there are valid points on both sides of the issue.
|
|
|
lonjil
|
2024-02-05 12:35:29
|
I keep saying without audio, and you keep saying it's an issue.
Is there a length limit for AVIF?
|
|
|
Jim
|
2024-02-05 12:38:02
|
There are issues other than just no audio. I mean a length limit for any video played in an img tag. For example, 60 seconds. Since it is used as a replacement for animated gifs that should be more than enough time yet not allow people to stick 3 hour long videos in an image tag and eat up an enormous amount of bandwidth unknown to the user.
|
|
2024-02-05 12:39:06
|
Also, I'm not saying you're wrong. You're acting like I am.
|
|
|
lonjil
|
2024-02-05 12:40:19
|
I'm extremely confused about what you're trying to say or argue now.
|
|
|
Jim
|
2024-02-05 12:40:41
|
Ditto for you.
|
|
|
lonjil
|
2024-02-05 12:41:44
|
I'm saying that it would be essentially identical to what we have with AVIF. Arbitrary video stream in an img tag, autoplay, and no audio, so any argument Chrome and Firefox devs have against doing it with video in general seems hollow and nonsensical.
|
|
|
Jim
|
2024-02-05 12:43:42
|
I'm in support of it as well. Why Chrome & Firefox would be against it, I don't know.
|
|
|
Quackdoc
|
|
yoochan
but for the jpeg xl, code is contributed by the community, they would just need to review and unflag... but I agree it would be nice if they could hire some devs again
|
|
2024-02-05 02:05:22
|
reviewing a PR for a feature that's not even enabled in stable builds? too hard mate.
|
|
|
Cacodemon345
|
|
lonjil
No. I explained what the point of interop is. The Chrome team has disproportionate control over the web regardless. I'm very annoyed by the situation, and always tell people that Chrome is the new IE, but I try to be realistic and accurate in my critiques and complaints. If someone has the same goal as me, but I think they're presenting incorrect facts, or relying on faulty logic, or what have you, I argue against that.
|
|
2024-02-05 02:35:58
|
Chrome is more than the new IE. There's no abundance of security issues and vulnerabilities associated with it unlike the latter that contributed to its downfall.
|
|
|
_wb_
|
|
lonjil
It would be a new standard for the web, though? Safari supports J2K and mp4 in img tags, but no one thinks it's an interop problem.
|
|
2024-02-05 02:40:28
|
Actually we want mp4 in img tags for ages, it makes GIF/APNG/AWebP redundant, but somehow Chrome also blocks that. They allow arbitrary AV1 in an img tag as long as you package it nicely in an AVIF, but not any other video codec and certainly not in a video container. It would make things a LOT easier if people could just use normal video containers in an img tag (which then plays muted and looping by default so it behaves just like a GIF).
|
|
2024-02-05 04:09:41
|
There are no length limits on GIFs, so why put a length limit on other formats in an img tag?
|
|
2024-02-05 04:11:32
|
It's not the scope/purpose of web standards to make it impossible to make websites that annoy people. The goal is to make it possible to make websites that use existing technology effectively.
|
|
|
lonjil
|
|
_wb_
Actually we want mp4 in img tags for ages, it makes GIF/APNG/AWebP redundant, but somehow Chrome also blocks that. They allow arbitrary AV1 in an img tag as long as you package it nicely in an AVIF, but not any other video codec and certainly not in a video container. It would make things a LOT easier if people could just use normal video containers in an img tag (which then plays muted and looping by default so it behaves just like a GIF).
|
|
2024-02-05 04:14:54
|
Aye. The first time I heard of this feature was when you mentioned it, I think a couple of years after Safari added support. If I recall correctly, Firefox also outright rejected it, rather than doing the "we're neutral" thing like with JXL.
|
|
|
_wb_
|
2024-02-05 04:15:49
|
https://calendar.perfplanet.com/2017/animated-gif-without-the-gif/
|
|
2024-02-05 04:17:27
|
Colin was a colleague of mine back then at Cloudinary (now he works at Shopify)
|
|
|
lonjil
|
2024-02-05 04:19:08
|
I recognize that title from somewhere... https://cloudinary.com/blog/evolution_of_img_gif_without_the_gif
|
|
|
_wb_
|
2024-02-05 04:20:35
|
I guess that article was published more than once 🙂
Anyway, the arguments are still valid today, and it's silly that Chrome/Firefox have still not allowed it.
|
|
|
lonjil
|
2024-02-05 04:23:57
|
But yeah, I just wanted to use it as an example of a useful feature stonewalled by Chrome that I've never seen anyone call an interoperability issue. Don't think people should be so harsh on the interop group just because they aren't advocating for JXL.
|
|
|
_wb_
|
2024-02-05 04:28:13
|
Yes, JXL is not the only thing that Chrome is blocking for no good reason.
|
|
2024-02-05 04:30:40
|
(a cynic could say though that both MP4 and JXL are blocked from the <img> tag for the same reason: forcing AVIF down our throats)
|
|
|
lonjil
|
2024-02-05 04:31:25
|
Would've made more sense to use AVIF for stills and webm for animation
|
|
|
_wb_
|
2024-02-05 04:34:39
|
yes, or even better just any video codec/format they support in <video>
|
|
2024-02-05 04:37:06
|
in terms of tooling for end-users, it's just so much easier to produce regular video formats than it is to wrap an AV1 in an AVIF container
|
|
|
Traneptora
|
|
Jim
You seem to be defending Google, defending monopolies, pushing back against public opinion. Those ideals are what is likely to get jxl adopted.
|
|
2024-02-05 04:40:54
|
How do you come to this conclusion in response to a statement about all browsers supporting H.264
|
|
2024-02-05 04:56:11
|
https://caniuse.com/?search=h.264
|
|
|
Jim
|
2024-02-05 05:10:45
|
Last I heard, Firefox did not support it. In fact, on Linux you have to install the codec separately to get it to work. You used to have to do that on Windows too, but I haven't used Windows in a while. I never heard any announcement from the developers that they were going to support it or is it one of the cases of "if the hardware decoding works, then someone must have paid a license for it" ?
|
|
2024-02-05 05:12:14
|
I know Google, Microsoft, and Apple would just pay any royalty for it just so it works on their browsers, so it wasn't an issue for them.
|
|
|
lonjil
|
2024-02-05 05:15:02
|
Last you heard must've been a hell of a long time ago
|
|
|
Traneptora
|
2024-02-05 05:16:14
|
firefox has supported it without plugins on Linux since January 2015
|
|
2024-02-05 05:16:55
|
it supported it with gstreamer's plugin in 2013
|
|
|
lonjil
|
2024-02-05 05:17:25
|
In 2014, Cisco just straight up started giving out an unrestricted h264 decoder for free, with them paying all the patent fees.
|
|
|
Traneptora
|
2024-02-05 05:17:48
|
OpenH264 has been in firefox for a while now, yea
|
|
|
lonjil
|
2024-02-05 05:19:39
|
Oh, right, and it's an encoder too, since it's for WebRTC
|
|
|
Jim
|
2024-02-05 05:23:07
|
There was a lot of issues with streaming services/drm content with it early on. I remember on linux a lot of discussions telling people to just uninstall it and use one of the codecs that might have legal issues.
|
|
|
Traneptora
|
2024-02-05 05:29:14
|
"early on" refers to ten years ago but yes
|
|
2024-02-05 05:29:23
|
and even earlier
|
|
|
Jim
|
2024-02-05 05:33:28
|
Linux documentation tends to stay the same for decades. Now I think Firefox uses ffmpeg to decode video and support hardware decoding. I will have to do some debugging on it to see which codec is being used. We install a package that just had hundreds of codecs included with it. There used to be a lot of issues with it but it generally works well these days.
|
|
|
lonjil
|
2024-02-05 05:38:23
|
You need different sources of documentation, jeez
|
|
|
Jim
|
2024-02-05 05:41:10
|
If needed, yes. However, since most how-tos will generally work across many different distros, most will just refer to the documentation from one distro and will often not change for quite some time. Same with things specfic to the distro. They write it once then the steps tend to work fine for decades. Its better than Microsoft who has documentation floating around for things that no longer work yet its still linked to for years.
|
|
|
Traneptora
|
2024-02-05 05:41:25
|
idk if I want to know if a browser supports a feature I just go to caniuse
|
|
2024-02-05 05:42:27
|
trying to search up old forum posts for how to enable h.264 in firefox doesn't make any sense when caniuse tells me it has worked out of the box since 2015
|
|
|
Jim
|
2024-02-05 05:42:33
|
For the most well-known features that is true, but there are sometimes caveats. And lesser-known features are not always kept up-to-date.
|
|
|
Traneptora
|
2024-02-05 05:42:45
|
H.264 is not a lesser-known feature
|
|
2024-02-05 05:42:54
|
and the caveats are typically documented in caniuse
|
|
2024-02-05 05:44:27
|
what bothers me is not that you didn't know how common H.264 is but when Lon mentioned that all browsers support it you responded incredibly aggressively about defending monopoloies
when all they did was point out (correctly) that browsers do indeed support H.264
|
|
|
Jim
|
2024-02-05 05:44:30
|
Not always. Years ago when there were many issues with getting certain sites to work, it was only documented on various blogs how to get around them (usually by using another codec). I never saw links added to caniuse.
|
|
|
Traneptora
|
2024-02-05 05:44:56
|
misinformation serves nobody
|
|
2024-02-05 05:45:12
|
so pointing out that browsers do actually support H.264 doesn't undermine anyone's case
|
|
|
Jim
|
2024-02-05 05:46:23
|
But just because it has built-in support doesn't mean it works well everywhere. Now, most of the issues have been fixed and it isn't an issue but when it first got added there were a lot of frustrated people asking for help.
|
|
|
Traneptora
|
2024-02-05 05:46:51
|
yea I too remember the internet ten years ago but I don't believe that's particularly relevant in a discussion about interop 2024
|
|
2024-02-05 05:47:37
|
I learned to write websites on an O'Reilly book that talks about implementation differenes between Netscape and Internet Explorer
|
|
|
Jim
|
2024-02-05 05:47:47
|
I never brought up h264 until Lonnie did.
|
|
2024-02-05 05:48:09
|
misinformation serves nobody
|
|
|
Traneptora
|
2024-02-05 05:49:19
|
you know you don't have to be weird and aggressive about things
|
|
|
Jim
|
|
Traneptora
misinformation serves nobody
|
|
2024-02-05 05:49:28
|
same
|
|
2024-02-05 05:50:04
|
Anyway, heading out now. Have a great day/evening.
|
|
|
_wb_
|
2024-02-05 08:07:06
|
If I recall correctly, I was in this podcast: https://siimcast.libsyn.com/s7e02-decoding-dicom-the-evolution-of-image-compression-in-medical-imaging
|
|
|
spider-mario
|
2024-02-05 10:12:20
|
https://www.reddit.com/r/programming/comments/1ajq7bj/google_is_once_again_accused_of_snubbing_the_jpeg/
|
|
2024-02-05 10:31:54
|
I see at least two people in that thread peddling the idea that no one was interested in JXL until Chrome removed it
|
|
2024-02-05 10:32:09
|
> Basically no one in the real world cared about jpegxl until it became a "killed by google" meme, but in reality it was already dead.
|
|
2024-02-05 10:32:37
|
magistral Dunning-Kruger in action
|
|
2024-02-05 10:34:05
|
thinking themself above the “ ‘killed by google’ meme”
|
|
2024-02-05 10:34:27
|
“_I_ have figured out the truth, you see”
|
|
2024-02-05 10:38:42
|
similar pattern in the other one
|
|
2024-02-05 10:38:49
|
> There was also zero interest in JXL (even in chrome) until they decided to remove it and internet haters picked it up as another rant about Google.
|
|
|
lonjil
|
2024-02-05 10:39:07
|
I've seen this claim a few times now on different discussion forums
|
|
2024-02-05 10:39:27
|
Hacker News, IIRC, maybe the comment section on Phoronix?
|
|
2024-02-05 10:39:38
|
I wonder where the idea originated.
|
|
|
spider-mario
|
2024-02-05 10:42:05
|
if I had to speculate, I’d suspect that they(=A) were annoyed by those(=B) complaining about the JXL removal and wanted to convince (A)selves that (B) were wrong
|
|
2024-02-05 10:43:06
|
the idea that there was no interest was a convenient way to achieve that
|
|
|
jonnyawsom3
|
2024-02-05 10:48:45
|
I think to the average user, they just see us complaining about it, not knowing about the multiple corporations and developers asking for it in the past
|
|
|
spider-mario
|
2024-02-05 10:49:41
|
yeah, there’s probably some of that
|
|
2024-02-05 10:50:00
|
they only heard about it then, so clearly there can’t have been interest before
|
|
2024-02-05 10:50:25
|
the https://en.wikipedia.org/wiki/Argument_from_ignorance#Argument_from_self-knowing
|
|
2024-02-05 10:50:43
|
if there had been interest in it before, they’d have known about it, but they didn’t, so there wasn’t
|
|
|
_wb_
|
2024-02-05 10:52:04
|
Also I have seen some people pretend that the experimental Chrome support was a way to gauge interest, and it turned out web devs didn't start using it so they removed it again. Leaving out the minor detail that it was disabled by default so couldn't actually be used in any meaningful way by web devs.
|
|
|
lonjil
|
2024-02-05 11:36:59
|
if anyone here uses lobsters: https://lobste.rs/s/jfjjcp/jpeg_xl_rejected_for_interop_2024
please do not brigade, but if anyone has something useful to add to their discussion, I can provide an invitation link if needed.
|
|
|
spider-mario
https://www.reddit.com/r/programming/comments/1ajq7bj/google_is_once_again_accused_of_snubbing_the_jpeg/
|
|
2024-02-06 12:02:48
|
> That's one aspect of it, another one is that the Mountain View is it own echo chamber. I used to work at Google Zurich and it was a common complaint that devs in Moutain View often had a misplaced sense of superiority versus all the other offices.
|
|
|
sklwmp
|
2024-02-06 05:42:55
|
https://x.com/t3dotgg/status/1754689598234722756?s=20
|
|
2024-02-06 05:43:50
|
referencing: https://twitter.com/luciascarlet/status/1754638648455122985
|
|
|
yoochan
|
|
lonjil
if anyone here uses lobsters: https://lobste.rs/s/jfjjcp/jpeg_xl_rejected_for_interop_2024
please do not brigade, but if anyone has something useful to add to their discussion, I can provide an invitation link if needed.
|
|
2024-02-06 07:35:49
|
They don't like the abort() and the 0.x.x, understood... But I fear these people won't change their mind after the 1.0
|
|
|
_wb_
|
2024-02-06 07:58:07
|
Do they know what versions libgav1/libavif/dav1d and libwebp were at when Chrome integrated them? 🙂
|
|
2024-02-06 07:59:58
|
WebP was integrated in Chrome and enabled by default before the bitstream was even frozen. Which was pretty annoying because you had to sniff UA strings to know if you could use alpha or animation...
|
|
2024-02-06 08:03:34
|
(also I think the AVIF spec was still evolving when Chrome enabled it)
|
|
2024-02-06 08:10:48
|
The version thing is basically a confusion: historically people would start counting at version 1, which would be the first version, and it would take until version 3 or 4 before the API is anywhere close to final and the code is anywhere close to robust and secure. With semantic versioning, 1.0 is a major milestone that corresponds to something like libjpeg version 6, for example.
|
|
|
spider-mario
|
2024-02-06 08:25:12
|
https://lobste.rs/s/jfjjcp/jpeg_xl_rejected_for_interop_2024#c_ttif7r oof, yet another “I compressed lossily with default settings and the file was larger” comparison
|
|
2024-02-06 08:26:51
|
> Okay, so how well do these even work??? Well… that depends on your priorities. So, let’s discuss my own priorities: I **suck** at picking out fine visual details, so I can NOT judge these by quality.
so how did they nevertheless reach the conclusion that they were qualified to conduct such a test?
|
|
2024-02-06 08:27:04
|
I don’t want to sound too harsh, but “not doing it” was also an option
|
|
|
lonjil
|
2024-02-06 09:45:01
|
> “jpeg to JXL with lossy re-encoding” is a nonsensical conversion. If the source file is a jpeg, you’re going to get vastly better results by converting the source DCT to JXL (for better compression), then dropping smaller coefficients until the file is the size you want it to be. The reason for this is simple: by avoiding the DCT -> pixel values -> new DCT conversion, you avoid adding any new artifacts. It’s also dramatically faster.
would this be a useful option?
|
|
|
_wb_
|
2024-02-06 10:13:53
|
it depends on the jpeg quality, but generally no. If the jpeg is low quality, you just want to losslessly recompress it. If it is high quality, you better decode to pixels and re-encode with the full lossy jxl. Only for a narrow range of mid-quality JPEGs it would make sense to do lossy requantization in the DCT domain followed by lossless recompression.
|
|
|
spider-mario
|
2024-02-06 01:15:06
|
> Google actually had a lot of input to JPEG XL, I believe they backported some aspects of AVIF into the JXL standard.
did we?
|
|
|
yoochan
|
2024-02-06 01:35:31
|
😄 it would be interesting to invite them here, or to write the genesis of jxl on the jpegxl.info website, for reference
|
|
|
_wb_
|
|
spider-mario
> Google actually had a lot of input to JPEG XL, I believe they backported some aspects of AVIF into the JXL standard.
did we?
|
|
2024-02-06 02:00:42
|
Not AFAIK.
|
|
|
jonnyawsom3
|
|
yoochan
😄 it would be interesting to invite them here, or to write the genesis of jxl on the jpegxl.info website, for reference
|
|
2024-02-06 03:00:04
|
I've mentioned the idea of a "Resources" channel or similar before, to keep useful info or files in easy reach. Such as Jon's old hand decoded JXL file with explanations for each bit, or the 12 byte smallest JXL file that people frequently want
|
|
|
_wb_
|
2024-02-06 03:21:00
|
maybe something like a wiki or github repo would be more useful for that — ideally something low-effort to add stuff to (but with some protection against spam/abuse)
|
|
|
yoochan
|
|
I've mentioned the idea of a "Resources" channel or similar before, to keep useful info or files in easy reach. Such as Jon's old hand decoded JXL file with explanations for each bit, or the 12 byte smallest JXL file that people frequently want
|
|
2024-02-06 03:23:42
|
hand decoded jxl ? I want to see this ! it could be a not so accessible page of jpegxl.info ?
|
|
|
username
|
|
_wb_
maybe something like a wiki or github repo would be more useful for that — ideally something low-effort to add stuff to (but with some protection against spam/abuse)
|
|
2024-02-06 03:25:33
|
if it where to be a github repo/wiki would it be put under this github organization? https://github.com/jxl-community
|
|
|
jonnyawsom3
|
|
yoochan
hand decoded jxl ? I want to see this ! it could be a not so accessible page of jpegxl.info ?
|
|
2024-02-06 03:25:42
|
https://discord.com/channels/794206087879852103/824000991891554375/901846536487579658
|
|
|
Traneptora
|
|
spider-mario
> Okay, so how well do these even work??? Well… that depends on your priorities. So, let’s discuss my own priorities: I **suck** at picking out fine visual details, so I can NOT judge these by quality.
so how did they nevertheless reach the conclusion that they were qualified to conduct such a test?
|
|
2024-02-06 07:14:36
|
no, it gets better, they took a JPEG and recompressed it with --distance=0 --effort=3 and got a larger file and acted surprised pikachu
|
|