|
Exorcist
|
2025-11-07 03:34:07
|
What is "smartphone photo"?
Shot by smartphone camera or display for smartphone screen?
|
|
|
|
cioute
|
2025-11-07 04:56:19
|
thanks, i just tested webp 100% quality in opencamera, somehow it looks less noisy and weights less, the only problem is no affordable zoom for webp/avif/jpegxl in fossify gallery
|
|
|
AccessViolation_
|
2025-11-07 05:02:10
|
are you sure 100% quality doesn't turn it into a lossless WebP?
|
|
2025-11-07 05:02:41
|
it may be that if you want the highest lossy quality, you need to select 99%
|
|
|
|
cioute
|
2025-11-07 05:07:00
|
not sure
|
|
|
AccessViolation_
|
|
cioute
not sure
|
|
2025-11-07 05:07:46
|
I tested it in OpenCamera, it seems like a 100% quality WebP is lossless, so it's going to store a lot of information to preserve every pixel value exactly.
A 100% quality WebP is 10.5 MB for me, while a 99% quality WebP is only 3.5 MB. And note that even that 100% lossless WebP is not really lossless, since it's derived from a lossy JPEG that the camera outputs
|
|
|
jonnyawsom3
|
|
AccessViolation_
are you sure 100% quality doesn't turn it into a lossless WebP?
|
|
2025-11-07 05:07:49
|
If it was lossless, I highly doubt it would weigh less
|
|
2025-11-07 05:08:25
|
> somehow it looks less noisy and weights less
hmm
|
|
2025-11-08 06:45:03
|
Has anyone tried using ROI for foveated encoding with AV1? I did some tests with x264 back in the day but was having trouble finding a good balance
|
|
|
DZgas Ж
|
|
monad
shrek-to-jxl is good enough
|
|
2025-11-08 08:13:26
|
<:BlobYay:806132268186861619>
|
|
|
lonjil
|
2025-11-09 08:05:14
|
I got completely side tracked on implementing an FPGA DCT encoder. I've spent several days just reading about different sine and cosine approximation techniques.
|
|
2025-11-09 08:05:50
|
Guess I should probably organize some notes and put up somewhere, considering how obscure some of it was.
|
|
|
daniilmaks
|
|
Demiurge
|
|
2025-11-10 05:02:12
|
https://cdn.discordapp.com/attachments/673202643916816384/1415149093075812493/webpisshub.webp?ex=6912934b&is=691141cb&hm=b5bf1697de917536d40352218d080c188d7fa4a478e8a5485440bc3f8cf47a59
|
|
2025-11-10 05:03:11
|
https://github.blog/changelog/2025-08-28-added-support-for-webp-images/
|
|
2025-11-10 05:04:35
|
webp is a jpeg sidegrade that was outdated on arival
|
|
|
Meow
|
2025-11-10 05:10:35
|
A format is widely adopted when it's outdated
|
|
|
daniilmaks
|
2025-11-10 05:11:46
|
a format is widely adopted when a browser monopoly shoves it into a forced adoption
|
|
2025-11-10 05:13:34
|
webp has no business gaining support in 2025, it should be phased out, not added in.
|
|
2025-11-10 05:15:18
|
webp is the definition of a codec that had no market interest. it sucks for high-density, it sucks for high fidelity, it sucks for detail preservation and is prone to colorshift and the worst generational loss that has been ever archieved.
|
|
|
Meow
|
2025-11-10 05:15:28
|
https://en.wikipedia.org/wiki/WebP#Support
|
|
2025-11-10 05:16:16
|
The lossless part done by <@532010383041363969> is good
|
|
|
daniilmaks
|
2025-11-10 05:17:08
|
lossless webp good. I agree. but it is a travesty that it got bundled with the worst mainstream lossy codec in current age.
|
|
|
Meow
https://en.wikipedia.org/wiki/WebP#Support
|
|
2025-11-10 05:18:55
|
right at the first paragraph:
> Google actively promotes WebP, and Google Chrome and all Chromium-based browsers support the format. The proprietary PageSpeed Insights tool suggests that webmasters switch from JPEG and PNG to WebP in order to improve their website speed score.
the rest of the market was essentially forced into adding support for the format because incompatibilities drive consumers away from a given product, specially if the format is encountered often enough.
|
|
2025-11-10 05:19:56
|
https://en.wikipedia.org/wiki/WebP#Disadvantages_and_criticism
|
|
2025-11-10 05:20:50
|
> In September 2010, Fiona Glaser, a developer of the x264 encoder, wrote a very early critique of WebP. Comparing different encodings (JPEG, x264, and WebP) of a reference image, she stated that the quality of the WebP-encoded result was the worst of the three, mostly because of blurriness on the image. Her main remark was that "libvpx, a much more powerful encoder than ffmpeg's jpeg encoder, loses because it tries too hard to optimize for PSNR" (peak signal-to-noise ratio), arguing instead that "good psycho-visual optimizations are more important than anything else for compression". In October 2013, Josh Aas from Mozilla Research published a comprehensive study of current lossy encoding techniques[109] and was not able to conclude that WebP outperformed JPEG by any significant margin.
|
|
|
Meow
|
2025-11-10 05:32:26
|
This may have prompted the creation of MozJPEG
|
|
2025-11-10 05:35:59
|
How to prove that WebP isn't really better than JPEG? Create a new JPEG encoder/decoder
|
|
|
daniilmaks
|
2025-11-10 06:01:45
|
mozjpeg and jpegli both obsoleted webp a while ago
|
|
|
Meow
|
2025-11-10 07:27:11
|
Jpegli can even make XYB JPEG on par with AVIF sometimes
|
|
|
Exorcist
|
2025-11-10 07:47:59
|
> How to prove that WebP isn't really better than JPEG?
- VP8 has only `4*4` DCT block, no `8*8`
- VP8 has only YUV420, no YUV444
These limits make WebP can only fit small, low quality target
|
|
|
AccessViolation_
|
2025-11-10 08:37:08
|
[TECHNICAL OVERVIEW OF VP8, AN OPEN SOURCE VIDEO CODEC FOR THE WEB (Google Research)](<https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/37073.pdf>):
> VP8 uses 4x4 block-based discrete cosine transform (DCT) for all luma and chroma residual signal. Depending on the prediction mode, the DC coefficients from a 16x16 macroblock may then undergo a 4x4 Walsh-Hadamard transform.
[VP8 vs VP9: 8 Key Differences and How to Choose (Cloudinary)](<https://cloudinary.com/guides/video-formats/vp8-vs-vp9-8-key-differences-and-how-to-choose#2-block-sizes>)
> While both VP8 and VP9 use block-based motion compensation and intra-frame prediction, VP9 supports larger block sizes than VP8. VP8 uses a fixed block size of 16×16 pixels, whereas VP9 supports block sizes ranging from 4×4 to 64×64 pixels. This flexibility allows VP9 to adapt better to different types of video content, resulting in more efficient compression and improved visual quality.
<:DogWhat:806133035786829875>
|
|
|
|
cioute
|
2025-11-10 03:33:46
|
differences between vp9 and h265? maybe you know something chatgpt don't
|
|
|
Exorcist
|
2025-11-10 03:35:09
|
https://forum.doom9.org/showthread.php?t=168947
|
|
|
lonjil
|
|
lonjil
new jp2000 decoder https://github.com/LaurenzV/hayro/tree/main/hayro-jpeg2000
|
|
2025-11-11 03:34:39
|
Typst's JPEG 2000 decoder I linked to earlier is now feature complete, though very slow.
|
|
|
Cacodemon345
|
2025-11-13 01:55:14
|
We should look to HEIF instead when it gets inevitably adopted in 2035 or something.
|
|
|
AccessViolation_
|
|
lonjil
Typst's JPEG 2000 decoder I linked to earlier is now feature complete, though very slow.
|
|
2025-11-13 09:44:28
|
woah, pure rust PDF is nice
|
|
|
DZgas Ж
|
|
cioute
differences between vp9 and h265? maybe you know something chatgpt don't
|
|
2025-11-16 04:43:11
|
Hi, I've been studying codecs for practical use for over 5 years, and here's what I'll say:
VP9 is more primitive
HEVC is overloaded with technology
VP9 is supported by browsers and has hardware support everywhere
HEVC isn't fully supported by browsers, but it has hardware support everywhere, which is useful, for example, in Telegram
For real-time encoding, due to its simplicity, VP9 will always be better than HEVC.
When encoding with longer and more complex presets, HEVC will always be better than VP9.
VP9 requires less computation for decoding, from 30 to 50% less than HEVC, depending on the content type. This means, for example, that if a device poorly decodes 1080p HEVC, it might play 1080p VP9 perfectly.
Content that offers exceptional advantages in any situation:
VP9: Minecraft, very active gameplay
HEVC: Anime, animation, static flat lines
Ideologically, the use of VP9 may be motivated by open source fanaticism. But there is no restriction on the use of HEVC by individuals anywhere (due to patents).
In 90% of cases, there is no point in using HEVC if AV1 is available.
|
|
2025-11-16 04:46:15
|
It's important to note that VP9 encoding isn't implemented on GPU. The use of HEVC can only be achieved by enabling fast encoding for work or streaming using a GPU.
|
|
2025-11-16 04:50:00
|
Webp always better than Jpeg <:FrogSupport:805394101528035328> in web use
|
|
2025-11-16 04:54:17
|
We didn't have time to implement progressive decoding. The internet has become too fast for that. Now only decoding cost per watt matters. Time is wasted. <:avif:1280859209751334913> <:JXL:805850130203934781> <:pancakexl:1283670260209156128>
|
|
|
lonjil
|
2025-11-16 05:09:11
|
the only video codec worth using 10-bit h264
|
|
|
DZgas Ж
|
2025-11-16 05:10:56
|
a video that can't be played anywhere
|
|
|
lonjil
the only video codec worth using 10-bit h264
|
|
2025-11-16 05:21:41
|
To be fair, under very specific and defined conditions, you can squeeze decent quality out of AVC. About a year ago, I set a goal to find the best combination of parameters and resolution, specifically 10-bit. and encoded AV1 for comparison with identical encoding speed on a single core. Overall, it's not bad, and without a good eye, it seems unnoticeable, but the codec is still outdated enough that it's only used when full legacy is required.
|
|
|
lonjil
|
|
DZgas Ж
a video that can't be played anywhere
|
|
2025-11-16 05:36:51
|
any good video playing software can handle it
|
|
2025-11-16 05:37:27
|
I have an old GPU, VP9 and AV1 are no-go for me.
|
|
2025-11-16 05:38:06
|
(and ofc, I'm a sucker for very high quality)
|
|
2025-11-16 05:38:52
|
(that is to say, I want it to look very good even to a trained eye)
|
|
|
DZgas Ж
|
|
lonjil
I have an old GPU, VP9 and AV1 are no-go for me.
|
|
2025-11-16 06:58:37
|
so
|
|
2025-11-16 07:00:00
|
This is precisely one of the reasons why I tried and succeeded in creating my own easyhevc preset, so that I could compress Telegram videos without decoding problems anywhere, both on smartphones and on older hardware.
|
|
|
lonjil
|
2025-11-16 07:02:12
|
I have an RX 580 😄
|
|
|
DZgas Ж
|
2025-11-16 07:02:33
|
🥵 🔥 👍
|
|
|
Quackdoc
|
|
lonjil
I have an RX 580 😄
|
|
2025-11-16 07:06:14
|
I got one of those sitting on my desk lmao
|
|
2025-11-16 07:06:17
|
4gb one
|
|
|
lonjil
|
2025-11-16 07:06:38
|
im glad i have the 8gb one
|
|
|
Quackdoc
|
2025-11-16 07:07:29
|
rn im rocking an a380, wish rx580 had better support for wlroots, no vulkan t.t
|
|
|
lonjil
|
2025-11-16 07:10:23
|
what's missing?
|
|
|
Quackdoc
|
2025-11-16 07:11:03
|
https://gitlab.freedesktop.org/mesa/mesa/-/issues/5882
|
|
2025-11-16 07:11:23
|
also required for zero copy vaapi + vulkan
|
|
|
lonjil
|
|
daniilmaks
|
|
DZgas Ж
Webp always better than Jpeg <:FrogSupport:805394101528035328> in web use
|
|
2025-11-17 07:22:07
|
lol
|
|
|
DZgas Ж
|
|
lol
|
|
2025-11-17 08:31:13
|
The reaction of a person who only knows about webp because of negative memes
|
|
|
daniilmaks
|
|
DZgas Ж
The reaction of a person who only knows about webp because of negative memes
|
|
2025-11-17 08:34:08
|
I hated webp for purely technical reasons way before there were memes about it
|
|
2025-11-17 08:34:32
|
webp was trash back when it was created, it is trash today.
|
|
|
lonjil
|
2025-11-17 08:35:04
|
At all useful quality levels, jpeg is better
|
|
2025-11-17 08:35:22
|
At least if we're talking lossy
|
|
|
daniilmaks
|
2025-11-17 08:35:37
|
the only thing that webp should ever be used for is lossless mode. nothing more.
|
|
2025-11-17 08:36:16
|
and even then it's only for compatibility reason. sans compatibility you should be on jxl by now.
|
|
|
DZgas Ж
The reaction of a person who only knows about webp because of negative memes
|
|
2025-11-17 08:38:55
|
also... *I am the one who makes the memes.*
|
|
2025-11-17 08:41:23
|
.
https://www.youtube.com/watch?v=ujBp5B35el4 webp was always a meme regardless
|
|
2025-11-17 08:42:40
|
https://www.youtube.com/watch?v=_h5gC3EzlJg
|
|
|
DZgas Ж
|
|
.
https://www.youtube.com/watch?v=ujBp5B35el4 webp was always a meme regardless
|
|
2025-11-17 08:54:14
|
Jon's comparisons are truly ridiculous. Quantization losses are completely normal, and the fact that flif, which is positioned as a lossless codec, is being compared here is also ridiculous. What's not funny is your lack of arguments. JPEG XL also has a significant loss during recompression, which I especially noticed in my tests when shifting the image by ±3 pixel and when scaling the image by ±1%, which is exactly what happens. But what's your argument? WebP is posted on Reddit, for example, encoding it from downloaded originals. And at what point does the problem occur? Yes, this video shows the truth, but for me, it's a self-evident fact, like don't re-record a VHS 10 times from a copy of a copy of a copy of a copy
|
|
2025-11-17 08:55:35
|
We're living in a time when JPEG XL hasn't won. It's already seemingly everywhere except on the internet. And on the internet, you have a choice: PNG, JPEG, and WebP, which have 100% support everywhere. And WebP is the best choice
|
|
|
lonjil
At all useful quality levels, jpeg is better
|
|
2025-11-17 08:56:48
|
"useful"
|
|
|
webp was trash back when it was created, it is trash today.
|
|
2025-11-17 08:57:40
|
I don't see any arguments in your words, you're a man meme webp
|
|
|
daniilmaks
|
|
DZgas Ж
I don't see any arguments in your words, you're a man meme webp
|
|
2025-11-17 09:04:02
|
https://discord.com/channels/794206087879852103/805176455658733570/1437311015451754546
|
|
2025-11-17 09:04:54
|
no 444 mode in lossy mode = meme format
|
|
|
DZgas Ж
Jon's comparisons are truly ridiculous. Quantization losses are completely normal, and the fact that flif, which is positioned as a lossless codec, is being compared here is also ridiculous. What's not funny is your lack of arguments. JPEG XL also has a significant loss during recompression, which I especially noticed in my tests when shifting the image by ±3 pixel and when scaling the image by ±1%, which is exactly what happens. But what's your argument? WebP is posted on Reddit, for example, encoding it from downloaded originals. And at what point does the problem occur? Yes, this video shows the truth, but for me, it's a self-evident fact, like don't re-record a VHS 10 times from a copy of a copy of a copy of a copy
|
|
2025-11-17 09:06:53
|
iirc flif was a dual lossy-lossless codec but jon can correct me.
|
|
|
DZgas Ж
|
|
no 444 mode in lossy mode = meme format
|
|
2025-11-17 09:07:02
|
Excellent argument.
Could you provide statistics on the use of 444 jpeg on the internet compared to 420 jpeg?
|
|
|
daniilmaks
|
|
DZgas Ж
Excellent argument.
Could you provide statistics on the use of 444 jpeg on the internet compared to 420 jpeg?
|
|
2025-11-17 09:10:17
|
that's an ad-populum argument, and while it is not inherently fallacious, it is flawed in this instance because we're studying the quality archieved by each codec not how most social media perform shortcuts to reduce bandwidth usage.
|
|
2025-11-17 09:11:45
|
"almost nobody uses this higher quality mode therefore it's pointless showing how it defeats my argument"
|
|
|
DZgas Ж
|
|
https://discord.com/channels/794206087879852103/805176455658733570/1437311015451754546
|
|
2025-11-17 09:13:28
|
Referring to a 15-year-old quality report. Apparently everyone uses 1.0, which is 15 years old?
Even though WebP has been in development for all these years and continues to improve its quality, you should compare the old and current encoders. No one does such tests. It's the same gap as between the original JPEG from 1992 and mozjpeg/jpegli.
|
|
2025-11-17 09:14:25
|
|
|
2025-11-17 09:14:58
|
I just recently vibecoded it using the original documentation
|
|
|
daniilmaks
|
2025-11-17 09:16:07
|
any sort of lineart, graphs, screenshots, and downcaled photos are inherently butchered by 420 mode and you already know that.
why downscaled and not full res you may ask? about 99% of photos are taken on bayer sensors which have reduced chroma resolution vs luma, so 444 gives only marginal improvements unless the photo is downscaled.
|
|
2025-11-17 09:16:40
|
(just adding that at the end because it's something you might bring up)
|
|
|
DZgas Ж
Referring to a 15-year-old quality report. Apparently everyone uses 1.0, which is 15 years old?
Even though WebP has been in development for all these years and continues to improve its quality, you should compare the old and current encoders. No one does such tests. It's the same gap as between the original JPEG from 1992 and mozjpeg/jpegli.
|
|
2025-11-17 09:18:24
|
yes, it's a 15-year old report, that's the whole point: I literally said it was a meme back then.
|
|
|
DZgas Ж
|
|
that's an ad-populum argument, and while it is not inherently fallacious, it is flawed in this instance because we're studying the quality archieved by each codec not how most social media perform shortcuts to reduce bandwidth usage.
|
|
2025-11-17 09:18:34
|
Quality? Ultimate? Why? We're studying reality. The reality is that WebP was created for the internet and nothing else—not for film archiving, not for family photos on disk, but for the internet, and nothing else. 444 isn't an argument on the internet because it's not needed here.
|
|
|
daniilmaks
|
2025-11-17 09:19:11
|
as encoders advanced it's not just webp that got better but also the ones that came to compete with it.
|
|
2025-11-17 09:19:51
|
in the end you got what you got: a codec that was not competitive when it released, and it's not competitive today.
|
|
|
DZgas Ж
|
|
in the end you got what you got: a codec that was not competitive when it released, and it's not competitive today.
|
|
2025-11-17 09:21:34
|
words to the wind
|
|
2025-11-17 09:22:53
|
You know, in arguments like these, they usually do something like: you're wrong, here's a picture, and jpeg is 60 times better than webp.
|
|
|
daniilmaks
|
|
DZgas Ж
Quality? Ultimate? Why? We're studying reality. The reality is that WebP was created for the internet and nothing else—not for film archiving, not for family photos on disk, but for the internet, and nothing else. 444 isn't an argument on the internet because it's not needed here.
|
|
2025-11-17 09:22:57
|
webp was created for the internet and fails at that: a good chunk of the images in the internet are gui elements, which get butchered by 420. so we got to pick a poison: jpeg with no transparency, webp with poor chroma resolution, or png with higher bandwidth use.
|
|
2025-11-17 09:23:44
|
or gif, if we're feeling old school.
|
|
|
DZgas Ж
|
2025-11-17 09:23:47
|
gui elements
Images
|
|
2025-11-17 09:23:54
|
Bruh
|
|
|
daniilmaks
|
2025-11-17 09:24:19
|
yes I see plenty of those as much as you're surprised
|
|
2025-11-17 09:24:25
|
yes, in 2025
|
|
2025-11-17 09:24:41
|
don't ask me why
|
|
|
DZgas Ж
|
2025-11-17 09:25:08
|
Using codecs for other purposes is their problem
|
|
|
daniilmaks
|
2025-11-17 09:25:53
|
>format made for the internet
>uses it for the internet
>you weren't supposed to do that
|
|
|
DZgas Ж
|
2025-11-17 09:26:09
|
It reminded me of the Samsung news page, with a post about news where the entire page was one giant image.
|
|
|
daniilmaks
|
|
DZgas Ж
It reminded me of the Samsung news page, with a post about news where the entire page was one giant image.
|
|
2025-11-17 09:26:35
|
ok that's taking it too far, we both agree here lol
|
|
2025-11-17 09:26:41
|
<:KekDog:805390049033191445>
|
|
|
DZgas Ж
|
2025-11-17 09:27:40
|
But this is brilliant, why design something on the page if you can do everything in Photoshop?
|
|
2025-11-17 09:28:18
|
https://www.samsung.com/sec/event/GalaxyBook4Edge/comp/ just great
|
|
|
daniilmaks
|
2025-11-17 09:29:52
|
iirc, the reason webp got trasparency support was specifically to use it for gui decorations.
As you can imagine, back then images were extensively used for gui elements.
|
|
|
DZgas Ж
|
|
webp was created for the internet and fails at that: a good chunk of the images in the internet are gui elements, which get butchered by 420. so we got to pick a poison: jpeg with no transparency, webp with poor chroma resolution, or png with higher bandwidth use.
|
|
2025-11-17 09:29:58
|
There is a problem: even at high quality, even webp demonstrates better quality 420 than jpeg 444 with the same file size
|
|
|
DZgas Ж
I wanted to make a small yuv420 comparison webp jpegli mozjpeg but already on the first picture it is absolutely obvious that all these tests do not make sense, webp is completely better than jpeg in any implementation. With identical file size
```
webp -q 58 -m 6 -af -sharp_yuv
mozjpeg -baseline -notrellis -sample 2x2 -quality 33 -tune-psnr -quant-table 1
cjpegli --noadaptive_quantization --chroma_subsampling=420 -p 0 -d 3
```
all files out 28 KB
Original | jpegli | mozjpeg | webp
|
|
2025-11-17 09:30:48
|
Here, low quality
|
|
2025-11-17 09:31:45
|
There is a point when JPEG around Q90-95 becomes better than WEBP, this is true for 444, but again, this is not an internet case.
|
|
|
daniilmaks
|
|
DZgas Ж
There is a problem: even at high quality, even webp demonstrates better quality 420 than jpeg 444 with the same file size
|
|
2025-11-17 09:31:56
|
this needs a few asterisks: it will depend on the target bpp and the type of content. because at low bpp, 444 is expected to lose efficiency on jpeg.
|
|
|
DZgas Ж
|
|
DZgas Ж
It's even painfully amazing that the webp 420 looks even better than the guetzli 444, in my opinion
```guetzli --quality 82
webp -q 58 -m 6 -af -sharp_yuv ```
Original | webp | guetzli
|
|
2025-11-17 09:33:01
|
Here. q82 jpeg here and webp 420 looks better than jpeg 444
|
|
|
daniilmaks
|
2025-11-17 09:33:04
|
I will do some testing on webp next weekend since you got me in the mood. what is the most up to date build for webp?
|
|
|
DZgas Ж
Here. q82 jpeg here and webp 420 looks better than jpeg 444
|
|
2025-11-17 09:34:25
|
I'm on mobile rn so I can't do honest 1:1 checks rn but will check later, sure.
|
|
|
DZgas Ж
|
2025-11-17 09:34:44
|
Builds... I know Google has cwebp builds post somewhere, but I forgot where <:FrogSupport:805394101528035328>
|
|
|
A homosapien
|
|
I will do some testing on webp next weekend since you got me in the mood. what is the most up to date build for webp?
|
|
2025-11-17 09:35:20
|
https://developers.google.com/speed/webp/download
|
|
|
daniilmaks
|
2025-11-17 09:35:31
|
I think I downloaded cwebp from google couple months ago
|
|
|
A homosapien
|
2025-11-17 09:35:31
|
Here are the latest builds
|
|
|
daniilmaks
|
2025-11-17 09:36:44
|
is there a changelog somewhere
|
|
|
DZgas Ж
|
2025-11-17 09:36:50
|
I'll be sure to send you the image I'm using for testing, it's a sound wave, a 3D spectrogram from my spectrogram drawing project, but I name it simpler: killer jpeg xl
|
|
|
jonnyawsom3
|
|
DZgas Ж
There is a point when JPEG around Q90-95 becomes better than WEBP, this is true for 444, but again, this is not an internet case.
|
|
2025-11-17 09:36:59
|
In our testing, 4:2:0 was better at Quality 80 and below
|
|
|
daniilmaks
|
2025-11-17 09:37:54
|
yeah that's roughly what I would expect
|
|
|
DZgas Ж
|
2025-11-17 09:39:59
|
https://discord.com/channels/794206087879852103/805176455658733570/1380112200190005258
|
|
|
daniilmaks
|
|
DZgas Ж
https://www.samsung.com/sec/event/GalaxyBook4Edge/comp/ just great
|
|
2025-11-17 09:40:23
|
https://tenor.com/view/skeleton-falling-gif-27355771
|
|
|
DZgas Ж
|
|
DZgas Ж
I'll be sure to send you the image I'm using for testing, it's a sound wave, a 3D spectrogram from my spectrogram drawing project, but I name it simpler: killer jpeg xl
|
|
2025-11-17 10:12:33
|
https://drive.google.com/file/d/1qdMpZrij0YTD-2zuEbniTmt9cAdETGz3/view?usp=sharing
|
|
2025-11-17 10:14:01
|
cwebp -q 58 -m 6 -af -sharp_yuv
Or a different quality, tests.
The main thing here is that JPEG XL completely loses to WebP in this image. All the problems of JPEG XL are clearly visible: fading, excessive smoothing, everything.
|
|
2025-11-17 10:15:13
|
JPEG loses at 444 because it doesn't have enough bitrate, and at 420 because it doesn't have enough technology
|
|
|
A homosapien
|
2025-11-17 10:29:43
|
I should really get sharpyuv working in jpegli
|
|
|
DZgas Ж
|
|
A homosapien
I should really get sharpyuv working in jpegli
|
|
2025-11-17 10:33:41
|
Are you sure this is possible? I didn't go into much detail about the algorithm, but I have a similar program that independently compresses the color channel in PNG so that the JPEG encoder compresses the colors less.
But WebP has a major advantage: decoding is done entirely by libwebp, meaning the color interpolation algorithm is performed by the codec that outputs the post-processed interpolated color channels. In JPEG, the color channel interpolation by the program itself, and can vary.
|
|
2025-11-17 10:34:30
|
In some JPEG programs, colors are interpolated using nearest neighbor or bilinear interpolation, webp has completely solved this problem with its own interpolation at the decoding stage.
|
|
|
A homosapien
|
2025-11-17 10:34:32
|
sjpeg already does this, I personally just want it in jpegli
|
|
|
DZgas Ж
|
2025-11-17 10:34:53
|
<:PepeOK:805388754545934396>
|
|
2025-11-17 10:35:38
|
I really can't understand how this can be done. Webp has one decoder, but JPEG...
|
|
|
A homosapien
|
2025-11-17 10:36:45
|
Almost all programs use libjpeg-turbo, as they should
|
|
|
_wb_
|
|
iirc flif was a dual lossy-lossless codec but jon can correct me.
|
|
2025-11-17 10:58:52
|
In those generation loss comparisons, of course FLIF was used in a lossy way, and the bitrates of the first generations were matched to make it a more or less fair comparison. Of course it would be meaningless to put FLIF there in lossless mode, generation loss is not a thing for lossless.
FLIF was exceptionally generation loss resistent (as long as you don't start cropping or otherwise moving around pixel positions between generations) because its lossy mode isn't doing any frequency transform, it's just quantizing precision of residuals in a way that is idempotent (no change if you recompress a decoded image at the same encode settings).
|
|
|
daniilmaks
|
2025-11-17 11:04:13
|
yeah that's roughly what I thought, I found it weird of dzgas saying it was lossless.
|
|
|
DZgas Ж
|
|
yeah that's roughly what I thought, I found it weird of dzgas saying it was lossless.
|
|
2025-11-17 11:05:42
|
I didn't say it was lossless in the test, I'm saying that the codec itself isn't designed to be used lossy; that's an additional feature, not its primary one. FLIF was promoted as a lossless codec.
|
|
|
daniilmaks
|
2025-11-17 11:06:22
|
that clears things, thanks.
|
|
|
DZgas Ж
|
2025-11-17 11:08:29
|
Same as lossy modular in jpeg xl. It's possible, but Why? This isn't something that can be shown as: look, it can... it can. It's bad, but it can. Why...? The main thing is that it exists, it's possible... maybe you want to compress lossless by 10% more so that it's almost as lossless but losslessn't .
|
|
|
daniilmaks
|
2025-11-17 11:09:12
|
vibe-losslessness
|
|
|
_wb_
In those generation loss comparisons, of course FLIF was used in a lossy way, and the bitrates of the first generations were matched to make it a more or less fair comparison. Of course it would be meaningless to put FLIF there in lossless mode, generation loss is not a thing for lossless.
FLIF was exceptionally generation loss resistent (as long as you don't start cropping or otherwise moving around pixel positions between generations) because its lossy mode isn't doing any frequency transform, it's just quantizing precision of residuals in a way that is idempotent (no change if you recompress a decoded image at the same encode settings).
|
|
2025-11-17 11:10:30
|
sounds a bit like lossywav, you can recompress it as many times as you want, but the lossy part only happens the first time.
|
|
|
DZgas Ж
|
2025-11-17 11:11:24
|
Well, technically, wav has bit quantization, which is also compression, if the original is, for example, 32-bit float and the resulting file is 16 bit
|
|
2025-11-17 11:12:08
|
lossy in essence
|
|
|
daniilmaks
|
2025-11-17 11:12:41
|
I get where you're going but in that context lossless is a meaningless word with no real world use.
|
|
2025-11-17 11:13:20
|
in the digital media realm*
|
|
2025-11-17 11:15:09
|
but it's a fair point you're bringing up, people tend to forget lossless only means "lossless after this very particular stage in the processing chain"
|
|
|
DZgas Ж
|
2025-11-17 11:16:07
|
<:galaxybrain:821831336372338729><:This:805404376658739230>
|
|
|
daniilmaks
|
2025-11-17 11:18:17
|
fun anecdote: I once had to do a whole lesson on compression types to some dude who thought he was increasing the quality of screenshots by naive converting them jpg to png
|
|
|
DZgas Ж
|
|
I get where you're going but in that context lossless is a meaningless word with no real world use.
|
|
2025-11-17 11:18:50
|
The main thing is what data is original. People whose master track is float 32bit 192khz, just because they are paranoid mélomane, but flac 16bit 44.1, which is definitely lossless.
|
|
|
daniilmaks
|
|
fun anecdote: I once had to do a whole lesson on compression types to some dude who thought he was increasing the quality of screenshots by naive converting them jpg to png
|
|
2025-11-17 11:19:14
|
he was incredibly stubborn at first but I had live evidence to help him change his mind.
|
|
|
DZgas Ж
|
|
fun anecdote: I once had to do a whole lesson on compression types to some dude who thought he was increasing the quality of screenshots by naive converting them jpg to png
|
|
2025-11-17 11:20:49
|
Well, it's not that bad, it's much harder to explain to others that they shouldn't compress images into archives, because it doesn't make sense, even though the size becomes a couple hundred kilobytes smaller due to the redundancy of service data markup inside the files 🥹
|
|
|
daniilmaks
|
2025-11-17 11:23:33
|
I only put media onto archives for organization (and rarely so), and even then it's usually either fast compression or no compression. there's also technical reasons: copying large numbers of files between android and windows has been buggy for over a decade, zipping stuff makes some copying operations faster and more reliable.
|
|
|
DZgas Ж
|
2025-11-17 11:31:20
|
For some reason, 7zip forks lack a COPY mode; I have no idea why. The original 7zip has a clear and understandable "7z copy" that doesn't perform any compression at all, packing files as is.
|
|
|
DZgas Ж
I've been studying compression for about eight years now, I think. But only now have I learned about Zopfli, which compresses data into pure Deflate, but much better. I'm really surprised to learn about this, because when encoding APNG, I wondered how Zopfli, from 2013, could have been added to a format from 2004. Well, it turns out that's exactly it. It simply compressed my test animation better than 7zip (lzma2), which was a bit surprising (~0.05%). (I'm talking about APNG Assembler)
|
|
2025-11-17 11:45:01
|
Yes, that's really interesting, although it's not better than 7zip in This case, I'm surprised no one talks about this anywhere at all
https://www.advancemame.it/download
https://github.com/google/zopfli/issues/29#issuecomment-77830614
advzip --recompress -4 --iter 100 my.zip
|
|
2025-11-17 11:46:11
|
although you know it's still very close
|
|
|
TheBigBadBoy - 𝙸𝚛
|
|
DZgas Ж
Yes, that's really interesting, although it's not better than 7zip in This case, I'm surprised no one talks about this anywhere at all
https://www.advancemame.it/download
https://github.com/google/zopfli/issues/29#issuecomment-77830614
advzip --recompress -4 --iter 100 my.zip
|
|
2025-11-17 12:12:46
|
best Deflate optimizer is ECT
and it's even multithreaded <:YEP:808828808127971399>
|
|
|
Exorcist
|
|
fun anecdote: I once had to do a whole lesson on compression types to some dude who thought he was increasing the quality of screenshots by naive converting them jpg to png
|
|
2025-11-17 01:57:27
|
https://github.com/victorvde/jpeg2png <:galaxybrain:821831336372338729>
|
|
|
daniilmaks
|
|
Exorcist
https://github.com/victorvde/jpeg2png <:galaxybrain:821831336372338729>
|
|
2025-11-17 02:03:02
|
HA, yeah I'm familiar, I wish I knew of it back then.
|
|
|
username
|
|
daniilmaks
|
2025-11-17 02:06:18
|
obviously nowadays I'd use this instead https://github.com/ilyakurdyukov/jpeg-quantsmooth
|
|
|
username
|
2025-11-17 02:09:05
|
I use both jpeg2png (modified) and quantsmooth since in some cases one does better then the other
|
|
|
daniilmaks
|
|
username
|
|
2025-11-17 02:13:37
|
you're thio?
|
|
|
username
|
|
you're thio?
|
|
2025-11-17 02:15:07
|
no also the sharpness part of that fork I sent isn't in Thio's fork
|
|
|
Exorcist
|
|
username
|
|
2025-11-17 02:15:14
|
it may sharpen the noise dots
|
|
|
daniilmaks
|
2025-11-17 02:16:15
|
I'm thinking why not combine the algorithms of each so that you can tune the smoothing methods with a switch instead of a different program.
|
|
2025-11-17 02:17:04
|
also I find it neater having a jpg output since it's more space efficient for the source material.
|
|
|
username
|
|
also I find it neater having a jpg output since it's more space efficient for the source material.
|
|
2025-11-17 02:19:08
|
jpeg2png's output is fine if I plan on putting it into editing software since it will get converted to raw pixels either way
|
|
|
daniilmaks
|
|
Exorcist
it may sharpen the noise dots
|
|
2025-11-17 02:19:34
|
interesting, I'll add salt pepper noise to some tests.
|
|
|
username
jpeg2png's output is fine if I plan on putting it into editing software since it will get converted to raw pixels either way
|
|
2025-11-17 02:20:29
|
true, it's specially unimportant if it's just an intermediate file that will get deleted.
|
|
|
Exorcist
|
2025-11-17 02:21:21
|
jpeg2png is also a high bit-depth decoder when you set iteration = 0
|
|
|
daniilmaks
|
2025-11-17 02:21:48
|
what does iteration = 0 mean
|
|
2025-11-17 02:22:15
|
in this context
|
|
|
Exorcist
|
2025-11-17 02:22:24
|
only decode, do not smooth by gradient descent
|
|
|
daniilmaks
|
|
Exorcist
only decode, do not smooth by gradient descent
|
|
2025-11-17 02:22:51
|
<:FeelsReadingMan:808827102278451241> good to know
|
|
|
username
|
|
Exorcist
jpeg2png is also a high bit-depth decoder when you set iteration = 0
|
|
2025-11-17 02:24:34
|
how does it compare to the JPEG decoding in jpegli/libjxl?
|
|
|
daniilmaks
|
|
Exorcist
only decode, do not smooth by gradient descent
|
|
2025-11-17 02:24:49
|
ok just so I didn't misread you, jpeg2png always works in high bitdepth, you're just saying it can be useful if I want the high precision decoding without the smoothing. correct?
|
|
2025-11-17 02:25:55
|
nice, I gained 1 singular neuron today.
|
|
|
DZgas Ж
|
|
TheBigBadBoy - 𝙸𝚛
best Deflate optimizer is ECT
and it's even multithreaded <:YEP:808828808127971399>
|
|
2025-11-17 09:26:54
|
can it .zip?
|
|
|
TheBigBadBoy - 𝙸𝚛
|
2025-11-17 09:32:56
|
yeah
|
|
2025-11-17 09:33:11
|
well
|
|
2025-11-17 09:33:21
|
it can only optimize zip
|
|
2025-11-17 09:33:29
|
it cannot create it from scratch <:KekDog:805390049033191445>
|
|
|
|
cioute
|
|
DZgas Ж
Hi, I've been studying codecs for practical use for over 5 years, and here's what I'll say:
VP9 is more primitive
HEVC is overloaded with technology
VP9 is supported by browsers and has hardware support everywhere
HEVC isn't fully supported by browsers, but it has hardware support everywhere, which is useful, for example, in Telegram
For real-time encoding, due to its simplicity, VP9 will always be better than HEVC.
When encoding with longer and more complex presets, HEVC will always be better than VP9.
VP9 requires less computation for decoding, from 30 to 50% less than HEVC, depending on the content type. This means, for example, that if a device poorly decodes 1080p HEVC, it might play 1080p VP9 perfectly.
Content that offers exceptional advantages in any situation:
VP9: Minecraft, very active gameplay
HEVC: Anime, animation, static flat lines
Ideologically, the use of VP9 may be motivated by open source fanaticism. But there is no restriction on the use of HEVC by individuals anywhere (due to patents).
In 90% of cases, there is no point in using HEVC if AV1 is available.
|
|
2025-11-19 02:52:35
|
very good explanation
|
|
|
lonjil
the only video codec worth using 10-bit h264
|
|
2025-11-19 02:57:58
|
sadly no hw decoder
|
|
|
DZgas Ж
|
|
obviously nowadays I'd use this instead https://github.com/ilyakurdyukov/jpeg-quantsmooth
|
|
2025-11-21 07:17:21
|
Well, I use SCUNet-GAN and it's the best option 99% of the time. Only 1% of the time does SCUNet smooth things out more than I'd like, and my jpeg is the original, so only in those cases would jpeg2png actually be better. But over the past year, I've only encountered a couple of situations where scaling isn't an option, and SCUNet performed poorly. It's about ~10 times slower, but it's not 2^N times slower.
In 99% of cases, this is an example of when everything is fine:
Original jpeg | jpeg2png | SCUNet-GAN
|
|
|
daniilmaks
|
2025-11-21 07:19:49
|
nowadays I try to avoid ai-style solutions when it comes to well defined problems like these but I appeciate the existence of those tools nonetheless
|
|
2025-11-21 07:20:33
|
I do like the results there, it's almost oversmooth but still looks fine.
|
|
|
DZgas Ж
|
|
I do like the results there, it's almost oversmooth but still looks fine.
|
|
2025-11-21 07:22:29
|
It's true, it's too perfect, as if it was shot on a camera 20 times more expensive than it could have been in this scene. And it's absolutely true that sometimes the result is so good that you have to add noise in post-processing, it's all too good <:galaxybrain:821831336372338729>
|
|
2025-11-21 07:23:26
|
This is not a problem because it will still be compressed 10 times wherever I send it, anywhere, Reddit, Telegram, Instagram, it doesn’t matter
|
|
|
daniilmaks
|
|
DZgas Ж
It's true, it's too perfect, as if it was shot on a camera 20 times more expensive than it could have been in this scene. And it's absolutely true that sometimes the result is so good that you have to add noise in post-processing, it's all too good <:galaxybrain:821831336372338729>
|
|
2025-11-21 07:24:35
|
https://tenor.com/view/game-day-gif-1421612078979008371
|
|
|
DZgas Ж
|
2025-11-21 07:24:51
|
So let the clear lines be compressed with less artifacts, rather than creating more blur via jpeg2png or more artifacts by sending the original
|
|
|
username
|
|
2025-11-21 10:48:51
|
a large number of iterations, which gave better quality in the original code, create artifacts in this
-i 1000
|
|
|
gb82
|
|
DZgas Ж
Webp always better than Jpeg <:FrogSupport:805394101528035328> in web use
|
|
2025-11-22 04:20:03
|
i think i agree more generally for what is considered "web quality" but at high fidelity jpegli can edge out wins against libwebp in metrics and visually
|
|
2025-11-22 04:20:56
|
i don't think libwebp hill climbed very hard on high fidelity, because it is a genuinely very difficult problem
|
|
|
DZgas Ж
|
|
gb82
i don't think libwebp hill climbed very hard on high fidelity, because it is a genuinely very difficult problem
|
|
2025-11-22 12:08:23
|
This is a problem for people like **us**. Regular users don't think that way.
There's a SD TV... and a HD one.
There are "lossy images" for heavy "compression", and "lossless images" ""without"" compression. There's nothing to think about. Webp implements this simple approach perfectly, creating a clear **gap **between its lossless codec and its lossy codec. It's hard to explain why people need "Very High Quality but its Not the Source"... like aac 640 kbps stereo
|
|
|
Exorcist
|
2025-11-23 11:19:27
|
> Nokia has a legal complaint against HP related to H.264 and H.265 filed October 2023. It even mentions H.265 on consumer devices being an infringement.
https://news.ycombinator.com/item?id=46019283
https://www.courtlistener.com/docket/67928650/1/nokia-technologies-oy-v-hp-inc/
|
|
2025-11-23 11:21:19
|
> HEIF itself is a container that may not be subject to additional royalty fees for commercial ISOBMFF licensees. Nokia also grants its patents on a royalty-free basis for non-commercial purposes.
https://en.wikipedia.org/wiki/High_Efficiency_Image_File_Format#Patent_licensing
https://github.com/nokiatech/heif/blob/master/LICENSE.TXT
|
|
2025-11-23 11:22:17
|
Trust me bro, we won't sue you<:avif:1280859209751334913>
|
|
|
lonjil
|
2025-11-23 11:27:22
|
> H.264
> 2023
the fuck?
|
|
|
spider-mario
|
2025-11-24 08:09:07
|
> It even mentions H.265 on consumer devices being an infringement. I sort of doubt this is about paying $0.04 more per laptop - there must be some uncertainty or legal risk maybe in certain chipsets or something.
nothing of the sort according to their link:
> Dozens of companies have taken licenses to Nokia’s essential patent claims at rates that are reasonable and non-discriminatory. Yet, despite receiving multiple offers from Nokia, HP has refused to take a license to Nokia’s H.264 and H.265 essential decoding patent claims. HP’s failure to negotiate in good faith to reach an agreement on terms for a license to Nokia’s standard essential patents for the relevant standards (including Nokia’s patented H.264 and H.265 technology) has forced Nokia to institute this lawsuit.
|
|
|
Cacodemon345
|
2025-11-24 08:39:47
|
Now that explains why Dell and HP disabled HEVC in their later laptops.
|
|
|
lonjil
|
2025-11-24 09:22:44
|
how can there still be essential patents on h.264 decoding ??
|
|
2025-11-24 09:22:53
|
shouldn't they have expired already?
|
|
|
Exorcist
|
2025-11-24 09:36:33
|
> reasonable and non-discriminatory
<:ugly:805106754668068868>
|
|
|
spider-mario
|
|
lonjil
shouldn't they have expired already?
|
|
2025-11-24 10:58:03
|
wikipedia says some of them don’t expire until 2030
|
|
|
lonjil
|
2025-11-24 10:58:26
|
How completely ridiculous
|
|
2025-11-24 10:59:21
|
since it's been around since 2003
|
|
|
spider-mario
|
2025-11-24 10:59:50
|
maybe not quite as ridiculous as copyright (death + 70 years in most countries), but yeah
|
|
|
lonjil
|
2025-11-24 11:00:06
|
yeah, but patents are usually 20 years
|
|
|
spider-mario
|
2025-11-24 11:00:12
|
(on that note, thanks to Jim Morrison’s relatively early death, I may live to see The Doors’ discography enter the public domain!)
|
|
|
lonjil
|
2025-11-24 11:00:49
|
like, what, did some patents sit in patent office bureaucracy hell for 7 years and then get granted with the 20 year time period? 😆
|
|
|
spider-mario
(on that note, thanks to Jim Morrison’s relatively early death, I may live to see The Doors’ discography enter the public domain!)
|
|
2025-11-24 11:01:12
|
dang
|
|
|
spider-mario
|
2025-11-24 11:01:54
|
it’s already the case now in countries where it’s death + 50 years, which includes Canada https://en.wikipedia.org/wiki/2022_in_public_domain#Countries_with_life_+_50_years
|
|
|
_wb_
|
|
lonjil
yeah, but patents are usually 20 years
|
|
2025-11-24 01:28:54
|
even so, it can happen that the codec itself is old enough to not be patent-encumbered anymore, but e.g. specific implementation techniques that were patented later are used... Also there could be patents on recent extensions/revisions of h.264
|
|
|
AccessViolation_
|
2025-11-24 02:54:15
|
I feel like every application of a patented thing should be exempt if that thing was made before the date of the patent application
|
|
2025-11-24 03:03:25
|
you shouldn't be able to retroactively demand people pay licensing fees for things they were doing before you got the exclusive rights to them. that's like making something illegal and prosecuting people that did it while it wasn't
|
|
2025-11-24 03:04:36
|
hmm no, it's not like that because I think you should also be able to continue making and selling patented things if you were doing so before the patent application
|
|
2025-11-24 03:06:46
|
I like that software patents aren't really a thing in the EU 🙂
|
|
|
_wb_
|
|
AccessViolation_
I feel like every application of a patented thing should be exempt if that thing was made before the date of the patent application
|
|
2025-11-24 03:22:18
|
in principle if you can demonstrate that your thing predates the patent application, it's prior art and the patent is invalid. But invalidating a patent is very nontrivial and costly
|
|
|
DZgas Ж
|
|
lonjil
> H.264
> 2023
the fuck?
|
|
2025-11-24 03:27:14
|
🥹
|
|
2025-11-24 03:30:08
|
AVC is still the fastest codec for encoding and decoding, for 1080p resolution I would certainly recommend libvpx vp9 -deadline realtime -cpu-used 8 -row-mt 1 but avc is still an option -- And of course, AVC used in a lot of places, and no one will just release patents, haha, they released the entire EVS codec that no one needs, but they didn't dare release the AVC patents
|
|
|
lonjil
how can there still be essential patents on h.264 decoding ??
|
|
2025-11-24 03:30:34
|
<:Stonks:806137886726553651>
|
|
|
lonjil
|
|
DZgas Ж
AVC is still the fastest codec for encoding and decoding, for 1080p resolution I would certainly recommend libvpx vp9 -deadline realtime -cpu-used 8 -row-mt 1 but avc is still an option -- And of course, AVC used in a lot of places, and no one will just release patents, haha, they released the entire EVS codec that no one needs, but they didn't dare release the AVC patents
|
|
2025-11-24 03:35:45
|
Indeed. AVC is the only codec I use.
|
|
|
spider-mario
|
|
DZgas Ж
🥹
|
|
2025-11-24 07:48:45
|
I believe “the fuck” was a reaction to the patent status
|
|
|
DZgas Ж
|
|
spider-mario
I believe “the fuck” was a reaction to the patent status
|
|
2025-11-24 08:08:55
|
I realized this when I replied to the next message. I sincerely believe that AVC is outdated in all respects. I created my own EASYHEVC preset that solves full the decoding issue for clients. DISCORD and Telegram both support HEVC, which is enough to encode all "original quality" memes in hevc, which is what I do. For streaming, for example on my personal website, I use VP9/AV1, which encoded on any processor purchased these days... hmmm. VP9 also solves the decoding issue, and for AV1, you just need to keep the resolution at 1 megapixel at 30 fps <:AV1:805851461774475316>
|
|
|
Smegas
|
2025-11-24 09:00:47
|
All my home movies are in AVC. HEVC and AV1, in order to obtain a better compression ratio, blur small details, which, with stronger compression, results in a blurry image. To achieve the same level of detail in HEVC, I had to produce a file the same size as AVC. AVC is faster and eco-friendly.
|
|
|
spider-mario
|
2025-11-24 09:20:14
|
HEVC has broader support for hardware-accelerated 10-bit decoding, though
|
|
2025-11-24 09:20:32
|
which comes in handy for HDR 4K
|
|
|
DZgas Ж
|
|
Smegas
All my home movies are in AVC. HEVC and AV1, in order to obtain a better compression ratio, blur small details, which, with stronger compression, results in a blurry image. To achieve the same level of detail in HEVC, I had to produce a file the same size as AVC. AVC is faster and eco-friendly.
|
|
2025-11-24 10:31:25
|
All problems are solved by fine-tuning presets, but unfortunately, Normal people will never do this
|
|
|
Lumen
|
|
Smegas
All my home movies are in AVC. HEVC and AV1, in order to obtain a better compression ratio, blur small details, which, with stronger compression, results in a blurry image. To achieve the same level of detail in HEVC, I had to produce a file the same size as AVC. AVC is faster and eco-friendly.
|
|
2025-11-25 10:03:45
|
that is an extremely uneducated statment
|
|
2025-11-25 10:03:58
|
(in the encoding field)
|
|
|
_wb_
|
2025-11-25 04:11:20
|
Will jxl-rs be the first codec implementation in a browser that is written in Rust?
|
|
|
monad
|
2025-11-25 04:14:46
|
"The first memory-safe codec."
|
|
|
Cacodemon345
|
|
_wb_
Will jxl-rs be the first codec implementation in a browser that is written in Rust?
|
|
2025-11-25 04:29:28
|
Find out in the current episode.
|
|
2025-11-25 04:29:50
|
--- Browser News Network
|
|
|
|
veluca
|
2025-11-25 05:09:03
|
no, chrome uses the Rust png crate nowadays
|
|
|
monad
|
2025-11-25 05:23:52
|
"The first memory-safe codec since PNG."
|
|
|
lonjil
|
2025-11-25 05:55:09
|
I'm playing around with various software verification tools, so maybe if all goes well we can brag about it being the first formally verified codec implementation in a browser 😄
|
|
|
|
veluca
|
2025-11-25 06:01:44
|
you're not the first one to tell me that, funnily enough
|
|
2025-11-25 06:02:03
|
(at least, for formal verification for unsafe)
|
|
|
AccessViolation_
|
2025-11-25 06:06:11
|
that would be the final nail in the coffin for the all too common "just that it's written in rust doesn't mean it's secure"
|
|
|
lonjil
|
2025-11-25 06:07:34
|
There are so many different verification tools for Rust now, it almost feels like the verification community has bailed on C in favor of Rust
|
|
|
AccessViolation_
|
2025-11-25 06:10:03
|
I saw you comment about a JPEG XL decoder in WUFFS on hackernews and that would be *hell* to implement but really cool
|
|
|
lonjil
There are so many different verification tools for Rust now, it almost feels like the verification community has bailed on C in favor of Rust
|
|
2025-11-25 06:13:32
|
that's good to hear. I'm not surprised, rust is probably already really appealing if you need some level of verifiability
|
|
|
|
veluca
|
|
AccessViolation_
I saw you comment about a JPEG XL decoder in WUFFS on hackernews and that would be *hell* to implement but really cool
|
|
2025-11-25 06:14:01
|
yeah no
|
|
|
AccessViolation_
|
2025-11-25 06:15:49
|
right I need to remember to not bring up these silly ideas near decoder writers <:KekDog:805390049033191445>
|
|
2025-11-25 06:16:41
|
sorry I scared you with the mere thought of having to write that XD
|
|
|
|
veluca
|
2025-11-25 06:23:07
|
tbh I don't like wuffs much
|
|
|
AccessViolation_
|
2025-11-25 06:24:10
|
hmm why is that?
|
|
2025-11-25 06:25:12
|
I like it conceptually, but it's a shame it compiles to C. I guess that's a nice thing for compatibility, but I don't like it in terms of design aesthetics
|
|
|
|
veluca
|
2025-11-25 06:25:44
|
I might like it more once I see a decoder for a "serious" format written in it
|
|
2025-11-25 06:27:03
|
but the general feeling I have is that it's unlikely that you can get a good special-purpose language for this stuff and the effort would be better spent making Rust better at writing decoders 😛
|
|
|
AccessViolation_
|
|
veluca
I might like it more once I see a decoder for a "serious" format written in it
|
|
2025-11-25 06:31:20
|
I guess that's a testament to how much of a pain it is to write. we care about safety, but not [glares at wuffs] *that* much
|
|
2025-11-25 06:31:49
|
or there might be other reasons too. I wouldn't know
|
|
|
|
veluca
|
2025-11-25 06:32:41
|
keep in mind that writing a custom language means you need to write a custom compiler/stdlib
|
|
2025-11-25 06:32:47
|
and you need to trust *those* too
|
|
|
AccessViolation_
|
|
veluca
but the general feeling I have is that it's unlikely that you can get a good special-purpose language for this stuff and the effort would be better spent making Rust better at writing decoders 😛
|
|
2025-11-25 06:34:42
|
I don't how how much this helps decoders specifically but I'm quite excited about ranged integers. creating them requires proving or checking that they're within the range, but then whereever they're used, the compiler knows those integers can only ever be within that range which in theory allows it to make some novel optimizations that it wouldn't otherwise
|
|
|
lonjil
|
2025-11-25 06:35:13
|
brb writing a decoder in a theorem prover language with a verified core
|
|
|
|
veluca
|
2025-11-25 06:35:19
|
https://github.com/rust-lang/rfcs/issues/671 😛
|
|
|
AccessViolation_
|
2025-11-25 06:40:15
|
yeah those ^^
I haven't really been paying attention to the development of it, I seem to remember there was syntax to create them on nightly but they didn't do anything
|
|
|
lonjil
brb writing a decoder in a theorem prover language with a verified core
|
|
2025-11-25 06:42:25
|
gosh that reminds me, someone made some sort of proof system that would generate a spec *and* reference implementation from the same source representation. I don't remember what it's called
|
|
2025-11-25 06:43:49
|
one idea was that it would eliminate the problem of inconsistencies between a specification and its reference implementation
|
|
2025-11-25 06:48:42
|
SpecTec!
https://webassembly.org/news/2025-03-27-spectec/
|
|
|
_wb_
|
2025-11-25 07:15:27
|
I like programming languages with a clear operational semantics, that's the kind of stuff I did in a previous life
|
|
|
Exorcist
|
2025-11-27 11:47:11
|
|
|
2025-11-27 11:47:26
|
|
|