JPEG XL

Info

rules 57
github 35276
reddit 647

JPEG XL

tools 4225
website 1655
adoption 20712
image-compression-forum 0

General chat

welcome 3810
introduce-yourself 291
color 1414
photography 3435
other-codecs 23765
on-topic 24923
off-topic 22701

Voice Channels

General 2147

Archived

bot-spam 4380

other-codecs

DZgas Ж
I also have no clue how it follows that mp3 wasting bits in the bass means I am somehow incorrect.
2022-12-12 04:28:35
no one complains about low frequency artifacts
daniilmaks
2022-12-12 04:29:29
that means they are pretty much *not hearable*
2022-12-12 04:29:53
can you translate the graph?
2022-12-12 04:30:06
I mean literally, I don't speak russian
2022-12-12 04:31:05
I can't even type the characters to find out myself
DZgas Ж
can you translate the graph?
2022-12-12 04:39:36
distortions % frequencies hz
daniilmaks
2022-12-12 04:42:11
then it's what I thought. opus still has the least amount of distortion in the bass range, just like all the others, despite being the most prunned in that area.
DZgas Ж
then it's what I thought. opus still has the least amount of distortion in the bass range, just like all the others, despite being the most prunned in that area.
2022-12-12 04:43:43
what? how do you look at the graph? Lower - less distortion - better
daniilmaks
DZgas Ж what? how do you look at the graph? Lower - less distortion - better
2022-12-12 04:44:39
you seem to be confused about how psychoacoustic models work
DZgas Ж
you seem to be confused about how psychoacoustic models work
2022-12-12 04:45:08
and what do you mean by that?
2022-12-12 04:46:21
it is quite obvious that OPUS is better than all codecs, it is already clear, and the graphics can be seen more clearly
daniilmaks
2022-12-12 04:46:41
Because the distortion in the bass, after a certain point, gets masked by the distortions in the higher frequencies. Therefore, if you move those wasted bits out of the bass and into the higher frequencies you get a net improvement of quality at given bitrate.
DZgas Ж
2022-12-12 04:47:12
I just show that due to the features of the codec itself, it is not so perfectly excellent in the low frequency region, and this can be seen on the graph
daniilmaks
Because the distortion in the bass, after a certain point, gets masked by the distortions in the higher frequencies. Therefore, if you move those wasted bits out of the bass and into the higher frequencies you get a net improvement of quality at given bitrate.
2022-12-12 04:47:47
This is because the distortion is way higher in the higher frequencies, even in the opus model.
DZgas Ж I just show that due to the features of the codec itself, it is not so perfectly excellent in the low frequency region, and this can be seen on the graph
2022-12-12 04:49:00
The only thing your graph shows is that opus in the only codec that isn't *wastefully overkill* in the bass allocation.
2022-12-12 04:50:54
less bass distortion is only better when it's not already masked by the much higher distortions in the more hearable ranges, which is the case for all current codecs.
DZgas Ж
2022-12-12 04:52:29
it is interesting that using such a large block at high frequencies does not improve the situation in the end, although it is only hundredths of a percent of quality another codecs
less bass distortion is only better when it's not already masked by the much higher distortions in the more hearable ranges, which is the case for all current codecs.
2022-12-12 04:53:45
but I don't care about all this, I just shared an interesting observation of the tangent that all past codecs so accurately save low frequencies.
daniilmaks
DZgas Ж but I don't care about all this, I just shared an interesting observation of the tangent that all past codecs so accurately save low frequencies.
2022-12-12 04:54:20
*at the expense of the higher frequencies
2022-12-12 04:54:49
but yeah, I agree it's an interesting observation!
2022-12-12 04:57:45
also please note that, IIRC, artifacts are not the only source of distortion. *any* deviation from the source, hearable or otherwise, is measured as distortion.
DZgas Ж it is interesting that using such a large block at high frequencies does not improve the situation in the end, although it is only hundredths of a percent of quality another codecs
2022-12-12 05:03:16
those blocks are only large in this linear scale, which is very poor at proportionally displaying human hearing.
Reddit • YAGPDB
2022-12-12 05:58:09
Demiurge
2022-12-12 08:37:00
Well the x axis is freq in hertz, that's easy to figure out. The y axis is a percent of some kind.
DZgas Ж I just show that due to the features of the codec itself, it is not so perfectly excellent in the low frequency region, and this can be seen on the graph
2022-12-12 08:39:15
He's saying that, just like all the other codecs, opus has less distortion in the lower freq than the high freq.
2022-12-12 08:39:24
Not that it has less distortion than other codecs.
2022-12-12 08:39:50
But that it still distorts high freqs more than low
DZgas Ж it is quite obvious that OPUS is better than all codecs, it is already clear, and the graphics can be seen more clearly
2022-12-12 08:42:26
I don't think it's obvious opus is better at all bitrates for all types of music.
2022-12-12 08:42:48
opus really, really has a problem with harpsichord music as of right now.
2022-12-12 08:43:01
And that's my favorite instrument :(
DZgas Ж
Demiurge opus really, really has a problem with harpsichord music as of right now.
2022-12-12 08:54:11
what is the bitrate?
Demiurge I don't think it's obvious opus is better at all bitrates for all types of music.
2022-12-12 08:56:00
and yet, there are no other better codecs that can sound just as good at 128 kbit bitrate. and for large bitrates like 256+ there are always old codecs
Demiurge He's saying that, just like all the other codecs, opus has less distortion in the lower freq than the high freq.
2022-12-12 08:56:46
that's understandable enough.
Demiurge
DZgas Ж what is the bitrate?
2022-12-12 09:53:30
I tested as high as 128kbps I think.
DZgas Ж
2022-12-12 09:55:14
<:Thonk:805904896879493180> u thnk
2022-12-12 09:56:12
by default ffmpeg encode opus use 96 kbps
Demiurge
2022-12-12 10:06:32
I used opus tools, not ffmpeg, and I made a point to increase the bitrate to a realistic level.
2022-12-12 10:06:41
I don't think I tested above 128kbps
2022-12-12 10:07:41
it was quite a while back since I did the abx tests but it was with the latest release.
Reddit • YAGPDB
2022-12-13 01:38:59
pshufb
2022-12-13 04:08:08
at 320 KB/s, the residuals with opus are basically non existent within the practical ranges of human hearing
2022-12-13 04:09:36
128 KB/s not so much, but by 192 KB/a it's already beyond my ability to tell a difference
uis
Demiurge opus really, really has a problem with harpsichord music as of right now.
2022-12-13 06:07:13
Can you report it to encoder devs?
Demiurge
2022-12-13 06:59:03
Is libopus actively developed still? it's been on 1.3 for a long time...
2022-12-13 06:59:31
Also they were aware of the harpsichord problem and 1.1 was supposed to "fix" it according to them...
2022-12-13 07:00:04
If I'm the only one who's bothered by it maybe it's not worth reporting...
VEG
2022-12-13 07:44:05
Yes, there is a lot of activity in the repository of the codec
2022-12-13 07:44:36
It doesn't need frequent releases when it is stable
2022-12-13 07:47:02
FLAC had 4 updates in 8 years...
2022-12-13 07:47:19
And this year alone 4 updates also 🙂
2022-12-13 07:47:29
With some significant improvements (a bit better compression, other changes according to the newer stricter spec)
DZgas Ж
2022-12-13 08:35:17
FLAC <:Stonks:806137886726553651>
pshufb 128 KB/s not so much, but by 192 KB/a it's already beyond my ability to tell a difference
2022-12-13 08:35:53
other codecs work better at these bitrates? for example Vorbis
Demiurge
2022-12-13 10:10:24
Perceptual improvements are a good cause for a new release.
2022-12-13 10:10:40
FLAC does not have perceptual inprovements since it's lossless compression
diskorduser
Demiurge Perceptual improvements are a good cause for a new release.
2022-12-13 10:30:44
Does opus have harpsichord problem even at higher bitrates? (>200K)
Demiurge
2022-12-13 10:33:34
I don't think I tested that high but at that point it kinda defeats the purpose of using opus
diskorduser
2022-12-13 10:34:40
So what codec would you use at such bitrates like 192 or 256k?
Demiurge
2022-12-13 10:35:18
Probably lossywav?
diskorduser
2022-12-13 10:36:26
🤔 Does it sound better than aac or vorbis at 256k?
Demiurge
2022-12-13 10:36:42
Have not tested yet...
Reddit • YAGPDB
2022-12-13 12:22:09
pshufb
DZgas Ж other codecs work better at these bitrates? for example Vorbis
2022-12-13 02:01:47
I could not reproduce this claim and I am not sure why it would be true
Reddit • YAGPDB
2022-12-13 06:34:34
DZgas Ж
pshufb I could not reproduce this claim and I am not sure why it would be true
2022-12-13 07:43:12
<:Thonk:805904896879493180> <:Thonk:805904896879493180> <:Thonk:805904896879493180> <:Thonk:805904896879493180>
Reddit • YAGPDB
2022-12-13 08:55:14
2022-12-13 09:04:04
spider-mario
2022-12-13 09:14:45
I recently learned that FLAC is limited to 8 channels per stream, which is kind of a shame
2022-12-13 09:14:51
WavPack doesn’t have that limitation
daniilmaks
pshufb I could not reproduce this claim and I am not sure why it would be true
2022-12-13 10:25:09
it sounded more like a question, rather than a claim.
pshufb
2022-12-14 06:11:38
ah
Reddit • YAGPDB
2022-12-14 08:41:59
2022-12-14 08:44:04
VEG
spider-mario I recently learned that FLAC is limited to 8 channels per stream, which is kind of a shame
2022-12-14 07:34:39
Why would you need more?
2022-12-14 07:36:07
It should be possible to put a few multiplexed FLAC streams into one Ogg contatiner if required, but 8 sounds like enough for all normal use cases
spider-mario
2022-12-14 07:39:44
higher-order ambisonics would be one example
Reddit • YAGPDB
2022-12-14 08:39:59
2022-12-14 11:33:54
2022-12-15 03:37:59
2022-12-15 10:11:39
Traneptora
Demiurge I don't think I tested that high but at that point it kinda defeats the purpose of using opus
2022-12-15 02:20:22
I wouldn't say it defeats the point of using Opus to at 160 kbps
2022-12-15 02:20:29
since it's *still* going to outperform Vorbis and AAC at those bitrates
2022-12-15 02:21:08
even if all three achieve transparency there's still no downside to using Opus
Demiurge
pshufb I could not reproduce this claim and I am not sure why it would be true
2022-12-15 05:13:24
You could not reproduce the scratchy harpsichord?
2022-12-15 05:13:40
I remember I used the same harpsichord sample on the opus 1.1 demo page
2022-12-15 05:14:18
encoded with the latest version of opus and ABX tested with squishyball
Reddit • YAGPDB
2022-12-15 05:37:44
pshufb
Demiurge You could not reproduce the scratchy harpsichord?
2022-12-15 08:18:00
I could not reproduce the claim that Vorbis did better than Opus at high bitrates
Demiurge
2022-12-15 08:27:17
I don't think I tested higher than 128kbps
2022-12-15 08:27:28
If I did it would not have been much higher
daniilmaks
2022-12-15 08:47:04
are there any high quality harpsichord samples in public domain?
2022-12-15 08:47:19
or at least free to download
Demiurge
2022-12-15 08:54:11
The 48khz sampled one on the opus 1.1 page works well for abx testing
daniilmaks
2022-12-15 09:03:25
oh I misread what you said earlier then
Reddit • YAGPDB
2022-12-15 10:12:04
2022-12-16 02:24:24
2022-12-16 12:14:04
2022-12-16 07:13:14
2022-12-16 09:13:34
2022-12-16 11:13:09
2022-12-17 03:47:19
2022-12-17 09:22:49
2022-12-18 06:34:44
2022-12-19 01:14:44
afed
2022-12-19 04:16:28
so is 12-bit jpeg really needed and used? <https://github.com/libjpeg-turbo/libjpeg-turbo/issues/199#issuecomment-1066163024> `12-bit support can be added to the TurboJPEG API as part of the TurboJPEG v3 overhaul, for which I'm currently trying to obtain funding. The new API structure will make it a lot easier to support 12-bit JPEGs, since it would only require a separate set of buffer set/get functions for 12-bit samples rather than a completely separate API`
Fraetor
2022-12-19 04:21:13
Only if you can't have JPEG XL.
Reddit • YAGPDB
2022-12-20 02:10:54
_wb_
2022-12-20 08:47:00
I just checked, YouTube really does send WebP images for its video thumbnails even if the user agent clearly indicates it does not support WebP (e.g. by requesting the url with `Accept: image/jpeg`).
2022-12-20 08:48:24
That is quite nasty behavior imo, effectively forcing all browsers to support WebP since otherwise YouTube completely breaks.
2022-12-20 08:51:48
this is what YouTube looks like when webp doesn't work
2022-12-20 08:52:28
only the ad image works, it's a jpeg
2022-12-20 08:53:08
the channel icons work too, they are jpegs too
yurume
2022-12-20 09:08:24
to be fair there is no actual requirement about supported image formats, probably besides from PNG (even JPEG is not required AFAIK)
2022-12-20 09:38:32
and yet everyone is using JPEG, resulting in a de facto standardization (people will serve JPEG even though you don't give `Accept: image/jpeg`, won't they)
2022-12-20 09:39:13
WebP is now heading the same direction (like it or not) and I guess youtube's WebP only policy is actually quite late
_wb_
2022-12-20 09:40:26
https://www.w3.org/Graphics/ indeed says "There is no limit in the Web specifications to the graphical formats that can be used on the Web."
yurume
2022-12-20 09:40:42
PNG is required by the canvas specification
2022-12-20 09:41:00
(or more accurately, PNG *writing*, not sure about PNG reading)
_wb_
2022-12-20 09:41:11
isn't JPEG writing also required? I recall something like that
yurume
2022-12-20 09:42:21
https://html.spec.whatwg.org/multipage/canvas.html#serialising-bitmaps-to-a-file "User agents must support PNG ("`image/png`"). User agents may support other types. If the user agent does not support the requested type, then it must create the file using the PNG format. [PNG]"
2022-12-20 09:42:51
it does have an API support for lossy formats, but doesn't require any particular lossy format
_wb_
2022-12-20 09:44:24
right
2022-12-20 09:45:11
so spec-wise, only png is OK. But all browsers do support jpeg too so I suppose that's de facto also required, likely some pages will break if that gets removed.
2022-12-20 09:45:49
even writing image/webp is supported by all major browsers except safari, it looks like...
2022-12-20 09:48:46
I didn't know chrome and firefox came with a webp encoder too, even on mobile. That must add some binary size and possibly security surface for quite questionable functionality, imo.
Fraetor
2022-12-20 09:52:38
They heavily reference JPEG in the spec for <img>, even if they don't explicitly say it has to be supported.
2022-12-20 09:52:58
https://html.spec.whatwg.org/multipage/embedded-content.html#the-img-element
_wb_
yurume WebP is now heading the same direction (like it or not) and I guess youtube's WebP only policy is actually quite late
2022-12-20 09:54:15
I wonder when exactly they started doing that. But even today, still 2.7% of browsers in use don't support WebP according to https://caniuse.com/webp so I think it's still nasty to use WebP without fallback.
Fraetor
2022-12-20 09:56:23
Does the rest of youtube work on those browsers?
yurume
2022-12-20 09:57:10
in the past youtube used their power to discourage IE6 for users (https://blog.chriszacharias.com/a-conspiracy-to-kill-ie6), and I guess they are trying to do that again
_wb_
2022-12-20 10:32:03
IE6 was kind of universally agreed to be horrible and I think they did the world a favor by helping to kill it. But the merits of (lossy) WebP were controversial, there was no consensus that it was actually a worthwhile permanent addition to the web platform, and Firefox/Safari resisted for as long as they could.
Fraetor Does the rest of youtube work on those browsers?
2022-12-20 10:38:01
No idea, I just used Chrome devtools to show how it looks when 'Disable WebP image format' is checked. But I would expect the videos themselves to be available in h264, vp9 and av1 and use content negotiation to select a codec that works for the user agent.
2022-12-20 10:39:28
The only reason to keep vp8 around in browsers is for webp now, I think. At least I don't think anyone is using it for video anymore, especially not without fallback.
yurume
_wb_ IE6 was kind of universally agreed to be horrible and I think they did the world a favor by helping to kill it. But the merits of (lossy) WebP were controversial, there was no consensus that it was actually a worthwhile permanent addition to the web platform, and Firefox/Safari resisted for as long as they could.
2022-12-20 10:43:26
I mean, they can convince themselves that they are doing the right thing, no matter it's true or not
joppuyo
_wb_ this is what YouTube looks like when webp doesn't work
2022-12-20 01:20:45
I can say as a web developer that this is very common practice, since webp is supported by over 95% of browsers it can be argued that it's not worth the effort to support those very few users that can't use it
2022-12-20 01:21:45
if this was done before firefox got webp support, yeah I could argue it could be seen as something malicious but if they did this after firefox got on board I don't think they did it on purpose
2022-12-20 01:22:28
always having a jpeg fallback is of course nice but in youtube's case this could mean terabytes of extra data to store
2022-12-20 01:27:20
when I want to use a new feature or API in a web project I usually look at caniuse and if it's supported by less than 90% of browsers I will use a polyfill or a fallback so it doesn't break on old browsers. if the support is over 90% I usually don't bother with any sort of fallback if: 1. there are only minor issues for browsers that don't support the feature 2. it's more effort than it's worth to implement a fallback
_wb_
2022-12-20 01:33:15
getting broken images is more than a minor issue imo
2022-12-20 01:34:11
as to the storage: I cannot imagine that the storage for the thumbnails is in any way important compared to the storage for the videos themselves
joppuyo
2022-12-20 01:41:08
I can't really speak for youtube's regarding their prioritisation but aiming for 100% compatibility is very difficult and often impossible. and we are talking about one of the world's most visited websites and not some hobby project
2022-12-20 01:41:18
well actually, in a case of worlds most visited site, you could argue that compatibility is more important, but on the other hand you have to prioritize stuff like new features over perfect compatibility, since the new features and improvements bring in money and perfect compatibility uses a lot of development hours without bringing much immediate benefits
Fraetor
_wb_ The only reason to keep vp8 around in browsers is for webp now, I think. At least I don't think anyone is using it for video anymore, especially not without fallback.
2022-12-20 02:07:38
Is VP8 part of the webRTC spec, or is that not an issue with openh264?
afed
2022-12-20 02:26:14
<@886264098298413078> so now there will be dav1d in tdesktop for windows (after merging), but it's not used for av1 video decoding yet? <https://github.com/telegramdesktop/tdesktop/pull/25572>
fab
2022-12-20 03:51:09
https://artifacts.videolan.org/x264/release-win64/
novomesk
afed <@886264098298413078> so now there will be dav1d in tdesktop for windows (after merging), but it's not used for av1 video decoding yet? <https://github.com/telegramdesktop/tdesktop/pull/25572>
2022-12-20 04:18:39
Not yet, but it should be easy to do it later.
BlueSwordM
_wb_ That is quite nasty behavior imo, effectively forcing all browsers to support WebP since otherwise YouTube completely breaks.
2022-12-20 04:55:54
Well, if Google does this for WebP, I believe we could do this for JPEG-XL on Chrome browsers. Note that word *could*. Since we are above Chrome/AOM devs, we won't be doing this.
_wb_
2022-12-20 05:07:51
I was wrong: YouTube actually does work on browsers that don't support webp
2022-12-20 05:08:15
it just does it in a weird way — when trying "disable webp support" in chrome devtools, it does break
2022-12-20 05:08:33
but e.g. using an old firefox version from before firefox supported webp, it does work
Fraetor
2022-12-20 07:35:44
I guess there is some user agent stuff then. Probably because it polyfills some JS for older browsers or similar.
Reddit • YAGPDB
2022-12-20 07:36:48
2022-12-20 07:39:43
fab
2022-12-20 08:16:57
https://www.mpeg.org/standards/Explorations/41/
Reddit • YAGPDB
2022-12-20 11:44:02
2022-12-20 11:45:32
2022-12-21 10:50:12
2022-12-21 05:11:07
2022-12-22 11:09:37
2022-12-22 12:44:27
2022-12-22 01:58:57
2022-12-22 03:27:02
2022-12-22 10:33:52
JendaLinda
2022-12-23 04:38:16
It seems that the most reliable way to view animated AVIF is to play the file in VLC.
Reddit • YAGPDB
2022-12-23 05:44:57
spider-mario
2022-12-23 11:08:46
> Looking at the latest incarnation of HDR in Adobe camera raw I am mildly disappointed > > A number of considerations > > 1. The choice of AVIF and Jpeg XL is not particularly exciting. Of course those are license free and google is backing AVIF however from compression point of view they lag quite a bit from other formats within HEIF containers. Not necessarily HEIC there are others
2022-12-23 11:08:49
(https://www.dpreview.com/forums/thread/4688341?page=2#forum-post-66723006)
2022-12-23 11:08:59
<:Thonk:805904896879493180>
BlueSwordM
spider-mario > Looking at the latest incarnation of HDR in Adobe camera raw I am mildly disappointed > > A number of considerations > > 1. The choice of AVIF and Jpeg XL is not particularly exciting. Of course those are license free and google is backing AVIF however from compression point of view they lag quite a bit from other formats within HEIF containers. Not necessarily HEIC there are others
2022-12-24 12:17:03
This is stupid lmao.
2022-12-24 12:17:25
Also, Google is backing JXL, but Chrome/AOM are backing AV1 as well.
_wb_
2022-12-24 07:20:38
Besides, HEVC (HEIC), the other payloads you can currently have in HEIF are: - AVC (h264) - JPEG - AV1 - JPEG XR - JPEG 2000 - JPEG XS
2022-12-24 07:23:22
At one of the previous JPEG meetings they asked us if we wanted to define jxl as a possible payload for heif — we said "no thanks" since unlike for those other payloads, there's basically nothing the heif container can offer jxl that it doesn't already have, it would just be an unnecessary duplication of functionality.
2022-12-24 07:26:59
HEIF is designed to be a wrapper for 'dumb' payloads that just compress numbers without semantics, which is how things were done in the past: the codec doesn't know what the numbers mean, it just compresses them, and meaning is left as a metadata/file format/application responsibility.
2022-12-24 07:30:32
So heif has metadata for colorspace (enum or icc) and orientation, and it does tiling, cropping, layering, extra channels like alpha etc at the file format level.
2022-12-24 07:31:49
That's great for payload codecs that don't support those things yet, but it's not the optimal way to do things.
2022-12-24 07:33:17
In jxl, all render-impacting info goes into the codestream (payload), and the jxl file format is a very shallow thing that just stores some optional metadata that doesn't influence how to display the image.
2022-12-24 07:34:48
Main advantage of the jxl approach: the encoder always knows what it is doing, since it has all the information. It knows the colorspace, it knows which pixels are visible or not because of alpha.
2022-12-24 07:39:36
Since it can internally work in an absolute colorspace (XYB), the encoder configuration of quality can use a scale that is the same regardless of the image colorspace.
BlueSwordM
_wb_ Since it can internally work in an absolute colorspace (XYB), the encoder configuration of quality can use a scale that is the same regardless of the image colorspace.
2022-12-24 03:47:06
Yes! This is actually a massive issue with classical video encoders: there has to be specific optimizations for specific color spaces.
spider-mario
2022-12-24 07:58:11
I asked the person in the thread but the response I got was not very informative
2022-12-24 07:58:29
https://www.dpreview.com/forums/post/66723770
Fraetor
2022-12-24 08:05:15
Maybe they are confused about jxl vs JPEG?
_wb_
2022-12-24 08:44:16
Or maybe it's one of those persons who don't realize lossy compression can reach various trade-offs between quality and size? Maybe comparing default settings of every format and getting the smallest files with heic?
fab
2022-12-24 08:51:15
2022-12-24 08:51:44
Usually Google look at those images and wants good compression
2022-12-24 08:51:53
Don't care about the quality
Reddit • YAGPDB
2022-12-24 10:48:22
2022-12-24 11:49:27
2022-12-25 07:08:32
2022-12-26 01:41:22
2022-12-26 04:17:12
2022-12-26 10:44:32
2022-12-26 11:43:07
2022-12-26 01:55:42
2022-12-26 04:04:22
2022-12-27 01:44:12
2022-12-28 06:34:37
2022-12-28 08:05:17
2022-12-29 05:15:12
2022-12-29 09:24:12
Sebastian Merca
2022-12-29 11:44:50
hello, is possible to convert a jpg xl sequence to prores?
sklwmp
Sebastian Merca hello, is possible to convert a jpg xl sequence to prores?
2022-12-29 12:53:57
you could probably just use ffmpeg
Reddit • YAGPDB
2022-12-30 04:02:02
2022-12-30 08:50:02
2022-12-31 09:40:52
2023-01-01 05:58:07
2023-01-01 01:44:32
2023-01-02 03:20:17
2023-01-02 05:47:17
DZgas Ж
2023-01-02 01:43:43
telegram desktop (windows) now supports AV1
2023-01-02 03:14:07
svt-av1 does not have the ability to limit the block size.
afed
2023-01-02 03:27:35
<@886264098298413078> is it hard to also add this for windows? it feels like the telegram devs don't have windows <:SadCat:805389277247701002> https://github.com/telegramdesktop/tdesktop/pull/25623
novomesk
afed <@886264098298413078> is it hard to also add this for windows? it feels like the telegram devs don't have windows <:SadCat:805389277247701002> https://github.com/telegramdesktop/tdesktop/pull/25623
2023-01-02 04:41:52
It would work only in Qt6 builds as they made special patches to force using lcms in Qt 6.3 and Qt 6.4
Reddit • YAGPDB
2023-01-03 08:39:52
2023-01-03 09:19:47
2023-01-03 10:21:32
2023-01-03 10:41:12
2023-01-04 09:31:52
2023-01-04 03:03:12
DZgas Ж
2023-01-04 04:07:58
and didn't attach any commands. I'm using spline interpolation. and besides, why doesn't anyone tell people about the existence of 540p
2023-01-04 04:09:39
for the last 4 days I have been researching the best AV1 encoding option for the Telegram, and it turned out that it can be even faster than AVC veryslow with extremely good output quality
2023-01-04 04:10:48
In fact, it's even strange that I spent so many days doing this, considering how much "no one" wants people to change coding parameters
2023-01-04 04:14:37
in fact, I have never encountered such a problem that the codec is doing its job much better, while use the same amount of time. But at the same time, it's impossible to show it visually, people just don't see it. for example, where is AV1 and where is AVC? And be sure that if you don't know exactly how each of these codecs works, you won't be able to understand it.
2023-01-04 09:47:08
SVT-AV1 slightly disappoints with its quality. Well, of course it's fast and generally what need. But his algorithms for smoothing noise and block jitter just don't seem to work, or work very poorly. At AOMENC, even at speed 8 (the fastest), all algorithms work that destroy any noise and shaking of blocks.... but unfortunately it is 6 times longer than SVT-AV1 speed 10
2023-01-04 09:50:51
SVT-AV1 speed 9-10 same AVC veryslow-placebo || I can't give an exact answer because of the different encoding time depending on the complexity of the film scene ||
Reddit • YAGPDB
2023-01-04 11:49:17
2023-01-05 02:34:42
sklwmp
2023-01-05 05:54:58
Reddit • YAGPDB
2023-01-05 03:54:52
2023-01-05 04:45:07
2023-01-05 09:29:22
2023-01-06 01:23:32
2023-01-06 06:22:37
3DJ
2023-01-06 01:26:40
let's play a game I like to call Guess the MP3 <:kekw:808717074305122316>
2023-01-06 01:27:41
what are the best (and free) lossy encoders out there right now?
2023-01-06 01:29:07
I'm trying to compare AC4-IMS (new Dolby format for binaural audio) to other lossy codecs like opus and ofc mp3
Traneptora
3DJ what are the best (and free) lossy encoders out there right now?
2023-01-06 02:14:41
libopus should be the best free lossy audio encoder
3DJ
Traneptora libopus should be the best free lossy audio encoder
2023-01-06 02:17:01
that's the official one right? I used opusenc.exe from opus-tools-0.2-opus-1.3
Traneptora
2023-01-06 02:17:19
opusenc is the frontend to libopus, yes
3DJ
2023-01-06 02:17:31
any runner-ups?
2023-01-06 02:17:46
i'm gonna try vorbis next just for shits and gigles
2023-01-06 02:18:08
and I can't find an MQA encoder, guess it's proprietary?
Traneptora
2023-01-06 02:18:12
libvorbis outperforms LAME by a good deal, the FFmpeg built-in AAC encoder is something solid to try. I'm assuming by free you mean FOSS and not just free to download
2023-01-06 02:18:35
if you want something just free to download then quicktime's AAC encoder is gratis
3DJ
Traneptora libvorbis outperforms LAME by a good deal, the FFmpeg built-in AAC encoder is something solid to try. I'm assuming by free you mean FOSS and not just free to download
2023-01-06 03:02:20
nah I meant free as in free download, not necessarily FOSS cuz I just don't wanna have to pay for a quick experiment lol
2023-01-06 03:03:36
and is ffmpeg's AAC the best encoder for that format (as in highest quality within the 72kbps constraint)? or is there a better reference encoder?
Traneptora
2023-01-06 03:04:24
72 kbps is 36 kbps/channel which is well within HE-AAC range, and FFmpeg's encoder is only LC-AAC
2023-01-06 03:04:46
I'd use quicktime's AAC encoder if you're trying to test "AAC" itself not a specific encoder's implementation of it
2023-01-06 03:04:54
I don't know how to obtain it but I do know it's free
2023-01-06 03:05:29
it's at least one of the best if not the best AAC encoder that's freely available
3DJ
2023-01-06 03:11:07
👌
sklwmp
Traneptora it's at least one of the best if not the best AAC encoder that's freely available
2023-01-06 03:33:05
how about Fraunhofer FDK AAC? i don't know which among the two is better
2023-01-06 03:33:14
although, fdk-aac is harder to find an easy-to-use binary for
Traneptora
2023-01-06 03:37:11
fdk-aac for a while was source-only
2023-01-06 03:37:19
unsure how it compares to qaac nowadays anyway
3DJ
2023-01-06 04:20:20
<@557099078337560596> <@853026420792360980> is there a tool to measure quality loss by comparing these lossy conversions to the original lossless WAV file?
sklwmp
2023-01-06 04:20:40
I was just searching for something like that a few days ago
2023-01-06 04:21:21
I think the closest thing I found was either just inverting the audio and checking how much it differed (not very smart or perceptual) Or Visqol by Google (https://github.com/google/visqol) "Perceptual Quality Estimator for speech and audio"
3DJ
2023-01-06 04:28:43
yeah I usually invert the 2nd track in audacity to hear the difference but I was hoping there would be something more mathematical that checks how much the bits/samples have changed compared to the raw source. is that what visqol does?
sklwmp
2023-01-06 04:31:06
I guess so? I haven't tried it yet.
afed
2023-01-06 05:24:05
https://youtu.be/3zaq56QsX28
Reddit • YAGPDB
2023-01-06 06:42:23
2023-01-06 08:02:52
2023-01-06 08:32:02
3DJ
sklwmp I guess so? I haven't tried it yet.
2023-01-07 02:41:15
couldn't figure out how to build/use visqol but found a tool that does exactly what I was looking for I packaged it with scripts that make it even easier to use here <https://github.com/ThreeDeeJay/eaqual/releases/latest>
2023-01-07 02:42:37
here's a test I made comparing quality and filesizes: <https://airtable.com/shrA5PHQEomEZXKZb> > An ODG of -4 means a very annoying disturbance, while an ODG of 0 means that there is no perceptible difference.
2023-01-07 02:58:51
ugh, no wonder stuff like qaac AAC-HE got a worse ODG than MP3 it seems that for some dumb reason, they desync (leading silence added?) during conversion
sklwmp
3DJ here's a test I made comparing quality and filesizes: <https://airtable.com/shrA5PHQEomEZXKZb> > An ODG of -4 means a very annoying disturbance, while an ODG of 0 means that there is no perceptible difference.
2023-01-07 09:55:10
what bitrate are these targeting? also, thank you so much for the comparison!
Reddit • YAGPDB
2023-01-07 01:59:02
spider-mario
3DJ yeah I usually invert the 2nd track in audacity to hear the difference but I was hoping there would be something more mathematical that checks how much the bits/samples have changed compared to the raw source. is that what visqol does?
2023-01-07 02:36:42
perhaps my tool https://github.com/sboukortt/intersect-lv2 could be of help? It’s not made specifically for this purpose, but by using it with the original as the “left” channel and the lossily-compressed version as the “right” channel, the output would have most of the audio in the center output channel, what the lossy codec removed in the left output channel, and what it added in the right output channel
2023-01-07 02:37:00
since by just computing the difference, you can’t know whether it was added or removed
2023-01-07 02:37:44
(well, in some cases, you _can_ know, but not generally)
3DJ
sklwmp what bitrate are these targeting? also, thank you so much for the comparison!
2023-01-07 03:10:35
72kbps, which I think is the target used by AC4-IMS (a new format by Dolby for binaural stereo), so I just wanted to check if this new format was actually more efficient than the other good lossy standards which actually reached a somewhat decent support. tho I also wanted to check if there was anything better than opus right now
spider-mario perhaps my tool https://github.com/sboukortt/intersect-lv2 could be of help? It’s not made specifically for this purpose, but by using it with the original as the “left” channel and the lossily-compressed version as the “right” channel, the output would have most of the audio in the center output channel, what the lossy codec removed in the left output channel, and what it added in the right output channel
2023-01-07 03:13:52
so like inverting channels in audacity? I already tried that and noticed the difference, but I was just looking for a way to quantify how *much* difference there is between 2 (lossless and lossy) stereo tracks. Thanks for sharing tho 👍
spider-mario
3DJ so like inverting channels in audacity? I already tried that and noticed the difference, but I was just looking for a way to quantify how *much* difference there is between 2 (lossless and lossy) stereo tracks. Thanks for sharing tho 👍
2023-01-07 04:12:19
note quite like inverting it, because if you invert in Audacity and notice a difference, it could be either something that was present in the original signal and lost by the lossy codec, or some noise/distortion that was _added_ by the lossy codec
2023-01-07 04:12:24
my tool lets you know which of the two it is
3DJ
2023-01-07 04:21:45
ahhh I see. I'll check it out then 👌
Reddit • YAGPDB
2023-01-08 05:07:02
JendaLinda
2023-01-08 11:25:06
Lately I was testing jpeg-quantsmooth filter. It does pretty good job cleaning up drawings that were "accidentally" saved in jpeg.
Reddit • YAGPDB
2023-01-08 12:30:07
2023-01-08 01:03:22
sklwmp
2023-01-08 01:23:22
> As part of a larger story about Apple's plans to allow third-party app stores on the iPhone and iPad in EU countries, Bloomberg's Mark Gurman claimed that Apple is also considering removing its requirement for iPhone and iPad web browsers to use WebKit, the open source browser engine that powers Safari.
2023-01-08 01:23:28
That would be amazing. Unlikely, but amazing.
Demiurge
2023-01-08 04:52:45
lol. We will consider it, mortal peon.
2023-01-08 04:56:28
You're lucky we allow you to pay us 1000 dollars to use our phone at all. Don't go thinking that it's your phone just because you bought it.
Reddit • YAGPDB
2023-01-08 06:13:22
2023-01-09 02:01:12
2023-01-09 03:23:12
afed
2023-01-10 01:39:50
<:FeelsReadingMan:808827102278451241> https://github.com/nigeltao/boring-libjpeg-turbo
sklwmp
2023-01-10 05:00:07
Wait, libjpeg-turbo allows for arithmetic-coded JPEG?
2023-01-10 05:00:13
That's cool.
_wb_
2023-01-10 06:14:38
It does, and there are also variants that support 12-bit and alpha, but the problem is that using such features will create big interoperability issues since we cannot upgrade all jpeg decoders...
Reddit • YAGPDB
2023-01-10 08:02:02
joppuyo
JendaLinda Lately I was testing jpeg-quantsmooth filter. It does pretty good job cleaning up drawings that were "accidentally" saved in jpeg.
2023-01-10 03:24:34
I’ve used a program called “cupscale” along with a machine learning model called “1x_JPEG” to remove JPEG artifacts very successfully
JendaLinda
2023-01-10 03:36:45
The convenience of jpeg-quantsmooth filter is that it's available in IrfanView. It's not perfect and it needs the actual JPEG data to undo the artifacts. It worked pretty well on some pictures before downscaling them. The downscaled pictures are pretty much spotless. I'm still keeping the original JPEG data, losslessly transcoded to JXL.
Reddit • YAGPDB
2023-01-11 09:32:27
2023-01-11 12:31:27
2023-01-11 11:42:22
2023-01-12 04:56:47
2023-01-12 05:35:07
2023-01-12 10:26:17
DZgas Ж
2023-01-12 04:21:25
Firefox not support animated AVIF lol
JendaLinda
2023-01-12 05:14:14
The funny thing about animated AVIF is, if the program doesn't support animation, it can't decode the file at all.
Traneptora
2023-01-12 06:58:07
that's because AVIF is not forward compatible like PNG
DZgas Ж
2023-01-13 03:11:26
mpc-hc 1.9 supported animated avif
Reddit • YAGPDB
2023-01-13 03:22:22
2023-01-13 06:32:07
JendaLinda
2023-01-13 09:36:49
VLC can play avif animation as well. It's acting as a video file. Why not just use mp4 or webm/mkv?
Reddit • YAGPDB
2023-01-13 12:16:22
2023-01-13 01:06:52
sklwmp
2023-01-13 01:11:08
How does this fast "block shrinking" (shrink-on-load) work for JPEG? Is it a libvips thing or is there something inherent to JPEG that allows for quarter size or half size decodes, like I know CineForm can do?
spider-mario
2023-01-13 01:14:03
perhaps something along the lines of bypassing the inverse DCT since you would be throwing away the spatial information anyway?
yurume
2023-01-13 01:17:37
I guess shrinking happens during zigzag decoding, so that you drop high-frequency components?
veluca
2023-01-13 01:22:30
AFAIU they do a slightly modified 4x4 DCT on the top 4x4 DCT coefficients
2023-01-13 01:22:36
or 2x2
_wb_
2023-01-13 01:44:41
We could in principle do the same in jxl, just we have more block types to do it for (but codewise, I think you could basically just treat e.g. a dct32x16 as a dct16x8 on the top coeffs to get the 1:2 pixels, so only for stuff like AFV it gets tricky)
Reddit • YAGPDB
2023-01-13 02:41:23
DZgas Ж
_wb_ JPEG is working on JPEG AI, mostly driven by Huawei and Bytedance. But I doubt this kind of thing will be ready for practical deployment anytime soon.
2023-01-13 07:56:25
It is correct to assume that JPEG XL is de-facto the last *real *image codec in the coming decades
2023-01-13 07:57:29
(Okay, there will still be an ultra-heavy AVIF2 in about 5 years)
Fraetor
2023-01-13 10:28:41
I doubt it, because lossless is always going to be a usecase.
DZgas Ж
2023-01-13 10:40:43
it just doesn't seem to me that anyone will take on such a large and all-encompassing project like Jpeg xl ever again.
2023-01-13 10:42:02
it will be much more profitable in the future to invest in improving the Encoder, making it better and better every year
Fraetor
2023-01-14 02:29:06
True, there is a lot of space for improved encoders using the JXL bitstream.
Demiurge
JendaLinda VLC can play avif animation as well. It's acting as a video file. Why not just use mp4 or webm/mkv?
2023-01-14 04:47:13
avif is ALREADY a valid mp4 container file
Reddit • YAGPDB
2023-01-14 08:26:17
daniilmaks
JendaLinda The funny thing about animated AVIF is, if the program doesn't support animation, it can't decode the file at all.
2023-01-15 09:59:57
I low-key *prefer* this behavior
_wb_
2023-01-15 10:56:53
well I prefer it actually supporting animation, but I kind of agree that 'graceful degradation' to just showing the first frame is rather confusing, since people will just think it's a still image that happens to be a big file for some reason.
daniilmaks
2023-01-15 11:02:37
pretty much. I have said before that an animated sequence 99% of the time requires itself to be opened fully to convey it's message unambiguously. showing just the first frame of an animated sequence and calling that graceful degradation is not much unlike sharing a video and only being able to play the first GOP
_wb_
2023-01-15 11:05:53
yeah, generally speaking graceful degradation for these kind of things just doesn't work because people will just assume the degraded experience is all there is — better to fail clearly and get application developers to do things properly
2023-01-15 11:07:57
it's the same with transparency: if you're going to just ignore the alpha channel or blend poorly (say reducing things to 1-bit alpha instead of doing proper blending), you can better not show anything
2023-01-15 11:10:45
I think this was the main problem with JPEG XT: sure it is nice that you can add extensions to JPEG that add alpha, HDR, lossless etc, but if most applications are just going to show you the 'fallback' lossy SDR image with alpha ignored, people will think that's all there is, with near-zero pressure on implementors to add any support for those extensions.
JendaLinda
2023-01-15 12:19:21
It seems I was wrong, some programs will show the first frame (Firefox, Xnview MP, Windows Photos), other programs (IrfanView, Gimp, krita) report unknown file format.
2023-01-15 12:29:50
Another story is AVIF with transparency. Only IrfanView and Windows Photos loaded the picture with black background (both using WIC), all other programs, including Firefox, refused to open the file.
sklwmp
2023-01-15 02:41:08
What's a good codec like PNG for lossless compression but faster decoding? I know QOI is a thing, but are there any decompression speed comparison graphs? I know there are ones for encode (fjxl vs fpnge vs QOI for example)
yurume
2023-01-15 02:51:04
http://www.radgametools.com/oodlelimage.htm ?
2023-01-15 02:51:40
of course, non-free (though I think there is a source code available because of the integration to UE)
pshufb
2023-01-15 03:18:05
Hi <@794205442175402004>, I hope this @ isn't an annoyance. I just read this: https://twitter.com/eddychik/status/1614642129397231616 And was wondering if you knew much about their comments regarding HEIF testing.
yurume
2023-01-15 03:24:33
I don't know much about the experiment, but there are many axes for codec comparison; in which axis is HEIF 10% better than JPEG XL?
pshufb
2023-01-15 03:31:11
I asked him a similar question in a reply.
_wb_
2023-01-15 04:27:51
The answer is in the twitter thread already. We did test heic (libheif with x265), and at the settings we used, it was somewhat better than jxl (and avif) at the low to medium qualities, but then got worse than jxl as quality goes up. This was with an encode speed that was 3x slower for heic than for jxl iirc, and libjxl 0.6 iirc.
2023-01-15 04:30:59
We haven't tested Apple's hardware encoder of heic, which is maybe more relevant in practice since this is how most heic files are currently created in the wild. I would assume it's significantly worse in quality/byte than x265, though of course much faster.
Reddit • YAGPDB
2023-01-15 05:02:27
pshufb
_wb_ The answer is in the twitter thread already. We did test heic (libheif with x265), and at the settings we used, it was somewhat better than jxl (and avif) at the low to medium qualities, but then got worse than jxl as quality goes up. This was with an encode speed that was 3x slower for heic than for jxl iirc, and libjxl 0.6 iirc.
2023-01-15 09:27:27
Yeah, they ended up linking the source, so sorry for the ping. I didn't know about the encode speed stuff, though. Thank you!
Reddit • YAGPDB
2023-01-16 04:01:12
grey-torch
2023-01-17 02:52:28
https://www.cast-inc.com/blog/800-mpixelss-260-luts-implementation-qoi-lossless-image-compression-algorithm-and-its
2023-01-17 02:54:27
discussion on implementing qoi encoder/decorder on fpga
w
2023-01-17 03:15:00
w-why
2023-01-17 03:15:33
oh this was from last year. I was wondering why anyone is still talking about qoi in current year
Reddit • YAGPDB
2023-01-17 11:03:12
2023-01-17 12:30:57
2023-01-17 04:18:02
sklwmp
2023-01-17 04:38:08
Actually, what *are* the benefits of animated JXL versus animated AVIF (i.e. AV1 video)?
2023-01-17 04:38:28
coming from a convo over on the AV1 discord, it is a good question
veluca
2023-01-17 04:44:28
are there any? 😛
2023-01-17 04:44:39
a video codec will do better at video
2023-01-17 04:44:53
if you have animated pixel art, then maybe jxl will be better
2023-01-17 04:44:59
(if using lossless)
2023-01-17 04:45:26
or in general animated-whatever where you want to do lossless
2023-01-17 04:45:35
otherwise, nah
sklwmp
2023-01-17 04:45:57
so it's mostly there as a feature because... completeness sake?
veluca
2023-01-17 04:46:25
that, and they needed to give me an internship project 😛
2023-01-17 04:46:38
well, tbf
2023-01-17 04:46:48
this is on the delivery side
2023-01-17 04:47:35
for the content production pipeline usecase, you probably don't want inter prediction anyway, i.e. a video there is just a sequence of (high-quality) images
2023-01-17 04:47:53
and there you do get benefits from jxl, I think
2023-01-17 04:47:56
(I did not test this, or even verify my assumptions)
sklwmp
2023-01-17 04:48:41
so something like a replacement for jpeg2000 intermediates?
veluca
2023-01-17 04:48:47
... maybe?
sklwmp
2023-01-17 04:49:04
aight aight thanks for the insights
afed
2023-01-17 04:50:23
avif is better for lossy animation/short videos, jxl is better for true lossless images, for example I need to merge several lossless images into a single animated one and since jxl is better for lossless, it will be better for that also for higher bpc
BlueSwordM
sklwmp Actually, what *are* the benefits of animated JXL versus animated AVIF (i.e. AV1 video)?
2023-01-17 05:01:38
Mostly better intra lossless. Otherwise, just go with AV1 for anything even closely related to video.
joppuyo
sklwmp Actually, what *are* the benefits of animated JXL versus animated AVIF (i.e. AV1 video)?
2023-01-17 05:05:32
I would say for animation JXL would be better but for video you would want to use AV1
2023-01-17 05:06:06
and by animation I say something like a loading indicator in the UI. something you would traditionally use a GIF
pshufb
veluca for the content production pipeline usecase, you probably don't want inter prediction anyway, i.e. a video there is just a sequence of (high-quality) images
2023-01-17 05:25:48
there is all-intra for avif
veluca
pshufb there is all-intra for avif
2023-01-17 05:54:11
Yeah, I know, but I'd argue that animated JXL is probably better than all intra avif
Reddit • YAGPDB
2023-01-17 06:51:12
_wb_
2023-01-17 09:01:12
When things are basically a slideshow of unrelated pictures, so inter doesn't help much at all, jxl should be better.
2023-01-17 09:01:47
Also when quality is very high (lossless or close to it), jxl should be better.
2023-01-17 09:08:48
If it's a synthetic animation (something like screen content with sprites moving around as perfect translations with integer pixel offsets), in principle jxl could be better (patches could be used to encode sprites only once, though we currently only look for that within a frame, not across frames)
Reddit • YAGPDB
2023-01-18 02:39:32
Demiurge
2023-01-18 06:14:05
This guy is extremely verbose...
Reddit • YAGPDB
2023-01-18 08:52:32
DZgas Ж
grey-torch discussion on implementing qoi encoder/decorder on fpga
2023-01-18 09:12:51
but FJXL killed QOI
grey-torch
2023-01-18 09:47:39
They don't compete unless someone start implementing FJXL on FPGA
yurume
2023-01-18 09:55:30
QOI will probably live on as long as people prefer a shorter code
2023-01-18 09:56:03
the problem is that, of course, if you are willing to only look at a short code and modest performance there will be a better format than QOI 😉
w
2023-01-18 10:00:01
yeah just use png
2023-01-18 10:00:08
import zlib
2023-01-18 10:00:08
done
2023-01-18 10:00:31
if i'd have anything on hardware i'd like zstd
yurume
2023-01-18 10:03:09
well one can argue that zlib (or to the point, even miniz) is longer than qoi
2023-01-18 11:28:09
a modest proposal: make a new image format which i) can be encoded & decoded in ~500 lines of C and ii) roughly as speedy as libpng (TYPO) but iii) results in a smaller size.
2023-01-18 11:28:46
and it'd be named MOI (More than OK Image format)
grey-torch
2023-01-18 01:10:59
There's actually a deflate encode/decode library in < 1000 line of C afaik raylib uses them as part of the game framework https://github.com/vurtun/lib
Reddit • YAGPDB
2023-01-19 02:41:54
2023-01-19 08:46:04
2023-01-19 10:01:33
2023-01-20 06:37:07
afed
2023-01-21 01:55:41
https://youtu.be/WeM2D1-yTzo
Reddit • YAGPDB
2023-01-21 09:58:37
2023-01-22 12:32:27
2023-01-23 09:43:22
Demiurge
2023-01-24 03:42:19
Anyone else having trouble decoding this file? libwebp says it's a lossy animated webp with no errors but ffmpeg is unable to decode it.
Traneptora
Demiurge Anyone else having trouble decoding this file? libwebp says it's a lossy animated webp with no errors but ffmpeg is unable to decode it.
2023-01-24 03:44:24
> [webp @ 0x559661416700] skipping unsupported chunk: ANIM > [webp @ 0x559661416700] skipping unsupported chunk: ANMF
2023-01-24 03:44:28
looks like it's not supported
Demiurge
2023-01-24 03:56:42
Well that's pretty weird...
2023-01-24 03:57:16
ffmpeg does not support animated webp eh?
Traneptora
2023-01-24 03:57:24
on decode, that's correct
2023-01-24 03:57:32
it does on encode (via libwebp)
Demiurge
2023-01-24 03:57:51
Yeah... That seems really weird...
2023-01-24 03:58:06
Ah well
Reddit • YAGPDB
2023-01-24 04:27:57
2023-01-24 05:37:52
Demiurge
2023-01-25 12:01:54
DjVu can have a DPI of at least up to 6000 according to this encoder I'm testing.
2023-01-25 12:06:34
The default DPI is 300 but you can set it to anything between 72 and 6000 just by passing the --dpi option to the encoder
Reddit • YAGPDB
2023-01-25 01:34:13
2023-01-25 08:22:06
2023-01-25 08:42:07
2023-01-26 12:53:06
2023-01-26 01:16:22
2023-01-26 05:44:41
2023-01-26 10:04:07
2023-01-27 04:01:57
2023-01-27 06:24:51
jjido
2023-01-27 01:15:08
SQOA is an image format derived from QOI. Like the original, it is a single header (two if you enable ANS compression). https://github.com/jido/seqoia
2023-01-27 01:16:18
There was an annoying bug in the rANS code which is why I never advertised it, but I finally fixed it so it should be usable now.
VEG
2023-01-27 03:48:37
https://habr.com/en/post/456476/ - about improved Bluetooth SBC support in Android
2023-01-27 03:48:46
https://issuetracker.google.com/issues/136342164 - pity that Google don't want to implement this little but really useful change
JendaLinda
2023-01-27 06:59:30
Alright, viewing PNGs and JPGs in DOS is easy. Now I've tried something more modern. I've managed to install Windows 98 on 486DX2 66MHz with 16MB of RAM, it took only two hours. I've found out that IrfanView 4.44 should work on Win98 and it has WebP plugin. IrfanView successfully loaded PNG, so far so good, but WebP causes IrfanView to crash immediately. I didn't expect miracles, loading PNGs is slow on this poor little CPU, but it's not even trying. I'm disappointed.
spider-mario
2023-01-27 08:23:22
has anyone looked into the difference in compression efficiency between limited (16-235) and full (0-255) range with x264/x265?
2023-01-27 08:24:24
maybe with a 10-bit source to isolate the effect of converting 8-bit content from one to the other
_wb_
2023-01-27 08:31:49
I suppose it's similar to using a wide gamut colorspace to encode sRGB images in 8-bit. Smaller range of values means smaller files (all else like the quantizer setting being the same). But it also means more banding and lower precision in general...
jjido
JendaLinda Alright, viewing PNGs and JPGs in DOS is easy. Now I've tried something more modern. I've managed to install Windows 98 on 486DX2 66MHz with 16MB of RAM, it took only two hours. I've found out that IrfanView 4.44 should work on Win98 and it has WebP plugin. IrfanView successfully loaded PNG, so far so good, but WebP causes IrfanView to crash immediately. I didn't expect miracles, loading PNGs is slow on this poor little CPU, but it's not even trying. I'm disappointed.
2023-01-27 09:06:13
Does the 486dx have SSE?
JendaLinda
2023-01-27 09:06:50
That might be an issue.
2023-01-27 09:10:53
I suppose if it was working somehow, it would be very slow. I was just curious how slow it would be.
Reddit • YAGPDB
2023-01-28 04:58:16
2023-01-28 09:22:32
2023-01-30 01:19:12
Demiurge
2023-01-30 04:41:59
I cannot find any information about the alleged patent issues there supposedly were at the beginning of DjVu.
2023-01-30 04:42:59
All I can see is that it was a free format that is older than PDF and available for free earlier than PDF.
2023-01-30 04:43:19
And it has a GPL reference implementation called DjVuLibre
2023-01-30 04:43:41
The reasons for its failure can be attributed to the lack of decent freely available encoding tools.
2023-01-30 04:44:50
The encoding tools available in DjVuLibre do not take advantage of any of DjVu's main advantages.
2023-01-30 04:47:11
Even to this very day, if you want to create an acceptable DjVu file, you have to pay money, or use image editing software to manually divide an image into layers yourself and use the command line tools and inferior, less-optimized GPL encoding tools to combine the layers into a single file.
2023-01-30 04:47:25
And you have to do that for every page.
2023-01-30 04:52:32
There is a `pdf2djvu` tool and a `djvudigital` tool for converting PDF to DjVu but `djvudigital` is illegal to distribute in binary form because that would violate Ghostscript's copyright license. Both tools produce slightly different output and neither of them produce ideal output. One produces output that mangles page metadata such as bookmarks/ToC and the other produces output that mangles page margins and often a larger filesize than the input PDF. Both tools apply lossy compression to the image data in the PDF files. Neither tools produce acceptable output in my opinion.
2023-01-30 04:53:16
With such a poor tooling situation, it's impossible to expect DjVu to gain widespread use and success if it's such a pain in the ass to create a DjVu file.
2023-01-30 04:54:03
No matter how innovative and awesome and capable the format is in theory, if no one can use it and take advantage of that innovation, it might as well not exist at all.
sklwmp
Demiurge There is a `pdf2djvu` tool and a `djvudigital` tool for converting PDF to DjVu but `djvudigital` is illegal to distribute in binary form because that would violate Ghostscript's copyright license. Both tools produce slightly different output and neither of them produce ideal output. One produces output that mangles page metadata such as bookmarks/ToC and the other produces output that mangles page margins and often a larger filesize than the input PDF. Both tools apply lossy compression to the image data in the PDF files. Neither tools produce acceptable output in my opinion.
2023-01-30 07:00:29
To clarify, it's actually the Ghostscript *driver* by AT&T `GSDjVu` that has a license (CPLv1) incompatible with the GPL. That's why they cannot distribute binaries.
Demiurge
2023-01-30 07:12:09
`djvudigital` is not functional without that ghostscript driver, and distributing the driver in binary form theoretically violates ghostscript's copyright license. And since neither copyright holder has publicly clarified or granted a license exception, computer-illiterates (and time-constrained people, and people working in artificially-constrained software environments, etc) who rely on precompiled binary distribution of software, and are incapable or unwilling to compile source, can't even use djvudigital regardless of whether it even has good performance in the first place.
2023-01-30 07:12:37
In my experience `djvudigital` is fast but is constrained to single-threaded operation.
2023-01-30 07:13:11
It also uses the djvulibre encoders, which have not received the same amount of tuning as the commercial encoders.
2023-01-30 08:07:27
It's extremely fast despite being single threaded. It's also mangling the chapter titles in the table of contents.
2023-01-30 08:08:13
Overall I would say it produces unacceptable output.
2023-01-30 08:08:23
If only because of that.
Reddit • YAGPDB
2023-01-30 03:57:56
2023-01-30 09:52:32
afed
2023-01-31 05:29:39
<@310374889540550660> <@208917283693789185> what about considering adding fpnge to libvips for fast modes? <https://github.com/veluca93/fpnge> https://discord.com/channels/794206087879852103/803645746661425173/1069395996473311242 I didn't do a very accurate comparisons, but it was about at least twice as fast as `vips compression=1` with much better compression for non-palette images and not just for screenshots however, i don't know how hard it would be or how much effort it would take to fully implement, but in my opinion that speed and compression is worth it
2023-01-31 05:29:54
a small test on spng images <https://github.com/libspng/benchmark_images/> ``` 884777 large_palette.png 2877559 large_palette.png.fpnge.png 13730755 large_palette.png.vips0.png 2493372 large_palette.png.vips1.png 1259778 large_palette.png.vips6.png 1218769 large_palette.png.vips9.png 13107563 large_rgb8.png 13806093 large_rgb8.png.fpnge.png 25932709 large_rgb8.png.vips0.png 21406843 large_rgb8.png.vips1.png 15309969 large_rgb8.png.vips6.png 15205517 large_rgb8.png.vips9.png 14874463 large_rgba8.png 15895344 large_rgba8.png.fpnge.png 34575485 large_rgba8.png.vips0.png 25566328 large_rgba8.png.vips1.png 15940463 large_rgba8.png.vips6.png 15879244 large_rgba8.png.vips9.png 2285461 medium_rgb8.png 2347682 medium_rgb8.png.fpnge.png 3950106 medium_rgb8.png.vips0.png 3663753 medium_rgb8.png.vips1.png 2838908 medium_rgb8.png.vips6.png 2827241 medium_rgb8.png.vips9.png 2934436 medium_rgba8.png 3093087 medium_rgba8.png.fpnge.png 6034599 medium_rgba8.png.vips0.png 4816040 medium_rgba8.png.vips1.png 3250939 medium_rgba8.png.vips6.png 3240970 medium_rgba8.png.vips9.png 216531 small_rgb8.png 221353 small_rgb8.png.fpnge.png 375198 small_rgb8.png.vips0.png 359452 small_rgb8.png.vips1.png 289068 small_rgb8.png.vips6.png 288357 small_rgb8.png.vips9.png 248924 small_rgba8.png 257845 small_rgba8.png.fpnge.png 500157 small_rgba8.png.vips0.png 411361 small_rgba8.png.vips1.png 291282 small_rgba8.png.vips6.png 290519 small_rgba8.png.vips9.png ```
Reddit • YAGPDB
2023-01-31 10:49:41
DZgas Ж
2023-01-31 12:15:48
It's a pity that the old microsoft chose The bad way for jpegXR
tufty
afed <@310374889540550660> <@208917283693789185> what about considering adding fpnge to libvips for fast modes? <https://github.com/veluca93/fpnge> https://discord.com/channels/794206087879852103/803645746661425173/1069395996473311242 I didn't do a very accurate comparisons, but it was about at least twice as fast as `vips compression=1` with much better compression for non-palette images and not just for screenshots however, i don't know how hard it would be or how much effort it would take to fully implement, but in my opinion that speed and compression is worth it
2023-01-31 12:58:42
Hi <@1034873369314730065> , we've just switched to spng as our default PNG library https://libspng.org
2023-01-31 12:59:28
oic, fpng is an AVX encoder, interesting ... I'll ask the spng author
2023-01-31 01:01:34
thanks for pointing this out!
afed
tufty Hi <@1034873369314730065> , we've just switched to spng as our default PNG library https://libspng.org
2023-01-31 07:08:47
yeah i know, i meant fpnge just for faster speeds and spng for the rest (and for decoding) and pnge also supports sse4.1 after some updates, not as fast as with avx2 but still the fastest encoder at this compression https://github.com/veluca93/fpnge/pull/3
2023-01-31 07:15:00
```i7-12700K AVX2 244.678 MP/s 10.787 bits/pixel i7-12700K SSE4 171.628 MP/s 10.770 bits/pixel Ryzen 3900X (VM) AVX2 156.358 MP/s 10.787 bits/pixel Ryzen 3900X (VM) SSE4 92.838 MP/s 10.770 bits/pixel Atom C2350 SSE4 10.159 MP/s 10.770 bits/pixel```
3DJ
2023-01-31 08:16:52
any idea which one of these 2 would have (at least in theory) higher quality? 🤔
2023-01-31 08:19:40
hmm if I download the (youtube) video (in JDownloader), I can select an opus track with higher bitrate, but still not sure which one would be better
w
2023-01-31 08:28:04
I don't recall yt having AAC 192
2023-01-31 08:28:18
for normal videos -f 251 is always the best (opus 160)
afed
2023-01-31 08:33:20
192 for youtube music
3DJ
2023-01-31 08:34:11
ahh, that's right. guess it used some inaccurate estimation of the bitrate, maybe because it uses VBR
2023-01-31 08:42:15
ok so I'm trying to compare the audio quality when uploading lossless WAV vs NVIDIA Shadowplay, which according to this uses Windows' built-in AAC encoder > *** As of version 2.4.1.21 the bitrate is now 192 Kbps by default *** > The information below applies to ver. 2.2.2.0 and older. > Background information : Shadowplay uses the AAC Encoder included with windows for audio encoding and by default it uses 96 Kbps as the bitrate. More information is available on the MSDN page. > <https://msdn.microsoft.com/en-us/library/windows/desktop/dd742785(v=vs.85).aspx> > The bitrate is variable, but may not be sufficient for every user at 96 Kbps. Fortunately we don’t have to stay at that setting ! With a simple hex edit you can change that value to one of the 4 available : 96, 128, 160, 192 <https://linustechtips.com/topic/331710-guide-shadowplay-mods/>
2023-01-31 08:44:41
so the question is: is there a way to use this encoder (or replicate settings using ffmpeg/reference encoder) to convert a reference test WAV file so I can upload it and compare it to uploading the WAV directly?
2023-01-31 08:46:43
like, I wanna download the (Opus 251/160kbps) audio track and check just how much quality is lost between the WAV and the AAC 192kbps upload
2023-01-31 09:05:59
found this but getting an error when converting <:NotLikeThis:805132742819053610> <https://github.com/SvenGroot/MFEncode>
2023-02-01 02:16:28
figured it out, it requires 48khz 16-bit exclusively
Reddit • YAGPDB
2023-02-01 11:22:56
2023-02-02 06:34:36
2023-02-02 10:48:11
yurume
2023-02-02 02:51:17
https://phoboslab.org/log/2023/02/qoa-time-domain-audio-compression oh well
veluca
2023-02-02 02:54:04
knee-jerk reaction: whyyyy
yurume
2023-02-02 02:55:07
while I has been a bit harsh for QOI, I do think he has a knack for picking some scheme that somehow appeals to the average programmers for the simplicity
Reddit • YAGPDB
2023-02-02 03:09:56
Traneptora
2023-02-02 03:16:41
> with a constant bitrate of 277 kbits/s for Stereo 44100hz.
2023-02-02 03:16:45
;p
2023-02-02 03:17:26
> It does not cut off higher frequencies, like many transform codecs do
2023-02-02 03:17:27
;p
2023-02-02 03:17:35
someone doesn't realize that low-pass at 20 kHz is a good thing
_wb_
2023-02-02 03:18:24
the concept itself does make some amount of sense though, it's basically like a gpu texture format but for audio
Traneptora
2023-02-02 03:18:50
had opus not already existed and had a permissive open-source library that works well
2023-02-02 03:18:54
then it would make more sense
_wb_
2023-02-02 03:19:32
does opus decode fast enough to do it on the fly for playing game sound effects?
Traneptora
2023-02-02 03:22:53
yea especially cause it's super low delay
2023-02-02 03:23:22
opus decodes at like hundreds times realtime
2023-02-02 03:24:33
at least FFmpeg decodes it at 417x realtime, on a quick test sample
2023-02-02 03:25:28
I get about 300x realtime with that one instead
2023-02-02 03:25:58
certainly there's a lot of variables that can be controlled here but this should give you a rough estimate
_wb_
2023-02-02 03:55:41
for something like game sounds you need not only super low delay but also low enough complexity to play many samples at the same time without hogging the cpu
yurume
_wb_ for something like game sounds you need not only super low delay but also low enough complexity to play many samples at the same time without hogging the cpu
2023-02-02 04:33:37
this. if you only have a handful number of samples played at any time opus will be fine. if you have 100s of them you really want to decode it and preload buffers.