JPEG XL

Info

rules 57
github 35276
reddit 647

JPEG XL

tools 4225
website 1655
adoption 20712
image-compression-forum 0

General chat

welcome 3810
introduce-yourself 291
color 1414
photography 3435
other-codecs 23765
on-topic 24923
off-topic 22701

Voice Channels

General 2147

Archived

bot-spam 4380

other-codecs

DZgas Ж
2022-08-21 12:56:42
For AOMENC in ffmpeg i use ```ffmpeg.exe -i ... -pix_fmt yuv420p10le -map 0 -ac 2 -c:a libopus -b:a 87k -vbr on -application audio -apply_phase_inv 0 -c:s copy -vf "scale=704:396:flags=spline" -c:v libaom-av1 -cpu-used 4 -crf 25 -lag-in-frames 64 -tune ssim -g 1200 -threads 1 ...``` or non-song ```ffmpeg.exe -i ... -pix_fmt yuv420p10le -an -vf "scale=704:396:flags=spline" -c:v libaom-av1 -cpu-used 4 -crf 25 -lag-in-frames 64 -tune ssim -g 1200 -threads 1 ...```
2022-08-21 01:00:43
I've already spent a week and compressed 6 seasons of different anime with these parameters, and I'm watching now. they all have absolutely a problem with the fact that the key frame is BEFORE the key frame, and it's not fucking funny
2022-08-21 01:03:10
SVT-AV 1 acted even funnier, at speed parameter 6 it simply did not recognize any correct keyframes, and at speed parameter 4 it did not recognize any frames at all and made all 17 seconds of video in one keyframe.
2022-08-21 01:27:25
ahahah
2022-08-21 01:27:34
Rav1e just Work
2022-08-21 01:28:01
<:AV1:805851461774475316>
DZgas Ж Rav1e just Work
2022-08-21 02:59:36
but Rav1e just has a ugly coding quality.
DZgas Ж 11.886 sec -- the keyframe is SET on one frame before the real key frame
2022-08-21 03:01:16
I was able to solve this problem, now I will find out which of the encoding parameters is reason, it's actually just a **fatal bug**, but I do not know what to report
2022-08-21 03:09:14
-lag-in-frames 64
2022-08-21 03:09:55
this parameter breaks the algorithm for analyzing keyframes, but I want to note that the default parameter of 25 is very small.
2022-08-21 03:43:47
I found: Another effect of lag-in-frames is the kind of scene detection the encoder decides to choose. 0-18: No scene-detection. 19-32: Scene detection mode 1 is active(due to limited future frame prediction) 33 and higher: Scene detection mode 2 is active due to large number of future references allowing for the highest level of scene detection present in aomenc and more information is gathered.
2022-08-21 03:51:27
WELL aomenc Scene detection mode 2 does not work correctly - some programmer missed ONE digit in the code and the keyframe is assigned with a shift back to another frame
BlueSwordM
DZgas Ж -lag-in-frames 64
2022-08-21 03:52:21
Wait, how are you using this setting? <:kekw:808717074305122316>
2022-08-21 03:52:32
This is not present in mainline aomenc, just in aom-av1-psy.
DZgas Ж
BlueSwordM Wait, how are you using this setting? <:kekw:808717074305122316>
2022-08-21 03:53:14
until recently, there was a maximum of 48, but recently, six months ago, it was expanded to 96
2022-08-21 03:53:30
but it doesn't matter because this shit is BROKEN
BlueSwordM
DZgas Ж until recently, there was a maximum of 48, but recently, six months ago, it was expanded to 96
2022-08-21 03:53:58
Uh, that's only a thing in aom-av1-psy. It was never expanded to 128 max in mainline aomenc... It's still 48.
DZgas Ж
BlueSwordM Uh, that's only a thing in aom-av1-psy. It was never expanded to 128 max in mainline aomenc... It's still 48.
2022-08-21 03:54:16
128 in aom-av1-psy
BlueSwordM Wait, how are you using this setting? <:kekw:808717074305122316>
2022-08-21 03:54:36
i just set and just use
2022-08-21 03:54:40
all works
BlueSwordM
2022-08-21 03:55:01
Anyway, I'm surprised you're having issues with scene detection honestly. Never had a problem with it.
DZgas Ж
2022-08-21 03:55:54
damn, I'm so angry because I fkd up a week of work and the whole result with each KEYscene is a MISTAKE
2022-08-21 03:57:07
Okay, I've tested it now, 32 works well, and I'll stop there.
DZgas Ж I found: Another effect of lag-in-frames is the kind of scene detection the encoder decides to choose. 0-18: No scene-detection. 19-32: Scene detection mode 1 is active(due to limited future frame prediction) 33 and higher: Scene detection mode 2 is active due to large number of future references allowing for the highest level of scene detection present in aomenc and more information is gathered.
2022-08-21 04:02:28
by the way, detector 2 works really better, for example, the transition to blackness, the last frame before there is nothing in the frame, it was correctly identified as the key. BUT the error with the shift of all frames make all this useless, I hope someone will notice it and fix it.
hotsauce
2022-08-22 10:00:41
Do you think there will be a VIF (Verasatile Image Format) a la HEIF (High efficiency Image Format)?
190n
2022-08-23 02:02:51
would that be needed now that AVIF and JXL exist?
diskorduser
2022-08-23 03:50:18
who knows. they will still release it just for format wars...
The_Decryptor
2022-08-23 03:59:34
HEIF is already used for 3 codecs, another wouldn't "hurt"
_wb_
2022-08-23 05:50:06
VVC in HEIF, why not. But I don't think anyone really believes that patent-encumbered still image formats can get widespread adoption.
DZgas Ж
2022-08-23 09:23:56
vvc🤣 hevc<:ReeCat:806087208678588437> heif<:kekw:808717074305122316> heic
190n
The_Decryptor HEIF is already used for 3 codecs, another wouldn't "hurt"
2022-08-23 09:46:48
HE**V**C is even used in 2 image formats <:PeepoDiamondSword:805394101340078092>
w
2022-08-23 09:47:38
man what is the point of heif
2022-08-23 09:49:03
if none of the heifs are interchangeable
2022-08-23 09:49:25
i guess maybe it's kind of like bmp
2022-08-23 09:49:28
and how jpeg and png can be in bmp
_wb_
2022-08-23 09:50:25
heif is just a container that allows you to do some stuff like layering, tiling, orientation, embedding ICC profiles, etc with payloads that do not necessarily support those things
2022-08-23 09:51:00
also worth noting: Nokia claims it has patents on heif itself (the container format, regardless of payload)
2022-08-23 09:51:52
heif does have ways to signal what will be the payload, so it's relatively easy to see what kind of decoder you'll need
2022-08-23 09:52:19
in a way it's not that different from how video containers work: they also typically allow you to use various payload codecs
w
2022-08-23 09:52:33
will we see jxl in heif
_wb_
2022-08-23 09:52:47
that was a question actually at the previous jpeg meeting
2022-08-23 09:53:01
my opinion was: no, we don't need that
2022-08-23 09:53:45
all the functionality of heif is already supported natively by jxl so there really is no point having an orthogonal mechanism to do things like layers, tiles, orientation etc
2022-08-23 09:54:45
also heif is patent-encumbered and quite verbose - it has a pretty large overhead, which for video is fine but for small images is an issue
w
2022-08-23 09:55:13
now i want jxl in mkv
_wb_
2022-08-23 09:55:21
so the conclusion was: no, we're not going to do anything to define jxl in heif
190n
2022-08-23 09:55:35
thank god
_wb_ also heif is patent-encumbered and quite verbose - it has a pretty large overhead, which for video is fine but for small images is an issue
2022-08-23 09:55:52
yeah making 1482 byte avifs was kinda hard
_wb_
2022-08-23 09:57:04
jxl tracks in a video container could make sense... why not have a nice picture or slideshow of pictures that comes with an audio track and maybe even some subtitles
w
2022-08-23 09:58:02
it looks like mkv facilitates anything given a codec id/name and just the data
2022-08-23 09:58:40
how is adding a codec done for heif?
2022-08-23 09:59:37
or will i have to pay to find out
_wb_
2022-08-23 10:00:00
ironically the heif spec is not behind a paywall
2022-08-23 10:00:15
somehow they managed to convince iso to drop the paywall on it
2022-08-23 10:01:02
(but if you want to deploy it, presumably you need to pay licensing fees to Nokia)
w
w and how jpeg and png can be in bmp
2022-08-23 10:05:50
bmp is kinda funny but the wikipedia page is my favorite among formats <https://en.wikipedia.org/wiki/BMP_file_format>
JendaLinda
2022-08-23 10:58:30
AVI is partially based on BMP.
hotsauce
2022-08-23 06:47:17
VVC in HEIF as VIF/VIC would be interesting performance wise though. Not sure how much of the claimed ~50% improvement would be relevant once you take away the tricks for video and only have the still image improvements.
_wb_
2022-08-23 07:17:44
Everything always claims 50% improvement but that's usually only using some irrelevant metric like PSNR, and comparing the slowest possible encoder of [new thing] against the worst possible encoder of [old thing].
BlueSwordM
_wb_ Everything always claims 50% improvement but that's usually only using some irrelevant metric like PSNR, and comparing the slowest possible encoder of [new thing] against the worst possible encoder of [old thing].
2022-08-23 07:30:21
It's also only true at very high resolutions(4k+), but that's perfectly fine for images since they can easily be as large or much larger than that. To be fair though, they compare reference implementations vs reference implementations. To get 50% coding gain from AVC to HEVC in video for example, MPEG members tested HM(h.265) vs JM(h.264) at 4k, which is how they got the 45-50% better coding gains(according to PSNR and SSIM). The gains are lower for 1080p(25-30%).
_wb_
2022-08-23 08:49:39
ok, but still, PSNR and SSIM are pretty easy to fool
BlueSwordM
2022-08-23 09:06:24
Oh I know, but it makes sense if you look at how they evaluate stuff 😄
2022-08-23 09:07:28
What you said on Twitter is very interesting in that regard though.
DZgas Ж
2022-08-25 07:54:14
are there any reasons why you are seriously talking about AVC and HEVC? lol
2022-08-25 07:56:39
only Apple and *torrents* use hevc
2022-08-25 07:57:35
for obvious reasons, some TVs support HEVC
2022-08-25 07:59:01
the latest versions of SVT-AV1 at speeds of 9-13 are ahead of AVC in both speed and quality, almost comparable to HEVC
2022-08-25 08:00:05
this is such a gigantic reach that now I don't even know why HEVC continues to exist
2022-08-25 08:01:12
AVC remains as the fastest encoder on its minimum preset - SLOW and below
w
2022-08-25 08:02:28
uhdbd uses hevc
DZgas Ж
w uhdbd uses hevc
2022-08-25 08:04:32
well, it will be difficult to change it, to decode 4k AV1, you need more player power, and this is already ten years of use history
2022-08-25 08:08:20
(In fact, I have never touched BD discs in my life, so I have nothing to say here, high-speed Internet appeared in Russia so early that after the DVD we immediately switched to using the Internet only)
JendaLinda
2022-08-25 10:48:30
TV broadcasters are very good friends with MPEG so they won't change their mind any time soon. They will be using their beloved AVC, HEVC, VVC and AAC forever. That means support for those codecs is mandatory in TVs to decode terrestrial and satellite TV signals. Similar situation is in digital radio which relies on AAC and AAC HE.
The_Decryptor
2022-08-25 10:55:24
I wonder how slow the rollout of VVC will be, would broadcast TV even be relevant by then?
2022-08-25 10:55:49
Like here (Australia) the rollout has been so slow that it's been eclipsed by streaming
JendaLinda
2022-08-25 11:21:36
Here in the middle of the Europe, terrestrial TV is still relevant and it's a reliable way to let people watch ads. Terrestrial TV is free to air. Most channels are in SD, only the state TV is HD. Who needs more HD channels, can rent a cable, satellite, IPTV service. I suppose there will be motivation to switch to newer codec so more TV channels would fit in the spectrum and TV stations could sell more ads. The next logical step is VVC, most of the content would be still in SD though.
_wb_
2022-08-25 11:23:23
How much compression advantage does vvc really bring though in SD? Assuming realistic encoders, no superslow software encoders...
JendaLinda
2022-08-25 11:24:17
Not much but as long as people are will to watch it, it's worth it.
2022-08-25 11:25:51
HEVC is used for SD as well.
2022-08-25 11:28:28
After switching to VVC they will just cut the bitrate to half and call it a day.
2022-08-25 11:40:50
Most people are using some kind of paid TV service anyway. Internet providers usually provide Internet, TV and telephone in one service. So FTA TV is meant for people who don't use internet or don't want to pay for TV.
diskorduser
2022-08-25 12:02:50
In my country satellite tv is popular and the satellite setup boxes come with hevc support.
2022-08-25 12:03:54
only the fhd channels use h265 though.
JendaLinda
2022-08-25 12:07:49
Satellite TV is paid as well. Getting TV and internet in one service is usually better value.
diskorduser
2022-08-25 12:10:39
yeah I'm going for internet+tv from next year.
JendaLinda
2022-08-25 12:27:13
Speaking of Bluray, this seems to be pretty niche product as optical drives are disappearing from computers and game consoles. I wouldn't use Bluray movies as a good example of codec usage. After all, a Bluray movie can be also encoded in MPEG2 or VC-1 aka Windows Media Video.
Demez
DZgas Ж only Apple and *torrents* use hevc
2022-08-25 03:51:28
I personally use x265 for encoding my own videos
2022-08-25 03:52:01
my phone can record to hevc as well
spider-mario
2022-08-25 03:57:36
4K Blu-rays (required for HDR) use H.265
JendaLinda
2022-08-25 04:41:00
My old camera was recording 720p video in MJPEG. The video files were huge. I needed some way to shrink those videos using "visually lossless" compression. There were few codecs to choose from. There was XviD but it was already too obsolete. Also hardware decoders were pretty buggy with XviD. Then there was AVC with decent hardware support and x264 encoder was already well developed. Then there was HEVC, very new, poorly supported in HW. And AV1 was not a thing yet. So AVC was the perfect choice. At that point, ffmpeg didn't have a stable AAC encoder yet, there was only some experimental one. So I just used libmp3lame for audio. MP3 doesn't seem to be a problem in MP4 files.
spider-mario
2022-08-25 05:17:33
nowadays, my go-to AAC encoder is fdk, but unfortunately it means having to build ffmpeg from source because of the license
JendaLinda
2022-08-25 06:05:31
Yeah there were some external AAC codecs available but using MP3 was easy enough. Honestly at 128kbps for mono audio from the camera mic it doesn't make much difference.
190n
DZgas Ж only Apple and *torrents* use hevc
2022-08-26 12:05:49
i think a lot of streaming services use hevc for 4k and/or hdr
JendaLinda
2022-08-26 10:14:16
That makes sense, it's pretty much the only option that will work in most TV's. CPUs in TVs are pretty underpowered, so relying on software decoding wouldn't be a good idea.
2022-08-26 10:26:28
I can imagine that another option to deal with Motion-JPEG would be lossless transcoding to Motion-JXL. I'm not sure if MJXL is a thing of if it would be useful at all. I also suppose it wouldn't include JPEG reconstruction data as it would make things much more complicated.
Fraetor
2022-08-27 01:46:13
There was some talk about it a few months ago here. It was mostly decided that ffv1 is better for most lossless video purposes, but there may be a place for a stream of jxl frames for low latency applicatons, especially with something like fjxl's encode speed.
DZgas Ж
The_Decryptor I wonder how slow the rollout of VVC will be, would broadcast TV even be relevant by then?
2022-08-28 01:54:33
too expensive, I read about the progress of EVC in the TV industry
JendaLinda TV broadcasters are very good friends with MPEG so they won't change their mind any time soon. They will be using their beloved AVC, HEVC, VVC and AAC forever. That means support for those codecs is mandatory in TVs to decode terrestrial and satellite TV signals. Similar situation is in digital radio which relies on AAC and AAC HE.
2022-08-28 01:55:39
read about EVC
_wb_ How much compression advantage does vvc really bring though in SD? Assuming realistic encoders, no superslow software encoders...
2022-08-28 02:02:11
on the Release the advantage of 1 percent was declared before AV1, but now we have such graph for 4k video, and here you can see that the efficiency of VVC appears only at an ultra-low-speed coding with of 10% best compress then av1. this is so slow that it will take about a day for 360p 60 seconds video on a threadripper 64 core
2022-08-28 02:05:26
I am simply amazed by the metrics that for VVC is "faster" at the SVT-AV1 speed 4 level, I use 4 as the maximum quality metric after which there is already an unbearably long coding, and with all this, there is no point in VVC because it loses AV1 in quality on its "FAST" capabilities
yurume
2022-08-28 02:06:21
(slight off-topic: do all of them scale well to 96 cores after all?)
DZgas Ж
yurume (slight off-topic: do all of them scale well to 96 cores after all?)
2022-08-28 02:07:00
no one scales at all
2022-08-28 02:08:44
AVC scales well, HEVC is also not bad, but starting with VP9 to improve compression, codecs began to take into account all parts of the image for efficiency, and therefore parallelization now always leads to a direct division of the frame into parts and degrades the quality noticeably
2022-08-28 02:09:44
video encoding, if it is large, is now done like this - we divide the video into pieces for 1 minute and encode each piece into one thread on One core, then glue everything together and get an ideal video
JendaLinda
2022-08-28 02:12:15
I would be very surprised if next generation DVB used AV1 and Opus. They are using industrial hardware encoders.
DZgas Ж
JendaLinda I would be very surprised if next generation DVB used AV1 and Opus. They are using industrial hardware encoders.
2022-08-28 02:13:03
oh, if it made sense to change the DVB standard at all
JendaLinda
2022-08-28 02:14:32
There will be need for higher density codecs as frequencies used for TV broadcast are repurposed for mobile data.
DZgas Ж
2022-08-28 02:15:05
people want HD channels, this is no longer a problem, because if there was some kind of Bandwidth Deficit, 4 ordinary SD channels would work instead of HD. in addition, changing the entire system that has been going on for the second decade is well, so-so, for SD content AVC shows itself perfectly
JendaLinda
2022-08-28 02:17:35
They changed the DVB standard once, so they can do it again.
DZgas Ж
JendaLinda They changed the DVB standard once, so they can do it again.
2022-08-28 02:18:16
it's been so long<:H264_AVC:805854162079842314>
JendaLinda
2022-08-28 02:18:24
We have SD in HEVC now.
DZgas Ж
JendaLinda We have SD in HEVC now.
2022-08-28 02:18:44
ha ha, no
2022-08-28 02:19:29
AVC is better than HEVC on SD content, this is the bitter truth. even so, AVC is encoded about 10 times faster
JendaLinda
2022-08-28 02:19:57
Yes but they're using HEVC anyway.
DZgas Ж
2022-08-28 02:20:19
well money moment
JendaLinda
2022-08-28 02:20:49
Indeed, it forced people to replace their TVs or buy conversion boxes.
DZgas Ж
2022-08-28 02:20:50
new technologies mean need to change everything replace everything reconstruct everything and pay for everything
2022-08-28 02:21:31
<:Stonks:806137886726553651>
JendaLinda
2022-08-28 02:22:39
Still better than MPEG2 that was used before. AVC was in fact completely skipped.
DZgas Ж
JendaLinda Still better than MPEG2 that was used before. AVC was in fact completely skipped.
2022-08-28 02:23:14
what are you talking about now?
JendaLinda
2022-08-28 02:24:32
Before HEVC, DVB was running in MPEG2.
DZgas Ж
diskorduser only the fhd channels use h265 though.
2022-08-28 02:24:40
Yes, that's right. That's how it should be, it's wise
JendaLinda Before HEVC, DVB was running in MPEG2.
2022-08-28 02:25:18
no
2022-08-28 02:27:46
DVB-T 1997 use MPEG2 DVB-T2 2003 use AVC aka h.264
2022-08-28 02:29:38
DVB-S2X is an extension of DVB-S2 satellite digital broadcasting standard. 2014 can use HEVC
JendaLinda
2022-08-28 02:30:10
Yes it was. As you said, it's necessary to change everything, so they were using DVB-T MPEG2 until recently and then they switched to DVB-T2 HEVC. Completely fresh start.
DZgas Ж
JendaLinda Yes it was. As you said, it's necessary to change everything, so they were using DVB-T MPEG2 until recently and then they switched to DVB-T2 HEVC. Completely fresh start.
2022-08-28 02:31:08
DVB-T2 came out in 2003 and hevc in 2013 what's wrong with you?
JendaLinda
2022-08-28 02:32:10
Because my country was using DVB-T long after it's obsolence.
DZgas Ж
2022-08-28 02:32:16
u from france?
JendaLinda
2022-08-28 02:32:55
I'm from Czechia
DZgas Ж
2022-08-28 02:33:28
https://www.mpo.cz/en/guidepost/for-the-media/press-releases/czechia-switched-to-2nd-generation-of-digital-broadcasting-and-completed-the-transition-to-dvb-t2--257698/#:~:text=The%20Czech%20Republic%20has%20completely,a%20better%2Dquality%20digital%20broadcasting.&text=It%20concerned%20at%20least%202.5,used%20a%20free%20terrestrial%20television
2022-08-28 02:33:42
this is extremely unique
2022-08-28 02:34:58
yes....so before that you didn't even have HD TV
JendaLinda
2022-08-28 02:36:26
We did not. HD is considered premium, you have to pay for it. Only the state TV is HD in DVB-T2.
DZgas Ж
JendaLinda We did not. HD is considered premium, you have to pay for it. Only the state TV is HD in DVB-T2.
2022-08-28 02:39:55
well...okay
Demez I personally use x265 for encoding my own videos
2022-08-28 02:42:16
Me too, I used to, then I just thought why am I doing this. and I started recording original videos on my smartphone in 640x480 resolution because I don't need to anymore
2022-08-28 02:43:47
the valuable of a pixel is more important than their quantity
DZgas Ж the valuable of a pixel is more important than their quantity
2022-08-28 02:45:05
they also true when designing television
JendaLinda Yeah there were some external AAC codecs available but using MP3 was easy enough. Honestly at 128kbps for mono audio from the camera mic it doesn't make much difference.
2022-08-28 02:46:23
128 for mono is really more than enough, and there is not much of a problem with the sound because the quality of the microphones is also so-so
JendaLinda
2022-08-28 02:48:12
I actually used only 96 kbps as the camera mic was recording only 22050 Hz
fab
DZgas Ж well...okay
2022-08-28 02:49:38
What is this font i seem to know
JendaLinda
2022-08-28 02:50:36
It's LCD font used by those basic character LCD screens.
DZgas Ж
JendaLinda I can imagine that another option to deal with Motion-JPEG would be lossless transcoding to Motion-JXL. I'm not sure if MJXL is a thing of if it would be useful at all. I also suppose it wouldn't include JPEG reconstruction data as it would make things much more complicated.
2022-08-28 02:52:13
Motion-JPEG this is the fastest way to create video in general, which is why it was used in video surveillance in the 1990s because of its cheapness. now, of course, we can say that AVC is the fastest on its low-power ULTRAFAST-VERYFAST. my OLD processor has JPEG hardware Encoding and I used it when I experimented with rendering in SONY VEGAS 13 when a giant-sized video and AVC just encodes it extremely slowly, jpeg coped several times faster, then stitched hundreds of thousands of pictures into MJPEG via FFMPEG, attached a aac track, and sent it to YouTube. surprisingly, there was not even any desynchronization.
fab What is this font i seem to know
2022-08-28 02:53:00
https://t.me/my_content/5514
2022-08-28 02:53:18
JendaLinda
fab What is this font i seem to know
2022-08-28 02:54:06
Font used in HD44780
DZgas Ж
2022-08-28 02:54:15
pf
fab
2022-08-28 02:54:34
Don't say insult
DZgas Ж
2022-08-28 02:54:55
<:PeepoDiamondSword:805394101340078092>
JendaLinda
DZgas Ж Motion-JPEG this is the fastest way to create video in general, which is why it was used in video surveillance in the 1990s because of its cheapness. now, of course, we can say that AVC is the fastest on its low-power ULTRAFAST-VERYFAST. my OLD processor has JPEG hardware Encoding and I used it when I experimented with rendering in SONY VEGAS 13 when a giant-sized video and AVC just encodes it extremely slowly, jpeg coped several times faster, then stitched hundreds of thousands of pictures into MJPEG via FFMPEG, attached a aac track, and sent it to YouTube. surprisingly, there was not even any desynchronization.
2022-08-28 02:55:58
I guess the reason was the camera already had JPEG codec so they used it for video as well.
DZgas Ж
Fraetor There was some talk about it a few months ago here. It was mostly decided that ffv1 is better for most lossless video purposes, but there may be a place for a stream of jxl frames for low latency applicatons, especially with something like fjxl's encode speed.
2022-08-28 02:56:11
I don't think JPEG XL can be fast enough to compete in speed with jpeg for creating hundreds of thousands of images in a video
JendaLinda I guess the reason was the camera already had JPEG codec so they used it for video as well.
2022-08-28 02:56:59
the reason was still in the absence of alternative technologies or their complexity
fab
DZgas Ж https://t.me/my_content/5514
2022-08-28 02:57:24
Watched but didnt joined
2022-08-28 02:57:47
I don't know czech
DZgas Ж
2022-08-28 02:57:59
heh
2022-08-28 02:58:19
2022-08-28 02:58:36
2.0.0 - Applejack 3.0.0 - Braeburn 3.1.0 - Celestia 3.2.0 - DaringDo 3.3.0 - Electric-Sky 3.4.0 - Fluttershy (libaom version naming)
JendaLinda
2022-08-28 03:01:00
Minecraft indeed uses font based on HD44780.
yurume
2022-08-28 03:01:46
(and GNU Unifont I think?)
JendaLinda
2022-08-28 03:03:18
It seems so.
DZgas Ж
JendaLinda Minecraft indeed uses font based on HD44780.
2022-08-28 03:09:48
too much difference
2022-08-28 03:10:31
JendaLinda
2022-08-28 03:11:16
It's the same idea. There's not much you can do with 5x8 pixels.
DZgas Ж
DZgas Ж
2022-08-28 03:12:31
in fact, the minecraft font now is a giant unicode database, with a bunch of characters for all European languages, for Asian languages, and a lot of things in it are funny, that's why it weighs so much
2022-08-28 03:13:14
and the minecraft font exists only as a picture in minecraft game, but here are the instructions
2022-08-28 03:14:45
in general, the font itself cannot be bought anywhere, and since minecraft > mojang > microsoft owns it all, it cannot be massively distributed
JendaLinda
2022-08-28 03:23:54
The unicode font in Minecraft actually seems to be based on https://unifoundry.com/unifont/
Fraetor
DZgas Ж I don't think JPEG XL can be fast enough to compete in speed with jpeg for creating hundreds of thousands of images in a video
2022-08-28 05:06:43
fjxl can do about 75 fps at 4K based on the numbers veluca posted with their CPU. I can't find and benchmarks of libjpeg-turbo on similar hardware though, though I imagine it will beat fjxl by a bit. It isn't really a direct comparison though, as fjxl is lossless.
Demez
DZgas Ж Me too, I used to, then I just thought why am I doing this. and I started recording original videos on my smartphone in 640x480 resolution because I don't need to anymore
2022-08-28 05:06:46
interesting choice
2022-08-28 05:09:11
except for me, its nearly all clips recorded with replay buffer from obs, usually with 2 monitors in the video, leading to 3840x1080, so I don't think I would ever record in a lower res for that
_wb_
2022-08-28 05:25:34
libjpeg-turbo doesn't do multithreaded encode afaik, and I don't think it does more than ~200 Mpx/s
DZgas Ж
2022-08-28 05:34:03
🤔
Demez interesting choice
2022-08-28 05:37:16
almost everywhere on the Internet I can upload these videos as source quality, for YouTube I increased them by 2 times by spline, although it was possible to choose better algorithms, it was important that YouTube did not squeeze my amazing 640x480 into a block-pixel mess
2022-08-28 05:39:36
experimentally pouring dozens of videos, I found out that 480x640 magnified by 2 times vertical 960x1280 is recognized by YouTube as "1080p"
JendaLinda
2022-08-28 08:52:58
I was looking into TIFF as some people still use it. TIFF offers multiple codecs, so I've tried them out. There are multiple codecs optimized for B&W images but those are all pretty much worthless. Some of them produced even bigger file than the uncompressed image. JPEG encapsulated in TIFF doesn't make much sense to me. From the supported lossless codecs, Deflate seems to be the best. Although TIFF doesn't seem to be using filtering like PNG does and the compression suffers from that.
_wb_
2022-08-28 09:02:46
TIFF only has West (Left) filtering, like jpeg DC
yurume
2022-08-29 11:31:12
> WebP still only has one implementation (libwebp) as far as I know, and I don't know of any efforts to change that situation.
2022-08-29 11:31:42
Jon said this in the Chromium issue, and I want to mention that image-rs does have its own webp decoder implementation (I'm very surprised about that)
2022-08-29 11:34:48
unfortunately (for webp) though, that decoder doesn't do chroma yet and has caused lots of headaches for years (cf. https://github.com/image-rs/image/issues/1648)
_wb_
2022-08-30 06:01:04
Oh, interesting, I didn't know there was a WebP reimplementation there.
Morpholemew
2022-08-30 12:28:23
There's a webp decoder written in Go too: https://pkg.go.dev/golang.org/x/image/webp (with https://pkg.go.dev/golang.org/x/image/vp8 and https://pkg.go.dev/golang.org/x/image/vp8l )
yurume
2022-08-30 12:34:12
wait, not just webp but also the entirety of vp8?
2022-08-30 12:34:21
heh
_wb_
2022-08-30 12:35:54
how were these created though? independently from spec, or by taking libwebp source code and transpiling it to a different programming language?
0xC0000054
2022-08-30 12:36:44
ImageSharp has a WebP decoder/encoder written entirely in C#. https://github.com/SixLabors/ImageSharp
_wb_
2022-08-30 12:43:54
oh wow indeed
2022-08-30 12:44:19
and even looks like they did it from spec
Morpholemew
yurume wait, not just webp but also the entirety of vp8?
2022-08-30 01:16:38
Looks like just the necessary bits for webp, but I only skimmed, I didn't read the whole thing.
_wb_ how were these created though? independently from spec, or by taking libwebp source code and transpiling it to a different programming language?
2022-08-30 01:19:44
The spec is referenced, but I don't know if it's a reimplementation from scratch or a manual conversion of the C code. It's far too clean to be an automatic transpile.
_wb_
2022-08-30 01:23:20
automatic transpiling is improbable for anything except emscripten-style stuff. But it makes a big difference if you start from the reference code (using the spec only in a secondary way, e.g. to help understand the reference code), or if you start from the spec and basically don't look at the reference code
Morpholemew
_wb_ automatic transpiling is improbable for anything except emscripten-style stuff. But it makes a big difference if you start from the reference code (using the spec only in a secondary way, e.g. to help understand the reference code), or if you start from the spec and basically don't look at the reference code
2022-08-30 01:56:37
Looking through the git history, I found an instance where the author filed a bug against the spec where it didn't match libwebp. So I'm thinking golang.org/x/image/webp et al was probably implemented from the spec rather than the code. https://bugs.chromium.org/p/webp/issues/detail?id=205
2022-08-30 01:57:27
(at least for lossless)
improver
2022-08-30 02:03:44
important point though is that these implementations were done post-browser-adoptation, i think
_wb_
2022-08-30 02:43:29
certainly post chrome adoption
DZgas Ж
2022-08-31 06:16:44
modern x264 wins vp9 at any settings up to 720p at the same encoder speeds 🍖
2022-08-31 06:21:24
modern x264 wins vp9 and x265 and svt-av1 at any settings up to 720p at the same encoder speeds🍖 🍖 🍖
2022-08-31 06:24:03
svt-av1 at a speed of 10-12 has so many problems, even on the qualities of CRF "1" there are gross errors with the movement of blocks, even on an ideal almost lossless picture there may simply be the grossest noticeable artefact of incorrect block movement that makes it completely useless to use SVT-av1 for coding **intermediate **ANYWHERE
2022-08-31 06:28:07
VP9 has a giant problem with the movement of blocks, I would say this is the problem of Lack of movement information, that is, the encoder spends a lot of bytes on information inside the blocks, the image itself, and there are extremely few bytes for the correct movement and motion vectors, I even I don't know how it could be fixed
2022-08-31 06:29:37
x265 is just a shame, even in comparison placebo x264 placebo x265 turns out worse. how so. and at the same time much slower.
2022-08-31 06:30:15
all this concerns the size of the frame 540p 480p 360p 240p 144p
2022-08-31 06:31:28
AOMENC starts coding better at speeds of 4-6 while being hundreds of times slower
2022-08-31 06:33:09
for fullHD quality/speed content, it is better to use SVT-AV1 at speed 6 than AOMENC at speed 8
Fox Wizard
2022-09-01 06:59:28
Aomenc above speed 6 is... ew anyways <:KekDog:884736660376535040> ~~personally I don't want to go lower than 4 and prefer to stay at 2 or 3 which is slow af XD~~
DZgas Ж
2022-09-01 12:14:48
288x144 the same FILE size the same encoding time the last frame before the new key aomenc speed 5 hevc slow avc placebo+
2022-09-01 12:16:58
_wb_
2022-09-01 12:23:55
sigh, so much oversmoothing in both av1 and hevc (but more in hevc)
2022-09-01 12:25:27
I think video codec devs have been way too focused on appeal (avoiding visible compression artifacts) and on bad objective metrics like PSNR (which is very tolerant for smoothing)
DZgas Ж
_wb_ sigh, so much oversmoothing in both av1 and hevc (but more in hevc)
2022-09-01 12:30:40
Yeah, it's a giant problem. but the worst thing is that a large number of old and fast technologies in AVC work better than a few Power and New ones in AV1 and HEVC, vp9 is built on this - several large free technologies, but in the end it looks disgusting than a seemingly overloaded AVC
_wb_
2022-09-01 12:32:12
it's quite similar to how a good jpeg encoder (mozjpeg) can be quite competitive with some way more modern formats (webp,avif)
DZgas Ж
_wb_ it's quite similar to how a good jpeg encoder (mozjpeg) can be quite competitive with some way more modern formats (webp,avif)
2022-09-01 12:35:06
webp is much more complicated than jpeg if we compare it, it seems like XVID and AVC - webp can be faster than JPEG, while it is better to compress
2022-09-01 12:36:26
I am now checking the webp At maximum speed and mozjpeg at the same
Fraetor
2022-09-01 12:45:17
How old is AVC? Would AV2 or 3 be able to take its best bits?
DZgas Ж
_wb_ it's quite similar to how a good jpeg encoder (mozjpeg) can be quite competitive with some way more modern formats (webp,avif)
2022-09-01 12:46:10
after squeezing 1000 photos, I found out that the fastest webP is 1.7 times slower than the "slow" jpeg, the quality is mostly better with WEBP, but for example, everything is not so clear in this picture, there is no question in the photos Jpeg q30 Webp q30
Fraetor How old is AVC? Would AV2 or 3 be able to take its best bits?
2022-09-01 12:46:25
AVC? maybe AV1?
_wb_
2022-09-01 12:47:05
AVC = h264 is from 2004
DZgas Ж
Fraetor How old is AVC? Would AV2 or 3 be able to take its best bits?
2022-09-01 12:53:01
I have no confidence that AV2 will be released in the next 10 years at all. There are many problems with the fact that progress stops, it is as inevitable as the fact that for the production of processors the most advantageous limit of nanometers was 28(after all, the most profitable thing in terms of speed and price that you can buy now are the very Xeon 28 nanometers that are already 7+ years old), when the price of the board was the cheapest in production, since then processors are more expensive and require more cooling, everything is so bad that buying a powerful laptop will turn off 3 of 4 cores so that it would work at least a day and not burn out at the same time. oh, how difficult everything is now. the problem with AV1 is simply terrifying, at speeds when it is effective in compression Power, it is **hundreds **of times not effective in speed. and this is already now and already at the parameters FAR from the maximum
2022-09-01 12:54:14
as it was 4 years ago, so it remains now, the real power of av1 remains only in the hands of those 50+ corporations that made it, who have enough power and who give their content to millions of people at once, like Netflix and YouTube
2022-09-01 01:00:41
I made an agreement with a friend and took his Ryzan with 16 cores to compress AV1 several movies and anime, a whole week of work, is really good, wonderful quality and file size, but the speed of work is so expensive that I don't even know if it made sense, the same power could be compressed 100 times more video watches that would weigh 50% more but same quality...
2022-09-01 01:03:52
I don't even know what to do with this content now
Fraetor
_wb_ AVC = h264 is from 2004
2022-09-01 01:13:56
Given that, there shouldn't be any patent issue with nabbing the best bits of AVC for AV2 then.
2022-09-01 01:14:05
Which will be nice, though I don't know if the underlying models are compatible in a way where that would be useful.
DZgas Ж
Fraetor Given that, there shouldn't be any patent issue with nabbing the best bits of AVC for AV2 then.
2022-09-01 01:16:21
you confuse
_wb_
2022-09-01 01:16:30
I think AV1 probably already includes the best bits of AVC — it was published in 2004 but any patents related to it would be older, most probably old enough to already have been expired (or close enough to being expired) when AV1 was created
DZgas Ж
_wb_ I think AV1 probably already includes the best bits of AVC — it was published in 2004 but any patents related to it would be older, most probably old enough to already have been expired (or close enough to being expired) when AV1 was created
2022-09-01 01:17:29
this is completely wrong, even HEVC does not include many parts from AVC
Fraetor
2022-09-01 01:18:06
Ah, good to know.
DZgas Ж
2022-09-01 01:21:11
this is the problem, the codecs are thoroughly not old, and they take only the most powerful and Expensive new tools, it is obvious that all codecs have DCT as a base, but for example some prediction of 4 by 4 pixels and an analysis that gave 1-2% efficiency - new codec just did not Have it, we hope that another BIG tool will be better
2022-09-01 01:23:51
for a very long time it was not clear to me personally why everything is like this??? now that they have made an MPEG-5 EVC codec that, according to the idea, uses all the free Old technologies and is a competitor to HEVC, but even so AVC is still faster and better at its speeds
2022-09-01 01:24:57
but it annoys me that NO ONE supports it, it's a free codec with open documentation, but I can't even play it anywhere
2022-09-01 01:25:05
2 years
2022-09-01 01:25:21
https://github.com/mpeg5/xeve
_wb_
2022-09-01 01:26:06
I am talking about bitstream coding tools, not about clever encoder implementation stuff.
DZgas Ж
DZgas Ж https://github.com/mpeg5/xeve
2022-09-01 01:26:32
I would say it's **literally **AVC FOR FULL HD
_wb_ I am talking about bitstream coding tools, not about clever encoder implementation stuff.
2022-09-01 01:27:16
Wait, what's the difference?
2022-09-01 01:28:18
for example, there are the same tools that the AOMENC SVT-AV1 RAV1E codecs use in different ways
2022-09-01 01:30:05
AVC has a lot of patent encoders, but as you know, the most powerful and free at this time is x264
2022-09-01 01:36:52
do you know what annoys me? here principle of building codes now. How was it before? Took the technology, and over time they improved it, made it faster and more efficient, squeezing the maximum out of the algorithms, the same JPEG as an example. And now, being taken extremely powerful, excessive technology, and it is being **cut**, they are trying to **cut **it so much that it could at least be used, it seems to me that it is because of this approach that there are a lot of problems with the performance of codecs right now
2022-09-01 01:38:14
av1 is a good codec, but you need a supercomputer to unleash it real abilities, because that's exactly the level of technology in it
_wb_
2022-09-01 01:57:31
I don't know much about video codecs, but e.g. in image codecs: DCT and coefficient quantization are bitstream-level coding tools, but how an encoder makes use of it (just naively, doing deadzone quantization, trellis quantization, etc) is something else.
2022-09-01 01:57:59
E.g. mozjpeg is doing some fancy encoder tricks that could also be applied in libjxl but are currently not used (yet)
2022-09-01 01:59:24
so basically there's 1) the bitstream expressivity and 2) the clever heuristics and tricks encoders use to perform good compression (perceptually optimized etc)
2022-09-01 02:00:26
1) is something that tends to only improve, but 2) is something that tends to get reset to zero in each new codec generation
DZgas Ж
_wb_ E.g. mozjpeg is doing some fancy encoder tricks that could also be applied in libjxl but are currently not used (yet)
2022-09-01 02:01:40
it would be nice to have a lot of algorithms that Bruteforce on which one would be better to compress each individual block or fragment
_wb_
2022-09-01 02:01:55
better wrt what metric?
DZgas Ж
2022-09-01 02:02:23
I would like to understand what "better" means
_wb_
2022-09-01 02:02:49
also: full bruteforce is computationally not feasible anymore for the modern codecs, the combinatorial explosion is too much
DZgas Ж
2022-09-01 02:03:53
when you compare images, you immediately see errors that it would be better not to see, AVI is very good at destroying Noticeable artifacts, what metrics does it use? standard PNSR? I don't think so
_wb_ also: full bruteforce is computationally not feasible anymore for the modern codecs, the combinatorial explosion is too much
2022-09-01 02:04:44
that's right, it's enough to see once how long JPEG guetzli has been running
JendaLinda
2022-09-08 08:56:26
Why are so many programs so dumb so they enforce alpha channel in PNG even if it's not actually used?
Petr
2022-09-08 09:15:57
I've seen that so many times. 🤦‍♂️ And that's where optimization comes in. 😀
JendaLinda
2022-09-08 09:52:25
Indeed, I always optimize PNGs I've got.
_wb_
2022-09-08 09:52:33
Apple's built-in screenshot thing does that iirc. It's very silly since screenshots are always opaque.
JendaLinda
2022-09-08 09:54:41
The same thing in Windows.
yurume
_wb_ Apple's built-in screenshot thing does that iirc. It's very silly since screenshots are always opaque.
2022-09-08 10:11:21
it has an ability to capture a single window in which case borders are generally transparent. but I agree that when you don't have a single transparent pixel it's silly to signal alpha channels.
_wb_
2022-09-08 10:23:42
ah right well at least they could do something different for screenshots of (crops) of the screen than for screenshots of windows
JendaLinda
2022-09-08 11:03:00
There's also option to use tRNS chunk for single color transparency kinda like in GIF.
The_Decryptor
2022-09-08 11:12:24
It's actually possible to use the "rectangle selection" screenshot mode in macOS and end up with an image with transparent pixels (because of bugs of course), it basically just dumps the buffer from the compositor and whatever alpha values it has survive as-is
JendaLinda
2022-09-08 11:59:40
tRNS can do neat tricks in paletted images, it practically expands palette entries from RGB to RGBA.
DZgas Ж
2022-09-09 12:48:15
in cWEBP.exe, which I took to consider for the first time, I discovered a new experimental function -af (auto-adjust filter strength), which uses new algorithm for creating the filter deblocks, and it works better than a regular filter, but it is slow, during the time it processes the image, it was possible to encode 3 more of the same webp pic. It's good that the format has not been abandoned and is still kept up to date because of things like this
2022-09-09 12:51:52
2022-09-09 03:27:24
**As of November 2021, web browsers that support WebP had 96% market share.**
_wb_
2022-09-09 04:20:06
Is the webp bug in safari fixed already?
DZgas Ж
2022-09-10 01:22:27
I can't understand why there is not a single image in the entire AV1 documentation to understand how and what works
2022-09-10 01:28:24
but in AVC **patents **there are hundreds of detailed images and principles of operation
_wb_ Is the webp bug in safari fixed already?
2022-09-10 01:29:50
bug?
spider-mario
DZgas Ж but in AVC **patents **there are hundreds of detailed images and principles of operation
2022-09-10 02:04:04
just to clarify for those who are not aware (since it could be a source of confusion), AVC = H.264
2022-09-10 02:04:39
the name “AVC” is to H.264 as “HEVC” is to H.265
2022-09-10 02:04:55
(and “VVC” to H.266)
_wb_
DZgas Ж bug?
2022-09-10 02:26:06
a small fraction of webp images just don't display at all in safari due to some weird bug
DZgas Ж I can't understand why there is not a single image in the entire AV1 documentation to understand how and what works
2022-09-10 02:27:45
patent clerks like to see diagrams, while spec editors usually prefer to avoid them
DZgas Ж
_wb_ a small fraction of webp images just don't display at all in safari due to some weird bug
2022-09-10 02:28:06
okay. I've never seen a safari in my life <:Windows:806135372298977342>
_wb_ patent clerks like to see diagrams, while spec editors usually prefer to avoid them
2022-09-10 02:29:56
what a problem. here comes a new young mind, and instead of understanding how everything works by watching an AV1 DOC or at least VP9 - he will have to descend to the potents of 2003, where there are these beautiful and understandable pictures
_wb_
2022-09-10 02:35:03
https://bugs.webkit.org/show_bug.cgi?id=219977 — looks like they fixed it, so I guess WebP can _really_ be used on iOS/Safari when everyone has upgraded (which should be quite soon, iOS doesn't have a long tail of old versions staying around like Android has)
spider-mario
2022-09-10 02:36:32
it helps that they are still providing software updates for 7-year-old devices
_wb_
2022-09-10 02:37:37
it probably also helps that their typical demographic mostly include people who don't keep devices around for 7 years 🙂
spider-mario
2022-09-10 02:38:13
fair
w
2022-09-11 02:24:18
the windows built-in snipping tool can do this
novomesk
2022-09-13 09:49:07
One of our family iPhones got iOS 16.0 My first impression was that AVIF doesn't work at all but finally I was able to create one AVIF which worked in Safari. There is a huge potential of improvements, Apple's implementation appears very limited.
_wb_
2022-09-13 12:18:41
any idea what the limitations are?
2022-09-13 12:19:00
still only, I assume? Does alpha work? Is it 4:2:0 only or also 4:4:4?
spider-mario
2022-09-13 12:45:24
does it support HDR stills? (e.g. https://spider-mar.io/hdr/salon.avif)
2022-09-13 12:46:45
oh, I can update my own phone to iOS 16
novomesk
_wb_ any idea what the limitations are?
2022-09-13 12:56:36
Alpha works, YUV444 works. 8bit and 10bit AVIF works. testfile in 12bit didn't work for me, AVIF with limited range didn't work. Animation didn't work. RGB AVIF (not YUV) is rendered incorrectly.
JendaLinda
2022-09-13 12:57:50
Didn't they used the implementation from Microsoft?
novomesk
JendaLinda Didn't they used the implementation from Microsoft?
2022-09-13 12:59:59
While one bug is common with M$, they have also different bugs. So I think it is not the same implementation.
0xC0000054
2022-09-13 01:00:25
I have noticed an increase in the number of people starring my Photoshop AVIF plugin on GitHub in the last 48 hours. I have no idea what is causing this increase in attention. I doubt that it could be Apple's AVIF support, as the plugin is currently only available for Windows.
novomesk
spider-mario does it support HDR stills? (e.g. https://spider-mar.io/hdr/salon.avif)
2022-09-13 01:05:29
salon.avif have somehow hallucinogenic look in my Safari.
BlueSwordM
JendaLinda Didn't they used the implementation from Microsoft?
2022-09-13 03:44:47
Nope. They're using dav1d internally.
2022-09-13 03:45:15
If they're willing to, they can let all features go loose.
spider-mario
novomesk salon.avif have somehow hallucinogenic look in my Safari.
2022-09-13 03:53:20
oof, I see what you mean
2022-09-13 03:54:42
novomesk
2022-09-13 03:55:19
Post also Chome (on Win or Linux) so people see the difference.
diskorduser
2022-09-13 03:56:19
salon.avif has banding like artifacts at dark areas on chrome android.
spider-mario
2022-09-13 03:56:20
Chrome on Linux, SDR display
2022-09-13 03:57:10
might have slightly messed up the conversion of the screenshot to sRGB as it looks a bit more saturated than the AVIF
2022-09-13 03:57:15
but you get the general idea
_wb_
2022-09-13 03:59:10
the safari version looks funky
2022-09-13 03:59:25
what is that? discarded msb or something?
spider-mario
2022-09-13 03:59:34
I’m going to update my MacBook and try on desktop Safari
novomesk
_wb_ the safari version looks funky
2022-09-13 04:23:45
Looks like Alien vs Predator game
paperboyo
novomesk Looks like Alien vs Predator game
2022-09-13 04:34:48
The ones from Rebellion (on Atari Jaguar and, esp., first one on the PC) were excellent. Sorry for off-topic, but I just **had** to.
improver
2022-09-14 12:25:05
https://xeiaso.net/blog/avif-help-requested heh
The_Decryptor
2022-09-14 12:48:11
I've tried 3 different AVIF encoders (ImageMagick, GIMP and go-avif), and none of them can do lossless encoding
2022-09-14 12:48:39
I thought I was using them wrong, but go-avif has a specific `--lossless` flag that still produces colour fringing (And a larger output file than the input)
0xC0000054
The_Decryptor I've tried 3 different AVIF encoders (ImageMagick, GIMP and go-avif), and none of them can do lossless encoding
2022-09-14 01:12:39
What OS Are you using? If you are on Windows, Paint.NET has lossless AVIF support.
The_Decryptor
2022-09-14 01:13:39
I'll give that a try
2022-09-14 01:17:51
Yep, that's worked, nice
2022-09-14 01:17:58
Easy to tell, the colours are all wrong
190n
2022-09-14 02:03:37
unless you're testing or something, i wouldn't bother with lossless avif
The_Decryptor
2022-09-14 02:10:13
Yeah it's purely for testing, and it basically agreed with everything I'd read about it
novomesk
2022-09-14 06:17:36
GIMP 2.10.x do not save lossless AVIF, GIMP 2.99.x is able. MS and Apple handle RGB AVIF incorrectly, that's why the wrong colors. But those bugs are trivial to fix so maybe they will fix it in future. The efficiency of AVIF lossless compression depends on AV1 encoders. I believe it could be improved but I think encoder developers have lot of priorities with higher importance than still lossless now.
_wb_
2022-09-14 06:28:39
Did YCoCg end up in AVIF or do you need to do RGB without reversible color transform?
2022-09-14 06:30:43
I think avif lossless might not have that much margin for improvement, entropy coding is probably too weak and it doesn't really have very useful coding tools for lossless as far as I can tell.
veluca
2022-09-14 06:31:07
still working on ycocg IIRC
_wb_
2022-09-14 06:36:37
Can they even still add it at this point without introducing a major interop problem?
veluca
2022-09-14 06:50:46
AFAIU they'd add it to cicp
2022-09-14 06:50:52
which is interesting
2022-09-14 06:51:22
no idea about how interop would look like though
_wb_
2022-09-14 06:56:25
my guess is it would work in chrome and nowhere else for quite a while
veluca
2022-09-14 07:01:04
well, it would work when something updates libavif
_wb_
2022-09-14 07:18:55
`s/when/if/`
improver
2022-09-15 10:51:29
https://xiph.org/flac/2022/09/09/flac-1-4-0-released.html
DZgas Ж
2022-09-17 12:42:54
I wrote a giant post about video codecs here https://encode.su/threads/3953-Video-Codecs-user-experience?p=76263#post76263
Demez
2022-09-18 08:43:44
interesting read
JendaLinda
2022-09-19 04:36:27
I've noticed that services like Facebook or Telegram automatically convert GIF to AVC.
yurume
2022-09-20 05:56:34
https://matthias-buehlmann.medium.com/stable-diffusion-based-image-compresssion-6f1f0a399202 of course.
_wb_
2022-09-20 06:53:18
> It’s interesting however how the artifacts introduced by this compression scheme are affecting the image content more so than the image quality, and it’s important to keep in mind that images compressed in such a way may contain these kinds of compression artifacts.
2022-09-20 06:55:49
As expected. This is the most scary part of this kind of technology: appeal without fidelity, combined with black box nonlinear behavior, means it's basically guided hallucination and not image preservation.
The_Decryptor
2022-09-20 06:56:02
Such interesting image artifacts though, like faces dissolving
_wb_
2022-09-20 06:56:48
> It’s just that the kind of artifacts introduced are a lot less notable, since they affect image content more than image quality — which however is also a bit of a danger of this method: One must not be fooled by the quality of the reconstructed features — the content may be affected by compression artifacts, even if it looks very clear.
2022-09-20 06:58:27
Very interesting, but no way this kind of technology should ever be allowed in applications where fidelity is critical, such as evidence in court or for insurance companies, medical, journalism, etc.
2022-09-20 07:00:05
2022-09-20 07:00:19
yuck
JendaLinda
2022-09-20 07:03:42
What about AI upscalers? Kinda similar thing, creating detail from nothing.
DZgas Ж
JendaLinda I've noticed that services like Facebook or Telegram automatically convert GIF to AVC.
2022-09-20 10:39:05
and this is fine, but I note that telegrams not generate YUV444 and compresses very much. But it also gives me complete freedom, I can make pixel or high-quality animation in High profile and its quality OR full video 1280x720 60 fps with baseline prosile and fastdecode tune (for maximum play optimized) it should be understood that gifs-video consume calculations, and you Can Not just take and put any video in a gif, the optimal for everything will be 640x360. but if you understand the topic, you can create an optimized AVC and make a larger frame size
yurume https://matthias-buehlmann.medium.com/stable-diffusion-based-image-compresssion-6f1f0a399202 of course.
2022-09-20 10:42:14
better compression artifacts than what is not in the original
2022-09-20 10:42:41
AND OF COURSE no AVIF tests
2022-09-20 10:43:58
was the post made 4 years ago? no, today, therefore, the author deliberately keeps silent that AVIF will probably be even better in something<:Thonk:805904896879493180>
The_Decryptor
2022-09-20 10:54:43
6KB AVIF (via Paint.NET)
DZgas Ж
2022-09-20 10:58:16
The_Decryptor 6KB AVIF (via Paint.NET)
2022-09-20 10:58:22
bad compress
yurume
_wb_ As expected. This is the most scary part of this kind of technology: appeal without fidelity, combined with black box nonlinear behavior, means it's basically guided hallucination and not image preservation.
2022-09-20 01:44:04
I actually was disappointed with the entire result. the fidelity issue is technically fixable with patching critical positions, but the compression is not strong enough to support that. ~~the post only makes use of the VAE component of SD so maybe there can be stronger systems possible in the future though.~~ (someone in HN later noted that this is not the case and it does make use of U-Net-based denoising as well)
_wb_
2022-09-20 01:52:54
From a bitstream-design point of view, fixing fidelity with patching is indeed a possibility. From an encoder design point of view, I think it is hard to detect where patches are needed for fidelity and where pixel-level differences are 'innocent' — in the end that is mostly a semantical question: if the codec turns some grass into different but similar looking grass, it's probably fine, while if it changes a face it's probably not fine. Fixing everything that is different at the pixel level will probably require patching nearly everything, but selecting what to patch and what to keep as is will be quite tricky. Of course relying on AI to decide where to fix fidelity and where not to just moves the problem 🙂
yurume
2022-09-20 01:56:13
with a strong enough system, I can indeed see that this can be useful for specific use cases, for example when you wear HMD everything not in focus can be "faked", not really requiring full fidelity. but this system is not strong enough for that.
JendaLinda
2022-09-20 03:21:59
Who uses GIF to encode video, is apparently not interested in quality anyway. Encoding to lossy codec is effective way to deal with those large blobs of several megabytes in size. I've seen GIFs using random noise as dithering. I get it, random noise looks more appealing to eye. Floyd-Steinberg dithering looks kinda funny in motion. However, random noise is very hard to compress. Another lossy compression will look bad, but it's smaller file. Problem solved.
3DJ
2022-09-20 09:24:18
RTX 40 for content creators (AV1): https://www.nvidia.com/en-us/geforce/news/rtx-40-series-and-studio-updates-for-content-creation/
brooke
2022-09-22 03:26:01
does anyone know how i can encode with exhale's aac? the app requires a .wav source file, but i was wondering if either .flac was usable elsewhere or there was a good method to transcode to .wav before encoding
2022-09-22 03:28:24
freac only has fdk-aac but i wanna try xhe
Diamondragon
brooke does anyone know how i can encode with exhale's aac? the app requires a .wav source file, but i was wondering if either .flac was usable elsewhere or there was a good method to transcode to .wav before encoding
2022-09-22 05:23:40
You could use foobar2000 to do the conversion. That's what I do, anyhow.
brooke
2022-09-22 01:10:25
<@284160100510466048> another question, though - what works best for conversion? should i downsample before encoding? i have to at least transcode 24/192 files to 24/96 before encoding, but i wanted to know if there was any difference in transcoding to 24/44.1 instead
JendaLinda
2022-09-22 02:35:31
Wouldn't be better to resample to the native resolution of your DAC to prevent additional resampling during playback? All DACs are capable of at least 48kHz these days.
brooke
JendaLinda Wouldn't be better to resample to the native resolution of your DAC to prevent additional resampling during playback? All DACs are capable of at least 48kHz these days.
2022-09-22 02:37:07
i'm talking from a purely digital standpoint here, like i only use headphones, no fancy hardware or anything
2022-09-22 02:39:00
i just play from foobar2000 now so just wanna know what's best
JendaLinda
2022-09-22 02:41:09
I would go with 48 kHz because it's a nice round number. 44,1 kHZ was chosen for audio CD as a compromise, later standards are using 48 kHz minimum.
brooke
2022-09-22 02:41:27
alright
2022-09-22 02:42:35
<@688076786525143117> talking about xHE-AAC in particular here though, are you familiar at all with AAC codecs?
2022-09-22 02:43:11
just to ask
JendaLinda
2022-09-22 02:46:00
Kinda. AAC HE is intended for low bitrates AFAIK. That's not that interesting for me, I prefer higher quality encoding.
brooke
2022-09-22 02:47:48
i'm trying to find a good middle ground between a decent-sounding desktop library and a general mobile library
2022-09-22 02:48:08
found xHE-AAC 96kbps SBR to be the best fit so at this point i'm just trying to learn more about the process
JendaLinda
2022-09-22 02:52:33
96 kbps is not that low, even MP3 can sound passable if decent encoder is used. I would consider opus as well. It performs pretty well at low bitrates.
brooke
2022-09-22 02:53:06
i'm actually switching off of opus
2022-09-22 02:53:13
opus was my go-to for a long time
BlueSwordM
brooke i'm trying to find a good middle ground between a decent-sounding desktop library and a general mobile library
2022-09-22 02:53:36
Just use Opus.
brooke
brooke i'm actually switching off of opus
2022-09-22 02:54:02
[2]
BlueSwordM
2022-09-22 02:54:06
At the 96-160kbps level, Opus through libopus generally wins because of higher fidelity targets.
brooke
2022-09-22 02:54:36
it's personal preference really, i did an ABX and found xHE-AAC generally performed better for me
2022-09-22 02:55:17
there's no real net loss since i already losslessly archive the source folders
2022-09-22 02:55:24
it's just experimentation
JendaLinda
2022-09-22 02:57:04
My only experience with AAC is that I'm using AAC LC alongside AVC video, 128kbps for mono and 192kbps for stereo, so that's it.
brooke
2022-09-22 02:57:35
for video i think AV1 / .opus is just the standard so that's how i adjusted
2022-09-22 02:57:43
opus being free and all
JendaLinda
2022-09-22 02:59:39
I didn't have a motivation to replace AVC as I'm working mostly with 1080p content, so I'm sticking to it. AVC has excellent hardware compatibility.
2022-09-22 03:11:24
I was using MP3 alongside AVC as well. After the release of stable AAC encoder in ffmpeg, I just switched the audio codec and kept the bitrate. There's not much difference in the high end. AAC can use even higher bitrate than MP3 but that's presumably intended for hi-res/multichannel audio.
improver https://xiph.org/flac/2022/09/09/flac-1-4-0-released.html
2022-09-22 03:30:57
The higher sample rate in FLAC could be actually handy. I can imagine FLAC could be used to encode any one-dimensional data like signals recorded using oscilloscope etc.
w
2022-09-22 04:07:45
just use bopus
Traneptora
brooke i'm talking from a purely digital standpoint here, like i only use headphones, no fancy hardware or anything
2022-09-22 06:15:20
there's no such thing as a purely digital standpoint
2022-09-22 06:15:41
all auto output devices have a DAC, since a DAC is required to convert digital to analog audio
2022-09-22 06:15:53
for most people most of the time the DAC will be built into the motherboard's sound card. but it still exists
brooke
2022-09-22 06:15:57
oh
2022-09-22 06:15:58
whoops
Traneptora
2022-09-22 06:16:16
and most of those are going to be at 48 kHz, which is why that's ideal for lossy
2022-09-22 06:16:22
to prevent resampling on playback
brooke
2022-09-22 06:16:43
Traneptora
2022-09-22 06:17:08
calling it "cd quality" vs "dvd quality" is misleading because DVD audio is usually compressed with mp2 or ac3
2022-09-22 06:17:14
and CD audio is uncompressed PCM
brooke
2022-09-22 06:17:20
yeah i get the idea
2022-09-22 06:17:37
well i mean at the same time i do frequently see DVD-A releases with 16/48 audio
2022-09-22 06:17:44
or even DVD-V sometimes
Traneptora
2022-09-22 06:19:02
in either case, for purely playback purposes, you don't want 96 kHz, that's at best identical to 48 kHz and at worst it can mess with the DAC
brooke
2022-09-22 06:19:40
i can't really show spek input since it doesn't support AAC but i'm assuming it's empty above a certain point
2022-09-22 06:20:26
.opus automatically set the sample rate to 48kHz but in this case it just keeps the source's sample rate
2022-09-22 06:20:47
so i'm guessing i should be downsampling from 24/96 to 24/48 then to xHE-AAC based on that assertion
2022-09-22 06:20:58
in the case of 24/96+ files
Traneptora
2022-09-22 06:21:48
most lossy formats have a hard low-pass at 20 kHz
2022-09-22 06:22:02
so there's actually no point in higher sample rates
brooke
2022-09-22 06:22:08
yeah i know opus did
Traneptora
2022-09-22 06:22:34
don't forget the nyquist-shannon sampling theorem, which says that a sampling rate of 2N is enough to perfectly reconstruct a signal within the 0-N band
brooke
2022-09-22 06:22:36
i 100% agree but i've always kept lossless files intact just for the sake of archival
Traneptora
2022-09-22 06:23:12
a bit of extra working sample rate is better though because there's another rider which says that the resampling function can have compact support and will decay rapidly with a bit of extra room
2022-09-22 06:23:21
but 48 kHz is plenty of extra room
2022-09-22 06:23:34
for archival/storage whatever, preserve the recorded original
brooke
brooke so i'm guessing i should be downsampling from 24/96 to 24/48 then to xHE-AAC based on that assertion
2022-09-22 06:23:34
so would this be the way to go?
Traneptora
2022-09-22 06:23:55
I don't believe you'll be able to hear the difference between 16 and 24-bit samples
2022-09-22 06:24:10
especially without specialized hardware
2022-09-22 06:24:16
I'd probably do 16/48 if you're compressing lossily
brooke
2022-09-22 06:24:23
yeah i fell for the 24/44.1 placebo early on
2022-09-22 06:24:28
i was like
2022-09-22 06:24:31
whoa better bass!
Traneptora
2022-09-22 06:24:31
you can try abx testing but I doubt that you'll score more than 50%
brooke
2022-09-22 06:24:38
but then i checked out the 16/44.1 version of the track way later on
2022-09-22 06:24:39
same dynamics
2022-09-22 06:24:45
so i was like... fuck
Traneptora I'd probably do 16/48 if you're compressing lossily
2022-09-22 06:25:13
i don't have a sound method for downsampling to a specific rate, what do you recommend?
Traneptora
2022-09-22 06:28:57
isn't 48 kHz one of the options provided
2022-09-22 06:29:08
if you're looking for software, SoX has a good algorithm for it
brooke
2022-09-22 06:29:17
https://sox.sourceforge.net/
2022-09-22 06:29:17
?
Traneptora
2022-09-22 06:29:38
yes that
brooke
2022-09-22 06:29:46
alright
2022-09-22 06:29:53
last updated in 2015
2022-09-22 06:29:54
ouch
Traneptora
2022-09-22 06:30:39
yea it's not a super up to date project
brooke
2022-09-22 06:31:39
i've got it up and running, looks pretty complicated though
2022-09-22 06:35:26
<@853026420792360980> okay i'm like completely lost could you guide me through this lmao
2022-09-22 06:35:55
like say i wanna convert a whole folder at a time
2022-09-22 06:36:12
i don't know if `-b` would be 16 in this case since 16/48 and all
2022-09-22 06:36:22
pretty sure `-r` would be 48000
Traneptora
brooke <@853026420792360980> okay i'm like completely lost could you guide me through this lmao
2022-09-22 06:41:10
`sox input.flac -b 16 output.flac rate 48k`
brooke
2022-09-22 06:41:22
so i can only do individual files?
Traneptora
2022-09-22 06:41:38
yea, but you can always use find and parallel to parallelize it
brooke
2022-09-22 06:43:27
i'm not following sorry
2022-09-22 06:46:05
based on what i saw online, do you mean something like this?```for i in flac/*.flac; do sox input.flac -b 16 output.flac rate 48k "$i" "$i_modified" done```
yurume
2022-09-22 06:46:29
`input.flac` and `output.flac` is a placeholder for file names
brooke
2022-09-22 06:47:03
yeah i'm trying to figure out as i go
2022-09-22 06:47:13
i hate CLI apps with lots of parameters for this reason lmao
2022-09-22 06:48:25
how about this then?```for i in flac/*.flac; do sox -b 16 rate 48k "$i" "$i_modified" done```
2022-09-22 06:48:51
mind was trying to fill in the blanks and i didn't think to omit the placeholders
2022-09-22 06:54:38
```i was unexpected at this time.``` well now i'm completely lost
Traneptora
2022-09-22 06:56:30
input.flac and output.flac are the actual names of the files
2022-09-22 06:56:36
you need to use them, don't put those names there literally
brooke
2022-09-22 06:56:52
like i said this tool is brand new to me so i have no idea what'll work
2022-09-22 06:57:18
if doing a whole folder isn't straightforward i'll find something else
Traneptora
2022-09-22 06:58:29
it is straightforward
2022-09-22 06:58:42
you're just not reading the syntax we already described
brooke
Traneptora input.flac and output.flac are the actual names of the files
2022-09-22 06:59:02
yeah but this means i'd have to do them individually right?
Traneptora
2022-09-22 06:59:07
.... no
brooke
2022-09-22 06:59:11
what
2022-09-22 06:59:29
i don't have a combined .flac or anything i just have a tracklist
2022-09-22 07:00:25
and you said "you can always use find and parallel to parallelize it" which i have no clue what that means
Traneptora
2022-09-22 07:00:31
```sh for i in input/*.flac; do sox "$i" -b 16 "output/$(basename "$i")" rate 48k done ```
2022-09-22 07:00:45
for example
brooke
2022-09-22 07:00:57
huh, alright
Traneptora
2022-09-22 07:01:54
alternatively ``` find input/ -name '*.flac' | parallel sox {} -b 16 output/{/} rate 48k ```
2022-09-22 07:01:57
or something like that
brooke
brooke ```i was unexpected at this time.``` well now i'm completely lost
2022-09-22 07:02:38
i just realized i ran this in command prompt and not bash
2022-09-22 07:02:40
amazing
2022-09-22 07:03:18
that's entirely on me then i made it way harder than it was supposed to oops
spider-mario
Traneptora I'd probably do 16/48 if you're compressing lossily
2022-09-22 07:06:17
if compressing lossily, it makes sense to keep the source 24-bit if it originally was, otherwise you just end up compressing some additional dither noise
2022-09-22 07:06:51
compressing pianoteq output with fdk-aac and `-vbr 5`, I got slightly smaller files starting from the 24-bit output than from the 16-bit quantized version
spider-mario if compressing lossily, it makes sense to keep the source 24-bit if it originally was, otherwise you just end up compressing some additional dither noise
2022-09-22 07:11:13
perhaps even if it wasn’t, if you’re resampling
2022-09-22 07:11:47
resampling and keeping 24 bits from the resampled signal shouldn’t be worse than resampling and then dithering again to 16 bits
2022-09-22 07:12:36
(probably not that much better either, in all fairness)
brooke
2022-09-22 07:26:06
<@853026420792360980> tysm for the help btw 🙏
2022-09-22 07:26:18
got what i was trying to do working
Traneptora
spider-mario if compressing lossily, it makes sense to keep the source 24-bit if it originally was, otherwise you just end up compressing some additional dither noise
2022-09-22 07:29:39
not all lossy codecs work internally in high precision
2022-09-22 07:29:48
for opus this would be true, but brooke's using HE-AAC which doesn't
spider-mario resampling and keeping 24 bits from the resampled signal shouldn’t be worse than resampling and then dithering again to 16 bits
2022-09-22 07:30:17
it wouldn't be worse, but it would take up a lot more space
2022-09-22 07:30:29
in practice this happens even if in theory encoders shouldn't do that
spider-mario
Traneptora it wouldn't be worse, but it would take up a lot more space
2022-09-22 07:31:51
aren’t we talking temporary files here?
2022-09-22 07:31:59
or am I misunderstanding?
brooke
Traneptora for opus this would be true, but brooke's using HE-AAC which doesn't
2022-09-22 07:33:58
ironically both ideal codecs (audio and cover image) don't work on my main music player while they do on foobar even though i hate foobar
2022-09-22 07:34:33
now that i've been experimenting with xHE-AAC for audio and JXL for covers
2022-09-22 07:35:09
but i mean the results are there after having followed everyone's advice
2022-09-22 07:35:13
this is pretty fucking good for 8.6MB
spider-mario
2022-09-22 07:35:37
not playing through Discord either, it seems
brooke
2022-09-22 07:35:46
yeah discord doesn't support m4a in general i think
spider-mario
2022-09-22 07:35:58
it does with standard AAC
brooke
2022-09-22 07:36:09
oh?
spider-mario
2022-09-22 07:36:16
let me try an example
brooke
2022-09-22 07:36:22
well it has the player controls it'd show on an .mp3 / .flac
2022-09-22 07:36:28
on codecs like .opus it just doesn't show anything
2022-09-22 07:36:33
it might just be the specific encoder i think
spider-mario
2022-09-22 07:37:26
brooke
2022-09-22 07:38:37
brooke it might just be the specific encoder i think
2022-09-22 07:39:03
probably the case
JendaLinda
2022-09-22 07:44:28
Speaking of dithering in lowering the bit depth of audio, is it actually beneficial? When i was converting some sounds for an old game to 8bit PCM, the dithering was very intrusive, a conversion without nay dithering sounded much better.
spider-mario
2022-09-22 07:47:27
I would say yes
2022-09-22 07:47:36
I think I would take broadband noise over distortion
JendaLinda
2022-09-22 07:48:31
To be fair, the game uses sounds in 22050Hz and 11025Hz, so everything was in the audible range.
2022-09-22 07:50:08
At the end, in such cases, the distortion is more tolerable than dithering.
spider-mario
brooke
2022-09-22 08:08:27
for what it’s worth, I used to use spek, but then switched to Sonic Visualiser to have a logarithmic frequency axis
brooke
2022-09-22 08:13:49
i use spek because it looks cool
2022-09-22 08:14:12
https://tenor.com/view/gus-fring-gus-fring-chicken-man-breaking-bad-gif-24625386
2022-09-22 08:14:14
We are not the same.
spider-mario
2022-09-22 08:17:19
Sonic Visualiser can also play the audio, which is nice to have as the playback line goes through the spectrogram
Traneptora
2022-09-22 10:09:22
~~just use ffplay~~
brooke
2022-09-22 10:50:54
after a grand total of 1 day using foobar2000 i've now switched to using a m-ab-s build
2022-09-22 10:50:59
👏 👏 👏
spider-mario
Traneptora ~~just use ffplay~~
2022-09-22 10:59:08
(also not logarithmic IIRC, and you don’t get to see the spectrogram in advance)
Traneptora
2022-09-23 01:17:01
(that's why it was a joke)