JPEG XL

Info

rules 57
github 35276
reddit 647

JPEG XL

tools 4225
website 1655
adoption 20712
image-compression-forum 0

General chat

welcome 3810
introduce-yourself 291
color 1414
photography 3435
other-codecs 23765
on-topic 24923
off-topic 22701

Voice Channels

General 2147

Archived

bot-spam 4380

jxl

Anything JPEG XL related

fab
2021-07-07 11:59:38
for %i in (C:\Users\User\Documents\kki9\*.png) do cjxl -q 79.141 -s 5 -I 0.99 %i %i.jxl
2021-07-07 11:59:45
resulting file 0.791 bpp
2021-07-07 12:00:06
1:20 compression to png
2021-07-07 12:01:58
how to use same setting for an animation?
2021-07-07 12:02:05
wb
2021-07-07 12:02:45
i want to do a 3:28 minute animation, is there any limitation on length of the animation?
2021-07-07 12:04:56
i read I need APNG
diskorduser
2021-07-07 12:52:59
Dot MP4
fab
2021-07-07 12:56:54
encoder and decoder changed too much
Jyrki Alakuijala
2021-07-07 01:21:20
I didn't understand what value Mono would bring me -- when compared to just writing C or C++ natively
lithium
2021-07-07 01:34:09
Yes, You also can use asp.net core to implement cross-platform.
diskorduser
fab i want to do a 3:28 minute animation, is there any limitation on length of the animation?
2021-07-07 01:38:38
Why don't you use libx264?
lithium
Jyrki Alakuijala if I learn something I want to have some probability of using it in 10 years
2021-07-07 01:46:18
Probably Rust is worth it to learn? > Mozilla: Rust > Microsoft: C# > Google: Go, Dart
fab
2021-07-07 02:09:11
cjxl doesn't read apng
2021-07-07 02:09:12
why
2021-07-07 02:09:16
failed to read the file
2021-07-07 02:09:23
i wasted 2 hours creating this file
2021-07-07 02:09:29
and cjxl refuses to read it
2021-07-07 02:10:51
<@!794205442175402004>
eddie.zato
2021-07-07 02:23:24
Works fine for me: ``` PS > cjxl zato-winz2.apng zato-winz2.jxl -m JPEG XL encoder v0.3.7 [SSE4] Read 162x290 image, 238.1 MP/s Encoding [Modular, lossless, squirrel], 4 threads. Compressed to 27197 bytes (0.056 bpp/frame). 162 x 290, 0.16 MP/s [0.16, 0.16], 1 reps, 4 threads. ```
fab
2021-07-07 02:26:53
the file size is big
2021-07-07 02:27:13
Bit rate : 160 kb/s Width : 720 pixels Height : 900 pixels original mp4
2021-07-07 02:27:20
426 KB (437.114 byte)
2021-07-07 02:27:32
2,86 MB (3.007.108 byte) jxl
2021-07-07 02:27:40
for %i in (C:\Users\User\Documents\kki9*.png) do cjxl -q 79.141 -s 5 -I 0.99 %i %i.jxl
2021-07-07 02:28:04
so 1020 kbps for a hd file
2021-07-07 02:28:19
don't know how it compares with real test samples
2021-07-07 02:28:36
maybe jpeg xl in that case is a bit better
2021-07-07 02:28:54
1,6 gb ram for that
2021-07-07 02:29:10
10 frame per second
2021-07-07 02:32:02
i see lags and evident loss of quality compared to original
2021-07-07 02:32:10
in chrome in mp4 colours looks better
2021-07-07 02:32:35
2021-07-07 02:33:20
2021-07-07 02:46:27
i think there is a bug
2021-07-07 02:47:26
-I don't works in animation
2021-07-07 02:47:28
serious bug
2021-07-07 02:47:37
that's why video weight so much
2021-07-07 02:47:57
the bpp is 0.247/frame
2021-07-07 02:52:21
changing from -d 1.977 hare to -d 7 don't bring nothing
2021-07-07 02:52:33
it weight 1,6 mb and 0,133 bpp/frame
2021-07-07 02:52:58
also quality appears the same in every frame
2021-07-07 02:53:00
no interframe
2021-07-07 02:53:21
is still 4 time bigger than original
2021-07-07 02:54:46
the file got invalid the second file
2021-07-07 02:54:50
do not open in chrome canary
2021-07-07 02:56:57
2021-07-07 02:57:00
it shows this
2021-07-07 02:57:14
maybe my cpu is not strong enough
2021-07-07 03:00:16
creating a series of png .png is using 1,6 gb of ram
2021-07-07 03:00:31
0,03 mpx/s
2021-07-07 03:01:06
150 elements 48,9 mb
2021-07-07 03:03:44
png images were all valid
2021-07-07 03:03:46
apng failed
2021-07-07 03:04:32
2021-07-07 03:05:02
diskorduser
2021-07-07 03:16:15
You should be using a video codec for such long duration. I don't understand why you want to convert a video to apng / animated jxl.
2021-07-07 03:16:23
You're just wasting your time
fab
2021-07-07 03:38:39
jpeg xl for now only use case i see for now
2021-07-07 03:40:24
unfortunately images from reddit are a lot more than 6 mpx
2021-07-07 03:40:26
usually 18 mpx
2021-07-07 03:42:12
like for example this can't be encoded
2021-07-07 03:42:29
2021-07-07 04:03:50
2021-07-07 04:03:54
Try to compress this
2021-07-07 04:05:01
update new encoder foxwizard is using is a bit better
2021-07-07 04:05:10
and he got 0.040 bpp of a image
2021-07-07 04:05:16
70 kb 12 mpx
2021-07-07 04:23:16
compile latest jpeg xl encoder
2021-07-07 04:24:37
for %i in (C:\Users\User\Documents\a10\*.jpg) do cjxld1 -j -q 78.83 --faster_decoding=3 --use_new_heuristics -s 9 -I 0.95 %i %i.jxl
2021-07-07 04:24:42
all those options work in this
2021-07-07 04:25:36
2021-07-07 04:25:58
06/07/2021 04:39 eddiezato
2021-07-07 04:26:13
yesterday encoder
2021-07-07 04:30:53
honestly i won't use old jxl
eddie.zato
2021-07-07 05:11:43
Sometimes, reading Fabian's posts, I imagine that he is just an artificial neural network. These statements about jxl use cases... <:NotLikeThis:805132742819053610>
2021-07-07 05:11:48
No offense, please. 😄
Jyrki Alakuijala
fab yesterday encoder
2021-07-07 05:36:21
you like the new changes? 😄
fab
2021-07-07 05:45:58
User watch <#794206087879852106>
2021-07-07 05:46:22
There is a txt about what i think about jxl
2021-07-07 05:46:34
5 hours of evaluation
raysar
eddie.zato Sometimes, reading Fabian's posts, I imagine that he is just an artificial neural network. These statements about jxl use cases... <:NotLikeThis:805132742819053610>
2021-07-07 06:17:41
It's not very good neural network, and alpha version maybe 😄
w
2021-07-07 06:34:43
and he somehow has regressed in over a year
fab
2021-07-07 06:49:19
https://www.reddit.com/r/modhelp/comments/370n08/spam_domains_a_bunch_of_domains_that_are_almost/
2021-07-07 06:49:24
why i can't post pastebin
2021-07-07 06:49:37
i just want to update people on jpeg xl situation
2021-07-07 06:49:39
how to do
Jyrki Alakuijala
fab There is a txt about what i think about jxl
2021-07-07 06:50:40
I looked at a few last days and didn't find it
fab
Jyrki Alakuijala I looked at a few last days and didn't find it
2021-07-07 06:50:57
https://discord.com/channels/794206087879852103/794206087879852106/862379679549620234
Jyrki Alakuijala
2021-07-07 07:02:49
ah, I didn't see the expand/collapse thing -- learning to use discord 😄
lithium
2021-07-07 07:34:14
jxl-16e8ea3 Tiny ringing have some improve in d0.5 s8 and s9, but still have some tiny ringing spread on smooth area, not every image have this issue, I guess if image have some dark color and some specific feature will have chance happen this issue.
fab
2021-07-07 07:44:43
upvote on reddit
2021-07-07 07:57:28
added dilvish link eddiezato builds
diskorduser
2021-07-08 04:40:02
Is it normal for debug build to encode very slow? -O2 - takes less than a second. debug build takes around ~~40~~ 80 seconds.
2021-07-08 04:40:34
Same image & same default settings .
BlueSwordM
2021-07-08 05:14:59
Yes, as debug builds have a lot of optimizations disabled.
diskorduser
2021-07-08 05:42:57
Oh I see. Time to build my de and window manager with O2 and cpu specific optimization.
fab
2021-07-08 09:23:26
<@!794205442175402004> there is another post on reddit but notification don't arrive
2021-07-08 09:23:54
good it arrived
2021-07-08 02:37:23
https://doublebench.com/doublebench-software/?ckattempt=1
Pieter
2021-07-09 01:31:22
<@456226577798135808> I think one question that matters here is what is more important: adoption of jpeg-xl, or adoption of libjxl?
2021-07-09 01:32:18
Ideally, both, of course. But I think that licensing the encoder under a copyleft license would additionally hurt adoption of the format itself.
2021-07-09 01:34:05
(I'm speculating here; I don't know the actual reason for the license choice)
Diamondragon
2021-07-09 01:36:07
Permissive licenses are very much the norm for reference software for free codecs. libaom is also an example of this: https://source.chromium.org/chromium/chromium/src/+/master:media/test/data/licenses/AOM-LICENSE
2021-07-09 01:36:56
Like sipa said, it is a trade-off meant to encourage adoption.
2021-07-09 01:37:57
If integrating libJXL meant re-licensing your application as GPL 3, the majority of devs would not want to do that.
BlueSwordM
2021-07-09 01:45:26
It just means closed commercial projects can very easily integrate libjxl stuff.
190n
2021-07-09 04:42:46
bit late to the party but even stallman himself supported a permissive license (BSD) for libvorbis, for most of the reasons people have given here: https://lwn.net/2001/0301/a/rms-ov-license.php3
2021-07-09 04:44:16
the situation is a bit different, because there doesn't seem to be any proprietary format that can really challenge jxl, but we still want everything possible -- even proprietary stuff -- to support it
raysar
2021-07-09 08:59:31
If you see the jpeg example, it's an open source encoder the best, for h264 it's also x264, for h265 it's an proprietary encoder but x265 is near the best. Maybe compagny create best lossy modular encoding, but for vardct i think we are near the limit of quality.
fab
2021-07-09 09:14:15
i think 17.36% is done from 5th february video heuristics, i noticed drops after -d 0.6 the quality look very bad to me in that video i see strong moving of pixels
_wb_
raysar If you see the jpeg example, it's an open source encoder the best, for h264 it's also x264, for h265 it's an proprietary encoder but x265 is near the best. Maybe compagny create best lossy modular encoding, but for vardct i think we are near the limit of quality.
2021-07-09 12:55:35
I think we are far from the limit of what can be done with vardct, but I also think improvements will likely end up in libjxl and not in some proprietary third party encoder
raysar
_wb_ I think we are far from the limit of what can be done with vardct, but I also think improvements will likely end up in libjxl and not in some proprietary third party encoder
2021-07-09 01:38:42
Oh really? more than some %file size like 10%? There are solution not explore now for that?
Scope
2021-07-09 01:58:14
Not all of the encoder features are even implemented in the current VarDCT, and even without them there is still room for efficiency improvements Btw, large (64x, 128x, 256x) block sizes (which are not used yet, as I understand, except for the recent initial implementation of 64) are useful only for large resolutions or can they also be better for quality at low bpp?
fab
2021-07-09 02:11:38
it lacks floating blocks
2021-07-09 02:11:53
something that probably in 2022 it will come in jxl
2021-07-09 02:11:57
encoder
2021-07-09 02:12:11
but it was announced few times by both jyrki and jon
2021-07-09 02:12:38
also the decoder is superfast with medium encoded vardct
2021-07-09 02:12:47
like even more than jpg
2021-07-09 02:13:13
mediocre/ very average speed on older modular images
2021-07-09 02:13:28
super slow for single core in 16 mpx images
2021-07-09 02:13:39
and it uses 435 mb ram even for one core
2021-07-09 02:14:13
i'm talking about the prerelease https://github.com/saschanaz/jxl-winthumb/releases/tag/v0.1.10
2021-07-09 02:14:50
there isn't this fix
2021-07-09 02:14:51
https://github.com/libjxl/libjxl/commit/a307c57469b824acc55e72d115b3d106eda24221
2021-07-09 02:15:03
commited on 8th july 2021
retr0id
2021-07-09 02:18:33
if libjxl was copyleft, then someone would just make a permissive (or proprietary!) implementation from scratch
fab
2021-07-09 02:34:14
format used for youtube stories: jpg
2021-07-09 02:37:05
is stored on google photos
lithium
Scope Not all of the encoder features are even implemented in the current VarDCT, and even without them there is still room for efficiency improvements Btw, large (64x, 128x, 256x) block sizes (which are not used yet, as I understand, except for the recent initial implementation of 64) are useful only for large resolutions or can they also be better for quality at low bpp?
2021-07-09 02:49:07
Yes, vardct quality improve still work in process, probably some important quality improve will implement in this month. > Jyrki Alakuijala : > I have at least three more ideas on how to reduce > (better handling of large transforms, > more aggressive smoothing, > flatter quantization matrices).
raysar
2021-07-09 06:54:21
New build for windows: commit a64f5768 There is the new noise synthesis ! You need to choose manually the amount of noise. example: cjxl.exe -d 3 --photon_noise ISO1600 (no need to use --noise 1) https://1drv.ms/u/s!Aui4LBt66-MmlGScGgU6NENFO_UZ?e=gFxD5Y
fab
2021-07-09 07:05:04
i didn't used noise 1 but the size changed
2021-07-09 07:05:26
only with this parameter
2021-07-09 07:05:40
using only q 75 s7 or s9 is same size as 2 days ago
_wb_
2021-07-09 07:12:51
This noise is added on top of the decoded image. The encoder is not trying to do noise estimation or denoising though, right <@604964375924834314> ?
spider-mario
2021-07-09 07:13:23
indeed, it’s purely from the parameter given
2021-07-09 07:13:34
also, if it’s used, there is no need to also say --noise=1, it implies it
Deleted User
2021-07-09 07:16:15
What photon_noise is recommended to get similar results to dithering?
_wb_
2021-07-09 07:18:13
You mean equivalent to 9-bit random lsb?
2021-07-09 07:19:06
Noise doesn't really follow the sRGB transfer curve though, I guess, so it would only be approximate
raysar
spider-mario also, if it’s used, there is no need to also say --noise=1, it implies it
2021-07-09 07:52:18
ok 🙂 i don't understand the code ^^ --noise 1 now use a fixed ISO?
spider-mario
2021-07-09 07:52:59
`--noise 1` is the way it always used to work, trying to determine the noise parameters by analyzing the image
2021-07-09 07:53:43
if not using `--photon_noise=…`, then `--noise 1` is unchanged from what it was a week ago
raysar
spider-mario if not using `--photon_noise=…`, then `--noise 1` is unchanged from what it was a week ago
2021-07-09 07:54:09
ok 🙂
w
2021-07-09 08:17:43
i need help! cjxl cant read this with the color profile
improver
2021-07-09 08:20:09
``` % cjxl v4esrgb.jpg v4esrgb.jxl JPEG XL encoder v0.3.7 [AVX2] /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/jxl/fields.cc:809: JXL_FAILURE: No feasible selector for 4887761 /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/jxl/fields.cc:554: JXL_RETURN_IF_ERROR code=1: ok_ /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/jxl/color_encoding_internal.cc:270: JXL_FAILURE: Unable to encode XY -2.443881 0.320882 /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/jxl/color_encoding_internal.cc:463: JXL_RETURN_IF_ERROR code=1: red_.Set(xy.r) /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/jxl/enc_color_management.cc:674: JXL_RETURN_IF_ERROR code=1: IdentifyPrimaries(profile, wp_unadapted, this) /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/jxl/jpeg/enc_jpeg_data.cc:299: JXL_RETURN_IF_ERROR code=1: SetColorEncodingFromJpegData(*jpeg_data, &io->metadata.m.color_encoding) /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/extras/codec_exr.cc:146: JXL_FAILURE: OpenEXR failed to parse input /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/extras/codec.cc:124: JXL_FAILURE: Codecs failed to decode /home/anon/.cache/yay/libjxl-git/src/libjxl/lib/extras/codec.cc:137: JXL_RETURN_IF_ERROR code=1: SetFromBytes(Span<const uint8_t>(encoded), io, pool, orig_codec) Failed to read image v4esrgb.jpg. /home/anon/.cache/yay/libjxl-git/src/libjxl/tools/cjxl_main.cc:58: JXL_RETURN_IF_ERROR code=1: LoadAll(args, &pool, &io, &decode_mps) ```
spider-mario
2021-07-09 08:53:39
apparently it’s “e-sRGB” which someone described thus: > e-sRGB is an early, primitive and basically failed attempt at a wide gamut specification. It is long obsolete and today we use AdobeRGB or ProPhoto.
2021-07-09 08:54:04
what is kind of strange is that it’s nevertheless an ICCv4 profile
2021-07-09 08:54:20
maybe we could substitute a v2 version for it and we would have more luck with cjxl
w
2021-07-09 08:54:38
it's still part of the main examples on <https://www.color.org/version4html.xalter>
_wb_
2021-07-09 10:08:51
does it have wildly imaginary primaries or something?
spider-mario
2021-07-09 11:05:09
it doesn’t seem to have explicit primaries at all, but those that we infer do seem wild
2021-07-09 11:05:26
the `JXL_FAILURE` from neknekk’s log reads: “Unable to encode XY -2.443881 0.320882”
raysar
2021-07-11 09:37:44
Does cjxl copy all exif data and exif from png? PNG:
2021-07-11 09:38:14
jxl:
2021-07-11 09:40:16
original png from tiff: https://1drv.ms/u/s!Aui4LBt66-MmlUxKFdSuo7qz-v80?e=kiaOdQ
_wb_
2021-07-11 10:09:39
You can see what blobs cjxl is putting in the container. If it shows exif or xmp, it always just copies the whole blob from the original
w
2021-07-11 10:10:23
is it possible to have no color profile in jxl?
_wb_
2021-07-11 10:16:46
It's either enum values or an icc profile, cannot have nothing
2021-07-11 10:17:43
You can do "Unknown" in the enum, but cjxl does not produce that (on input without colorspace info, it assumes sRGB and signals that)
raysar
_wb_ You can see what blobs cjxl is putting in the container. If it shows exif or xmp, it always just copies the whole blob from the original
2021-07-11 10:57:49
But here, it don't display the same metadata as png. It's a problem.
2021-07-11 11:03:35
it's my previous link
2021-07-11 11:24:33
yes: https://1drv.ms/u/s!Aui4LBt66-MmlRP3RmVo56pF2K3i?e=qiCpWQ
Cagelight
2021-07-11 12:24:36
when I use imagemagick to convert the tif to the png, the tiff metadata is preserved, shouldn't the same be true for JXL?
_wb_
2021-07-11 12:28:05
the jxl file format has defined how to attach Exif, XMP (or rather any XML), and JUMBF to a jxl image
2021-07-11 12:29:48
cjxl does not take TIFF as input
2021-07-11 12:31:02
ImageMagick tends to create all kinds of non-standard metadata in an effort to preserve whatever it can
2021-07-11 12:31:43
in the case of PNG, that would end up in txt chunks that cjxl does not understand so it will just get stripped
2021-07-11 12:34:49
in principle, the ISOBMF container format jxl is using could be extended easily - the decoder just ignores anything that is not a jxl codestream
2021-07-11 12:38:04
likely exiv2 knows how to parse tiff (it has to do that anyway if it wants to parse exif, which is basically the same syntax as tiff except it doesn't have to contain an image), but it doesn't know how to parse ImageMagick's non-standard PNG metadata chunks
Cagelight
2021-07-11 12:38:28
yeah I was using exiftool, which has a lot more extensive support
2021-07-11 12:41:18
for what it's worth, while ImageMagick doesn't seem to know ho to translate TIFF metadata properly, `exiftool` does, I can do `exiftool -tagsfromfile Irlande1_034.tif Irlande1_034.png` and the TIFF metadata gets reasonably translated into EXIF
_wb_
2021-07-11 12:41:31
metadata in image formats is basically a huge amount of historically evolved fragile stuff that hangs together with some duct tape and glue
raysar
2021-07-11 01:05:07
If i convert to jpg the tiff with imagemagick exif are destroy :/ I understand that this exif in this tiff is not standard (it's metadata from film scanner), but it's text file metadata, there are solution to keep it, like the png. People need to keep all metadata of their file when they convert it to jxl. It's hard to copy metadata text even if it's not standard?
Cagelight
raysar If i convert to jpg the tiff with imagemagick exif are destroy :/ I understand that this exif in this tiff is not standard (it's metadata from film scanner), but it's text file metadata, there are solution to keep it, like the png. People need to keep all metadata of their file when they convert it to jxl. It's hard to copy metadata text even if it's not standard?
2021-07-11 01:30:21
EXIF isn't being destroyed, there was no EXIF in the first place, just TIFF metadata. You need something that converts TIFF metadata to EXIF metadata. `exiftool` and `exiv2` can do this
2021-07-11 01:31:05
probably some image editors too
raysar
2021-07-11 01:34:13
Ok it's tiff metadata, With "exiftool -tagsfromfile Irlande1_034.tif Irlande1_034.png" i have
Cagelight
2021-07-11 01:37:52
not everything will be preservable, that's probably as much as can be transferred
raysar
2021-07-11 01:47:18
binary Tone Reproduction Curve cannot be store? It's useless to keep that data?
2021-07-11 01:48:31
ok thank you i will read it 🙂
2021-07-11 01:52:16
It was not my scanner, maybe not calibrated, i think it's used out of the box.
Maiki3
2021-07-11 08:03:35
anyone know how I can losslessly rotate a JXL?
_wb_
2021-07-11 08:59:28
There's no cli tool for it yet, but it is just a header field change
raysar
_wb_ There's no cli tool for it yet, but it is just a header field change
2021-07-11 09:17:10
We need an `cjxl input.jxl output.jxl --rotate 90`
2021-07-11 09:18:57
And a --mirror option 😜
BlueSwordM
2021-07-11 09:19:09
We also need `cjxl input.PNG output.jxl --dither=50`
raysar
2021-07-11 09:23:40
We have so many work for dev 😆 , an other option to remove noise synthesis and adding new noise losslessly to input jxl.
BlueSwordM
raysar We have so many work for dev 😆 , an other option to remove noise synthesis and adding new noise losslessly to input jxl.
2021-07-11 09:24:48
Yeah, that kind of bitstream editing would be mental.
spider-mario
2021-07-11 11:27:30
yes, something more or less like `jpegtran` but for jxl
2021-07-11 11:27:32
`jxltran`
2021-07-11 11:28:12
or a bit like swfmill
2021-07-11 11:28:41
why is this my reference
2021-07-11 11:28:57
it’s 2021
2021-07-11 11:29:03
flash is officially dead
BlueSwordM
2021-07-11 11:38:33
This is some very weird artifacting at D1 <:kekw:808717074305122316> https://github.com/libjxl/libjxl/issues/314
2021-07-11 11:39:35
Metadata makes all of the difference, doesn't it?
Orum
2021-07-12 02:03:35
I hope they're not using 0.3BPP for medical imaging...
2021-07-12 02:06:02
once RAM usage goes down I'll try it on some of my slide scans though
w
2021-07-12 04:44:10
how many levels of faster_decoding are there?
yllan
2021-07-12 04:58:14
excuse me, how do you assign distance < 0.1? I tried `cjxl -d 0.001 input.png output.jxl` But it always shows `Encoding [VarDCT, d0.100, squirrel], 6 threads.`
BlueSwordM
yllan excuse me, how do you assign distance < 0.1? I tried `cjxl -d 0.001 input.png output.jxl` But it always shows `Encoding [VarDCT, d0.100, squirrel], 6 threads.`
2021-07-12 05:12:50
I think it is possible that there might be either a UI limit, or a hardcap at 1 decimal of decimal precision or a hardcap at 0.1.
_wb_
2021-07-12 05:51:45
At some point lossless becomes smaller
BlueSwordM Metadata makes all of the difference, doesn't it?
2021-07-12 05:55:13
It's more about passing input correctly to cjxl. A perceptual encoder needs to know what the image looks like, which wasn't the case here (very dark gray was supposed to be white).
w how many levels of faster_decoding are there?
2021-07-12 05:55:32
Currently it is 0-4
Maiki3
_wb_ There's no cli tool for it yet, but it is just a header field change
2021-07-12 08:19:40
So what do I do? I need to rotate some images. 🙂
_wb_
2021-07-12 08:21:49
Write a `jxltran` tool 🙂
Maiki3
2021-07-12 08:23:25
ah, unfortunately i don't possess the skillset for that
w
2021-07-12 09:06:22
just change byte 64
_wb_
2021-07-12 10:20:57
it's trickier if the original image uses default headers and has no orientation yet, and the position of the field is not fixed
fab
2021-07-12 03:42:33
-q 50.883 -s 9 -p --gaborish=1 --epf=3 --use_new_heuristics
BlueSwordM
2021-07-12 03:47:32
Yeees.
2021-07-12 03:47:41
SIMD noise generation <:Poggers:805392625934663710>
w
2021-07-12 04:06:46
is there some macro in the API to pick between lossless, lossy, (for JPG) lossless lossy transcode depending on which one will compress better? Or do I have to try all 3
fab
2021-07-12 04:10:49
current best settings jxl
2021-07-12 04:11:15
that's the build date and exe download this
2021-07-12 04:11:24
and send all this parameters
2021-07-12 04:11:25
for %i in (C:\Users\User\Documents\enc\*.jpg) do cjxld1 -s 8 -d 8.839 --use_new_heuristics -I 0.488 --faster_decoding=4 --epf=1 -p %i %i.jxl
2021-07-12 04:11:30
they are all supported
2021-07-12 04:12:29
the encoder probably will cause a lot of face distortions
2021-07-12 04:12:37
but is an easy settings you can use
2021-07-12 04:12:43
without thinking about what you can do
2021-07-12 04:13:16
https://discord.com/channels/794206087879852103/794206170445119489/862368770085486622
2021-07-12 04:16:21
06/07/2021 04:39 eddiezato
raysar
w is there some macro in the API to pick between lossless, lossy, (for JPG) lossless lossy transcode depending on which one will compress better? Or do I have to try all 3
2021-07-12 05:34:29
As i understand not for now, you need to bruteforce yourself to test several encoding and choose the best. But for future another effort could do that.
improver
2021-07-12 11:58:40
depends on source. if source are pngs, then do lossless modular
w
2021-07-13 12:09:35
i find for manga grayscale for jpg, -j is way better
2021-07-13 12:10:05
lossless
2021-07-13 12:11:03
what i did for my library was check manually
2021-07-13 12:11:14
actually
2021-07-13 12:11:18
i just do both
2021-07-13 12:11:20
-j and no -j
2021-07-13 12:11:25
and pick the smaller one
2021-07-13 12:11:59
it's just for myself
2021-07-13 12:14:56
i find that these types of errors are insignificant when taking into account that jpg being decoded to a screen has the same problem
improver
2021-07-13 12:21:44
yeh. `-j` ends up with lossless since source is lossy so modular so good for manga
2021-07-13 12:22:00
if jpeg artifacts arent that bad then its v often better
w
2021-07-13 12:22:06
but most cases of color jpg are better without -j
improver
2021-07-13 12:22:28
depends whether its digital source or scans kinda
2021-07-13 12:22:50
better just do both if you wanna optimize the most
2021-07-13 12:23:41
depends in encoding flags i guess
2021-07-13 12:24:46
this looks like digital source so png is reasonable
2021-07-13 12:25:40
ehh defaults flags is not something I'd care about (I prefer `-e 9 -I 1 -E 3` just because i can)
2021-07-13 12:27:17
for compressing things to serve to customers then yeah latency is important but for bulk recompression of existing things i just ramp up to max (or cap it to the point when it takes too long)
w
2021-07-13 12:35:38
-d 0
2021-07-13 12:35:49
or -q 100
2021-07-13 12:39:10
1276065 cjxl 11.jpg -j -d 0 11.j.jxl 1649037 Jul 12 17:38 11.jpg 1421669 cjxl 11.jpg 11.jxl
2021-07-13 12:39:17
it's very random that's why i check both
2021-07-13 12:43:25
i did not keep stats for that
improver
2021-07-13 01:00:51
in my experience it IS the case that often `-j -d 0` can be quite a bit smaller for anime art and manga stuff
2021-07-13 01:01:59
if it's scanned, not digital source, then yes. can you give a link to show how images look like
2021-07-13 01:04:28
also uhh I'd also do `-e 9 -I 1 -E 3` too just cuz it's a maximum setting and normal jpeg transcoding can't reach these normally
w
2021-07-13 01:06:21
1119951 Jul 12 18:06 11.j.s9.jxl
Diamondragon
improver depends on source. if source are pngs, then do lossless modular
2021-07-13 01:06:28
Worth noting that lossless modular decoding is quite slow. Much slower than VarDCT.
improver
2021-07-13 01:07:26
fast enough for me :^)
2021-07-13 01:07:35
2021-07-13 01:08:01
yeah this certainly looks like a lot of jpeg artifacts, which modular wouldn't be able to pick up properly
Diamondragon
improver fast enough for me :^)
2021-07-13 01:12:11
Really? It's been tough for me to justify using. Opening lossless modular images takes multiple, full seconds for me. Nearly four seconds on the dot in ImageGlass. Admittedly the images are large (about 2260 x 3200 pixels), but it is still really jarring.
improver
2021-07-13 01:13:12
guess my hardware is just fast enough... also I think progressive decoding could probably make it a bit less painful
2021-07-13 01:13:20
also do you have images to share
2021-07-13 01:13:27
the most painful ones
w
2021-07-13 01:14:01
anything <2 mp is really fast/not noticeable when scrolling through them compared to jpg
2021-07-13 01:14:02
on my phone
2021-07-13 01:14:40
4mp takes 160ms
2021-07-13 01:14:44
on my phone
improver
2021-07-13 01:18:49
years ago I didn't. that was kinda important experience for me as software dev, taught me importance of optimization
2021-07-13 01:19:07
but, like, now i have money so...
2021-07-13 01:20:00
not that much money but enough to get something somewhat decent
Diamondragon
improver the most painful ones
2021-07-13 01:20:12
Nothing on hand that would be legal to share, I'm afraid. Only other people's copyrighted stuff.
improver
2021-07-13 01:20:29
there's fair use for this kind of usage
w
2021-07-13 01:20:39
me with 0$ phone with the 2 year contract plan
2021-07-13 01:21:36
just share in dm
improver
2021-07-13 01:22:19
recent enough low-end stuff does save money, but if you use really old stuff then it'll probably eat into your electricity bill a bit
2021-07-13 01:23:00
it kinda depends and these cost tradeoffs should be calculated beforehand though before making any decision
2021-07-13 01:26:29
regarding content stuff i just blast em whatever i go. me posting things doesn't mean that I own them or what people can use them commercially. i think such usage, like for encoder testing, is actually covered by <https://www.copyright.gov/fair-use/more-info.html>
w
2021-07-13 01:27:34
actually fair use is not a catch all excuse
2021-07-13 01:27:39
it's really up to the copyright holder
improver
2021-07-13 01:27:50
it's up to courts
w
2021-07-13 01:28:07
but this is the internet and there is a minimum level of anonymity so 🤷‍♂️
improver
2021-07-13 01:28:42
this clearly falls into non-commercial education and research kind of thing imo
2021-07-13 01:29:12
especially since im not hired to work on jxl so it's really non-commercial
w
2021-07-13 01:29:35
yeah but you wouldnt be sending others your copy of photoshop in a non commercial education setting
2021-07-13 01:29:41
piracy is piracy
improver
2021-07-13 01:30:26
it's kinda complicated but for some antivirus deving I actually would
w
2021-07-13 01:30:46
fan scanlations is still piracy
improver
2021-07-13 01:30:54
if who you send to only use for non-commercial antivirus deving
w
2021-07-13 01:31:06
that's why md should not give scanlators rights. they are not human
2021-07-13 01:31:27
yeah im a pirate
2021-07-13 01:31:29
im also canadian
improver
2021-07-13 01:31:53
it depends on whether that piracy actually allows one to replace the original paid experience
2021-07-13 01:32:05
if it's obtained for free then it's a lot more lax
w
2021-07-13 01:32:27
but that's up to the owner of the work
improver
2021-07-13 01:32:28
sharing whole complete manga would probably be bad but sharing cuts is fine
2021-07-13 01:32:43
sharing images given out for free is fine too
w
2021-07-13 01:32:59
not necessarily
2021-07-13 01:33:15
only if you have a license to redistribute it
2021-07-13 01:33:16
even if it's free
improver
2021-07-13 01:33:43
it depends why you're redistributing it
Diamondragon
improver there's fair use for this kind of usage
2021-07-13 01:33:47
I'll try to find permissive licensed lossless images to test, and see if I can replicate the slowness. I'll then come back to share those.
w
2021-07-13 01:33:53
it doesnt make them less illegal
2021-07-13 01:34:17
just because you "feel" like it's "fair"
improver
2021-07-13 01:35:00
idk idk lossless licenses for encoding testing of content sounds hella stupid, unless you wanna include it in conformance repo or something like that
2021-07-13 01:35:36
actually if you read the law it depends on courts
w
2021-07-13 01:36:02
yeah and are you going to pay for a lawyer
improver
2021-07-13 01:37:15
I'd actually rather guide my decisions on what I observe in wild (everyone just post stuff whatever) than theorethical paranoia
2021-07-13 01:38:30
just ensure you don't piss someone off and you should be fine
2021-07-13 01:39:30
and if you piss someone off you can be bitten even if you don't break any law
2021-07-13 01:40:20
thats why i like anonimity in general
2021-07-13 01:40:49
but this is kinda going offtopic tbh and i should be sleeping at this time too so ill just shut up
2021-07-13 01:41:57
btw lenna wasn't copyright free :^)
w
2021-07-13 01:58:19
is it possible to have a gray icc profile with not gray number of channels for jxl?
improver
2021-07-13 01:58:23
phoneposting in bed, but if images are something like actually not widely published at all (but may be in the future) and somewhat scarce then yes them getting out could be economically damaging and fair use could probably not apply
w is it possible to have a gray icc profile with not gray number of channels for jxl?
2021-07-13 01:59:42
as in 3 channels but ICC makes stuff gray?
w
2021-07-13 02:00:10
as in does JxlDecoderGetColorAsICCProfile return cmsSigGrayData if and only if the number of color channels = 1
2021-07-13 02:26:15
never mind i found it
2021-07-13 02:26:18
> * If and only if this is 1, the JxlColorSpace in the JxlColorEncoding is > * JXL_COLOR_SPACE_GRAY.
2021-07-13 03:41:54
ok i didnt really find it.
2021-07-13 03:51:26
I have 2 questions: If there is only 1 color channel, is it guaranteed that JxlDecoderGetColorAsICCProfile will never return a color profile with a color space other than cmsSigGrayData? Does JXL_DEC_COLOR_ENCODING always come before JXL_DEC_NEED_IMAGE_OUT_BUFFER? I can't find where the order is documented, or at least in decode.h.
Diamondragon
2021-07-13 04:04:55
There are few apps as yet which can open jxl files, but I downloaded the portable version of Firefox Nightly from Mozilla's website. Modular Lossless files (2260 x 3200 in size) are completely drawn in about a second. Still slow, but not completely terrible. There must be something wrong with ImageGlass.
diskorduser
2021-07-13 05:02:16
It takes 2 to 3 second on Firefox to load 2260x3390 modular. IMO it's not bad.
Diamondragon
2021-07-13 05:28:27
Yeah. Closer to two seconds for colour pages for me, after all (on an i7-6820HK.) Just over a second for greyscale pages. No good way for me to precisely measure that I know of.
190n
2021-07-13 06:30:50
i know JPEG XL supports many channels in addition to color and alpha. is there a mechanism to standardize certain use cases for the additional channels so you could open a file with compliant software without having to tell the software what the channels are for?
2021-07-13 06:31:51
e.g. for the scheme in https://www.reddit.com/r/jpegxl/comments/oi9azb/will_jpeg_xl_replace_all_raw_camera_formats/h4u2sig/, could some metadata in the file specify "custom channel #0 is for the difference between 2 green pixels"?
yurume
190n i know JPEG XL supports many channels in addition to color and alpha. is there a mechanism to standardize certain use cases for the additional channels so you could open a file with compliant software without having to tell the software what the channels are for?
2021-07-13 08:36:30
there is a predefined list of extra channel types https://github.com/libjxl/libjxl/blob/89889a1f/lib/include/jxl/codestream_header.h#L44-L62
fab
2021-07-13 11:01:02
jpeg xl hasn't improved in lossless encoding the max it can is 1:5 from 06072021 to 13072021 with only -s 9 -q 100 the file size is the same big improvement at -s 9 -d 1 in file size and all super big improvement in denoising, sharpening, compression, very low file size it gets 730 kb for 7000x4000 upscaled that's in -s 9 -d 1.17 --use_new_heuristics
_wb_
yurume there is a predefined list of extra channel types https://github.com/libjxl/libjxl/blob/89889a1f/lib/include/jxl/codestream_header.h#L44-L62
2021-07-13 11:43:35
Yes, there's a list of the things we could anticipate (things that already exist today in one form or another). There are some reserved values for specific things we might want to add later, and then there are the generic "Unknown" and "Optional", where the intention is that you use the (optional) channel name to describe the meaning of the channel, and where "Optional" means that a decoder that doesn't understand it, can safely just ignore it, while "Unknown" means that a decoder should _not_ ignore it and preferably either refuse to decode or show an error or at least a warning if it doesn't understand it.
2021-07-13 11:46:29
For camera raw we have the "CFA" extra channel type, but we haven't specified yet what convention to use to signal the bayer pattern configuration etc. One approach could be to use DNG or something like that for that, and give it a jxl payload that just happens to have a CFA extra channel (and the DNG metadata specifies what it means). Another could be to define a box for it at the jxl container level.
fab
2021-07-13 07:42:57
Fabian — Oggi alle 21:34 ok i'd say -d 0.97 -s 7 --use_new_heuristics with next build should look same as default cavif rs for most of cases we should also compare file sizes anyway this build is good 768x959 153 kb 153 KB (157.434 byte) from twitter this i wouldn't compress this type of already too compressed jpg i wouldn't use s 9 d 1 on all that is photos with this build as the modular is at max so probably i'd say for explicit photos wait for next build and use -d 0.97 -s 7 --use_new_heuristics or wait at least 6 months when jpeg xl gets automatic lossless, vardct, lossy
2021-07-13 07:43:45
...
2021-07-13 07:44:20
i think is a bit improvement 6 days ago was comparable to webp
Jyrki Alakuijala
2021-07-13 08:25:30
good news, more ringing reduction is coming
Fox Wizard
2021-07-13 08:27:55
<:Poggers:805392625934663710>
eddie.zato
2021-07-14 05:08:02
I think we already need "ringing" emoji <:CatSmile:805382488293244929>
190n
2021-07-14 05:21:27
💍 🪐
fab
2021-07-14 09:03:42
also i'm curious after two build to try -d 1.055 -s 6 --use_new_heuristics i think new release of libjxl will be at least in september
2021-07-14 09:05:26
libjxl v0.3.7-241-g89889a1 win_x64 2021.07.13
2021-07-14 09:05:36
the last build i have
Jyrki Alakuijala
2021-07-15 10:41:11
I ended up delaying on the better but slower ringing reduction I have -- and I try to reach the same quality with faster and simpler heuristics
2021-07-15 10:41:34
the first step is in https://github.com/libjxl/libjxl/pull/322
2021-07-15 10:42:20
but it is much less effective than the other way of doing it, still a step to the right direction without any slowdown
2021-07-15 10:48:18
example improvement from 322, before:
2021-07-15 10:48:37
after:
2021-07-15 11:44:42
in the mean time... AVIF has delivered a huge improvement
2021-07-15 11:44:50
https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/index.html#08-2011-panthera-tigris-tigris-texas-park-lanzarote-tp04&AVIF-AOM=s&AVIF-AOM=s&subset1
2021-07-15 11:47:20
vs. 2021_07_02
2021-07-15 11:50:46
end of show image shows a big improvement for example
2021-07-15 11:51:08
cecret lake, also
2021-07-15 11:51:34
white dune
2021-07-15 11:51:40
probably all the pics are better now
_wb_
2021-07-15 11:51:50
this kind of encoder improvement race is good
Jyrki Alakuijala
2021-07-15 11:51:58
yes, perfect!
_wb_
2021-07-15 11:53:01
it's probably not a super fair race though - I suspect more people are working on av1/avif encoder improvement than on jxl encoder improvement
2021-07-15 11:57:54
2021-07-15 11:58:26
encode time of AOM lower than BPG/WebP2 now
2021-07-15 11:58:52
big difference with the plot from February:
2021-07-15 11:58:54
2021-07-15 12:00:02
BPG hasn't changed, but WebP2 became slower and libaom became a lot faster
2021-07-15 12:02:06
still an order of magnitude slower than jxl, but an order of magnitude faster than what it was half a year ago
2021-07-15 12:06:10
We really need more subjective evaluations done, and done correctly. We have re-established the JPEG AIC adhoc group to work on image quality assessment methodology, and that will be much needed.
improver
2021-07-15 12:08:18
I'd still like to have that slow ringing reduction on slowest effort option perhaps. because it really looks so much better
_wb_
2021-07-15 12:08:41
I don't believe objective metrics, but <@!826537092669767691>'s DSSIM is perhaps quite 'neutral' since no codec explicitly optimizes for it or internally uses the colorspace it operates in. It's interesting to see the evolution in those plots:
2021-07-15 12:08:51
February 2021:
2021-07-15 12:09:31
AVIF better (lower curve) than JXL for most of the bitrate range, JXL only better than AVIF above 1.3 bpp or so, JXL better than BPG above 1 bpp or so.
2021-07-15 12:09:48
July 2021:
2021-07-15 12:12:05
JXL consistently better than AVIF, AVIF has become worse than BPG, JXL better than BPG above 0.5 bpp or so.
2021-07-15 12:12:59
No idea if these objective metric results correspond to subjective results, that is hard to tell.
2021-07-15 12:14:48
But it does look like the order-of-magnitude speedup of libaom of the past half year may come at some sacrifice in compression density, at least according to this metric. It would be somewhat weird if such a speedup would *not* come at a cost imo (for a codec that is already a few years old).
2021-07-15 12:19:14
the other metrics seem to agree that AVIF has become worse compared to 6 months ago; then again visually it does look better so it might be things like grain synthesis that tend to mess up metrics results but can be good visually...
Jyrki Alakuijala
_wb_
2021-07-15 12:20:01
I just look at the images, this last comparison AVIF's weakest points improve
2021-07-15 12:20:08
sky doesn't look terrible now
2021-07-15 12:20:23
red objects are getting more bits, no longer utterly terrible
2021-07-15 12:20:41
dark shadows are better, not yet ideal
_wb_
2021-07-15 12:20:47
yes, I looked at a few and I also see improvements (the file size is not always the same though)
Jyrki Alakuijala
2021-07-15 12:21:10
textures doen't disappear, the original noise is more respected -- it may still disappear or reduce in places, especially close to boundaries
2021-07-15 12:21:31
BPG doesn't look good to me, I'd never use it for my own images
_wb_
2021-07-15 12:22:31
cecret lake looks a lot better in aom now at 0.94 bpp than in February at 1 bpp
Jyrki Alakuijala
2021-07-15 12:22:33
AVIF used to have a tendency of deciding somewhat arbitrarily that some large area of the image (perhaps at 64x64 resolution) didn't deserve bits, and just smoothed it out
2021-07-15 12:22:49
yep, it is a huge achievement from the AVIF/AV1 team
2021-07-15 12:23:09
as far as I understood they did this within the same flags, i.e., it is not from --tune=ssim changed to --tune=butteraugli
_wb_
2021-07-15 12:23:54
possibly as part of the speedup, they also do less search for things that can lead to big blurry blocks
Scope
2021-07-15 12:24:17
https://aomedia-review.googlesource.com/c/aom/+/141702
_wb_
2021-07-15 12:26:06
I hope it will also lead to better av1 video, i.e. that the improvements are not restricted to intra-only mode
paperboyo
2021-07-15 12:32:35
Interestingly, this improvement in detail preservation at sensible bpps came at a cost visible in eg. degraded patterns on balloons between https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_06_08/index.html#clovisfest&AVIF-AOM=t&AVIF-AOM=t&subset1 & https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/index.html#clovisfest&AVIF-AOM=t&AVIF-AOM=t&subset1
_wb_
2021-07-15 12:45:48
that might also be the aom encoder speedups making it try fewer prediction modes or something
Jyrki Alakuijala
paperboyo Interestingly, this improvement in detail preservation at sensible bpps came at a cost visible in eg. degraded patterns on balloons between https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_06_08/index.html#clovisfest&AVIF-AOM=t&AVIF-AOM=t&subset1 & https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/index.html#clovisfest&AVIF-AOM=t&AVIF-AOM=t&subset1
2021-07-15 12:48:07
previously they spent all the bits on the balloons, now when they spend some on the sky, too, there are not enough bits left for the balloons
2021-07-15 12:49:31
In my opinion these <0.1 bpp comparisons are not driven by practicality, even less than 1 bpp is questionable to me -- do we need/benefit from that risk
fab
2021-07-15 12:52:12
is this done at slowest speed for each encoder?
Jyrki Alakuijala
2021-07-15 12:58:03
no
2021-07-15 12:58:25
I like how VMAF thinks JXL is better than AVIF at lowest bpp, and worse at highest: https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/plots.html
2021-07-15 12:58:36
all evidence of viewing suggests the other way around
fab
2021-07-15 01:02:46
when vvc will be added to the comparison?
2021-07-15 01:03:01
at slowest possible speed
paperboyo
Jyrki Alakuijala In my opinion these <0.1 bpp comparisons are not driven by practicality, even less than 1 bpp is questionable to me -- do we need/benefit from that risk
2021-07-15 01:04:33
> these <0.1 bpp comparisons are not driven by practicality, even less than 1 bpp is questionable to me Fair and agreed; up to a point. With my usual bias towards low-end spectrum, I would only offer two semi-counterarguments: 1. codec allows those low settings, if it could know that by trying to be “faithful” at them low qualities, it will be neither faithful, nor pleasing, it could steer itself more into the “pleasing” territioy 2. to my naive mind, in an image of average 1bpp there are areas that are effectively 1.5bpp and areas that are effectively 0.4bpp. I would prefer those lower quality areas to look better than worse, so that’s why, for me, these comparisons at absurdly low qualities are useful: too see how this particular codec “brakes down”. Actually, in my (limited!) experience, every single codec starts breaking in low quality areas much earlier than exhausting ability to reduce quality in higher quality areas. As I tried (badly) to express in this here comment [3rd para from bottom]: https://github.com/GoogleChromeLabs/squoosh/issues/270
Jyrki Alakuijala
2021-07-15 01:06:41
jpeg xl is run at default speed (7) in those comparisons -- avif is also run at a reasonable speed, possibly about 7x slower than jpeg xl's encoding speed
2021-07-15 01:07:04
avifenc -a end-usage=q -a tune=ssim -a deltaq-mode=3 -a sharpness=3 -y 420 --min 0 --max 63 -a cq-level=$quality -o $target $origpng
2021-07-15 01:07:29
cjxl $origpng $target -v -q $quality
fab
2021-07-15 01:10:15
webp2 is slower? how to find parameters?
2021-07-15 01:10:39
https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/recipes.json
2021-07-15 01:11:35
ah all default speed
Jyrki Alakuijala
2021-07-15 01:14:00
possibly some differences in parallelism
2021-07-15 01:20:37
how image codecs are used in practice:
2021-07-15 01:20:55
1. find what is the compression level that produces acceptable results
2021-07-15 01:21:18
2. give it some margin so that there will not be surprises with difficult material
2021-07-15 01:22:42
the amount of margin necessary may vary from codec to codec, and because of this, the correct adaptiveness within the codec is important since it affects the margin
2021-07-15 01:25:08
very few organizations are able to hire an image quality person to review every served image, or to build an understanding what (surprising) image quality degradations cause to their business
2021-07-15 01:26:02
big organizations like Facebook do have mechanisms in place to be able to trace impact on user behaviour, and they can find the speed spot in quality for users
2021-07-15 01:26:41
I believe most of the organizations do it manually and then add margin
2021-07-15 01:26:49
or they use 'standards' in the field
2021-07-15 01:27:09
like 'photoshop for dummies' saying which quality to use for a website
2021-07-15 01:27:57
((or a more prestigious user guide))
2021-07-15 01:37:07
and some smart companies externalize image quality to Jon 😄
paperboyo > these <0.1 bpp comparisons are not driven by practicality, even less than 1 bpp is questionable to me Fair and agreed; up to a point. With my usual bias towards low-end spectrum, I would only offer two semi-counterarguments: 1. codec allows those low settings, if it could know that by trying to be “faithful” at them low qualities, it will be neither faithful, nor pleasing, it could steer itself more into the “pleasing” territioy 2. to my naive mind, in an image of average 1bpp there are areas that are effectively 1.5bpp and areas that are effectively 0.4bpp. I would prefer those lower quality areas to look better than worse, so that’s why, for me, these comparisons at absurdly low qualities are useful: too see how this particular codec “brakes down”. Actually, in my (limited!) experience, every single codec starts breaking in low quality areas much earlier than exhausting ability to reduce quality in higher quality areas. As I tried (badly) to express in this here comment [3rd para from bottom]: https://github.com/GoogleChromeLabs/squoosh/issues/270
2021-07-15 01:39:05
The item 2. -- Could you explain more?
_wb_
2021-07-15 01:41:20
I think there are three models of encoder setting that are widely used: 1. Manual: use the "export to..." dialog in Gimp or Photoshop and play with the quality slider until the preview looks good. 2. Simplistic automatic: what Jyrki describes above, basically "set it to a safe value", e.g. Wordpress by default just recompresses everything to libjpeg q82, 4:2:0. 3. Fancy automatic: like Cloudinary's q_auto (and there are others who do similar things), have an automated system to do per-image setting adjustment based on heuristics.
2021-07-15 01:45:46
An encoder that has a good internal perceptual model and adaptation, has the main benefit that it can make local choices (instead only having a global quality slider) and that it has a much narrower/reliable output range subjectively for a given encoder setting. This means that for 1. it saves time for users because you don't have to play with the slider for every image, just once will be OK. 2. it saves bandwidth because "safe" can be lower than before while still having a better "worst case" than before. 3. it saves cpu effort (no more need for iterative search and complicated heuristics) and likely produces better results.
Jyrki Alakuijala
2021-07-15 01:46:44
none of the processes that I know are centered on 'let's make it 0.8 bpp'
2021-07-15 01:47:17
the processes are centered on quality, how the image looks like -- the bpp is only a consequence
2021-07-15 01:50:34
ideally codecs would be compared against each other through the same process as the use
2021-07-15 01:51:13
all of 1, 2, and 3.
2021-07-15 01:51:55
particularly it shouldn't be matching a bpp produced by some other codecs faulty quality heuristics and special coding characteristics
_wb_
2021-07-15 01:56:19
the only reason why "specific bpp" encoding is a thing is because in subjective eval the only fair thing to do is to evaluate different codecs at the same bpp
2021-07-15 01:57:17
configuring an encoder to obtain a specific bpp or filesize in practical usage scenarios (not for testing) is a horrible idea
paperboyo
Jyrki Alakuijala The item 2. -- Could you explain more?
2021-07-15 01:59:35
In short: by looking at low quality for a given codec, it’s easier to see how it’s gonna look when this codec “brakes down” (differs depending on a codec). Also: when a codec brakes down on an image with differing characteristics (areas), it’s often that “a sky” breaks down much earlier than “a forest” will (differs depending on a codec).
2021-07-15 02:01:59
I’ve been in a situation, having only access to Jon’s second option (no fancy automatic), where I would have to unacceptably raise the weight of 95% of imagery for no useful reason other than to protect breaking down some image areas in 5% of imagery. I decided it’s not worth it. I hate to have to be in such a position whenever I happen to be looking at an image from these 5%.
Jyrki Alakuijala
paperboyo I’ve been in a situation, having only access to Jon’s second option (no fancy automatic), where I would have to unacceptably raise the weight of 95% of imagery for no useful reason other than to protect breaking down some image areas in 5% of imagery. I decided it’s not worth it. I hate to have to be in such a position whenever I happen to be looking at an image from these 5%.
2021-07-15 02:11:06
In other words: quality evaluation should be able to identify how things look when 'breaking down' happens?
2021-07-15 02:13:09
why is an overly low bpp version a good proxy of that?
2021-07-15 02:13:26
why not just gather evidence where the breaking down happens and look at it?
paperboyo
Jyrki Alakuijala In other words: quality evaluation should be able to identify how things look when 'breaking down' happens?
2021-07-15 02:26:37
> quality evaluation should be able to identify how things look when 'breaking down' happens Not sure if it _should be_. It just does it, no? JPEG: posterisation and blockiness, WebP/AVIF: torrent of plastic, JXL: not sure yet? blockiness? Going too low makes it easier to see. Going too low for multiple codecs also gives an idea about at which point the “breaking up” happens for each.
2021-07-15 02:34:34
Of course, the above only aids proper quality evaluation. And my obsession around “breaking down” point is because I’m, again, thinking solely about web use scenario (by far the one that will decide the overall success of new codecs, IMHO).
BlueSwordM
Jyrki Alakuijala I like how VMAF thinks JXL is better than AVIF at lowest bpp, and worse at highest: https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/plots.html
2021-07-15 02:34:49
One of the things in general is not trusting VMAF for anything related to images.
2021-07-15 02:37:05
Currently, VMAF's SSIM implementstion does not seem to actually take into account for chroma interestingly enough...
_wb_
paperboyo > quality evaluation should be able to identify how things look when 'breaking down' happens Not sure if it _should be_. It just does it, no? JPEG: posterisation and blockiness, WebP/AVIF: torrent of plastic, JXL: not sure yet? blockiness? Going too low makes it easier to see. Going too low for multiple codecs also gives an idea about at which point the “breaking up” happens for each.
2021-07-15 02:48:17
A big difference between JPEG and current-generation codecs like AVIF and JXL is that JPEG uniformly does the same thing to the entire image, while current-gen codecs make decisions about where to do what
2021-07-15 02:48:45
they also have way more coding tools to choose from, so artifacts are more related to encoder choices than to the codec itself
BlueSwordM
2021-07-15 02:49:17
In that regard, in what other ways can s3 be improved without hurting encoding speed in varDCT?
Jyrki Alakuijala
2021-07-15 03:00:29
what is s3?
improver
2021-07-15 03:00:57
`-s 3`, aka `-e 3`
Jyrki Alakuijala
2021-07-15 03:01:08
aaah
2021-07-15 03:02:22
I haven't been thinking about that much, I'm over focused on -s 7 (default) with some testing on -s 6
2021-07-15 03:02:55
simplest improvement of -s 3 today is to start using -s 6
raysar
2021-07-16 01:48:45
Hello, is it time to go to jxl 0.4.0 in github?
_wb_
2021-07-16 05:36:14
Yes, but the more recent the codec, the more room there is for making different choices in different regions. WebP has something, but it's limited.
fab
2021-07-16 07:25:08
-+
YAGPDB.xyz
2021-07-16 07:25:08
``` GiveRep <User:User> [Num:Whole number] ``` Invalid arguments provided: Not enough arguments passed
fab
2021-07-16 07:54:13
2021-07-16 07:54:15
https://jpeg.org/participate.html
veluca
raysar Hello, is it time to go to jxl 0.4.0 in github?
2021-07-16 09:14:50
we're likely going to skip that and going for 0.5.0 directly...
raysar
2021-07-16 11:45:35
https://c.tenor.com/New6ITYbsQoAAAAM/get-ready-excited.gif
fab
2021-07-17 12:23:18
animation aren't working
2021-07-17 12:24:15
diskorduser
2021-07-17 02:17:28
Animation with PNG?
2021-07-17 02:17:39
Isn't a normal png?
fab
2021-07-17 05:36:10
is the build of jamaika doom9 user
2021-07-17 05:36:19
so it could be that the plugin is obsolete
yurume
2021-07-18 10:06:31
try to define a macro named `NOMINMAX`?
Jyrki Alakuijala
raysar https://c.tenor.com/New6ITYbsQoAAAAM/get-ready-excited.gif
2021-07-18 11:09:49
We don't have a magical reserve of improvements that surface in 0.5 if you have been testing in head. 😛
2021-07-18 11:10:47
But of course it allows a significant collection of encoder improvements and decoder hardening to surface for serious use depending on releases
2021-07-18 11:12:49
... any thoughts on cjxl switching from VarDCT to modular for lower quality?
2021-07-18 11:13:00
Currently that seems to happen in external testing
2021-07-18 11:14:21
https://storage.googleapis.com/demos.webmproject.org/webp/cmp/2021_07_13/index.html#clovisfest&JXL=t&JXL=s&subset1
2021-07-18 11:14:39
tiny looks like modular, small looks like VarDCT
2021-07-18 11:14:54
...
2021-07-18 11:16:30
we also see a wiggle in the performance charts suggesting that that might be due to the same internal codec change:
2021-07-18 11:16:50
that's for encoding speed
2021-07-18 11:18:22
it looks like at that non-linearity we get a 3x slowdown in encoding and a 2x increase in bitrate
2021-07-18 11:18:56
it can be confusing for users if the bitrate increases that much when they drop quality from say distance 30 to distance 31
Scope
2021-07-18 11:19:47
I think modular at low quality requires some sort of smoothing or upscaling to avoid very noticeable pixelization
Jyrki Alakuijala
2021-07-18 11:19:56
... butteraugli bpp chart
_wb_
2021-07-18 11:21:01
It's probably better to switch to --resampling 2 and staying in vardct than to switch to modular, imo
2021-07-18 11:21:51
Also tricky to do without discontinuity but at least it will be fast to encode and I think the results will be better
Jyrki Alakuijala
2021-07-18 11:22:06
butteraugli metrics from same benchmark in february:
2021-07-18 11:23:14
march:
2021-07-18 11:24:29
june the same:
2021-07-18 11:25:08
1st of July:
2021-07-18 11:25:48
13 of July:
_wb_
2021-07-18 11:26:11
Not sure if BA is very meaningful around the range of dist=10, and it's a metric where jxl has an advantage from XYB just like the others have an advantage in metrics like VMAF from using YCbCr420. Encoding in the same space as what the metric uses is always good to get good scores with that metric.
Jyrki Alakuijala
2021-07-18 11:28:00
in the neural compression effort butteraugli correlated better in 0.07 - 0.3 bpp category than ssimulacra and dssim -- it cannot be just completely off there
_wb_
2021-07-18 11:29:08
What pnorm is this though?
Jyrki Alakuijala
2021-07-18 11:29:11
I have improved it there to some extent from data that I got from a tensorflow butteraugli experiments, so past experience might not be useful there
2021-07-18 11:29:22
it looks like a very high pnorm or max
2021-07-18 11:29:49
definitely not 6, because that would have about 1 - 1.5 bpp * pnorm
2021-07-18 11:30:34
(the team performing this test did not take advice from me)
2021-07-18 11:31:12
there are couple of things we can see in the charts
2021-07-18 11:31:14
...
2021-07-18 11:31:24
we are improving across the qualities
2021-07-18 11:31:57
we used to reach butteraugli = 1.0 at 3.2 bpp, now at about 3 bpp
2021-07-18 11:32:52
we wesed to reach butteraugli = 10.0 at 0.32 bpp, now at about 0.28 bpp
2021-07-18 11:33:55
hmmm, we probably cannot use these charts for anything....
2021-07-18 11:34:18
I changed butteraugli in between 😅 -- I normalized it to stay the same at my corpus, but that is focused around butteraugli distance 1.0 (has a lot of content from 0.5 to 1.5 distances, but the rest is extrapolated)
2021-07-18 11:35:20
...
2021-07-18 11:35:27
let's forget that and look at CIEDE
2021-07-18 11:35:39
February 2021
2021-07-18 11:36:11
Current
2021-07-18 11:37:55
avif-aom got much worse at 1 bpp, from 40.3 down to 39.3
2021-07-18 11:38:22
the whole curve shifted down
2021-07-18 11:38:38
that is a bit weird since it looks much better 😄
2021-07-18 11:40:13
jxl went up by 0.5 (also looks better than before), webp, webp2 and bpg did not move
2021-07-18 11:44:06
I think we should do something to our 'Harry Potter' curve and try to find the optimal switch-spot for modular vs. vardct (if we still need to switch)
fab
2021-07-18 11:46:31
2021-07-18 11:46:45
v0.99.2 —
2021-07-18 11:47:08
Jyrki Alakuijala
_wb_ It's probably better to switch to --resampling 2 and staying in vardct than to switch to modular, imo
2021-07-18 11:47:45
could you take ownership on exploring that? unfortunately people look at images on these qualities even if it is irrelevant in real use...