|
Demiurge
|
2025-06-06 06:05:39
|
I remember when imgur wasn't evil
|
|
|
Meow
|
2025-06-06 06:09:47
|
Imgur doesn't make a statement about this
|
|
|
CrushedAsian255
|
2025-06-06 11:38:02
|
do they block traffic from mainland china?
|
|
2025-06-06 11:38:14
|
or does Chinese GFW block imgur?
|
|
|
Meow
|
2025-06-07 03:06:36
|
Of course that GFW blocks Imgur
|
|
|
Demiurge
|
2025-06-08 03:24:31
|
There are a lot of things there that are "export-only" and not allowed for their own subjects
|
|
2025-06-08 03:25:07
|
Idk if imgur is owned by the CCP but I wouldn't be surprised...
|
|
2025-06-08 03:25:24
|
Seems the entire world is these days.
|
|
|
Meow
|
2025-06-08 03:51:38
|
Simply overused by services in Taiwan as they all directly display images from Imgur
|
|
|
pixyc
|
|
Meow
An online converter that lets you choose jpg and jpeg
|
|
2025-06-10 01:56:14
|
i remember when i was like 10 years old
|
|
2025-06-10 01:56:19
|
i was on a website
|
|
2025-06-10 01:56:24
|
and tried to upload an image
|
|
2025-06-10 01:56:37
|
the extension was *.jpeg
|
|
2025-06-10 01:56:42
|
but they only accepted *.jpg
|
|
2025-06-10 01:56:57
|
so my dumb ass at the time
|
|
2025-06-10 01:57:05
|
looked up a web based converter
|
|
|
Meow
|
2025-06-10 01:57:06
|
This still happens nowadays
|
|
|
pixyc
|
2025-06-10 01:57:15
|
cause i didn't know i coyld just change the extension
|
|
|
Meow
|
2025-06-10 02:13:19
|
The system may warn you
|
|
|
gb82
|
2025-06-11 06:12:06
|
How does anyone use HDR-VDP-2 like its used [here](https://cloudinary.com/labs/aic-3-and-hdr) if its written in Matlab?
|
|
|
_wb_
|
2025-06-11 02:28:48
|
It runs in octave too, that's what I used
|
|
|
damian101
|
|
looked up a web based converter
|
|
2025-06-12 12:42:24
|
meanwile I thought image formats were all the same because a website accepted a PNG after I changed extension to .jpg π
|
|
|
jonnyawsom3
|
|
meanwile I thought image formats were all the same because a website accepted a PNG after I changed extension to .jpg π
|
|
2025-06-12 12:54:23
|
Many a year I spent renaming extensions thinking I had converted them
|
|
|
juliobbv
|
2025-06-13 09:39:18
|
https://www.rachelplusplus.me.uk/blog/2025/06/evaluating-image-compression-tools/
|
|
2025-06-13 09:39:25
|
Interesting comparison between codecs
|
|
2025-06-13 09:39:53
|
it's AVIF-centric, but it has <:JXL:805850130203934781> tested as well!
|
|
2025-06-13 09:43:01
|
it also has a section for multi-resolution encoding
|
|
|
_wb_
It runs in octave too, that's what I used
|
|
2025-06-13 10:36:23
|
how's performance (in rough terms) btw?
|
|
2025-06-13 10:37:13
|
e.g. 1 MP/s, 0.1 MP/s, 0.01 MP/s...
|
|
|
_wb_
|
2025-06-13 11:37:32
|
I haven't measured, I think pretty slow but fast enough to compute it for something like a dataset of subjective results in a reasonable time. Probably too slow to use it inside an encoder or something though.
|
|
|
jonnyawsom3
|
|
juliobbv
https://www.rachelplusplus.me.uk/blog/2025/06/evaluating-image-compression-tools/
|
|
2025-06-13 01:27:00
|
> there's just no way to get lossless 4:2:0 format images
This hurts slightly to read xD
> Ideally the command-line tools for JPEGli/JPEG-XL would add support for a lossless 4:2:0 format like YUV4MPEG2, as that would allow a fully fair comparison.
Makes sense when you're using video input, to give 4:2:0 to the encoder, but generally subsampling shouldn't be used at all above quality 80~ so you're shooting yourself in the foot. Could probably be done *relatively* easily though, given all the piping for JPEG input and JPEG Transcoding is already there for jpegli and JPEG XL
|
|
2025-06-13 01:41:56
|
Interesting results though, and the multires testing makes me think my horrible `2:2:0` subsampling idea might not actually be that horrible....
|
|
2025-06-13 01:42:17
|
Assuming any decoder would even decode it
|
|
|
juliobbv
|
|
Interesting results though, and the multires testing makes me think my horrible `2:2:0` subsampling idea might not actually be that horrible....
|
|
2025-06-13 07:40:56
|
like, luma being shrunk horizontally by half?
|
|
|
jonnyawsom3
|
|
juliobbv
like, luma being shrunk horizontally by half?
|
|
2025-06-13 07:47:45
|
Pretty much yeah. Ideally the entire image being halved, but via subsampling in the decoder so that the result is the same size for compatibility
The same as my libjxl PR, at distance 10 the resolution is halved internally
|
|
|
juliobbv
|
|
Pretty much yeah. Ideally the entire image being halved, but via subsampling in the decoder so that the result is the same size for compatibility
The same as my libjxl PR, at distance 10 the resolution is halved internally
|
|
2025-06-14 09:22:02
|
yeah, I can see how doing that could work
|
|
2025-06-14 09:23:15
|
at least at quality levels in between a 1x and a full 2x downscale
|
|
|
AccessViolation_
|
2025-06-16 03:41:45
|
I think every JXL file is suspicious...
|
|
2025-06-16 03:42:53
|
filescan.io is usually amazing, I guess it hasn't quite been updated to support JXL properly yet... this must be some generic logic then, and I wonder why it's wrong about it? like the extension and mimetype are literally the same
|
|
|
Meow
|
2025-06-17 02:09:12
|
Approved by the Chrome team
|
|
|
CrushedAsian255
|
2025-06-18 07:55:07
|
*Renames my_photo.jxl to my_photo.image/jxl*
|
|
|
_wb_
|
2025-06-18 10:01:09
|
nah we just need to convince IANA that jxl should become a top-level mimetype, not something under `image/`
|
|
|
DZgas Π
|
2025-06-18 04:45:35
|
great, just the thing to upload a single 1280x picture to telegram
|
|
2025-06-20 02:05:17
|
SVD Vt+U
DHT
PCA+PCA-matrix
|
|
2025-06-21 01:26:59
|
drunk
|
|
|
CrushedAsian255
|
|
DZgas Π
drunk
|
|
2025-06-21 01:37:07
|
looks like you got yourself some DC offset
|
|
|
DZgas Π
|
|
CrushedAsian255
looks like you got yourself some DC offset
|
|
2025-06-21 03:04:15
|
yes, I experimented with WAV saving to jpeg image. but, in this case I experiment not with the actual signal, but with the difference of the signal at each sample, it turns out to be much more resistant to interference, much stronger than all other algorithms, but it turns out to be such a shift along the entire length of the signal. But it sounds great
|
|
2025-06-21 03:06:24
|
Unfortunately, ADPCM turned out to be extremely unsuitable for image noise of this type, when all data coefficients change, although it is technically more efficient, and the freed-up piece of free data should in theory be used for data recovery. But it's all too complicated. not for me.
|
|
|
DZgas Π
SVD Vt+U
DHT
PCA+PCA-matrix
|
|
2025-06-21 03:08:55
|
all these algorithms also give too much noise or distortion at the wav output if the images are compressed into jpeg
|
|
2025-06-21 03:09:53
|
I'm starting to understand why everyone uses DCT, it's really simple and just works
|
|
2025-06-21 03:11:43
|
but I can't understand why in Wikipedia in the article about DCT they didn't even mention DHT. and in the article about DHT they didn't mention DCT. And why do they have different names at all, although in essence they are identical algorithms, one of which has only COS and the second COS+SIN.
it seems that these are not connected in any way, although this is not so. Strange.
|
|
|
DZgas Π
yes, I experimented with WAV saving to jpeg image. but, in this case I experiment not with the actual signal, but with the difference of the signal at each sample, it turns out to be much more resistant to interference, much stronger than all other algorithms, but it turns out to be such a shift along the entire length of the signal. But it sounds great
|
|
2025-06-21 05:33:38
|
Haar wavelet does a great job of this, much better than other algorithms, including Lifting-Scheme DWT
|
|
|
DZgas Π
SVD Vt+U
DHT
PCA+PCA-matrix
|
|
2025-06-21 05:38:18
|
Haar wavelet (DWT)
Lifting-Scheme (DWT)
FWHT (Hadamard transform)
difference between wav samples (with logarithmic normalization)
raw wav samples
|
|
|
la .varik. .VALefor.
|
2025-06-21 07:41:51
|
jxlatex jank
https://github.com/varikvalefor/drawingstuff/blob/702ba3c2c5915c592fca9bac09c7c5b646fb1482/zbas
|
|
|
A homosapien
|
2025-06-21 08:18:26
|
An inside joke? π
|
|
|
jonnyawsom3
|
2025-06-21 08:28:14
|
A british one more like, though I'm not sure why
|
|
|
Quackdoc
|
2025-06-23 05:30:15
|
has anyone thought about picking this up? https://github.com/colour-science/colour/issues/1118
|
|
|
Lumen
|
2025-06-24 08:22:47
|
1GP/s SSIMU2 throughput is real π
|
|
2025-06-24 08:23:31
|
Only about 360MP/s for butter though
(Single high end consumer computer)
|
|
|
jonnyawsom3
|
2025-06-27 06:36:32
|
Random thought, we could make a JXL server tag that Discord added recently
|
|
2025-06-27 06:36:41
|
Nevermind, it's paywalled as usual
|
|
|
_wb_
|
2025-06-27 07:21:20
|
Don't we have a discord dev here somewhere? Maybe they can arrange us some free Nitro or whatever the DiscordCoin is called
|
|
|
Mine18
|
2025-06-27 07:31:41
|
Scott kidder, but i dont think an arrangement would be possible
|
|
|
jonnyawsom3
|
2025-06-27 07:41:54
|
I'm still curious about the empty EXIF tags https://discord.com/channels/794206087879852103/805176455658733570/1385721153116569850
|
|
|
juliobbv
|
|
_wb_
Don't we have a discord dev here somewhere? Maybe they can arrange us some free Nitro or whatever the DiscordCoin is called
|
|
2025-06-27 07:53:11
|
just turn on the server tag feature flag (TM)
|
|
2025-06-27 07:53:20
|
https://tenor.com/blflC.gif
|
|
2025-06-27 07:54:10
|
but I must say, having a server tag is definitely worth it for discoverability purposes
|
|
|
Fox Wizard
|
2025-06-28 08:44:33
|
Think it costs 3 server boosts <:KekDog:884736660376535040>
|
|
2025-06-28 08:46:53
|
Wanted to suggest the same thing, but then saw it isn't free and don't think there will suddenly be people who will want to boost the server since I've been the only person doing so for a pretty long time
|
|
|
CrushedAsian255
|
|
Fox Wizard
Wanted to suggest the same thing, but then saw it isn't free and don't think there will suddenly be people who will want to boost the server since I've been the only person doing so for a pretty long time
|
|
2025-06-28 08:58:08
|
I just don't want to give Discord any money Β°-Β°
|
|
|
jonnyawsom3
|
|
Fox Wizard
Wanted to suggest the same thing, but then saw it isn't free and don't think there will suddenly be people who will want to boost the server since I've been the only person doing so for a pretty long time
|
|
2025-06-28 09:32:21
|
I think the problem is, it goes from 2 boosts to 7 for level 2. So as long as 1 person is, another is pointless, and then you need a 3rd and half of a 4th for any benefit
|
|
2025-06-28 09:34:04
|
If the tags were 2 boosts instead of 3, I'd almost say it's worth it to get the tag instead of level 1, since we don't really make use of it anyway
|
|
|
HCrikki
|
2025-06-28 02:01:22
|
past the minimum for server tags, theres no point giving discord any more money. give devs and projects like xl converter and thorium your loose change or regular 1$/month, they need it more
|
|
2025-06-28 02:02:18
|
while a mere 100 dollar bounty would get jxl adopted inside software with actual users
|
|
2025-06-28 02:03:06
|
peanuts for guaranteed results compared to potential visibility increase with no guaranteed uptake
|
|
|
Meow
|
2025-06-28 02:12:35
|
Discord's Nitro is less useful comparing to Telegram Premium
|
|
|
lonjil
|
2025-06-28 09:45:39
|
<:Ashley_WHAAAAA:800762210833006622>
|
|
2025-06-28 09:45:52
|
using custom emoji from other servers is so worth it
|
|
|
π°πππ
|
|
lonjil
using custom emoji from other servers is so worth it
|
|
2025-06-28 11:53:50
|
Vesktop
|
|
2025-06-28 11:54:03
|
https://github.com/Vencord/Vesktop
|
|
2025-06-28 11:54:36
|
|
|
2025-06-28 11:55:13
|
and it's more minimal, private and secure compared to std discord client
|
|
2025-06-28 11:55:26
|
especially better on Linux
|
|
|
jonnyawsom3
|
2025-06-29 12:41:37
|
Posting that in the channel with Discord staff is a bold move...
|
|
|
gb82
|
2025-06-29 12:54:29
|
im sure they know about it. its not like we created it
|
|
|
Quackdoc
|
|
Posting that in the channel with Discord staff is a bold move...
|
|
2025-06-29 12:59:01
|
people have actively pinged him with images and questions about stuff like this lol
|
|
|
Cacodemon345
|
2025-06-29 07:30:17
|
This honestly seems too excess to post here.
|
|
|
Jyrki Alakuijala
|
|
DZgas Π
yes, I experimented with WAV saving to jpeg image. but, in this case I experiment not with the actual signal, but with the difference of the signal at each sample, it turns out to be much more resistant to interference, much stronger than all other algorithms, but it turns out to be such a shift along the entire length of the signal. But it sounds great
|
|
2025-06-30 09:20:19
|
sometimes I play (in my mind) with the idea of a JPEG XL vinyl. A JPEG XL image where the audio would be stored as a spiral circling (in a square way) towards the center, and the absolute center would have the label
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
sometimes I play (in my mind) with the idea of a JPEG XL vinyl. A JPEG XL image where the audio would be stored as a spiral circling (in a square way) towards the center, and the absolute center would have the label
|
|
2025-06-30 09:21:38
|
huh
|
|
2025-06-30 09:22:40
|
it is not entirely clear how you can twist the spiral squarely, because at 90 degree bends the signal will be completely ruined
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:23:23
|
one sample per pixel of course
|
|
2025-06-30 09:24:07
|
the usual disk plays 77 rpm, i.e., 37000 pixels per round (for 48 kHz sampling), a 10k x 10k image could operate roughly 77 rpm at 48 kHz
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
one sample per pixel of course
|
|
2025-06-30 09:25:06
|
if jpeg xl can generate pixel dots in a given shift algorithm, why not, it would be an effective recording of a sound wave, and if the image is of a known size and has a pre-known generation frame, it will be possible to decode it normally as wav
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:25:10
|
33 rpm would need more than twice as big of an image
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
the usual disk plays 77 rpm, i.e., 37000 pixels per round (for 48 kHz sampling), a 10k x 10k image could operate roughly 77 rpm at 48 kHz
|
|
2025-06-30 09:27:01
|
I don't think it makes sense to talk about revolutions per, it's not appropriate, the JXL disk is not physical. You can read as many pixels as you want
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:27:06
|
a disk without sound pressure changes would be fully gray (32768 is a 0 pressure), each pixel is a sample from the 48 kHz wav file
|
|
2025-06-30 09:27:24
|
well, revolutions per second will relate to how the final disc looks like
|
|
2025-06-30 09:27:34
|
will it look similar to a traditional disk or not
|
|
2025-06-30 09:28:38
|
Y channel would contain the R+L signal and the X could contain the R-L
|
|
|
DZgas Π
|
2025-06-30 09:28:44
|
4 bits, or 16 amplitude samples, is enough for a wave. But for each sample count, 1 pixel is needed. So, the main problem is that the image will be about 5000x or more, only for mono at 32000 Hz
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:29:17
|
no, we need 16 bits of course (the same as CD) and 44.1 and 48 kHz, to losslessly store cd-quality
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
a disk without sound pressure changes would be fully gray (32768 is a 0 pressure), each pixel is a sample from the 48 kHz wav file
|
|
2025-06-30 09:29:57
|
I understand what you mean
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:30:23
|
exactly, there will be somewhat random correlations that will be visible
|
|
2025-06-30 09:31:05
|
to make the correlations happen most consistently, the "rpm" should be ideally constant -- but that of course makes the resampling "interesting"
|
|
|
DZgas Π
|
2025-06-30 09:32:01
|
the big problem here is how to generate it natively in jpeg xl. I don't know anything about it
|
|
|
Jyrki Alakuijala
to make the correlations happen most consistently, the "rpm" should be ideally constant -- but that of course makes the resampling "interesting"
|
|
2025-06-30 09:33:14
|
the problem with reading the wave is secondary, it is unimportant, if the sample misses it will not be noticeable, because it is a world of waves
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:33:45
|
you just create it in a simple lossless format like PNG or PPM https://en.wikipedia.org/wiki/Netpbm
|
|
2025-06-30 09:34:43
|
if you rely on uncompressed stereo wav with 16-bit per sample (or 32 bit float perhaps more common nowadays), the IO there is rather easy, too
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
to make the correlations happen most consistently, the "rpm" should be ideally constant -- but that of course makes the resampling "interesting"
|
|
2025-06-30 09:35:09
|
to make revolutions per second is a complication, from the physical world, as I said, there is no need for this. You are not making vinyl, but a CD wav disk. because there is no problem in reducing the rotation speed as the file is read, because all the variables are known in advance
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:35:37
|
yes, it is just that then correlations radially will not be related to time
|
|
2025-06-30 09:35:59
|
it will be another "hash function" that will destroy any visual patterns that would possibly emerge otherwise
|
|
2025-06-30 09:36:19
|
think of a beat in a song, if they align radially, they will form a pattern
|
|
2025-06-30 09:36:36
|
but if the radial speed changes, that pattern will continue to be skewed as the song proceeds
|
|
2025-06-30 09:36:42
|
(it might not matter much)
|
|
2025-06-30 09:37:24
|
probably better to start without it
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
you just create it in a simple lossless format like PNG or PPM https://en.wikipedia.org/wiki/Netpbm
|
|
2025-06-30 09:38:18
|
Well, let's say I already did this, literally the day before yesterday, only using an 8-bit channel more natively
I also used the alternative wave difference method on the second image. It worked perfectly when I did tests after compressing it to jpeg.
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:39:08
|
nice
|
|
2025-06-30 09:40:29
|
another possible (and efficient) encoding would be to use 64 sample DCT, store the coefficients into the 8x8 block
|
|
2025-06-30 09:41:00
|
ringli is not far from that
|
|
2025-06-30 09:41:39
|
https://github.com/google/ringli/
|
|
2025-06-30 09:42:24
|
shorter DCT than traditional in audio works better for higher quality audio -- and no need for the "MDCT", that just messes things up
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
another possible (and efficient) encoding would be to use 64 sample DCT, store the coefficients into the 8x8 block
|
|
2025-06-30 09:42:44
|
This is acceptable only for the method when I encode the difference of the wave. Since I have tested dozens of all possible types of transformations invented by mankind, just to test the compression tolerance. Simply direct recording of the wave shows itself poorly. But the difference of the wave is excellent. And also haar wavelet by the way.
|
|
|
Jyrki Alakuijala
|
2025-06-30 09:43:22
|
what is a difference of the wave?
|
|
2025-06-30 09:43:34
|
difference between consequence samples?
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
https://github.com/google/ringli/
|
|
2025-06-30 09:45:23
|
no builds, no links, nothing. Looks like an abandoned internal project developed by a couple of people just like that... looks bad... Even Meta Encodec look better. And I want to note that in extreme compression Encodec demonstrates excellent sound quality at 12 kbps
|
|
|
Jyrki Alakuijala
what is a difference of the wave?
|
|
2025-06-30 09:47:57
|
seems to be correct. Only ADPCM turned out to be too digital and became completely unreadable when compressed in DCT. I had to write my own algorithm that took 16 levels of the wave with scaling by logarithm, and each reading made a comparison with which wave would be closer to the original. That is, the information says - how much the amplitude needs to be shifted in the next reading, up or down
|
|
2025-06-30 09:48:25
|
after compression in jpeg the error accumulated, but the sound sounded fine
|
|
2025-06-30 09:51:32
|
after compression to jpeg
|
|
2025-06-30 09:53:11
|
the main problem, as I said - for each count you need pixels, for music, and even more so stereo, and even more so 40,000 hertz. you need a lot of pixels, so many that it is better not to consider the option with 44 or 48
|
|
2025-06-30 09:53:48
|
even if pixels have excellent 32 bit float quality
|
|
2025-06-30 09:54:21
|
or just 16 shades in 4 bits. pixels amount will not decrease
|
|
|
A homosapien
|
2025-06-30 09:54:48
|
mono 24000 hertz is cool, retro is in fashion these days
|
|
|
DZgas Π
|
|
DZgas Π
after compression to jpeg
|
|
2025-06-30 09:55:07
|
this is 10500 hz
|
|
2025-06-30 09:55:25
|
to upload exactly in 1280x1280 telegram jpeg pic
|
|
2025-06-30 09:58:49
|
It's a pity that no algorithm gave more advantages, in quality or in size, the best for compression of the Analog, turned out to be the Analog
|
|
2025-06-30 10:00:46
|
|
|
2025-06-30 10:03:25
|
Unfortunately, I have never succeeded in successfully recording sound on paper. No analog or digital algorithms
|
|
|
Jyrki Alakuijala
|
|
DZgas Π
no builds, no links, nothing. Looks like an abandoned internal project developed by a couple of people just like that... looks bad... Even Meta Encodec look better. And I want to note that in extreme compression Encodec demonstrates excellent sound quality at 12 kbps
|
|
2025-06-30 01:37:37
|
it is an unannounced audio compression project π and right now inactive
|
|
|
DZgas Π
after compression in jpeg the error accumulated, but the sound sounded fine
|
|
2025-06-30 01:39:12
|
you could add a small bias towards zero in decoding, like 5 Hz high pass filter
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
you could add a small bias towards zero in decoding, like 5 Hz high pass filter
|
|
2025-06-30 05:35:44
|
can take a ready-made sound wave and draw a line at the top, since the limit values ββare known,can align it... But these are all small things, my project is not serious
|
|
|
Demiurge
|
|
Jyrki Alakuijala
https://github.com/google/ringli/
|
|
2025-07-01 05:25:18
|
why does it say it's archived?
|
|
2025-07-01 05:28:06
|
Is there any advantage opus has over ringli aside from hardware/software support?
|
|
|
Jyrki Alakuijala
|
2025-07-01 07:03:32
|
no advantage to use opus that I can think of, ringli compresses 60 % more in the high quality audio range, 1000 kB Opus becomes 400 kB Ringli
|
|
|
Demiurge
why does it say it's archived?
|
|
2025-07-01 07:04:31
|
archiving is a reversible process that Google uses to indicate activity of an opensource project -- I think it helps security efforts to not to stress about this codec (I'm guessing here, not really sure what we want to signal with 'archiving')
|
|
|
Demiurge
|
2025-07-01 07:36:15
|
I wish there was a demo page with some samples of music and voice in various bitrates... I think you mentioned it's optimized for transparency. Can it compete at ultra crunchy bitrates as well? Right now they're trying to make opus better at very low bitrates, plus dropped packets.
|
|
|
Jyrki Alakuijala
|
2025-07-01 07:56:40
|
I don't know what it does at ultra cruncy bitrates -- the internet speeds 10x every 10 years, it is not clear to me why we would want to make audio worse right now
|
|
|
Demiurge
|
2025-07-01 08:00:23
|
Well I wonder if it could eventually be used in telephones for example.
|
|
|
Jyrki Alakuijala
|
2025-07-01 08:05:30
|
there is no real reason any more why telephones cannot be transparent in audio quality, or even supernatural -- more clear than the original
|
|
2025-07-01 08:06:28
|
I'm practicing violin over the internet, and it would be nice if the quality was better, but it is already good enough for the practice to be useful
|
|
|
Quackdoc
|
|
Jyrki Alakuijala
I'm practicing violin over the internet, and it would be nice if the quality was better, but it is already good enough for the practice to be useful
|
|
2025-07-01 08:07:22
|
I use sonobus for audio over internet, musicians are actually an intended audience of it
> Simply choose a unique group name (with optional password), and instantly connect multiple people together to make music, remote sessions, podcasts, etc.
|
|
2025-07-01 08:07:31
|
in my experience, it works extremely well
|
|
|
Demiurge
Is there any advantage opus has over ringli aside from hardware/software support?
|
|
2025-07-01 08:08:31
|
~~time for a rust ringli decoder~~
|
|
|
Demiurge
|
2025-07-01 08:17:47
|
How would a supernatural codec work or be useful in practice?
|
|
2025-07-01 08:17:54
|
Wouldn't it just hallucinate?
|
|
2025-07-01 08:18:15
|
Even worse than lossy
|
|
|
Jyrki Alakuijala
|
|
Demiurge
How would a supernatural codec work or be useful in practice?
|
|
2025-07-01 08:33:52
|
the same like a BBC documentary does for insects, they just look better on TV than in nature π
|
|
2025-07-01 08:34:42
|
a supernatural audio codec would slightly reverse the great vowel shift so that it would be fully reversed in the next 500 years
|
|
|
Demiurge
|
2025-07-01 08:35:05
|
lmao
|
|
2025-07-01 08:35:15
|
It would also correct people's bad grammar
|
|
|
Jyrki Alakuijala
|
2025-07-01 08:35:29
|
but also make audio slightly more clear, by adding minor emphasis on end consonants etc., perhaps also fix such things
|
|
|
Demiurge
|
2025-07-01 08:35:44
|
And it would make all politicians mute
|
|
2025-07-01 08:36:14
|
And make your parents say "I approve of your lifestyle"
|
|
|
jonnyawsom3
|
2025-07-01 02:30:32
|
I was trying to see if there was any news about JPEG XL support... A little outside my price range xD
|
|
|
Fox Wizard
|
2025-07-01 05:05:47
|
"Tickets include coffee" that better be some good coffee <:KekDog:884736660376535040>
|
|
|
lonjil
|
|
Jyrki Alakuijala
no advantage to use opus that I can think of, ringli compresses 60 % more in the high quality audio range, 1000 kB Opus becomes 400 kB Ringli
|
|
2025-07-02 03:12:30
|
what Opus bit rates are you counting as being in the "high quality audio" range?
|
|
|
DZgas Π
|
|
Jyrki Alakuijala
no advantage to use opus that I can think of, ringli compresses 60 % more in the high quality audio range, 1000 kB Opus becomes 400 kB Ringli
|
|
2025-07-04 08:57:18
|
opus has an encoder and a decoder so i don't see the advantage in using ringli <:SadCheems:890866831047417898>
|
|
2025-07-04 09:00:30
|
I don't know why it's not obvious, but no one has written anything about ringli anywhere, not because it's unknown, but because no one has been can to try it.
Isn't it obvious that no one really gives a shit about the source code, if there is no executable file, then the program does not exist. The Linux era never came.
|
|
|
Demiurge
Well I wonder if it could eventually be used in telephones for example.
|
|
2025-07-04 09:08:54
|
Actually, VoLTE technology already has its own EVS codec, so.
|
|
2025-07-04 09:13:34
|
for bluetooth there is a new codec LC3 (which is declared as better than opus, but this is a complete lie)
in 2025, the best codec for 40-160 kbps is the undisputed leader OPUS without any competition at all. The fact is that at these bitrates there is no point in using anything else at all in principle. This is a lossy monopoly. The exception is bluetooth, to be fair, since opus is expensive to encode and decode, compared to other codecs. LC3 only makes sense for bluetooth.
|
|
2025-07-04 09:16:46
|
USAC sounds better at bitrates below 40. If you take even lower, the best will be Encodec neural network by Meta at bitrate 12.
If you take higher than 160, it is better not to use opus, depending on the task: For mass distribution, mp3 320 is enough for everyone. For games, Vorbis 192+, for video, aac 192+.
|
|
|
Jyrki Alakuijala
|
2025-07-05 07:34:34
|
I could hear differences between original and opus with opus at 512 kpbs or so (shaker loosing variance in dynamic/emphasis)
|
|
|
lonjil
what Opus bit rates are you counting as being in the "high quality audio" range?
|
|
2025-07-05 07:35:33
|
I believe this was in the 256 kbps opus use
|
|
2025-07-05 07:55:39
|
I don't like 128 kbps opus, it sounds like from a distance or through a thin wall, missing a lot
|
|
|
lonjil
|
|
Jyrki Alakuijala
I believe this was in the 256 kbps opus use
|
|
2025-07-05 09:54:14
|
Very nice. I'm looking forward to testing Ringli and giving feedback :)
|
|
|
diskorduser
|
2025-07-05 10:19:08
|
My phone supports some form of hdr but hdr images look bad on chrome. The OEM probably did something bad.
|
|
|
RaveSteel
|
2025-07-05 10:38:14
|
I have the same issue with my phone
|
|
|
diskorduser
|
|
RaveSteel
I have the same issue with my phone
|
|
2025-07-05 10:51:06
|
Is that xiaomi or mi?
|
|
|
RaveSteel
|
2025-07-05 10:51:16
|
Samsung
|
|
|
diskorduser
|
2025-07-05 10:51:55
|
Oh. I thought it was xiaomi's fault
|
|
|
RaveSteel
|
|
diskorduser
|
|
RaveSteel
|
|
2025-07-05 10:52:12
|
Webpage link?
|
|
|
RaveSteel
|
2025-07-05 10:52:37
|
https://jpegxl.info/resources/hdr-test-page.html
|
|
|
diskorduser
|
2025-07-05 11:01:40
|
Hdr avif looks funny on Google photos
|
|
|
jonnyawsom3
|
|
gb82
https://github.com/gianni-rosato/photodetect2
C impl is linked in README
|
|
2025-07-10 12:08:09
|
We tried compiling the C version, but <@207980494892040194> kept running into dependency issues. Don't suppose you could help figure it out?
|
|
|
gb82
|
2025-07-10 12:09:12
|
I donβt know if it actually self compiles
|
|
2025-07-10 12:09:16
|
What dep is missing?
|
|
|
A homosapien
|
2025-07-10 10:12:23
|
I got the zig version to build and it works like a charm
|
|
2025-07-10 10:15:38
|
However, the C implementation spits out this `undefined reference to 'WinMain'` and a few other errors as well
|
|
2025-07-10 10:33:40
|
My environment is msys2 (mingw64) using gcc `gcc -Wall -O3 screen_content_detect.c -o scd.exe`
|
|
|
LMP88959
|
2025-07-10 10:58:07
|
<@207980494892040194> the reference doesnt have a main function
|
|
2025-07-10 10:58:19
|
<@297955493698076672>
|
|
|
A homosapien
|
2025-07-10 10:59:03
|
So it can't even be built as an executable right?
|
|
|
LMP88959
|
2025-07-10 10:59:08
|
right
|
|
|
A homosapien
|
2025-07-10 11:00:14
|
I could have AI hallucinate me some code to get it working, but the zig version works for me.
|
|
|
juliobbv
|
|
LMP88959
<@207980494892040194> the reference doesnt have a main function
|
|
2025-07-10 11:43:39
|
yeah, the C implementation is just the helper functions π
|
|
2025-07-10 11:44:00
|
maybe I'll shape it up into a proper executable in the future
|
|
|
gb82
|
2025-07-11 05:25:59
|
i don't think u necessarily need a main function, its a reference implementation with example code, for all intents & purposes
|
|
2025-07-11 05:26:22
|
the Zig impl is not a reference implementation, it is based on the C code and bundled into an actual tool (if you could call it that)
|
|
|
Demiurge
|
2025-07-13 06:37:24
|
https://engineering.fb.com/2015/08/06/android/the-technology-behind-preview-photos/
|
|
2025-07-13 07:42:25
|
https://medium.com/@duhroach/reducing-jpg-file-size-e5b27df3257c
|
|
2025-07-13 07:49:30
|
Blurring the chroma channels before compression... It made me think, maybe encoders like libjxl can apply several different pre-processing techniques to the image before compressing it, to make the data more ideal for the type of compression being used.
|
|
2025-07-13 07:51:33
|
Like subtly re-arranging some of the barely-perceptible noise in the image to make it less random and more aligned with the compressor.
|
|
|
jonnyawsom3
|
2025-07-13 12:46:48
|
Days without Pashi mentioning noise: 0
|
|
2025-07-13 12:47:41
|
I often think of ideas like that, but preprocessing shouldn't be needed, the encoder itself should be improved with the same ideas
|
|
|
Demiurge
|
2025-07-13 12:53:20
|
You know the entire, the ENTIRE premise behind lossy compression is separating intelligible"signal" from unintelligible "noise" in an image, and preserving the "signal" without wasting bits on noise.
|
|
2025-07-13 12:53:57
|
So yeah of course I talk about noise a lot because you kind of have to in order to talk about lossy compression at all
|
|
2025-07-13 12:58:19
|
The reason why DCT works so well is because it preserves what our brain sees as important very well, and what our brain sees as noise, it changes the noise pattern into macroblocking patterns and other DCT artifact patterns that sometimes blend into photographs
|
|
2025-07-13 01:04:30
|
Being able to recognize what data our brains don't care about, and re-arranging that data so it looks the same but is much easier to compress, well, that's the whole premise of lossy
|
|
|
gb82
|
2025-07-14 06:45:48
|
https://engineering.fb.com/2024/03/26/android/instagram-threads-hdr-photos/
listening to the podcast, I guess Meta landed on using Ultra HDR JPEG for their HDR images on Instagram & Threads
|
|
|
juliobbv
|
2025-07-14 10:11:25
|
*sigh*
|
|
2025-07-14 10:12:05
|
we'll never be able to escape 30 year old image tech I guess?
|
|
|
Quackdoc
|
2025-07-14 10:27:58
|
not any time soon
|
|
2025-07-14 10:28:29
|
android doesn't support jxl at all yet xD
|
|
|
HCrikki
|
2025-07-14 11:20:22
|
big apps can force adoption since they handle their own entire stack
|
|
2025-07-14 11:21:22
|
ie, browse facebook slowly in a browser or fast if you use our jxl-serving app - consume even less bandwidth and cycles than a desktop browser with adblock
|
|
2025-07-14 11:23:09
|
lot of potential for bypassing browsers
|
|
2025-07-14 11:25:42
|
for android, imo itd be smart getting the preinstalled gallery apps onboard. samsung's already includes the jxl decoding routines of *dng 1.7* and theres like 200+ million installs of a version that includes that support
|
|
2025-07-14 11:27:56
|
imo people are focusing too much on websites and browsers whereas spreading decoders within apps (yours, your web service+galleries) should *precede* any use
|
|
|
Meow
|
|
juliobbv
we'll never be able to escape 30 year old image tech I guess?
|
|
2025-07-15 01:36:36
|
and some 30-year old tech called "PNG" just got updated
|
|
|
juliobbv
|
2025-07-15 01:46:43
|
mfw realizing 1995 was 30 years ago
|
|
2025-07-15 01:46:49
|
https://tenor.com/b0CbE.gif
|
|
2025-07-15 01:49:14
|
PNG is kind of an exception though -- you can convert from and to other lossless formats without generation loss
|
|
2025-07-15 01:51:46
|
I guess we could have Ultra HDR JPEG to JXL compression heh
|
|
|
Meow
|
2025-07-15 02:21:32
|
Isn't Ultra HDR just a gain map?
|
|
|
juliobbv
|
2025-07-15 02:23:32
|
that's my understanding
|
|
|
username
|
2025-07-15 02:23:42
|
gain map with a stupid name
|
|
|
juliobbv
|
2025-07-15 02:24:14
|
but I'm not sure if JXL can preserve the gain map if is big enough
|
|
2025-07-15 02:24:24
|
IIRC there was a metadata limit?
|
|
|
username
|
2025-07-15 02:29:55
|
iirc there was some talk about doing a spec extension or something for gainmap support with transcoding in JXL. IMO it should also be built around supporting JPEGΒ XT as well if they are going through that effort
|
|
|
Meow
|
2025-07-15 02:49:26
|
PNG 4th spec is to add gainmap as well
|
|
|
Mine18
|
|
HCrikki
imo people are focusing too much on websites and browsers whereas spreading decoders within apps (yours, your web service+galleries) should *precede* any use
|
|
2025-07-15 05:24:20
|
you say that but we haven't seen a major app do this, no?
|
|
|
Quackdoc
|
2025-07-15 09:41:51
|
some do but its not too common as that increases attack surface
|
|
2025-07-15 09:44:17
|
flutter apps sometimes ship their own decoders, (no jxl decoder/binding that is drop in compatible)
electron/webview apps obviously support what the chromium/webview supports
and some kotlin java apps use coil and that other api, but it has performance limitations and explicit integration that not many are willing to do
|
|
|
jonnyawsom3
|
|
username
iirc there was some talk about doing a spec extension or something for gainmap support with transcoding in JXL. IMO it should also be built around supporting JPEGΒ XT as well if they are going through that effort
|
|
2025-07-15 12:27:44
|
https://discord.com/channels/794206087879852103/803574970180829194/1308032222112120854
|
|
|
username
|
|
https://discord.com/channels/794206087879852103/803574970180829194/1308032222112120854
|
|
2025-07-15 12:40:06
|
after a quick glance over this it seems like it's generic enough to handle JPEGΒ XT data (such as alpha channels and whatever else) though I'm a bit tired ATM so I haven't looked too hard
|
|
|
la .varik. .VALefor.
|
|
la .varik. .VALefor.
jxlatex jank
https://github.com/varikvalefor/drawingstuff/blob/702ba3c2c5915c592fca9bac09c7c5b646fb1482/zbas
|
|
2025-07-16 02:24:37
|
moar
```
(export CMENE=shininglikecrystaltiaras && cd $CMENE && djxl $CMENE.jxl --output_format apng - | ffmpeg -y -f apng -i - -vf 'select=eq(2\,n)' -vframes 1 $CMENE-ctino2.png)
```
|
|
|
jonnyawsom3
|
2025-07-16 01:03:16
|
|
|
2025-07-16 01:03:16
|
A game dev asked me for help with their texture processing, asking about using PSNR and SSIM. I pointed them towards SSIMULACRA2, and I'm happy to see they're already implementing it
|
|
|
Lumen
|
|
A game dev asked me for help with their texture processing, asking about using PSNR and SSIM. I pointed them towards SSIMULACRA2, and I'm happy to see they're already implementing it
|
|
2025-07-16 01:13:10
|
why did they need to reimplement it?
|
|
|
jonnyawsom3
|
|
Lumen
why did they need to reimplement it?
|
|
2025-07-16 01:13:38
|
They're making a wrapper for it
|
|
|
Lumen
|
2025-07-16 01:13:49
|
oh I see
|
|
|
username
|
|
A game dev asked me for help with their texture processing, asking about using PSNR and SSIM. I pointed them towards SSIMULACRA2, and I'm happy to see they're already implementing it
|
|
2025-07-17 01:22:13
|
for diffuse/color/base textures SSIMULACRA2 would be excellent however for stuff like specular masks or normal maps it might not be the best choice since those kind of textures are in linear space and not meant to be viewed directly and just as data for a shader
|
|
2025-07-17 01:22:41
|
similar thing goes for image resizing/downscaling with or without gamma correct scaling for generating mipmaps and whatever as well I assume
|
|
|
jonnyawsom3
|
|
username
for diffuse/color/base textures SSIMULACRA2 would be excellent however for stuff like specular masks or normal maps it might not be the best choice since those kind of textures are in linear space and not meant to be viewed directly and just as data for a shader
|
|
2025-07-17 07:06:13
|
Normal maps are DXTnm only right now anyway, so there's nothing to compare against
|
|
2025-07-17 07:37:58
|
https://github.com/Yellow-Dog-Man/Ssimulacra2.NET
|
|
|
spider-mario
|
2025-07-17 12:39:39
|
https://www.energuide.be/en/questions-answers/how-much-power-does-a-computer-use-and-how-much-co2-does-that-represent/54/
> A desktop uses an average of 200 W/hour when it is being used
> A complete desktop uses an average of 200 Watt hours (Wh).
A site named βenerguideβ with such sloppy use of units doesnβt really inspire confidence
|
|
|
Lumen
|
2025-07-17 12:42:40
|
especially when both units are wrong XDD
|
|
2025-07-17 12:43:07
|
they probably meant 200W and that s it
|
|
|
spider-mario
|
2025-07-17 12:43:14
|
itβs like Goldilock and the Three Bears
|
|
|
Lumen
|
2025-07-17 12:43:17
|
but 200W is huge for a standard laptop
|
|
|
spider-mario
|
2025-07-17 12:43:30
|
yeah, this is for desktops
|
|
|
Lumen
|
2025-07-17 12:43:42
|
oh yes nevermind, then it s correct
|
|
|
spider-mario
|
2025-07-17 12:43:45
|
or do you mean the 50-100W they cite for those?
|
|
|
Lumen
|
2025-07-17 12:44:03
|
well 200W definitely seems a lot for an average of a consumer desktop
|
|
2025-07-17 12:44:24
|
50W would be more realistic probably right? I never did actual measurment so I don't know for sure
|
|
|
spider-mario
|
2025-07-17 12:44:27
|
my 16" MacBook Pro idles at 10-20W so their laptop estimate seems a bit high
|
|
2025-07-17 12:44:35
|
my desktop, around 150W including the screen
|
|
|
Lumen
|
2025-07-17 12:44:42
|
ah yes the screen
|
|
|
spider-mario
|
2025-07-17 12:44:52
|
(and the screen alone seems to be about 30-40W at the brightness I use it at)
|
|
|
Lumen
|
2025-07-17 12:45:07
|
my mini-pc with a celeron inside is at 2W even under load ahah what a ~~beast~~
|
|
2025-07-17 12:45:31
|
it doesnt even have a fan
|
|
2025-07-17 12:45:38
|
no moving part
|
|
|
spider-mario
|
2025-07-17 12:47:24
|
no noise
|
|
2025-07-17 12:47:33
|
no dust accumulation
|
|
|
Lumen
|
2025-07-17 12:48:42
|
don't ask it to encode av1 though ahah
|
|
|
spider-mario
|
2025-07-17 12:48:59
|
or run an LLM
|
|
|
Lumen
|
2025-07-17 12:49:05
|
poor celeron
|
|
2025-07-17 12:49:15
|
it has 8GB of RAM though! more than 4
|
|
|
spider-mario
|
2025-07-17 12:49:19
|
although perhaps gemma 3n could be worth a try?
|
|
|
Lumen
|
2025-07-17 12:49:35
|
I don't use any AI, I hate them ahah
|
|
2025-07-17 12:49:40
|
so I will not try sorry
|
|
|
diskorduser
|
2025-07-17 05:57:22
|
gaming laptop may use 200 watts during heavy gaming.
|
|
|
Demiurge
|
2025-07-18 08:24:06
|
While it's plugged in
|
|
2025-07-18 08:24:44
|
When unplugged they don't drain the battery that fast
|
|
2025-07-19 02:16:39
|
I wonder why people don't ever use kilojoules
|
|
2025-07-19 02:18:12
|
1 Whr =3.6 kjoule
|
|
|
Lumen
|
|
Demiurge
I wonder why people don't ever use kilojoules
|
|
2025-07-19 06:09:33
|
Usually we better know our average electric consumption per hour instead of per second for object
"I use my computer for 30min" we want to divide have 100Wh for a 200W object
Not... Multiplying by 1800
|
|
|
Demiurge
|
2025-07-19 06:12:09
|
And what the heck is a calorie? 4184 joules per kilocalorie? Why?
|
|
2025-07-19 06:12:32
|
Why can't people just use standard SI units?
|
|
2025-07-19 06:14:57
|
1 terabyte = 1,000,000,000,000 bytes! Not 1,099,511,627,776 bytes, like wtf?
|
|
2025-07-19 06:16:05
|
Ounces? Pints? Dry ounce? Wet ounce? Gallon and quart? Cups? Are liters and grams just not good enough for people?!
|
|
2025-07-19 06:16:40
|
It drives me crazy when trying to compare prices, and one item is priced per ounce and the other is per some other stupid measure
|
|
2025-07-19 06:18:06
|
You can't even tell what price something is and when you read the nutrition label, it's not nutrition "per 100 grams" but rather it's "per whatever arbitrary amount we decide to manipulate the numbers"
|
|
2025-07-19 06:18:25
|
America sucks basically
|
|
2025-07-19 06:18:35
|
Sorry friends
|
|
2025-07-19 06:25:58
|
I know Americans are always either really proud of their country, or really ashamed for some reason... Being proud or ashamed are both cringe. But this drives me mad when trying to shop.
|
|
|
Quackdoc
|
2025-07-22 07:06:31
|
any good tools that are good for specifically checking for corruption? wanting to batch encode a whole wackload of images to lossy, but them not being corrupt is kinda critical and veryfing each one is not really an option.
I'll be encoding with e7 d0.5, maybe using the progressive_dc dunno yet, so I'm not conserned with fidelity.
|
|
2025-07-22 07:07:11
|
just care about super obvious corruption, even if its a small portion of the frame
|
|
|
_wb_
|
2025-07-22 07:16:46
|
what file format are the images you want to check?
|
|
|
NovaZone
|
|
Quackdoc
any good tools that are good for specifically checking for corruption? wanting to batch encode a whole wackload of images to lossy, but them not being corrupt is kinda critical and veryfing each one is not really an option.
I'll be encoding with e7 d0.5, maybe using the progressive_dc dunno yet, so I'm not conserned with fidelity.
|
|
2025-07-22 07:56:00
|
iirc https://github.com/qarmin/czkawka can scan for broken files but as for "corruption" idk what could do that
|
|
2025-07-22 08:13:37
|
ah yea it can scan for corruption, dunno if it catches xl tho
|
|
|
Jyrki Alakuijala
|
|
Lumen
my mini-pc with a celeron inside is at 2W even under load ahah what a ~~beast~~
|
|
2025-07-22 10:50:47
|
I bought a ryzen 7 fanless pc, with 6 or 12 cores, don't remember -- it feels really nice -- I plan to use it for my spatial audio experiments / robotic sinfonic orchestra. I think it works with 12 W or so.
|
|
|
diskorduser
|
|
Jyrki Alakuijala
I bought a ryzen 7 fanless pc, with 6 or 12 cores, don't remember -- it feels really nice -- I plan to use it for my spatial audio experiments / robotic sinfonic orchestra. I think it works with 12 W or so.
|
|
2025-07-22 01:14:44
|
Hello could you allocate some time and fix that desaturation problem is jxl.
|
|
|
Quackdoc
|
|
_wb_
what file format are the images you want to check?
|
|
2025-07-22 01:40:17
|
just jxl, I am doing jxl lossless > jxl lossy. I doubt I will hit an issue, but reliability is key here.
|
|
|
Jyrki Alakuijala
|
|
diskorduser
Hello could you allocate some time and fix that desaturation problem is jxl.
|
|
2025-07-22 01:57:23
|
I made my best effort ~month ago
|
|
2025-07-22 01:58:33
|
what distance ranges is the desaturation problem at its strongest in the current head of libjxl? (in relation to the luma quality)
|
|
|
jonnyawsom3
|
|
Jyrki Alakuijala
what distance ranges is the desaturation problem at its strongest in the current head of libjxl? (in relation to the luma quality)
|
|
2025-07-22 02:10:28
|
I now have a checkerboard pattern stuck in my vision, but I just confirmed it's the AC of the B channel. We need to create a new quant table with a shallower falloff.
Though subtle, it's present even at distance 0.1 with flicker tests
|
|
2025-07-22 02:11:05
|
Distance 1
|
|
2025-07-22 02:12:41
|
Jon had an interesting theory the other day https://discord.com/channels/794206087879852103/1288790874016190505/1395052409105154168
|
|
2025-07-22 02:12:59
|
|
|
2025-07-22 02:14:56
|
Huh... Effort 4 really shows the B channel bleeding
|
|
2025-07-22 02:20:55
|
Main (176 bytes) vs 0.8 (141 bytes) at Distance 1 Effort 7
|
|
2025-07-22 02:21:44
|
This is just a 9 x 9 yellow and black checkerboard. It's the same results for larger images, but they hurt my eyes....
|
|
2025-07-22 02:22:33
|
Interesting that in all encodes, the bottom right pixel is always full saturation, but the rest are dull
|
|
|
spider-mario
|
2025-07-22 02:37:39
|
I kind of like the idea of artificially making the input a bit more saturated so that the decoded version is correctΒ β a sort of βchroma gaborishβ, in a sense
|
|
2025-07-22 02:37:51
|
(not quite, but you get what I mean)
|
|
|
Tirr
|
2025-07-22 02:38:36
|
or chroma-preserving filter
|
|
|
jonnyawsom3
|
2025-07-22 02:40:38
|
I'd definitely say people would prefer the output be slightly more saturated than slightly more dull. But there's a few different ways to fix it, incrasing the saturation internally, adjusting the quant table, different rounding... Would need to try them and see what's best, but 0.8 seemed to be pretty close already
|
|
|
spider-mario
|
2025-07-22 02:42:28
|
increasing the saturation has the advantage, over quant tables, of not having to signal different quant tables
|
|
|
jonnyawsom3
|
2025-07-22 02:42:35
|
Realised I should try Hydrium... It was not happy
|
|
|
spider-mario
increasing the saturation has the advantage, over quant tables, of not having to signal different quant tables
|
|
2025-07-22 02:47:54
|
Signalling a different quant table for B should only be a few bytes when parameterised though, and right now main is already 20% larger than 0.8 while looking worse
|
|
2025-07-22 02:50:52
|
570 KB Main and 450 KB v0.8
|
|
|
_wb_
|
2025-07-22 03:03:29
|
I think maybe it is possible to do RGB -> XYB in a clever way so that X and B are smoother than what you get from simple RGB2XYB, while the result after XYB -> RGB and clamping is the same. After all, there are many (X,Y,B) combinations that map to black after clamping, and naive RGB2XYB will only use (0,0,0).
|
|
|
spider-mario
|
2025-07-22 03:06:10
|
sounds a bit mozjpeg-like
|
|
|
_wb_
|
2025-07-22 03:16:31
|
kind of, since it's exploiting clamping, but here the aim would be to avoid chroma desaturation, not luma ringing. Though it might end up also helping with that, since you'd do something like mapping black to something like (0.2, -0.3, -0.5) instead of (0,0,0) if the surrounding pixels are close to (0.2, whatever_Y, -0.5)
|
|
2025-07-22 03:17:49
|
(so it would seemingly stretch contrast in Y, which will reduce visible ringing after clamping)
|
|
2025-07-22 04:41:21
|
In any luma-chroma space, e.g. XYB or YCbCr, there's a large part of the color cube that is left unused if you just apply the forward transform β close to black, chroma always gets close to zero, and close to white, the same happens. So there are "subnormal" YCC values that also have to map to some RGB values (either to some in-gamut color or to an out of gamut color that after clamping does become an in-gamut color) and these values could be used in a clever forward color transform. Basically instead of converting RGB to YCC pixel per pixel and using only the "normal" part of the cube, you could choose "subnormal" YCC values that convert back to the same RGB values if those subnormal YCC values create less overall entropy than the normal ones (e.g. maybe increase Y entropy but reduce entropy in both chroma channels more).
How to do this efficiently is nontrivial though. I don't know how to find the set of "subnormal" YCC values for a given RGB value β I suppose the set is empty for intermediate-luma colors, and gets larger as luma gets closer to 0 (and I suppose also as luma gets closer to the maximum value, though I'm not sure if we're supposed to clamp colors brighter than rgb(1,1,1) or if those colors are supposed to be actually brighter (i.e. rendered as HDR even if the image is nominally in an SDR space).
|
|
|
Jyrki Alakuijala
|
2025-07-22 10:13:38
|
are these results from libjxl head?
|
|
2025-07-22 10:17:24
|
I can make this better (at the cost of some compression density elsewhere)
|
|
|
Demiurge
|
2025-07-22 10:30:43
|
Don't desaturate my dragon πΏ
|
|
|
spider-mario
I kind of like the idea of artificially making the input a bit more saturated so that the decoded version is correctΒ β a sort of βchroma gaborishβ, in a sense
|
|
2025-07-22 10:33:28
|
It doesn't matter how it's done, as long as it works and is cheap to do. Even if it's a pre-processing hack. Isn't there a way to predict and compensate for desaturation effect so the final result is the same color as the original?
|
|
|
jonnyawsom3
|
|
Jyrki Alakuijala
are these results from libjxl head?
|
|
2025-07-22 11:02:40
|
Yes
|
|
|
Jyrki Alakuijala
I can make this better (at the cost of some compression density elsewhere)
|
|
2025-07-23 01:04:14
|
I'd try the new quant table with increased B AC first. It should have minimal impact on other image properties with minimal overhead while being fast to implement and test
|
|
|
Demiurge
|
2025-07-23 04:25:50
|
I don't think that's the right approach, to increase the precision of the quant table's higher frequency coefficients. The human eye is not that sensitive to blurring of the chroma channel. The problem is that the estimation of the final color vs the original color is way, way off. If that can be corrected by predicting and compensating for the error, either during quantization or beforehand during a pre-processing filter step (like <@604964375924834314> said) that would make a lot more sense and you would not have to pay the price of changing the quant tables
|
|
2025-07-23 04:28:13
|
Ideally it would be compensated for DURING the quantization step, not beforehand, but either way, whatever works and gets us there.
|
|
2025-07-23 04:30:48
|
Even if the B channel AC is to blame, you don't need to change the quant tables to predict and compensate for the color shift
|
|
2025-07-23 04:33:19
|
You just have to predict and factor it in, either before or during DCT
|
|
|
Jyrki Alakuijala
|
2025-07-23 08:00:18
|
I'll try to reduce it with more traditional approaches first
|
|
2025-07-23 08:00:35
|
the blue component desaturation is now the problem, and red-green is ok?
|
|
|
diskorduser
|
|
Jyrki Alakuijala
the blue component desaturation is now the problem, and red-green is ok?
|
|
2025-07-23 08:18:41
|
I noticed yellows getting desaturated.
|
|
|
A homosapien
|
|
Jyrki Alakuijala
the blue component desaturation is now the problem, and red-green is ok?
|
|
2025-07-23 08:19:33
|
If red-green desaturation is present, it is not perceptible (as far as I can see), which is what matters. Blue-yellow is quite notable even at lower distances.
|
|
2025-07-23 08:29:06
|
Here is a good problem case image
Left = Original, Right = JXL d2
|
|
|
Jyrki Alakuijala
|
|
diskorduser
I noticed yellows getting desaturated.
|
|
2025-07-23 09:32:50
|
that is too much blue
|
|
|
Demiurge
|
|
Jyrki Alakuijala
the blue component desaturation is now the problem, and red-green is ok?
|
|
2025-07-23 09:33:24
|
I noticed greens getting slightly desaturated but it's harder to notice compared to orange and red and yellow.
|
|
2025-07-23 09:34:42
|
I think it might affect all colors... but it's easiest for me to notice with yellows and oranges and reds simply because those colors have more cone receptors in my eye
|
|
|
Jyrki Alakuijala
|
|
A homosapien
Here is a good problem case image
Left = Original, Right = JXL d2
|
|
2025-07-23 09:35:03
|
I don't see it with this one
|
|
|
Demiurge
|
2025-07-23 09:36:31
|
I can definitely see the difference side by side even on my iPhone
|
|
2025-07-23 09:36:57
|
The left image is way more saturated and the one on the right is pale
|
|
2025-07-23 09:37:24
|
It looks more deeply red and orange, even without zooming in
|
|
2025-07-23 09:38:08
|
I would rank it "crazy obvious" level of difference
|
|
2025-07-23 09:38:54
|
It looks like the reds are replaced with a pale washed out vomit yellow
|
|
2025-07-23 09:40:10
|
I think the Detroit factory image is still probably one of the best examples of the problem
|
|
2025-07-23 09:40:28
|
Or anything with deep red-orange
|
|
|
NovaZone
|
2025-07-23 09:42:27
|
in testing its actually a problem with green primarily followed by yellow
|
|
2025-07-23 09:45:08
|
ez to tell with some basic subtraction
|
|
|
Demiurge
|
2025-07-23 09:48:42
|
Are there any colors that aren't affected? Like... Blue?
|
|
|
NovaZone
|
2025-07-23 09:49:18
|
i mean considering one of the primes is off i doubt it
|
|
|
Demiurge
|
2025-07-23 09:49:21
|
I have a feeling it would be hard to tell even if deep blues were desaturated since our eyes are much less sensitive to deep blues
|
|
|
NovaZone
ez to tell with some basic subtraction
|
|
2025-07-23 09:51:32
|
I don't think this is a valid way to tell since it doesn't represent how sensitive the human visual system is to the difference
|
|
|
NovaZone
|
2025-07-23 09:52:16
|
i mean is 255 than accurate if not then well
|
|
|
Demiurge
|
2025-07-23 09:52:58
|
For example the color of the diff/subtraction map is not the same color as the original image
|
|
2025-07-23 09:54:02
|
And for a variety of reasons it doesn't give us valid insight in terms of difference to the human visual system
|
|
2025-07-23 09:54:57
|
It tells us more about objective/digital/mathematical difference but not psychovisual
|
|
2025-07-23 09:55:37
|
It would be better to do an A/B flicker test to see which areas change the most visually
|
|
2025-07-23 09:55:53
|
For valid insight
|
|
|
NovaZone
|
2025-07-23 09:57:05
|
sure but for lossy it should artifact not color shift
|
|
2025-07-23 09:58:22
|
examining the range of that green block shows wild inconsistency's
|
|
2025-07-23 09:59:39
|
granted even on stupid high zoom it's impossible to perceive
|
|
2025-07-23 09:59:54
|
but thats with the assistance of algos to smooth it out
|
|
2025-07-23 10:01:50
|
for reference yee old jpeg
|
|
2025-07-23 10:02:21
|
also has problems with yellow clearly but 0 with green
|
|
2025-07-23 10:04:43
|
webp/avif largely the same story
|
|
|
Demiurge
|
|
NovaZone
for reference yee old jpeg
|
|
2025-07-23 10:05:12
|
Looks like olde JPEG has a problem with AC details around the text but the background color itself matches
|
|
2025-07-23 10:05:39
|
Meanwhile JXL screws up the background color and not just the text details
|
|
2025-07-23 10:06:01
|
Also the red looks just as bad or worse than the green
|
|
|
NovaZone
|
2025-07-23 10:06:56
|
yep while ill admit the testing methodology is nowhere near perfect, it does have a decently accurate representation of harsh flaws
|
|
|
Demiurge
|
2025-07-23 10:08:16
|
Yeah red and green is screwy
|
|
|
NovaZone
|
2025-07-23 10:12:57
|
ref img btw https://htmlcolorcodes.com/color-chart/
|
|
2025-07-23 10:13:08
|
web safe color chart one
|
|
|
Demiurge
|
2025-07-23 10:18:48
|
Still, in order to get a true idea about how bad the problem is and where it exists, is an A/B flicker test at the bare minimum...
|
|
|
NovaZone
|
2025-07-23 10:19:42
|
yea mostly just wanted to share a ez way for regular ppl to see the problems very clearly XD
|
|
2025-07-23 10:20:53
|
throw img in pipeline > convert > subtract > use eyeballs π€£
|
|
|
A homosapien
|
|
Jyrki Alakuijala
I don't see it with this one
|
|
2025-07-23 10:49:15
|
Did you do a flicker test? Side by side is more difficult to spot. But still perceptible for me.
|
|
|
jonnyawsom3
|
|
Distance 1
|
|
2025-07-23 11:10:59
|
You can replicate it within a minute or two using a chevkerboard, since it's all AC/HF. Interestingly I can't see it on my phone for the checker patterns, but it's obvious for Sapien's image. Likely depends on how saturated the display can be too
|
|
2025-07-23 02:39:38
|
|
|
|
A homosapien
Here is a good problem case image
Left = Original, Right = JXL d2
|
|
2025-07-23 02:40:44
|
To see if this is a problem with Jyrki's monitor, or eyes
|
|
|
spider-mario
|
2025-07-23 03:22:13
|
at first, I thought I couldnβt see it because I was looking at the blue parts (where you can see some detail blurring but itβs not really clear that anything happens to the saturation)
|
|
2025-07-23 03:22:17
|
itβs much more visible in the yellow parts
|
|
|
_wb_
|
2025-07-23 04:08:06
|
flipping between the two images, the yellow clearly becomes more grayish
|
|
|
jonnyawsom3
|
|
spider-mario
at first, I thought I couldnβt see it because I was looking at the blue parts (where you can see some detail blurring but itβs not really clear that anything happens to the saturation)
|
|
2025-07-23 04:18:20
|
The issue is that blue gets more saturated due to quantizing/rounding, which desaturates yellows and reds
|
|
|
Demiurge
|
2025-07-23 08:48:41
|
Yellow is on the opposite side of the spectrum as blue. It's weird saying it gets "more blue" rather than just saying it gets closer to zero or less saturated. While technically "closer to blue" and "closer to zero" are the same thing in this situation
|
|
2025-07-23 08:49:30
|
Anyways it's really weird how Jyrki seems to be the only one unable to see what we're seeing
|
|
|
Jyrki Alakuijala
|
|
Demiurge
Anyways it's really weird how Jyrki seems to be the only one unable to see what we're seeing
|
|
2025-07-23 09:17:07
|
I can see yellow become grayer, but I wasn't seeing the red-green getting less saturated. I'll get Sami's (spider-mario's) help tomorrow to see what I should be looking at.
|
|
|
To see if this is a problem with Jyrki's monitor, or eyes
|
|
2025-07-23 09:18:01
|
I use two monitors (macbook pro and a rare dell 8k, but I run it in 4k mode now)
|
|
2025-07-23 09:18:45
|
I acknowledge that it can be that my eyes are less sensitive for this than other people's eyes
|
|
2025-07-23 09:19:46
|
we were smart enough to add two 3 bit values for X quantization adjustement and B quantization adjustment -- the easiest fix is to just ramp up the precision by adjusting those
|
|
|
Demiurge
|
2025-07-23 09:23:09
|
Ramping up precision is expensive though
|
|
2025-07-23 09:23:14
|
You have to pay for it
|
|
|
jonnyawsom3
|
|
Demiurge
Yellow is on the opposite side of the spectrum as blue. It's weird saying it gets "more blue" rather than just saying it gets closer to zero or less saturated. While technically "closer to blue" and "closer to zero" are the same thing in this situation
|
|
2025-07-23 09:23:30
|
Correct, but due to the XYB transform, 0 on the B channel can actually be more blue for saturated colors like yellow. This was an example of increasing only the B channel by 2. The result was the RGB image having an increase of 24, desaturating the yellow https://discord.com/channels/794206087879852103/794206170445119489/1375373628199272550
|
|
|
Jyrki Alakuijala
we were smart enough to add two 3 bit values for X quantization adjustement and B quantization adjustment -- the easiest fix is to just ramp up the precision by adjusting those
|
|
2025-07-23 09:23:58
|
Do you mean this? https://discord.com/channels/794206087879852103/794206170445119489/1375384819231490048 Because we tried and it didn't work
|
|
|
Demiurge
|
2025-07-23 09:24:45
|
If you can somehow predict and compensate for the predicted color-shift error, you don't have to pay for additional precision
|
|
|
Jyrki Alakuijala
|
|
Demiurge
You have to pay for it
|
|
2025-07-23 09:30:11
|
during the more active butteraugli/JPEG XL I used NEC and EIZO photography quality monitors, now everything is more usual
|
|
|
jonnyawsom3
|
2025-07-23 09:30:52
|
Original, Head/Main, Maximum boosted B precision via 3 bit adjustment, v0.8
|
|
2025-07-23 09:31:35
|
v0.8 is still the most even and saturated result, while being 20% smaller, but still has some desaturation too
|
|
|
Jyrki Alakuijala
|
2025-07-23 09:32:58
|
very interesting
|
|
2025-07-23 09:33:32
|
we do quite a bit of increased 0 quantization for chromacity, and those heuristics have changed over time
|
|
2025-07-23 09:33:48
|
perhaps it would be a good idea to bring them back to 0.8 kind
|
|
|
jonnyawsom3
|
2025-07-23 09:35:11
|
That would explain it, as for yellow 0 in the B channel results in more blue instead of less as I mentioned before
|
|
2025-07-23 09:37:10
|
An interesting note, the bottom right pixel always has the original color. For the recent images, that's just black, but in my last test it's quite obvious https://discord.com/channels/794206087879852103/794206087879852106/1397219407650095227
|
|
2025-07-23 09:45:36
|
Hmm, I can bisect it somewhat down to a change between v0.8 and v0.9, but before your AC strategy PR
|
|
2025-07-23 09:46:24
|
Though, it'd likely be good to aim for even better saturation, than just rolling it back to v0.8's level
|
|
2025-07-23 09:47:17
|
My tiny test image (It works at higher resolutions, but this saves on encode time and my eyes hurting from moirΓ©)
|
|
|
Jyrki Alakuijala
|
2025-07-23 10:06:33
|
I'm thinking it is likely in quantization
|
|
|
Demiurge
Ramping up precision is expensive though
|
|
2025-07-23 10:07:11
|
true
|
|
|
juliobbv
|
2025-07-24 01:19:53
|
did at some point during JXL's development implement a trellis quantization strategy that can decimate near-0 coefficients to 0? sounds like that strategy might be malfunctioning for these kinds of images if that's the case
|
|
2025-07-24 01:22:27
|
mainline SVT-AV1 has a similar issue where near-greyscale HDR content looks blotchy because the RDOQ process can be too greedy at decimating small coefficients, therefore causing periodic undershoots/overshoots in the chroma channels
|
|
2025-07-24 01:25:07
|
my fix/workaround was to make RDOQ be less aggressive for chroma blocks that only have a couple small coefficients <:kekw:808717074305122316>
|
|
|
CrushedAsian255
|
2025-07-24 08:07:18
|
I feel like given how colour data is more patches and gradients compared to luminance, if someone would make another new format, Modular feels like a better algorithm for chroma data
|
|
|
Jyrki Alakuijala
|
|
juliobbv
did at some point during JXL's development implement a trellis quantization strategy that can decimate near-0 coefficients to 0? sounds like that strategy might be malfunctioning for these kinds of images if that's the case
|
|
2025-07-24 09:46:58
|
I don't like trellis -- it feels just too slow and similar benefits can be obtained with other heuristics, like mozjpeg has trellis, but jpegli doesn't
|
|
|
juliobbv
did at some point during JXL's development implement a trellis quantization strategy that can decimate near-0 coefficients to 0? sounds like that strategy might be malfunctioning for these kinds of images if that's the case
|
|
2025-07-24 09:47:48
|
There have been several changes to near-0 coefficients, particularly related to chromacity -- I think these are the reason for the degradation
|
|
|
juliobbv
my fix/workaround was to make RDOQ be less aggressive for chroma blocks that only have a couple small coefficients <:kekw:808717074305122316>
|
|
2025-07-24 09:49:13
|
yes, this is the way -- I have some of this in place for y already where I calculate 4 quadrants of energy within each dct block, and try to maintain the energy by even 'rounding' a very small value up to 1 (or -1) to maintain energy
|
|
|
jonnyawsom3
|
2025-07-24 03:02:24
|
I think that would fix it, internally the B values were off by 1 or 2. Does it have much of a speed penalty?
|
|
|
juliobbv
|
|
Jyrki Alakuijala
There have been several changes to near-0 coefficients, particularly related to chromacity -- I think these are the reason for the degradation
|
|
2025-07-24 05:35:06
|
yeah, that sounds like the smoking gun
|
|
|
Demiurge
|
2025-07-25 04:21:51
|
Maintaining the same energy... Sounds like error diffusion.
|
|
2025-07-25 04:22:11
|
Nice
|
|
|
Jyrki Alakuijala
|
|
Demiurge
Maintaining the same energy... Sounds like error diffusion.
|
|
2025-07-25 08:56:29
|
there is an analogy with it, just more hacky than usual error diffusion
|
|
|
Demiurge
|
2025-07-25 09:03:28
|
The hackier the better when it comes to this kind of thing
|
|
2025-07-25 09:03:47
|
As long as it works
|
|
|
jonnyawsom3
|
2025-07-25 09:04:07
|
As if we need libjxl to be any more confusing...
|
|
|
Demiurge
|
2025-07-25 09:06:54
|
What's confusing is the over(ab)use of C++ template hell, and the messy structure of the source tree.
|
|
|
Jyrki Alakuijala
|
2025-07-25 09:07:36
|
this part of the code is just honestly confusing, no additional confusion was necessary to be added
|
|
|
Demiurge
|
2025-07-25 09:07:58
|
Clever hacks to preserve fidelity are a good thing
|
|
2025-07-25 09:08:14
|
They are inevitable when making a good codec
|
|
|
Jyrki Alakuijala
|
2025-07-25 09:08:59
|
starting from https://github.com/libjxl/libjxl/blob/73beeb5409cd805cf8e957b2f33467be74dfbfe2/lib/jxl/enc_group.cc#L58
|
|
2025-07-25 09:09:15
|
there is a bit of additional confusion mixed in
|
|
2025-07-25 09:09:58
|
my estimate of human population who deeply understand AdjustQuantBlockAC function: 1
|
|
2025-07-25 09:10:55
|
also, it is not clear to many why we need QuantizeRoundtripYBlockAC
|
|
|
Demiurge
|
2025-07-25 10:32:37
|
Anyways the color desaturation thing is really severe, so I am glad you thought of using that "energy preservation" trick you use for luma and applying it to chroma as well...
|
|
2025-07-25 10:32:54
|
Currently jxl is probably the worst codec of all when it comes to preserving color fidelity
|
|
2025-07-25 10:33:04
|
And it affects jpegli too
|
|
2025-07-25 10:33:25
|
Not even libjpeg has such an issue
|
|
2025-07-25 10:34:06
|
Sure other codecs have other severe problems, but not this color shift issue
|
|
2025-07-25 10:34:58
|
It's unique to libjxl (and jpegli)
|
|
|
jonnyawsom3
|
2025-07-25 11:00:26
|
That's because XYB is unique to libjxl and jpegli
|
|
|
spider-mario
|
2025-07-25 11:09:34
|
does it affect jpegli without XYB?
|
|
|
jonnyawsom3
|
2025-07-25 11:24:07
|
Well, I discovered that Adaptive Quantization really hates small images
Original, jpegli, XYB, jpegli-NAQ, XYB-NAQ
|
|
2025-07-25 11:25:29
|
Even then, the final XYB image is still slightly desaturated compared to YCbCr next to it though
|
|
|
spider-mario
does it affect jpegli without XYB?
|
|
2025-07-25 11:33:39
|
Very slightly for non-XYB, but still less than JXL currently
Original, jpegli, XYB, JXL
|
|
|
Demiurge
|
|
spider-mario
does it affect jpegli without XYB?
|
|
2025-07-26 12:10:56
|
idk, I haven't tested.
|
|
|
Very slightly for non-XYB, but still less than JXL currently
Original, jpegli, XYB, JXL
|
|
2025-07-26 12:12:38
|
I cannot tell the difference between those 4 side-by-side...
|
|
|
jonnyawsom3
|
2025-07-26 12:13:04
|
Let's roll it by again then
|
|
|
Demiurge
|
2025-07-26 12:13:44
|
But I notice there is a lot more blue and yellow chromatic ringing in the 3rd image
|
|
|
jonnyawsom3
|
|
Demiurge
|
2025-07-26 12:14:07
|
Also I prefer 1x scale comparisons
|
|
2025-07-26 12:14:19
|
Especially for this color issue, you do not need to zoom in to see the color shift
|
|
2025-07-26 12:15:02
|
The thing is, I notice the color shift in almost every image. But it's not as obvious to me here.
|
|
|
jonnyawsom3
|
2025-07-26 12:15:23
|
I zoom in because it's in the areas of HF content, which are small details
|
|
|
juliobbv
|
|
Very slightly for non-XYB, but still less than JXL currently
Original, jpegli, XYB, JXL
|
|
2025-07-26 12:26:45
|
I do see the desaturation, that said luma artifacts appear to be the most noticeable to me
|
|
|
Demiurge
|
2025-07-26 12:45:31
|
The desaturation I've noticed and complained about were all very obvious from a distance
|
|
2025-07-26 12:59:39
|
And affected the entire image, not just small details
|
|
|
jonnyawsom3
|
2025-07-26 02:39:45
|
|
|
|
Very slightly for non-XYB, but still less than JXL currently
Original, jpegli, XYB, JXL
|
|
2025-07-27 04:15:44
|
Definitely depends on screen brightness too. I couldn't see it in sunlight, or with the brightness low, but turning it up in a dim room makes it obvious
|
|
|
pixyc
|
2025-07-27 05:12:46
|
pegj lx
|
|
|
Jyrki Alakuijala
|
|
Definitely depends on screen brightness too. I couldn't see it in sunlight, or with the brightness low, but turning it up in a dim room makes it obvious
|
|
2025-07-28 03:26:46
|
true -- this makes a big difference in observing this
|
|
|
jonnyawsom3
|
|
Jyrki Alakuijala
true -- this makes a big difference in observing this
|
|
2025-07-28 03:30:55
|
It's likely for a similar reason. Such saturated yellow is in a very narrow range, so lower screen brightness peaks at the lower saturation, making them look the same
The AQ heuristics aren't aware of the narrow range, so B also gets clipped to a lower value (or zeroed as you said before)
|
|
|
spider-mario
|
2025-07-28 04:06:21
|
https://youtu.be/hCQCP-5g5bo
|
|
|
jonnyawsom3
|
2025-07-28 04:18:07
|
I've been sent that a few times, pretty dissapointing for me. Didn't even use an image file, just drew onto a spectral synthesiser. Easily could've used a tiny PNG or a JXL art and gotten it to work
|
|
2025-07-29 12:14:04
|
|
|
|
Quackdoc
|
2025-08-01 05:10:33
|
can some folk test https://github.com/Quackdoc/cosmic-files/commits/jxl-preview-gallery ? need to know if it's usable enough to push make PR I've had some images fail in weird ways, but my images may just be outliers.
press `space` to open a gallery view or `ctrl + space` to open the details pane and just focus a jxl image and use arrows to navigate around
|
|
2025-08-01 05:11:01
|
im mostly concerned about performance and compatibility, I dunno if I need to put some limits in or anything
|
|
2025-08-01 05:11:55
|
it may also crash, if it crashes on an image, thats probably hitting https://github.com/tirr-c/jxl-oxide/issues/473
|
|
|
Traneptora
|
2025-08-04 09:01:43
|
so I just discovered something very funny
|
|
2025-08-04 09:02:09
|
if you type subprocess.run into your browser, trying to google the python function, it takes you to `http://subprocess.run/` instead
|
|
2025-08-04 09:02:33
|
but <http://subprocess.run/> apparently just returns a 301 to <https://docs.python.org/3/library/subprocess.html#subprocess.run>
|
|
2025-08-04 09:02:36
|
which is very very funny
|
|
|
_wb_
|
2025-08-04 09:59:27
|
someone registered that domain just to do that? lol
|
|
|
Quackdoc
|
2025-08-04 09:59:41
|
I wish I had that kinda cash lmao
|
|
|
spider-mario
|
2025-08-04 07:30:36
|
same for https://systemd.network/
|
|
|
CrushedAsian255
|
|
Traneptora
if you type subprocess.run into your browser, trying to google the python function, it takes you to `http://subprocess.run/` instead
|
|
2025-08-05 01:33:02
|
LOL the network blocked it as 'malware'
|
|
|
π°πππ
|
|
spider-mario
same for https://systemd.network/
|
|
2025-08-05 01:57:19
|
Wow, I just realized why I love OpenRC / S6 even more π
There is literally a book, just for the network module
|
|
|
Quackdoc
|
|
π°πππ
Wow, I just realized why I love OpenRC / S6 even more π
There is literally a book, just for the network module
|
|
2025-08-05 05:00:14
|
systemd is some of the most over engineered crap ever lol
|
|
|
|
afed
|
2025-08-06 02:33:04
|
https://github.com/w3c/png/issues/39
|
|
2025-08-06 02:33:53
|
https://news.ycombinator.com/item?id=44801027
|
|
|
jonnyawsom3
|
2025-08-06 02:37:06
|
<#805176455658733570> but <https://github.com/w3c/png/issues/426#issuecomment-2053907919>
> We probably also need to have a discussion about PNG updates vs. PNG2 (better name suggestions welcome). Anything that would significantly break existing PNGs should perhaps instead go into PNG2. The trade off is "old stuff broke" vs. "new adoption is extremely hard, so it was in vain." If the goal is to improve the world('s data situation), both make a strong case.
>
> The challenge of a PNG2 is that it wouldn't just be competing with the original PNG format, but also all the subsequent lossless formats. Lossless WebP has 14 different filters and the ability to de-correlate channels, a number of formats use CABAC to get better entropy coding, etc.
>
> I think that's okay. There probably should be multiple solutions.
> Even if the initial release of the hypothetical PNG2 is just a big catch-up and not yet fully competitive, I think it would be a good thing to do.
|
|
2025-08-06 02:37:53
|
I left a comment over a month ago when it was discussed here previously
|
|
|
|
afed
|
2025-08-06 12:42:17
|
yeah, it's already been discussed and it's <#805176455658733570>, but it just came back again as a new discussion on hn
and on-topic I meant that this is basically no different than introducing a new format, even if based on an existing one and why do we need a new format that is still worse than many other more modern alternatives and especially jxl?
and is it really that less new code is so important that nothing else matters?
|
|
|
spider-mario
|
2025-08-06 10:15:07
|
why did the brotli comment get downvoted so much
|
|
2025-08-06 10:15:12
|
it would have seemed worth considering, if the overall idea had been
|
|
2025-08-06 10:15:42
|
likewise for the content-encoding
|
|
|
|
afed
|
2025-08-06 10:44:09
|
less hyped, less well-known, not many people have tested it and know how effective it is, most people even think that it is only good for text compression and bad for everything else
|
|
|
jonnyawsom3
|
2025-08-07 01:48:08
|
Chicken and the egg, but the encoder has been hardly improved so there's even less reason to use it, which then means no one wants to improve the encoder
|
|
|
|
afed
|
2025-08-07 01:58:54
|
brotli?
there have been massive improvements, but the stable version has not been released yet
the last release was a long time ago
`brotli encodes 26% faster (lz compression tricks)`
<https://encode.su/threads/4408-brotli-encodes-26-faster-(lz-compression-tricks)>
|
|
|
jonnyawsom3
|
2025-08-07 03:06:54
|
Last release was 2020. If people can't see the improvements, they practically don't exist
|
|
|
spider-mario
|
2025-08-07 10:38:36
|
a lot of people also seem to just assume (without testing) that because compression is slow (by default), decompression must also be
|
|
2025-08-07 10:39:06
|
someone on reddit seemed to believe that it decompressed slower than LZMA π₯΄
|
|
|
jonnyawsom3
|
2025-08-07 10:39:45
|
Ironic thing is, ZSTD is the same if you use the higher settings. It's just because brotli defaults to better compression
|
|
|
spider-mario
a lot of people also seem to just assume (without testing) that because compression is slow (by default), decompression must also be
|
|
2025-08-07 10:46:51
|
This is a great demo I found where I can test arbitrary files on any device with a browser. Tested compressing DNGs on my phone in the past
https://bench.nickb.dev/
|
|
2025-08-07 11:27:45
|
It does show Brotli as twice as slow as ZSTD there though...
|
|
|
spider-mario
|
2025-08-07 11:53:39
|
that it is, but itβs hardly βslowβ, and certainly not as much as LZMA
|
|
|
TheBigBadBoy - πΈπ
|
2025-08-07 02:23:43
|
nothing is slow when we compare to `cjxl -e 11` <:KekDog:805390049033191445>
|
|
|
Kupitman
|
|
TheBigBadBoy - πΈπ
nothing is slow when we compare to `cjxl -e 11` <:KekDog:805390049033191445>
|
|
2025-08-07 03:54:13
|
nothing is slow when we compare to PAQ...
|
|
|
HCrikki
|
2025-08-09 03:46:45
|
https://github.com/web-platform-dx/developer-signals/issues/215
|
|
2025-08-09 03:47:28
|
proposal could use a signal boost and any relevant feedback about usefulness
|
|
|
Jyrki Alakuijala
|
|
Well, I discovered that Adaptive Quantization really hates small images
Original, jpegli, XYB, jpegli-NAQ, XYB-NAQ
|
|
2025-08-11 10:43:02
|
is that a 9x9 pixel image that is zoomed up?
|
|
|
jonnyawsom3
|
|
Jyrki Alakuijala
is that a 9x9 pixel image that is zoomed up?
|
|
2025-08-11 10:49:53
|
Yes, it applied to larger versions of the same pattern too, but the 9x9 stopped my eyes from hurting due to moirΓ©. It's an easy way to replicate the desaturation found in natural images too
|
|
2025-08-11 10:55:01
|
Well that's interesting... If I encode it as a JXL with `--disable_perceptual_optimizations` the yellow remains saturated but the B channel bleeds into the black pixels too
|
|
2025-08-11 10:55:29
|
Default lossy for comparison
|
|
|
Jyrki Alakuijala
|
|
Well that's interesting... If I encode it as a JXL with `--disable_perceptual_optimizations` the yellow remains saturated but the B channel bleeds into the black pixels too
|
|
2025-08-11 01:13:19
|
interesting, one would almost assume the opposite to happen by the flag name alone
|
|
|
jonnyawsom3
|
|
Jyrki Alakuijala
interesting, one would almost assume the opposite to happen by the flag name alone
|
|
2025-08-11 01:19:43
|
I realised it's because it disables XYB, eliminating the narrow range of the B channel that causes the desaturation
|
|
|
DZgas Π
|
2025-08-12 08:20:03
|
https://jeremylee.sh/bins/ the only one on the internet
|
|