JPEG XL

Info

rules 57
github 35276
reddit 647

JPEG XL

tools 4225
website 1655
adoption 20712
image-compression-forum 0

General chat

welcome 3810
introduce-yourself 291
color 1414
photography 3435
other-codecs 23765
on-topic 24923
off-topic 22701

Voice Channels

General 2147

Archived

bot-spam 4380

color

w
2024-09-26 09:14:56
first
CrushedAsian255
2024-09-26 09:16:20
Second
_wb_
2024-09-26 09:17:15
https://en.m.wikipedia.org/wiki/LMS_color_space
Traneptora
2024-09-26 09:17:52
Not only that <@386612331288723469>, but the transformation from one gamut to another is a linear transformation
2024-09-26 09:18:51
so if I have colors represented in sRGB between 0.0 and 1.0, and have accurate out-of-gamut colors represented by having numbers less than 0.0 or more than 1.0, then I can take that RGB weight vector, and multiply it by a matrix, and get a vector in another space, say, BT.2020's gamut
CrushedAsian255
2024-09-26 09:18:52
So it’s just a change of basis?
Traneptora
2024-09-26 09:18:57
that's exactly what it is
2024-09-26 09:19:07
choosing different primaries is literally a change of basis
CrushedAsian255
2024-09-26 09:19:26
So that’s why they’re called matrix coefficients
Traneptora
2024-09-26 09:19:33
matrix coefficients are something else
_wb_
2024-09-26 09:19:35
XYB is basically: Y is M+L (luma-ish, but a bit more yellowish) X is M-L (red-green chroma) B is S (but usually encoded as B-Y, so blue-yellow chroma)
CrushedAsian255
Traneptora matrix coefficients are something else
2024-09-26 09:19:41
Nevermind LMAO
Traneptora
2024-09-26 09:19:43
matrix coefficients refers to YCbCr <-> RGB
2024-09-26 09:19:47
which is also a linear transform
CrushedAsian255
2024-09-26 09:19:51
So kinda similar?
Traneptora
2024-09-26 09:19:55
just a different one
CrushedAsian255
_wb_ XYB is basically: Y is M+L (luma-ish, but a bit more yellowish) X is M-L (red-green chroma) B is S (but usually encoded as B-Y, so blue-yellow chroma)
2024-09-26 09:20:33
So it’s a linear transform from LMS?
Traneptora
2024-09-26 09:20:49
"matrix, primaries, and transfer" are the three things in cICP matrix refers to the YCbCr matrix, which is just a different representation of the RGB data. It's the same data. JXL doesn't use matrix because it doesn't deal with YCbCr outside of restructured jpegs, which uses the jpeg matrix
2024-09-26 09:21:02
"primaries" refers to the choice of primaries and white point, or gamut
CrushedAsian255
2024-09-26 09:21:03
[XYB] doesn’t really have a concept of a gamut or range or midpoints?
Traneptora
2024-09-26 09:21:27
XYB is approsimately biased LMS but not quite
_wb_
CrushedAsian255 So it’s a linear transform from LMS?
2024-09-26 09:21:31
Not quite linear, there is also a transfer function involved that models how cone activation is not linear
CrushedAsian255
CrushedAsian255 [XYB] doesn’t really have a concept of a gamut or range or midpoints?
2024-09-26 09:22:23
Is it just arbitrary values in no specific scale/range?
Traneptora
2024-09-26 09:22:26
if you start with linear light colors with rec709, the way you get XYB is by multiplying the [r,g,b] vector by a matrix to convert it to "approximately lms" and then applying a biased cubc transfer
_wb_
CrushedAsian255 [XYB] doesn’t really have a concept of a gamut or range or midpoints?
2024-09-26 09:22:38
Grayscale colors have X and B (or rather B-Y) at zero
CrushedAsian255
2024-09-26 09:23:53
Does JXL always store B as B-Y?
Traneptora
CrushedAsian255 Is it just arbitrary values in no specific scale/range?
2024-09-26 09:23:57
it doesn't really have one. It's defined in the spec to be a linear transform of rec709 colors. But remember that colors can be out of range, and to transform between two spaces is also linear. so it could have been defined as a linear transform of rec2020 primaries, with a different matrix, and actually just be the same definition
_wb_
2024-09-26 09:24:10
https://discord.com/channels/794206087879852103/824000991891554375/1282281959946584186
Traneptora
2024-09-26 09:24:43
i.e. if you take the rec709 -> XYB matrix, compose it with the rec2020 -> rec709 matrix, you get a rec2020 -> XYB matrix
2024-09-26 09:24:55
which is equally valid to be used directly
2024-09-26 09:25:26
this is also why it's very easy to decode XYB to any primaries, because you figure out the XYB -> rec709 matrix and the rec709 -> foo matrix, and then compose the matrices, so you can then just use that new one to decode XYB->foo
2024-09-26 09:25:46
without any extra computing
2024-09-26 09:26:07
cause you only need to compute the conversion matrix once, and then you use it for every pixel
_wb_
2024-09-26 09:27:55
2024-09-26 09:28:56
These are some slices, four different values of Y, with X on the horizontal axis and B on the vertical one
Traneptora
2024-09-26 09:28:59
for example, the conversion matrix to convert colors in sRGB to colors in BT2020 is this: ``` [[0.6273918, 0.32928905, 0.043318883], [0.06910729, 0.919537, 0.01135542], [0.01638554, 0.0880046, 0.89560986]] ```
2024-09-26 09:29:05
like it's just this matrix
2024-09-26 09:30:15
if `(r, g, b)` is a vector with colors in the rec709 space, and you multiply by this matrix, you get a new vector, which is the same color, but represented in the rec2020 space
2024-09-26 09:30:17
it's a change of basis
2024-09-26 09:30:41
I'm so so glad you called it a "change of basis" cause that really hits the concept on the head
2024-09-26 09:31:01
this is the change of basis matrix for rec709 -> rec2020
dogelition
Traneptora for example, the conversion matrix to convert colors in sRGB to colors in BT2020 is this: ``` [[0.6273918, 0.32928905, 0.043318883], [0.06910729, 0.919537, 0.01135542], [0.01638554, 0.0880046, 0.89560986]] ```
2024-09-26 09:31:10
note: only works with linearized, i.e. not "gamma"-encoded values
Traneptora
2024-09-26 09:31:20
Yes, I should specify, this only works in *linear light*
CrushedAsian255
2024-09-26 09:31:49
Can you just apply the EOTF?
Traneptora
2024-09-26 09:31:57
yes, that's called linearizing
2024-09-26 09:32:06
but if you do not do that, none of what I said works
2024-09-26 09:32:23
all of what I said about primaries is a linear-light-specific thing
2024-09-26 09:32:37
generally to do any kind of color space transitioning you have to linearize everything first
dogelition
2024-09-26 09:32:54
the only operation i'm aware of that "just works" in gamma space is scaling, i.e. multiplying by `a` in gamma space is equivalent to multiplying by some `f(a)` in linear space
CrushedAsian255
2024-09-26 09:33:07
Is XYB linear of gamma?
Traneptora
dogelition the only operation i'm aware of that "just works" in gamma space is scaling, i.e. multiplying by `a` in gamma space is equivalent to multiplying by some `f(a)` in linear space
2024-09-26 09:33:12
also not true
CrushedAsian255 Is XYB linear of gamma?
2024-09-26 09:33:35
XYB is defined as you take rec709 linear, transform it, and then apply an almost-cubic transfer
dogelition the only operation i'm aware of that "just works" in gamma space is scaling, i.e. multiplying by `a` in gamma space is equivalent to multiplying by some `f(a)` in linear space
2024-09-26 09:33:57
more specifically, if the "gamma space" is actually a pure gamma curve, sure
dogelition
2024-09-26 09:34:15
that's why i didn't put gamma in quotes
Traneptora
2024-09-26 09:34:16
because `(ab)^x = a^x b^x`
2024-09-26 09:34:24
ah, I missed that
2024-09-26 09:34:33
most spaces are not actually those true gamma curves though
2024-09-26 09:34:43
sRGB is notably *NOT* equivalent to a pure gamma curve
2024-09-26 09:34:49
and it's the most common space
2024-09-26 09:35:08
neither is rec709, rec601, or rec2020
2024-09-26 09:35:09
or PQ
2024-09-26 09:35:11
actually nothing really is
2024-09-26 09:35:18
except DCI ig
dogelition
2024-09-26 09:36:20
afaik when doing color space math, you usually use 2.4 for bt.709
Traneptora
2024-09-26 09:36:30
nah
2024-09-26 09:36:54
bt709 has a linear segment like srgb
dogelition
2024-09-26 09:37:12
that's the OETF
Traneptora
2024-09-26 09:37:47
what you're describing is bt.1886, which is supposed to be how you invert bt709 for the purpose of *display*
2024-09-26 09:38:07
but if you're inverting bt.709 for the purpose of *encoding* then you should invert it by using the inverse of the OETF
2024-09-26 09:38:25
because the EOTF and OETF for bt709 are not inverses of each other
2024-09-26 09:38:31
the EOTF is specified in bt.1886
2024-09-26 09:38:37
and the OETF is specified in bt.709
dogelition
2024-09-26 09:38:49
a calibrated (oled or similar) bt.709 display is calibrated to 2.4 gamma, so to be clear i'm only talking about display-referred (?) math related to that
Traneptora
2024-09-26 09:39:42
if you take a video that's encoded in bt709 (like, say, most dvds) and try to do any kind of arithmetic on it though, like extract a frame, convert to srgb, edit the image, etc. then ideally you just invert the OETF
2024-09-26 09:40:00
this is my code for it (fairly basic, I know)
2024-09-26 09:40:04
```java public static TransferFunction TF_BT709 = new TransferFunction() { @Override public double fromLinear(double f) { if (f < 0.018053968510807807336D) return 4.5D * f; else return 1.0992968268094429403D * Math.pow(f, 0.45D) - 0.0992968268094429403D; } @Override public double toLinear(double f) { if (f < 0.081242858298635133011D) return f * 0.22222222222222222222D; else return Math.pow((f + 0.0992968268094429403D) * 0.90967241568627260377D, 2.2222222222222222222D); } }; ```
2024-09-26 09:40:22
it has a linear segment just like SRGB
2024-09-26 09:40:32
(values taken from H.273)
dogelition
Traneptora if you take a video that's encoded in bt709 (like, say, most dvds) and try to do any kind of arithmetic on it though, like extract a frame, convert to srgb, edit the image, etc. then ideally you just invert the OETF
2024-09-26 09:41:04
but that's not going to give you the same absolute color between bt.709 and srgb on a properly calibrated display each
_wb_
2024-09-26 09:43:32
When doing image processing, it kind of depends on what you're doing what will be the best space to do it in. Linear light models the physics of photons, so if you're e.g. resampling it makes the most sense to do that. But transfer functions model perception, so if you're e.g. trying to make a nice-looking gradient, a perceptual space will work better than doing just interpolation in linear space.
Traneptora
dogelition a calibrated (oled or similar) bt.709 display is calibrated to 2.4 gamma, so to be clear i'm only talking about display-referred (?) math related to that
2024-09-26 09:44:24
that's actually not mentioned in Rec 709
2024-09-26 09:44:30
https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.709-6-201506-I!!PDF-E.pdf
2024-09-26 09:44:38
(bt709 is a free pdf from itu.int)
CrushedAsian255
2024-09-26 09:45:44
Is any of this annoying edge cases in jxl ?
Traneptora
2024-09-26 09:45:57
It does say tihs
2024-09-26 09:45:58
> In typical production practice the encoding function of image sources is adjusted so that the final picture has the > desired look, as viewed on a reference monitor having the reference decoding function of Recommendation > ITU-R BT.1886, in the reference viewing environment defined in Recommendation ITU-R BT.2035.
2024-09-26 09:46:19
but this is straight up a statement that production practice ignores the spec
2024-09-26 09:47:44
(bt.1886 is just gamma2.4)
dogelition
Traneptora that's actually not mentioned in Rec 709
2024-09-26 09:48:34
right, that's technically specified in bt.2035 along with peak white being 100 nits
Traneptora
2024-09-26 09:49:59
However, rec 1886 does specifically say that if program interchange is not required, they suggest an alternative EOTF to use
2024-09-26 09:50:08
(the alternative they suggest is the inverse of the one in rec709 btw)
2024-09-26 09:50:33
> that an alternative EOTF may be used in some cases where programme interchange is not required, a suggested equation is contained in informative Appendix 1.
dogelition
2024-09-26 10:00:46
but the "regular" bt.1886 eotf is the one that's recommended for HDTV production, and that is what's used in hollywood etc.
Traneptora
2024-09-26 10:01:41
It makes it seem like rec709 is entirely useless doesn't it <:KEK:1283989161111715913>
dogelition
2024-09-26 10:02:13
idk, is the OETF actually used by cameras?
Demiurge
_wb_ XYB is basically: Y is M+L (luma-ish, but a bit more yellowish) X is M-L (red-green chroma) B is S (but usually encoded as B-Y, so blue-yellow chroma)
2024-09-26 10:18:49
sounds a lot lke LAB
_wb_
2024-09-26 10:23:19
The concept is quite similar, yes.
2024-09-26 10:28:53
Main difference is Lab was created by letting people compare relatively large patches of colors to try to make it perceptually uniform, while XYB is more suitable for pixel-sized patches of color
lonjil
2024-09-26 11:02:59
https://discord.com/channels/794206087879852103/806898911091753051/1288787358740779028 > you also realize here that "white" is relative. There's no perfect concept of pure white, so you have to pick one. The most common choice is the color that an object at a temperature of 6500 kelvin would radiate, or D65. nit: since D65 is meant to be daylight, it has a slight green tint off of the ideal black body radiator of the same temperature.
_wb_
2024-09-26 12:23:48
https://en.m.wikipedia.org/wiki/DCI-P3#/media/File%3ACIE1931xy_gamut_comparison_of_sRGB_P3_Rec2020.svg
damian101
Traneptora and it's the most common space
2024-09-26 12:34:30
not even sure it is, considering how common it is to treat sRGB as gamma 2.2 on both the encode and decode side
dogelition afaik when doing color space math, you usually use 2.4 for bt.709
2024-09-26 12:36:20
for decoding, yes, as specified in bt.1886
dogelition
_wb_ https://en.m.wikipedia.org/wiki/DCI-P3#/media/File%3ACIE1931xy_gamut_comparison_of_sRGB_P3_Rec2020.svg
2024-09-26 12:36:49
note that the red primary of p3 is just barely outside of the bt.2020 gamut
2024-09-26 12:37:01
probably doesn't really matter in practice but i do find it interesting that it's technically not a subset
2024-09-26 12:37:31
(despite hdr content commonly being "p3 in a bt.2020 container")
damian101
Traneptora but if you're inverting bt.709 for the purpose of *encoding* then you should invert it by using the inverse of the OETF
2024-09-26 12:37:50
No. Unless it's live TV capture, which is the only place where the scene-referred Bt.709 OETF is used.
Traneptora if you take a video that's encoded in bt709 (like, say, most dvds) and try to do any kind of arithmetic on it though, like extract a frame, convert to srgb, edit the image, etc. then ideally you just invert the OETF
2024-09-26 12:38:29
No, absolutely not.
2024-09-26 12:39:01
I mean, that's often just not correct.
2024-09-26 12:40:49
The bt.709 OETF is just a hack to boost dark areas for direct camera to screen live TV, and is probably normally not even used anymore for many years now.
2024-09-26 12:41:38
Since we now have better more advanced real-time processing, and customers have better TVs.
dogelition idk, is the OETF actually used by cameras?
2024-09-26 12:45:48
Very uncommon for anything that applies DSP to the video capture. Very common for old analog TV cameras.
_wb_
dogelition note that the red primary of p3 is just barely outside of the bt.2020 gamut
2024-09-26 12:58:58
Yes, though I doubt that the difference between pure P3 red and the closest you can get to that in Rec2020 is very big. While the difference between pure Rec2020 red and the closest P3 approximation of that is probably noticeable.
2024-09-26 01:04:13
I wonder if there is something somewhere that can show that can render a chromaticity diagram close to correctly, say using many lasers that each produce a specific spectral color and together form a polygon that fits the curve well.
CrushedAsian255
2024-09-26 01:34:15
Even if you could you can’t take a picture of it
_wb_
2024-09-26 01:38:16
I would just like to see what is still missing from rec2020
afed
2024-09-26 01:44:08
https://youtu.be/0XTgz5Z1bhE
Traneptora
I mean, that's often just not correct.
2024-09-26 02:28:53
So you're saying for most purposes the thing in H.273 should be ignored
damian101
Traneptora So you're saying for most purposes the thing in H.273 should be ignored
2024-09-26 02:36:22
for encoding? yes for decoding bt.709, h.273 recommends bt.1886 anyway
Traneptora
2024-09-26 02:37:24
for rec709 tagged content, yea
2024-09-26 02:37:27
hm
damian101
afed https://youtu.be/0XTgz5Z1bhE
2024-09-26 02:41:25
surprised he didn't mention The Matrix, which is well-known to be mastered in bt.2020 gamut, and has plenty of greens exceeding P3.
lonjil
2024-09-26 02:41:55
the greenest movie anyone has ever made
damian101
lonjil the greenest movie anyone has ever made
2024-09-26 02:42:30
I think that's mostly the excessively green-tinted Blu-ray grading
2024-09-26 02:42:36
the UHD blu-ray looks more natural
lonjil
2024-09-26 02:42:48
wait a sec and ill show you something crazy
2024-09-26 02:43:59
top: original theatrical bottom: blu-ray
damian101
for encoding? yes for decoding bt.709, h.273 recommends bt.1886 anyway
2024-09-26 02:44:00
scene-referred anything is just a terrible idea in the modern world imo
2024-09-26 02:44:21
checks out
2024-09-26 02:44:22
lol
2024-09-26 02:44:25
very green
lonjil
2024-09-26 02:44:49
for the original, they made it a bit greenish inside the matrix with set design and lighting
2024-09-26 02:45:14
for the sequels, they used digital color grading and were like "HOLY SHIT WE CAN MAKE IT SO GREEN"
2024-09-26 02:45:27
so when the first movie was re-released they changed it to match the sequels
damian101
2024-09-26 02:45:32
I see
2024-09-26 02:45:49
I found the sequels quite meh, story-wise
2024-09-26 02:45:58
but I watched them a very long time ago
2024-09-26 02:46:02
soundtrack is amazing, though
spider-mario
2024-09-26 02:58:30
I don’t actually remember whether I’ve seen The Matrix
Quackdoc
2024-09-26 03:59:42
> Friends don’t let friends say “gamma.” https://hg2dc.com/2019/05/28/question-6/ man I hate the term gamma we should let it die
AccessViolation_
lonjil top: original theatrical bottom: blu-ray
2024-09-26 04:04:24
My developed photos after 30 dreadful minutes of trying to figure out how Darktable works
Quackdoc
CrushedAsian255 Can you just apply the EOTF?
2024-09-26 04:11:47
when you linearize images you typically want to *undo* the encoding function, not rendering it out. This isn't always an important distinction, but if you use sRGB it is, because rendering it out actually causes a somewhat harsh degree of dataloss
not even sure it is, considering how common it is to treat sRGB as gamma 2.2 on both the encode and decode side
2024-09-26 04:17:00
don't get me started on this <:PepeHands:808829977608323112>
2024-09-26 04:17:23
the spec is the spec, but no one cares about the spec, now we have a bunch of derivative specs with inaccuracies
2024-09-26 04:17:50
sRGB needs to hurry up and die. It's ecosystem is such a mess it needs to be taken out behind the shed and put down
damian101
2024-09-26 04:19:59
gamma 2.2 being the most common standard for displays has good reasons
CrushedAsian255
Quackdoc when you linearize images you typically want to *undo* the encoding function, not rendering it out. This isn't always an important distinction, but if you use sRGB it is, because rendering it out actually causes a somewhat harsh degree of dataloss
2024-09-26 04:23:25
So OETF^-1 ?
Quackdoc
gamma 2.2 being the most common standard for displays has good reasons
2024-09-26 04:23:44
sure, but we don't tag stuff as gamma 2.2 we tag stuff as sRGB, but half the time encode it as gamma 2.2, and then displays do really funky shit shit as well, why? I dunno, hell if I know, I don't think anyone knows
damian101
Quackdoc sure, but we don't tag stuff as gamma 2.2 we tag stuff as sRGB, but half the time encode it as gamma 2.2, and then displays do really funky shit shit as well, why? I dunno, hell if I know, I don't think anyone knows
2024-09-26 04:24:11
I tag my images as gamma 2.2 🙂
Quackdoc
2024-09-26 04:24:19
based as fuck
2024-09-26 04:24:30
why can't more people [pepehands](https://cdn.discordapp.com/emojis/1075509930502664302.webp?size=48&quality=lossless&name=pepehands)
2024-09-26 04:25:58
the issue is more so monitors
damian101
2024-09-26 04:25:59
But it really is a terrible mess. The future is proper color management, including transfer function, throughout the system. But for that you need properly tagged images in the first place...
Quackdoc the issue is more so monitors
2024-09-26 04:27:35
Well, that can be fixed by the consumer with changing one setting in the OS ideally...
Quackdoc
2024-09-26 04:28:26
it would be nice if most people could do that but 99% will just be "well this looks weird"
2024-09-26 04:28:40
imagine if displays could say what they did with edid
damian101
2024-09-26 04:28:52
yeah
2024-09-26 04:29:04
will happen over time
2024-09-26 04:29:22
but mistagged images will stay mistagged forever 💀
Quackdoc
2024-09-26 04:41:32
for me, the biggest thing is jack holm's quote in the massives aces thread > In most cases SDR video displays should be calibrated to the ITU-R BT.1886 EOTF.
2024-09-26 04:42:28
This seems pretty cut and dry that it's time to bounce lol
spider-mario
2024-09-26 05:28:32
are there that many mistagged images? my impression is that lack of monitor profiles is a greater problem
2024-09-26 05:28:40
(or of taking it into account)
Quackdoc
2024-09-26 05:39:42
In the past at least I know a lot of programs Photoshop included would actually encode them using the 2.2 encoding instead of SRGB 2 part but tag it as an SRGB
2024-09-26 05:41:17
The displats are for sure still an issue, but I'm still not 100% convinced that Displays are supposed to use a peer 2.2 function. I have been convinced of it, but I'm not 100% sure. I do wish I could read the actual specification myself. The clips I've seen are pretty convincing, but without the full context I obviously can't say for sure.
spider-mario
2024-09-26 05:43:02
there’s an argument to be made that if the author created the image as an sRGB file and it looked fine to them, maybe, in some sense, it really is an sRGB image
2024-09-26 05:43:19
they applied a “g2.2 interpreted as sRGB” filter to the image and they liked the result
2024-09-26 05:44:52
(dare I actually make that argument? I’m not sure)
Quackdoc
2024-09-26 05:45:55
I agree with that assesment as far as the creative side goes, Does not really apply for conversion (IE screenshot from a video rendered to gamma2.2 but encoded as sRGB)
w
2024-09-26 05:47:23
but if it's rendered gamma 2.2 decoded as srgb then it's how the creator intended
2024-09-26 05:47:29
so it's fine
Quackdoc
w but if it's rendered gamma 2.2 decoded as srgb then it's how the creator intended
2024-09-26 05:49:17
I'm not sure I understand, can you reword that?
w
2024-09-26 05:51:27
it might be wrong(mistagged) but it's not incorrect
2024-09-26 05:53:11
what's incorrect is every display panel being very different and not profiled
2024-09-26 05:53:20
actually causing images to look different
Quackdoc
2024-09-26 05:56:15
to be sRGB, it needs to be encoded with the piecewise function. This is important when doing the sRGB -> Linear, because you undo the piecewise function. if you do 2.2 -> Linear using the sRGB function it will cause dataloss, which will then be compounded when converting back to 2.2 or sRGB
2024-09-26 05:56:42
ofc display's decode with a pure 2.2, but we don't do that when doing sRGB-> LInear because of the data loss
w
2024-09-26 05:56:43
hmm obviously if you have a tool that's supposed to capture and it's changing the image, it's wrong and incorrect
Quackdoc
2024-09-26 05:57:28
man I hate colour, they had to go and make everything so confusing T.T
w
2024-09-26 05:59:15
just calibrate and profile your display
spider-mario
2024-09-26 05:59:53
and tag your screenshots with the display profile
w
2024-09-26 06:01:13
i have auto color management on so desktop is srgb
Quackdoc
2024-09-26 06:05:21
but what is desktop sRGB [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
spider-mario
2024-09-26 06:05:45
but don’t screenshots still capture the result of converting that to the display’s colorspace?
Quackdoc
2024-09-26 06:05:55
depends on the app
w
2024-09-26 06:06:00
windows ACM doesnt
spider-mario
2024-09-26 06:06:16
i.e. windows assumes non-tagged stuff is sRGB and converts it from sRGB to the display space to make it look correct, and then the screenshot tool captures that
w
2024-09-26 06:06:48
for non color managed apps, I don't have to do anything
2024-09-26 06:07:01
for color managed apps like firefox then it will be incorrect
spider-mario
2024-09-26 06:07:19
ah, the screenshot tool gets the pixels earlier in the chain?
w
2024-09-26 06:07:34
yeah, it's magic to me
Quackdoc
2024-09-26 06:07:51
many applications can handle screenshotting themselves
w
2024-09-26 06:08:04
windows photo app the only one that can do cross-monitor and screenshotting it with anything captures original color
dogelition
spider-mario i.e. windows assumes non-tagged stuff is sRGB and converts it from sRGB to the display space to make it look correct, and then the screenshot tool captures that
2024-09-26 06:32:14
windows by itself doesn't do any color management (except with "advanced color", i.e. sdr auto color management or hdr enabled) color management on windows is done by the applications transforming the pixels before rendering them
w windows photo app the only one that can do cross-monitor and screenshotting it with anything captures original color
2024-09-26 06:32:42
though that sounds like it might be an exception to that? idk
2024-09-26 06:33:26
oh i missed that you did talk about having ACM enabled
2024-09-26 06:33:56
that happens in the DWM or via a color space transform in the GPU display pipeline
w
2024-09-26 06:34:18
yeah I imagine it's in gpu compositing
dogelition
2024-09-26 06:34:44
you can actually mess with that (the DWM case) by messing with the shaders, see <https://github.com/ledoge/dwm_eotf>
Quackdoc
2024-09-26 06:40:36
the default to a 2.4 curve? [Hmm](https://cdn.discordapp.com/emojis/1113499891314991275.webp?size=48&quality=lossless&name=Hmm)
2024-09-26 06:41:10
oh yeah, that makes sense, it's just removing the linear section
dogelition
Quackdoc the default to a 2.4 curve? [Hmm](https://cdn.discordapp.com/emojis/1113499891314991275.webp?size=48&quality=lossless&name=Hmm)
2024-09-26 06:42:03
just a personal preference because i don't really "believe" in 2.2 as a standard (whereas 2.4 is at least the standard for video)
w
2024-09-26 06:42:11
yeah gamma is subjective
2024-09-26 06:42:12
and relative
Quackdoc
dogelition just a personal preference because i don't really "believe" in 2.2 as a standard (whereas 2.4 is at least the standard for video)
2024-09-26 06:42:49
not quite how it works here. as 2.2 2.4 etc. are actually hard coded standards. In this case it's talking about how to transform sRGB -> Linear -> PQ/scRGB/HLG etc
w
2024-09-26 06:43:10
this is only for eotf
Quackdoc
2024-09-26 06:43:12
note this is talking about transfer, once again the curse of the term gamma is striking hard
2024-09-26 06:44:56
note they also don't really have the right terminology as the EOTF for sRGB would technically be 2.2, what windows does when converting to HDR is use the inverse sRGB encoding function
2024-09-26 06:45:19
sRGB doesn't define a real eotf, but it does define the reference display as a pure 2.2
dogelition
2024-09-26 06:45:36
"they" is me btw
Quackdoc
2024-09-26 06:45:52
time to remove gamma from the readme [av1_pepegun](https://cdn.discordapp.com/emojis/659513552033415188.webp?size=48&quality=lossless&name=av1_pepegun)
Quackdoc > Friends don’t let friends say “gamma.” https://hg2dc.com/2019/05/28/question-6/ man I hate the term gamma we should let it die
2024-09-26 06:46:07
gamma is a horrible term, see
2024-09-26 06:46:23
it means too many things, and it's best to be explicit
dogelition
2024-09-26 06:46:32
idk i guess i put that there to make it easier to understand for people who don't know what an EOTF is
w
2024-09-26 06:46:43
and it is a gamma function
dogelition
2024-09-26 06:46:48
how about i rewrite it to EOTF ("gamma")
w
2024-09-26 06:47:01
not like you can make it anything but gamma
Quackdoc
2024-09-26 06:47:19
nuke gamma from oblivion, there is a reason why specs like sRGB use the term transfer function instead
2024-09-26 06:47:31
[av1_chad](https://cdn.discordapp.com/emojis/862625638238257183.webp?size=48&quality=lossless&name=av1_chad)
w
2024-09-26 06:47:33
because it's not using gamma?
dogelition
2024-09-26 06:47:52
i think "gamma EOTF" is a meaningful term though
Quackdoc
2024-09-26 06:47:53
"gamma" is a catch all term when talking about luminance... stuff
w
2024-09-26 06:48:21
not really
Quackdoc
2024-09-26 06:49:08
I mean, it would be better to say "changes linearization function from the inverse OETF to a pure power..." in this case
w
2024-09-26 06:49:22
or you can say gamma
2024-09-26 06:49:41
that's just yap
Quackdoc
2024-09-26 06:50:57
There is a reason why gamma isn't a very often used term professionally any more, it's vague and has no proper definition, if it is used at all, it should only be used after the proper terms like "transfer function"
2024-09-26 06:51:45
but this is not so important as changing " the sRGB EOTF is effectively replaced with a "normal" gamma EOTF." because they don't use the "sRGB OETF" they use the inverse sRGB OETF
w
2024-09-26 06:53:00
semantics 🤷 waste of time
Quackdoc
2024-09-26 06:53:34
it's actually not because as I said it's outright wrong
w
2024-09-26 06:53:45
eotf should be done in the vcgt anyway 🤷
Quackdoc
2024-09-26 06:53:48
the "implied EOTF" in sRGB is a pure 2.2
w
2024-09-26 06:54:52
yeah but what it was doesnt matter because the tool replaces it
2024-09-26 06:55:03
<:clueless:1186118225046540309>
Quackdoc
2024-09-26 06:55:40
[av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
dogelition
Quackdoc the "implied EOTF" in sRGB is a pure 2.2
2024-09-26 06:55:48
why would the piecewise thing not be an EOTF when it's defined in the section of the standard called "Transformation from RGB values to CIE 1931 XYZ values"
Quackdoc
2024-09-26 06:57:03
you my friend have been hit with, "what does the spec say, but which spec"
dogelition
2024-09-26 06:57:21
IEC 61966-2-1-1999.pdf
Quackdoc
2024-09-26 06:58:23
do they even mention EOTF in the section? because keep in mind the EOTF is what the display does
2024-09-26 06:58:38
as troy sobotka elegantly said "Displays have EOTFs. sRGB is an OECF / OETF, built atop of the notion of implicit management chains."
w
2024-09-26 06:58:46
2024-09-26 06:58:51
it dont look like a pure 2.2
Quackdoc
2024-09-26 06:59:31
that's why when decoding sRGB to linear, we use the inverse OETF, because the "implied EOTF" as per the spec is gamma 2.2
dogelition
Quackdoc do they even mention EOTF in the section? because keep in mind the EOTF is what the display does
2024-09-26 06:59:41
they don't call it anything, they just say that you calculate R' G' B' from R G B via the piecewise function
Quackdoc
2024-09-26 06:59:56
dogelition
2024-09-26 07:00:26
right, but in the explicit steps to go from RGB to XYZ they most certainly use the piecewise function
2024-09-26 07:00:32
and that's exactly what happens in an ICC color managed workflow
2024-09-26 07:00:47
so while i can see how you can argue purely from the spec that a display should be 2.2, i don't think it makes any sense in practice
Quackdoc
dogelition right, but in the explicit steps to go from RGB to XYZ they most certainly use the piecewise function
2024-09-26 07:01:07
yes, as I said, when converting stuff around, we use the encoding function as a basis
dogelition
2024-09-26 07:01:20
it's not how ICC color management works and it's not how windows works either (as it uses the piecewise function in advanced color)
2024-09-26 07:01:44
and (can't verify this myself) apparently the "reference" mode on apple displays also uses the piecewise function
Quackdoc
2024-09-26 07:03:22
there is a reason why sRGB is such debtated topics among nearly all color groups, here is an absolutely phenomenal forumn thread on the topic by some of some really great colourists and even this is a thread over 170 posts https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/
2024-09-26 07:06:33
but in the end, when undoing sRGB you "undo" the encoding function. when a display "renders" the image it uses pure 2.2
2024-09-26 07:07:40
"but wait, that sounds stupid, and this will cause a mismatch won't it" Yes. yes it does, and it's explicitly acknowledged.
dogelition
2024-09-26 07:09:32
again, i can see why you can make that argument based on the spec, but with how the practical implementations have worked for decades i don't think you can argue that it's correct specifically: if you have an sRGB image, and you view it through ICC color managed software, the result looks the same as if you took the "raw" pixels with no color management and sent them to a display that uses the piecewise EOTF
Quackdoc
2024-09-26 07:11:08
Properly managed displays use the pure 2.2 EOTF. Most displays out there OOB use the pure 2.2 EOTF, not all, it's a minor majority in my experience.
2024-09-26 07:11:47
if all displays just used one eotf, or the other eotf, then the issue of sRGB wouldn't be so massive as it is now, hence the talk about improperly tagged displays
w
2024-09-26 07:11:53
um no?
Quackdoc
2024-09-26 07:12:43
filmlight's daniel had a good talk about stuff like this https://youtu.be/NzhUzeNUBuM?t=985
w
2024-09-26 07:13:06
displaycal made my profile use srgb 🤷
Quackdoc
2024-09-26 07:13:21
particularly this
w displaycal made my profile use srgb 🤷
2024-09-26 07:13:39
ok, but according to the sRGB spec, it's a pure 2.2 function, so that would be accurate
2024-09-26 07:14:07
god I hate the term gamma [megareee](https://cdn.discordapp.com/emojis/1075610558805594162.webp?size=48&quality=lossless&name=megareee)
w
2024-09-26 07:14:08
but according to srgb spec it's piecewise
Quackdoc
w but according to srgb spec it's piecewise
2024-09-26 07:14:18
the encoding function, not decoding function
2024-09-26 07:14:25
as I said, there is a mismatch between them
Quackdoc "but wait, that sounds stupid, and this will cause a mismatch won't it" Yes. yes it does, and it's explicitly acknowledged.
2024-09-26 07:15:15
the sRGB specification *explicitly* defines seperate encoding and decoding, and as I said, acknowledges the mismatch between them
w
2024-09-26 07:15:42
https://www.color.org/chardata/rgb/srgb.pdf
2024-09-26 07:15:44
🤷
Quackdoc
2024-09-26 07:16:53
note that the sRGB specification was designed to be compatible with video, which at the time was a 2.4 power function as again daniele says > If we agree that Video EOTF is 2.4 then sRGB EOTF can only be 2.2 Gamma, Annex B will describe this in greater detail: https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024/169
w
2024-09-26 07:17:16
also my profile doesnt look like gamma 2.2
Quackdoc
w https://www.color.org/chardata/rgb/srgb.pdf
2024-09-26 07:17:18
and now we are back to 3rd party re-written specs
2024-09-26 07:18:25
ofc all of this would be different if sRGB spec actually defined an EOTF which it does not.
afed
2024-09-26 07:19:18
<:Poggers:805392625934663710> https://en.wikipedia.org/wiki/ScRGB
Quackdoc
afed <:Poggers:805392625934663710> https://en.wikipedia.org/wiki/ScRGB
2024-09-26 07:19:50
the most chad of colorspaces
dogelition
Quackdoc ofc all of this would be different if sRGB spec actually defined an EOTF which it does not.
2024-09-26 07:20:05
i don't understand how the function used in converting from RGB to XYZ could *not* be an EOTF
Quackdoc
2024-09-26 07:20:08
you don't need to worry about linearization if you are already linear [av1_chad](https://cdn.discordapp.com/emojis/862625638238257183.webp?size=48&quality=lossless&name=av1_chad)
dogelition i don't understand how the function used in converting from RGB to XYZ could *not* be an EOTF
2024-09-26 07:20:32
because most typically an EOTF is what the display does to display the video
2024-09-26 07:20:57
quite literally the electrical to optical transfer function
w
2024-09-26 07:21:19
no it's what you do to a signal for the display to create the color
Quackdoc
2024-09-26 07:21:41
in which case, as per the sRGB spec, the display uses a pure 2.2
w
2024-09-26 07:21:53
ok sure but we dont care about what the display physically does
2024-09-26 07:22:15
on its own
Quackdoc
2024-09-26 07:22:16
uh what? That's literally the vast majority of the issue here...
dogelition
Quackdoc in which case, as per the sRGB spec, the display uses a pure 2.2
2024-09-26 07:22:21
then what is the RGB -> XYZ conversion supposed to be used for?
Quackdoc
dogelition then what is the RGB -> XYZ conversion supposed to be used for?
2024-09-26 07:22:47
RGB to XYZ is just that. just like encoded RGB -> Linear RGB
2024-09-26 07:22:57
this is why we use the 2 peice function for that
2024-09-26 07:23:53
well in the end, since the sRGB spec doesn't specify an EOTF, we can just, most accurately, say there is no EOTF with sRGB
2024-09-26 07:24:04
whether or not one is implied or not is irrelevant
2024-09-26 07:25:34
the point is, when converting sRGB to linear, you do not do it the same way displays do it, which is a point of confusion
damian101
Quackdoc I agree with that assesment as far as the creative side goes, Does not really apply for conversion (IE screenshot from a video rendered to gamma2.2 but encoded as sRGB)
2024-09-26 07:26:05
you mean rendered/encoded to gamma 2.2 and tagged as sRGB...
dogelition
Quackdoc well in the end, since the sRGB spec doesn't specify an EOTF, we can just, most accurately, say there is no EOTF with sRGB
2024-09-26 07:26:12
i can see the point you're making there (it not defining the function used by the display), but i don't think it makes sense to define "EOTF" as something narrower than it literally means. that just seems wrong
Quackdoc
you mean rendered/encoded to gamma 2.2 and tagged as sRGB...
2024-09-26 07:26:18
I believe so
damian101
w but if it's rendered gamma 2.2 decoded as srgb then it's how the creator intended
2024-09-26 07:27:31
No, creators do that because most displays are gamma 2.2, and gamma 2.2 decoded as sRGB looks better than the other way around.
Quackdoc
dogelition i can see the point you're making there (it not defining the function used by the display), but i don't think it makes sense to define "EOTF" as something narrower than it literally means. that just seems wrong
2024-09-26 07:27:33
EOTF is that, an EOTF. quite literally the function used to convert electrical signals into optical signals. This is the process that happens in the TV/Display
w
No, creators do that because most displays are gamma 2.2, and gamma 2.2 decoded as sRGB looks better than the other way around.
2024-09-26 07:28:00
I just re-iterated what spider-mario said, how if it's what the creator was seeing then it doesn't matter
damian101
2024-09-26 07:28:50
the creator was probably working on a gamma 2.2 display as well
Quackdoc
2024-09-26 07:29:17
I suppose there could be an argument for saying that any conversion to "linear" is a conversion to an optical signal. However I have only seen it used in reference to displays
damian101
the creator was probably working on a gamma 2.2 display as well
2024-09-26 07:29:36
but a full color management pipeline in the future will decode sRGB correctly, and the picture will look wrong
w
2024-09-26 07:29:54
but if they made it to not look wrong to them then it wont look wrong to others
Quackdoc
2024-09-26 07:30:05
often times where there is no mismatch, it *can* be used to convert to linear, but the intent is for it to be used via the display as far as I am aware, but then again, I haven't actually ever seen it as a hard definition in a proper specification...
dogelition
Quackdoc I suppose there could be an argument for saying that any conversion to "linear" is a conversion to an optical signal. However I have only seen it used in reference to displays
2024-09-26 07:30:14
that's what i was trying to say, yeah, but now i'm not entirely sure if that's correct
damian101
w but if they made it to not look wrong to them then it wont look wrong to others
2024-09-26 07:30:42
but the image is incorrectly tagged, which does not matter nowadays because sRGB images are usually displayed nowadays, but this will probably change in the future
Quackdoc
2024-09-26 07:30:44
Imagine if color people would just properly define terms in publically availible documents
w
but the image is incorrectly tagged, which does not matter nowadays because sRGB images are usually displayed nowadays, but this will probably change in the future
2024-09-26 07:31:44
well they wont be changing how it is rendered
2024-09-26 07:31:47
we know they wont
damian101
2024-09-26 07:31:53
they will
w
2024-09-26 07:31:56
proof?
damian101
2024-09-26 07:32:10
new Android versions are already doing it
Quackdoc
2024-09-26 07:32:38
iirc android properly uses the inverse sRGB oetf for linearization unless tagged otherwise
damian101
2024-09-26 07:32:40
Firefox as well, although it just targets sRGB because it doesn't know what the target display uses
w
2024-09-26 07:33:19
or you can profile your display
2024-09-26 07:33:22
that was always a problem
damian101
2024-09-26 07:33:24
like, if I load one of my images correctly tagged as gamma 2.2 in Firefox, it will convert it to sRGB and it will look wrong
2024-09-26 07:33:50
The future is full color management from display to OS to applications
dogelition
Firefox as well, although it just targets sRGB because it doesn't know what the target display uses
2024-09-26 07:33:57
and (iirc) there's an option to treat video as being sRGB-encoded, which is on by default
2024-09-26 07:34:21
in practice that just means that if you don't have an icc profile, the decoded RGB values from the video are passed through without doing color management on them
Quackdoc
dogelition and (iirc) there's an option to treat video as being sRGB-encoded, which is on by default
2024-09-26 07:34:23
who encodes video as sRGB unless it's screen cap [av1_monkastop](https://cdn.discordapp.com/emojis/720662879367332001.webp?size=48&quality=lossless&name=av1_monkastop)
damian101
2024-09-26 07:34:29
I think it comes with the newest Windows as well, you can actually easily set your display to DCI P3 there for example, without loading a display ICC profile...
w
2024-09-26 07:34:30
firefox still doesnt do anything for video
dogelition
2024-09-26 07:34:45
and when you turn that setting off, it uses the inverse of the BT.709 OETF iirc
2024-09-26 07:34:47
which is terrible
Quackdoc
2024-09-26 07:34:55
man PQ can't come soon enough
w
2024-09-26 07:35:07
no, all of you need to profile your display
damian101
dogelition and (iirc) there's an option to treat video as being sRGB-encoded, which is on by default
2024-09-26 07:35:08
you know if there is an option to set my display trc?
2024-09-26 07:35:15
well, I don't use Firefox anyway...
w
2024-09-26 07:35:25
pq wont do shit if your panel is completely wrong anyway which it is because of the nature of displays
dogelition
you know if there is an option to set my display trc?
2024-09-26 07:35:34
use an icc profile? not sure i understand the question
damian101
dogelition use an icc profile? not sure i understand the question
2024-09-26 07:35:46
in Firefox I mean
Quackdoc
w pq wont do shit if your panel is completely wrong anyway which it is because of the nature of displays
2024-09-26 07:35:55
there is a lot less shenaigans going on though [av1_yep](https://cdn.discordapp.com/emojis/721359241113370664.webp?size=48&quality=lossless&name=av1_yep)
damian101
2024-09-26 07:36:06
or can Firefox read my display ICC under Linux?
dogelition
2024-09-26 07:36:15
it should be able to? if not you can set it manually in about:config
w
2024-09-26 07:36:24
you should be able to manually set the icc profile
damian101
2024-09-26 07:36:31
hmm
2024-09-26 07:36:45
I have never done any system-wide color management...
2024-09-26 07:37:09
All I know is that I have to set target-trc=gamma2.2 in mpv config to make my videos look god
w
Quackdoc there is a lot less shenaigans going on though [av1_yep](https://cdn.discordapp.com/emojis/721359241113370664.webp?size=48&quality=lossless&name=av1_yep)
2024-09-26 07:37:52
the shenanigans dont matter at all when faced with the bigger problem
Quackdoc
2024-09-26 07:38:19
the shenanigans are pretty much the biggest problem ATM
2024-09-26 07:38:34
tonemapping do be a close second tho [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
w
2024-09-26 07:38:36
mpv should be able to generate using display profile
damian101
2024-09-26 07:39:05
The YouTube app on Android 13 does gamma correction to my display.
2024-09-26 07:39:40
noticed some weeks ago
Quackdoc
2024-09-26 07:39:41
that's dope
2024-09-26 07:40:17
man monitors should at least use 1886 <:SadCheems:890866831047417898>
dogelition
2024-09-26 07:40:24
related question: with the new "SDR dimming" or whatever they call the improved HDR support on android, does e.g. starting an hdr video in youtube now not affect the rest of the display?
2024-09-26 07:40:42
i like (?) how i can tell that a video is in hdr on my phone because the appearence of the entire screen slightly shifts
Quackdoc
2024-09-26 07:40:46
it should
2024-09-26 07:41:03
last I checked it was pretty bad while multitasking
w
2024-09-26 07:41:22
you get bt1886 when you ~~profile~~calibrate your display
Quackdoc
2024-09-26 07:42:39
every display must be calibrated [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
w
2024-09-26 07:42:56
yeah that's what i'm saying
2024-09-26 07:43:03
otherwise everything else you do doesnt matter
Quackdoc
2024-09-26 07:44:01
Here is to hoping that KDE unfucks their SDR on HDR when colormanagement protocol lands
2024-09-26 07:44:31
not that I use KDE [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
spider-mario
w displaycal made my profile use srgb 🤷
2024-09-26 08:29:51
Florian Höch suggests 2.2 for general use https://sourceforge.net/p/dispcalgui/discussion/932493/thread/58431a55/#1d1b
w
2024-09-26 08:30:56
I use default video preset
spider-mario
2024-09-26 08:31:48
2024-09-26 08:32:28
(with all that said, I picked sRGB, but I often wonder whether that was the right choice)
w
2024-09-26 08:39:18
I think in practice doesn't matter
damian101
w I think in practice doesn't matter
2024-09-26 08:43:12
oh, it matters a lot
spider-mario
2024-09-26 08:46:19
someone in this thread https://hub.displaycal.net/forums/topic/srgb-vs-2-2/ originally postulated that “For most users the effect will not be visible.”, but then posted another response saying: > I am wrong 🙂 > > I take back what I just said about discernabke differences between true sRGB and G2.2. I just went through the exercise I described in prev post and softproofing G2.2 against true sRGB creates a readily visibke diff! > > So I am wrong according to my own logic
2024-09-26 08:46:57
(it’s rare to see people so readily admit their errors on the internet and I admire that)
damian101
2024-09-26 08:49:03
sRGB and gamma 2.2 looks almost the same in graphs, because those are mapped on linear light, making the curves look very different from how we actually perceive them.
2024-09-26 08:49:35
it's in the dark regions where the difference between gamma 2.2 and sRGB is huge
w
2024-09-26 08:49:44
I was thinking along the lines of the latest post in the thread
2024-09-26 08:51:27
the display is also profiled so it doesn't matter
2024-09-26 08:51:50
like you open a video and it's 1886 with 2.4
Quackdoc
2024-09-26 08:52:52
I actually havent looked at the code for argyll so I don't know what it means when it says gamma 2.2 vs sRGB. but I assume it is talking about whether your display uses the peicewise or the pure function
2024-09-26 08:53:52
iirc argyll is entirely software gpu based calibration
spider-mario someone in this thread https://hub.displaycal.net/forums/topic/srgb-vs-2-2/ originally postulated that “For most users the effect will not be visible.”, but then posted another response saying: > I am wrong 🙂 > > I take back what I just said about discernabke differences between true sRGB and G2.2. I just went through the exercise I described in prev post and softproofing G2.2 against true sRGB creates a readily visibke diff! > > So I am wrong according to my own logic
2024-09-26 08:59:49
sRGB vs 2.2 is extremely visible when looking at stuff like nighttime content, again back to this master class of a forumn thread https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024/118 this scenario is an intentionally exaggerated issue due to my selective screenshotting, but you can see a massive discrepancy
damian101
Quackdoc I actually havent looked at the code for argyll so I don't know what it means when it says gamma 2.2 vs sRGB. but I assume it is talking about whether your display uses the peicewise or the pure function
2024-09-26 09:00:14
sRGB is gamma 2.4 with a linear part in the dark area, that together approximates gamma 2.2
Quackdoc
sRGB is gamma 2.4 with a linear part in the dark area, that together approximates gamma 2.2
2024-09-26 09:00:40
for the encode function, decode (Display side) is a pure 2.2 function as per spec
2024-09-26 09:00:49
but many displays use the peicewise function inverted
damian101
2024-09-26 09:01:49
sRGB EOTF basically boosts severely boosts dark content below ~1 nits. sRGB OETF crushes dark areas when decoded as gamma 2.2
Quackdoc for the encode function, decode (Display side) is a pure 2.2 function as per spec
2024-09-26 09:02:05
wait, really??
Quackdoc
wait, really??
2024-09-26 09:02:09
yes
damian101
2024-09-26 09:02:12
wtf
Quackdoc
Quackdoc
2024-09-26 09:02:17
see here
wtf
2024-09-26 09:02:54
the encoding mismatch is because 8bit pipelines would kill a bunch of the data
2024-09-26 09:03:26
this is why the mismatch exists and in their own words "the advantages of optimising encoding outweight the disadvantages of this mismatch
2024-09-26 09:03:43
which would have been true were we still using CRTs
2024-09-26 09:10:16
*this is why* it's better to encode your image/video in a pure 2.2 function, when you can properly tag it. because it's a 1:1 ratio, no dataloss in a 2.2 display pipeline. This is the best case scenario because with an 2.2 monitor (as per sRGB spec). When you use the inverse sRGB OETF on an pure 2.2 encoded image, it won't look great, but it's still usable. This won't be an issue in any managed pipeline however.
2024-09-26 09:15:40
note with what it is, it's not like it really matters, make sure to calibrate when you can, it's around slim majority for displays to be a pure 2.2 EOTF so both users need to be accomodated for
kkourin
2024-09-27 01:27:11
people use srgb mode?
Quackdoc
2024-09-27 01:29:47
~~not willingly~~
jonnyawsom3
AccessViolation_ My developed photos after 30 dreadful minutes of trying to figure out how Darktable works
2024-09-27 03:06:50
I spent 2 hours in RawTherapee the other day just trying to figure out the UI. Eventually I made an ACES 16 bit JXL file (From a PNG) with probably too much denoising and not enough saturation
Quackdoc
2024-09-27 03:28:39
I need to find a simple color grading tool that works with touch and mouse primairly [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
Demiurge
Quackdoc sRGB vs 2.2 is extremely visible when looking at stuff like nighttime content, again back to this master class of a forumn thread https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024/118 this scenario is an intentionally exaggerated issue due to my selective screenshotting, but you can see a massive discrepancy
2024-09-30 06:38:03
So are ICC color profiles just unusable useless garbage then?
2024-09-30 06:40:27
Is it even possible for them to be used correctly? :(
Quackdoc
2024-09-30 04:20:40
if you have a properly calibrated display, properly colormanaged application/compositor, and finally a properly tagged image, then you are fine. if any one thing is not done properly, then you will get wrong colours
paperboyo
2024-09-30 09:53:29
FWIW, from my experience, depending on the situation, some of those three may have a different “weight” on the outcome. The more they stray from sRGB standard, the more. Again, from experience, the one that most often than not matters least is “properly calibrated display”.
2024-09-30 09:57:53
Maybe the other way of saying it would be: just like one can perform colour correction “by the numbers” on a black and white monitor, one can produce correct colours on a wildly miscalibrated display. The lack of hardware should not stop one trying.
spider-mario
2024-09-30 10:41:27
lack of uniformity can be a greater problem for example
2024-09-30 10:42:03
if you profile the centre of the display, and therefore output correct colours there, but the edges of the screen have a cooler colour temperature, that’s not ideal
a goat
spider-mario if you profile the centre of the display, and therefore output correct colours there, but the edges of the screen have a cooler colour temperature, that’s not ideal
2024-10-01 06:21:39
This is especially a problem on VA panels. I've been looking in to adding a screen filter that darkens the edges to a uniform level with my display
Quackdoc
2024-10-01 06:22:46
the only solution is oled and co [av1_chad](https://cdn.discordapp.com/emojis/862625638238257183.webp?size=48&quality=lossless&name=av1_chad)
spider-mario
2024-10-01 07:47:31
mini-LED / microLED ftw
a goat
spider-mario mini-LED / microLED ftw
2024-10-02 03:02:40
Wouldn't necessarily solve this issue as the LCD panel type producing off axis irregularities is totally independent of the backlight. The problem manifests itself the most on single blocks of color. The display manufacturer would have to purposefully dim/brighten the outer backlights on single color images and perform some basic level of signal analysis
spider-mario
2024-10-02 03:03:48
I just meant that as response to “the only solution is oled and co”
a goat
2024-10-02 03:04:05
But at the same time VA exists because it has much deeper blacks compared to IPS, which would be kind of pointless with microled
spider-mario
2024-10-02 03:04:10
I prefer them in principle for not having to worry about burn-in
a goat
2024-10-02 03:04:24
Yeah burn in scares me
w
2024-10-02 03:16:57
VA is a downgrade in every other aspect to IPS
Quackdoc
spider-mario I prefer them in principle for not having to worry about burn-in
2024-10-02 07:33:20
that's very fair. qled seemed promissing, I think it was qled, Im not sure any more. too many terms.
spider-mario
2024-10-02 07:36:17
to be entirely fair, I’m probably overstating the issue
2024-10-02 07:36:25
still, it’s always a bit at the back of my mind
2024-10-02 07:36:55
although I do sometimes worry about uneven mini LED aging as well
2024-10-02 07:36:59
I wonder how much of a problem it is
Quackdoc
2024-10-02 07:38:53
oled burning is still an issue, unless you run are talking about the switch's panel, I don't know what black magic they do, but it's insane
jonnyawsom3
2024-10-03 07:36:01
My phone is 8 years old, still no burn in
CrushedAsian255
My phone is 8 years old, still no burn in
2024-10-03 07:37:07
do you have always on display?
jonnyawsom3
2024-10-03 07:38:04
Not enabled
Demiurge
2024-10-05 09:09:59
Color doesn't exist. Y'all are just hallucinating and crazy
Quackdoc
2024-10-05 09:24:55
based
CrushedAsian255
2024-10-05 09:35:03
images aren't real, it's just a bunch of numbers in a wire
Quackdoc
2024-10-05 09:37:23
no one in this group is real, it's all a psyop
CrushedAsian255
2024-10-05 09:44:49
the government is hiding the true next generation image format, jpeg xl is a coverup for a larger conspiracy
spider-mario
2024-10-05 10:23:27
I should be careful about engaging in sarcasm of the sort, lest it be interpreted as official statements
Quackdoc
2024-10-05 10:31:32
[av1_kekw](https://cdn.discordapp.com/emojis/758892021191934033.webp?size=48&quality=lossless&name=av1_kekw)
Demiurge
2024-10-06 12:09:43
Official statement from official JPXL dev: color is government hoax
yoochan
2024-10-06 10:50:32
Like, the XL files, I want to encode... With murder and skully
lonjil
2024-10-06 03:09:31
If I have RGB data, and I have R_alpha, G_alpha, and B_alpha, and transform the RGB data to YCoCg, would it be valid and useful to transform the alpha data in the same way to get Y_alpha, Co_alpha, and Cg_alpha? Obviously it's reversible so you could just go back to RGB if needed, but would the YCoCg data be sensible? Would compositing in YCoCg give a sensible result?
Tirr
2024-10-06 03:15:49
in the case when Ar = Ag = Ab = 1, transforming in the same way will yield Ay = 1 but Aco = Acg = 0 which isn't desirable; it should be fully opaque after the transform but it's not
spider-mario
lonjil If I have RGB data, and I have R_alpha, G_alpha, and B_alpha, and transform the RGB data to YCoCg, would it be valid and useful to transform the alpha data in the same way to get Y_alpha, Co_alpha, and Cg_alpha? Obviously it's reversible so you could just go back to RGB if needed, but would the YCoCg data be sensible? Would compositing in YCoCg give a sensible result?
2024-10-06 03:33:02
I think my strategy to answer this would be to implement blending with the two approaches in a CAS like Sage and see if the resulting expressions are equivalent i.e. “is `inv_ycocg(blend(ycocg(pixel), ycocg(alphas))) == blend(pixel, alphas)`”
2024-10-06 03:33:18
(or just seeing if the left-hand side looks sensible, independently of the right-hand side)
lonjil
2024-10-06 03:34:02
that's a sensible method
2024-10-06 03:37:08
Tirr provided a rather convincing argument that it wouldn't work for compositing. Could use a CAS to try to find a different transform that makes it work though, "find an `f` that makes `inv_ycocg(blend(ycocg(pixel), f(alphas))) == blend(pixel, alphas)` true"
2024-10-06 03:37:43
Or perhaps a different blend equation, though I presume it would be less efficient than the standard one.
2024-10-06 03:39:52
Ay, Aco, Acg does have a sensible interpretation at least, I think. Ay presumably being the total proportion of photons that will make it through, and Aco and Acg weighting which specific photons are the ones to get through.
spider-mario
2024-10-06 03:40:25
Ay lmao 👽
2024-10-06 03:40:26
(sorry)
lonjil
2024-10-06 03:40:38
lol
_wb_
lonjil If I have RGB data, and I have R_alpha, G_alpha, and B_alpha, and transform the RGB data to YCoCg, would it be valid and useful to transform the alpha data in the same way to get Y_alpha, Co_alpha, and Cg_alpha? Obviously it's reversible so you could just go back to RGB if needed, but would the YCoCg data be sensible? Would compositing in YCoCg give a sensible result?
2024-10-06 08:43:45
Not sure about compositing in YCoCgYACoAChA (should be easy to check if the math works but I am tired atm), but for just storing the data that would make sense, assuming the three alpha channels are generally correlated similarly to how RGB is correlated (i.e. the values are the same for the "unsaturated" case). So I think it should generally help for compression. It's for this reason that we allowed RCTs to apply to any three channels: it might be useful to decorrelate other channels as well.
lonjil
2024-10-06 08:44:53
ah, very nice :)
2024-10-06 08:46:47
And yeah, I would assume that even in 3-channel alpha images, most semi-transparent areas will be non-tinted
Quackdoc
2024-10-12 07:31:25
Gonna try to make a crate to handle common CICP stuff me thinks, I would do ICC but ICC hurts my head, and a goal of this is going to be making a lot of the code with gpu interop in mind using rust-gpu. at the very least, the CICP stuff I plan to be "good enough" for conversion to and from "major" display colorspaces
2024-10-12 07:32:11
gonna be neat since programming for both cpu and shaders at the same time is a little bit of a weird situation
VcSaJen
Demiurge Color doesn't exist. Y'all are just hallucinating and crazy
2024-10-16 04:39:51
Color is quite interesting. We humans think that other mammals are "color-blind", while in reality we can't see many colors ourselves. I'm not talking about colors in infrared/ultraviolet range, nor about fancy things like polarization. I'm talking about good ol' colors. For example monochromatic yellow and [red+green] are distinctly different colors, but we can't distinguish them. Many birds and reptiles can, in fact, distinguish them, and if they had a language they would call them with different words, instead of both being "yellow". In general sense RGB is not enough to define "color", you would need a full spectrogram for that. A simple experiment can prove it: two identically looking yellow paints would look different if the light source is monochromatic yellow light. Monochromatic yellow paint would stay yellow, while [red+green] paint would darken.
CrushedAsian255
2024-10-16 04:42:10
Someone should make an image format that supports storing non human colours (ie for radio captures)
2024-10-16 04:42:30
Could be helpful as a sort of “RAW” format for light
VcSaJen
2024-10-16 04:51:47
There are RGBY monitors, but they're still trichromatic, not tetrachtomatic. Kinda similar to CMYK in that regard.
Quackdoc
2024-10-16 04:59:35
I hate colour, the more you learn the less you know
_wb_
CrushedAsian255 Someone should make an image format that supports storing non human colours (ie for radio captures)
2024-10-16 05:18:52
You are describing multi spectral imaging, which is something satellites do all the time. In JPEG XL you can use extra channels for this.
CrushedAsian255
2024-10-16 05:20:41
I’m more thinking continuous spectral
2024-10-16 05:20:44
Like 3d data
2024-10-16 05:20:52
Each pixel contains an entire spectrogram
_wb_ You are describing multi spectral imaging, which is something satellites do all the time. In JPEG XL you can use extra channels for this.
2024-10-16 05:21:44
Only thing is I still don’t get is why can’t you have triplets of extra channels use Var DCT
_wb_
CrushedAsian255 Only thing is I still don’t get is why can’t you have triplets of extra channels use Var DCT
2024-10-16 05:23:52
You can't. We could have allowed that in principle but it adds some complication.
CrushedAsian255 Each pixel contains an entire spectrogram
2024-10-16 05:25:06
Ah, yes then you'll get limited by the 4099 channel limit
CrushedAsian255
2024-10-16 05:52:05
So if I want lossy then I have to use Squeeze?
_wb_
2024-10-16 06:02:42
Yes, squeeze+quantization or other lossy modular methods, like lossy delta palette, or just reducing bit depth
Quackdoc
2024-10-16 08:21:04
https://github.com/w3c/csswg-drafts/issues/10998
2024-10-16 08:21:33
[Hmm](https://cdn.discordapp.com/emojis/1113499891314991275.webp?size=48&quality=lossless&name=Hmm)
spider-mario
VcSaJen Color is quite interesting. We humans think that other mammals are "color-blind", while in reality we can't see many colors ourselves. I'm not talking about colors in infrared/ultraviolet range, nor about fancy things like polarization. I'm talking about good ol' colors. For example monochromatic yellow and [red+green] are distinctly different colors, but we can't distinguish them. Many birds and reptiles can, in fact, distinguish them, and if they had a language they would call them with different words, instead of both being "yellow". In general sense RGB is not enough to define "color", you would need a full spectrogram for that. A simple experiment can prove it: two identically looking yellow paints would look different if the light source is monochromatic yellow light. Monochromatic yellow paint would stay yellow, while [red+green] paint would darken.
2024-10-16 08:22:34
those are sort of different phenomena: the former is observer metameric failure, whereas the latter is illuminant metameric failure (https://blog.kasson.com/the-last-word/metameric-failure/) and could occur even if you captured the full reflected spectrum under one illuminant
2024-10-16 08:23:25
to avoid it, you would need to measure the paint’s reflectance spectrum, and then calculate the final emitted spectrum by multiplying that with the illuminant’s
Demiurge
2024-10-16 09:52:01
spectral yellow in theory should still look more saturated...?
spider-mario
2024-10-17 08:59:07
Quackdoc
2024-10-17 09:03:40
I prefer gray the colour [av1_cheems](https://cdn.discordapp.com/emojis/720670067091570719.webp?size=48&quality=lossless&name=av1_cheems)
diskorduser
2024-10-19 01:33:53
Is there any android gallery app that supports displaying 12bit images?( My phone has a 12 bit display)
Quackdoc
2024-10-21 09:28:30
<@401816384109150209> in this thread that is a big discussion about the sRGB shittery https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/
2024-10-21 09:29:21
bigger snippet from bottom of thread
2024-10-21 09:30:04
note, it's possibly quite important that they do explicitly state a CRT for a reference display
2024-10-21 09:32:28
daniele has a great point to really hammer home the fact that it's supposed to be displayed on a pure 2.2 display > The primary goal of sRGB is to be compatible with video. > And a piecewise EOTF is not compatible with video. Video in this context being rec709 encoded footage which was often gamma2.4 or gamma2.2
2024-10-21 09:34:09
I feel dirty for saying gamma
dogelition
2024-10-21 09:34:56
and to repeat my point too: in an ICC color managed workflow, there is no ambiguity as images will always be decoded with the piecewise EOTF and not something like 2.2 gamma - for me, that's reason enough to argue that it's objectively the correct interpretation, but you can draw your own conclusions
Quackdoc
dogelition and to repeat my point too: in an ICC color managed workflow, there is no ambiguity as images will always be decoded with the piecewise EOTF and not something like 2.2 gamma - for me, that's reason enough to argue that it's objectively the correct interpretation, but you can draw your own conclusions
2024-10-21 09:35:44
well depending on what you define as the eotf [av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
2024-10-21 09:36:54
if eotf == inverse oetf then yes if eotf == gamma2.2 then no but this distinction is ofc made clear by the context you gave
2024-10-21 09:40:57
but yeah, as far as my understanding goes Monitors: Decode sRGB with a pure 2.2 any software converting formats: inverse OETF
2024-10-21 09:42:48
this has some leeway, some stuff when mastered on an sRGB monitor, for sRGB monitor may want to choose to convert to linear using a pure 2.2 transfer, but generally I find it better to use inverse oetf
spider-mario
2024-10-21 09:53:21
I’m not actually sure how I would go about “grading for ‘sRGB interpreted as 2.2’ ”
2024-10-21 09:53:26
I suppose I could grade it as usual, then add a step “convert to gamma 2.2 but don’t attach a profile with it”?
2024-10-21 09:53:45
(after two failed attempts at posting that in the correct channel, here we go…)
2024-10-21 09:54:18
I _have_ noticed that content I grade on my computer tends to have its shadows somewhat crushed when I look at it on my phone, so maybe I should in fact do just that
Quackdoc
2024-10-21 10:00:20
I can't wait until PQ becomes the standard, I don't care about PQ so much specifically, but I don't see anything else happening before it, and I just want *something* that isn't the mess that sRGB is
2024-10-21 10:03:39
I love how color.org's sRGB spec is wrong. I also love how the "web standard" spec for color is paywalled. Like, I can get paywalling other things, but perhaps, maybe the the most basic of things that is used for the ground work of 90% of other things shouldn't be paywalled
RaveSteel
Quackdoc I can't wait until PQ becomes the standard, I don't care about PQ so much specifically, but I don't see anything else happening before it, and I just want *something* that isn't the mess that sRGB is
2024-10-21 10:14:57
Sadly all HDR implementations aside from MacOS are pretty bad
2024-10-21 10:15:10
And HDR specifications are also quite a mess
2024-10-21 10:15:25
IMO a HDR 400 certification shouldnt even be a thing because it's too dim
2024-10-21 10:15:36
HDR on Android is also bad
2024-10-21 10:15:50
Creating an HDR image is easy, but properly viewing that image is a struggle
Quackdoc
RaveSteel Sadly all HDR implementations aside from MacOS are pretty bad
2024-10-21 10:20:51
I find HDR fine on android, the issue is when you mix in SDR specifically which A13+ fixed, at least from what I have seen
RaveSteel
2024-10-21 10:24:08
Depends on the application I guess but some HDR sample images just did not display properly on a few gallery apps I've tried, inclduing the samsung gallery
2024-10-21 10:24:52
Only chrome tonemapped the image, but it still looked improper
Quackdoc
2024-10-21 10:26:17
a lot of apps just out right don't support HDR
RaveSteel
2024-10-21 10:27:30
yes, sadly
_wb_
2024-10-22 07:30:34
one thing that has confused me endlessly is how many versions of ImageMagick have interpreted gamma2.2 PNG files as being sRGB, to the point that `identify -verbose` would pretend that it is sRGB while really it isn't.
spider-mario
2024-10-22 08:28:07
I think it just always prints sRGB
2024-10-22 08:28:42
the problem was in the other direction: it would turn sRGB PNGs into gamma-2.2 ones
2024-10-22 08:29:21
because it would write only the “fallback” gAMA chunk and not the actual sRGB chunk that it's meant to be a fallback for
_wb_
2024-10-22 11:00:36
right but then it would take gamma-2.2 PNGs and convert them to jpeg or ppm or whatever without any conversion so it would be interpreted as sRGB again
Traneptora
2024-10-24 08:38:25
all this makes me glad I wrote umbrielpng
2024-10-24 08:38:35
png tag canonicalizer
Demiurge
_wb_ one thing that has confused me endlessly is how many versions of ImageMagick have interpreted gamma2.2 PNG files as being sRGB, to the point that `identify -verbose` would pretend that it is sRGB while really it isn't.
2024-10-25 05:02:40
does graphicsmagick also have that problem? I always thought gm was a cleaner, more precise version of im
damian101
RaveSteel IMO a HDR 400 certification shouldnt even be a thing because it's too dim
2024-10-27 08:07:53
HDR 400 is a clear improvement over SDR for well-mastered HDR sorces that don't overdo it with the brightness.
2024-10-27 08:08:29
at least of the display uses decent tonemapping and doesn't just clip at 400 nits...
_wb_
2024-10-27 08:33:42
400 nits is plenty bright if you are in a sufficiently dim room
2024-10-27 08:37:42
In cinema theaters the max SDR brightness is around 50 nits, so if you match that, then 400 nits gives you 3 stops of HDR headroom
dogelition
2024-10-27 11:13:30
regular IPS panels with no local dimming qualify for DisplayHDR 400, and those just do not have a high dynamic range due to their terrible contrast ratio (+ IPS glow on top of that)
_wb_
2024-10-27 01:17:00
It is the lowest bar of the Display HDR labels, yes. https://displayhdr.org/
dogelition
2024-10-27 01:30:11
the CTS 1.1 requirements were even more of a joke, as it didn't even require any wide gamut coverage or bit depth higher than 8 bit for the 400 tier
Quackdoc
at least of the display uses decent tonemapping and doesn't just clip at 400 nits...
2024-10-27 04:20:38
displays typically don't clip lel, they will track as far as they can then use a really harsh tonemapping curve for PQ, for HLG they... scale it, tonemap isnt really the right word