|
username
|
2024-03-04 04:00:08
|
(I had someone else generate this for me since I'm not on the Steam beta branch)
|
|
|
HCrikki
|
2024-03-04 05:40:16
|
Its in steam stable now
|
|
|
fab
|
2024-03-04 05:55:52
|
Youtube is encoding 88600 videos a second
|
|
|
CrushedAsian255
|
|
I know something only had 4:2:0...
|
|
2024-03-04 08:25:12
|
WebP 🤮
|
|
|
_wb_
|
2024-03-04 09:42:03
|
So <@440391953545166848> if you want I can introduce you to the JXS spec writers if you don't know them already
|
|
|
RedNicStone
|
2024-03-04 09:44:48
|
Thanks for moving this here, the slow mode was killing me. I dont have any contacts with any of the people on the JXS committee, I'd love if you could share those contacts
|
|
|
_wb_
|
2024-03-04 09:47:48
|
Mail me at jon@cloudinary.com, I'll introduce you to them
|
|
|
RedNicStone
|
2024-03-04 09:49:37
|
Awesome, I appreciate that
|
|
|
_wb_
|
2024-03-04 09:51:33
|
If you're interested in JXS, you might also be interested in HTJ2K. <@553324745240608773> here can tell you all about that.
|
|
|
RedNicStone
|
2024-03-04 09:54:21
|
I'm not very familiar with HTJ2K at all. Our target is implementing the encoding on smaller FPGAs, which is also why we decided to go with JPEG XS. Does HTJ2K offer similar size advantages for hardware implementations?
|
|
|
Pierre-Anthony Lemieux
|
2024-03-04 09:55:37
|
Yes.
|
|
2024-03-04 09:56:50
|
Happy to put you in touch with FPGA implementers if you DM me your email address and provide more information on the application you have in mind.
|
|
|
RedNicStone
|
2024-03-04 09:59:58
|
That does sound quite interesting. Were looking to use an open implementation if possible however, which is why we are going to such lengths to implement it ourselfs
|
|
|
yurume
|
2024-03-05 05:52:31
|
https://medium.com/@migel_95925/supercharging-jpeg-with-machine-learning-4471185d885d learning the best Q-table via ML does sound legit, but does it work that very well? (I've also found a similar approach which was publicly released, https://medium.com/@colbyn/modern-image-optimization-for-2020-issues-solutions-and-open-source-solutions-543af00e3e51)
|
|
2024-03-05 05:54:01
|
I kinda suspect a sort of metric overfitting, the best metric `compression.io` did use was MS-SSIM for example
|
|
|
_wb_
|
|
yurume
https://medium.com/@migel_95925/supercharging-jpeg-with-machine-learning-4471185d885d learning the best Q-table via ML does sound legit, but does it work that very well? (I've also found a similar approach which was publicly released, https://medium.com/@colbyn/modern-image-optimization-for-2020-issues-solutions-and-open-source-solutions-543af00e3e51)
|
|
2024-03-05 06:45:56
|
We are basically doing something like this in our next q_auto. It works as well as the amount of real training data you have — which is subjective data, not metrics. This was the main reason we did the CID22 experiment.
|
|
|
yurume
|
2024-03-05 06:47:01
|
would that outcome also make into jpegli, just in case? that would be an even more efficient JPEG encoder on top of the current jpegli if that observation is generalizable enough.
|
|
|
_wb_
|
2024-03-05 06:52:34
|
We're not trying to make encoders better, just get better at using them to produce consistent quality.
|
|
2024-03-05 06:59:13
|
(and also to get better at using the best available codec for the image)
|
|
2024-03-05 06:59:57
|
Especially for AVIF, getting consistent quality is kind of hard.
|
|
|
yurume
|
2024-03-05 07:50:28
|
OTOH: https://opus-codec.org/demo/opus-1.5/
|
|
2024-03-05 07:50:38
|
xiph.org demos are always worthy to look at and marvel.
|
|
|
_wb_
|
2024-03-05 09:11:50
|
Tim Terriberry was talking about this when I saw him in Palo Alto two months ago. Pretty cool stuff!
|
|
|
damian101
|
|
I know something only had 4:2:0...
|
|
2024-03-05 09:19:46
|
some AV1 encoders do not support 4:4:4
WebP has no 4:4:4, but like AVIF supports lossless RGB
|
|
|
fab
|
2024-03-05 09:26:41
|
I also helped until 10:00 AM of today to improve opus encoders
|
|
2024-03-05 09:28:30
|
Better I mean more similar to the original recording
|
|
2024-03-05 09:29:22
|
Before I spent about 27:40 minutes optimizing Tok Tik audio
|
|
2024-03-05 09:30:05
|
Because with the Song alex velea monalisa there' was some muffling
|
|
|
|
afed
|
2024-03-05 09:30:08
|
svt-av1 is 4:2:0 only
libaom has lossless rgb support, other encoders do not
so it depends
although I don't see the point of lossless avif screenshots
even for high quality lossy, jpeg is enough and may be even better at higher qualities with much faster speeds
|
|
|
fab
|
2024-03-05 09:30:15
|
And I did managed toSolve that
|
|
2024-03-05 09:40:22
|
Brave beta waz installed on my PC
|
|
2024-03-05 10:00:25
|
Still it was badly mixed but the voice heaac quality was ok
|
|
|
Demiurge
|
2024-03-05 11:18:06
|
It's amazing that you can understand what someone is saying at 90% packet loss
|
|
|
|
afed
|
2024-03-05 11:28:30
|
though, it doesn't work that well in actual use on unstable networks
|
|
|
jonnyawsom3
|
|
afed
svt-av1 is 4:2:0 only
libaom has lossless rgb support, other encoders do not
so it depends
although I don't see the point of lossless avif screenshots
even for high quality lossy, jpeg is enough and may be even better at higher qualities with much faster speeds
|
|
2024-03-05 03:22:40
|
Steam added it for HDR screenshots, but I'm sure there would've been something better
|
|
|
HCrikki
|
2024-03-05 04:39:12
|
shouldve at least improved jpg saves. Their encoder gives really poor quality for all resolutions
|
|
|
spider-mario
|
2024-03-09 01:05:02
|
inspired by the libjxl 0.10 post, and by the fact that wavpack 5.7 was just released with multithreading support, I decided to run a quick benchmark against flac
|
|
2024-03-09 01:05:12
|
for convenience, I ran it on only one file, but it lasts 45 minutes
|
|
2024-03-09 01:05:17
|
|
|
2024-03-09 01:05:32
|
|
|
2024-03-09 01:05:44
|
(green = multithreaded wavpack, orange = single-threaded wavpack, blue = flac)
|
|
2024-03-09 01:05:50
|
oops, forgot the labels
|
|
2024-03-09 01:05:59
|
x = compression ratio, y = compression time in seconds
|
|
2024-03-09 01:07:11
|
decoding speed seems to be pretty much a non-issue (even the slowest single-threaded wavpack is over 300× real time)
|
|
2024-03-09 01:08:25
|
(the test file was https://youtu.be/kp7VHwvRzoY )
|
|
2024-03-09 01:09:32
|
uugh, YouTube’s new, mandatory dynamic range compression completely removes the fade in at the start, I hate it
|
|
2024-03-09 01:11:37
|
of note: wavpack’s multithreaded decoding appears to work just as well on files that were compressed single-threaded
|
|
|
|
Posi832
|
|
yurume
OTOH: https://opus-codec.org/demo/opus-1.5/
|
|
2024-03-09 06:51:44
|
1.5 has released?!?!?!?! omg omg omg
|
|
|
DZgas Ж
|
|
Posi832
1.5 has released?!?!?!?! omg omg omg
|
|
2024-03-09 10:22:10
|
Nothing difference 1.4 1.5 .opus file after encoding.
|
|
|
CrushedAsian255
|
2024-03-09 10:35:47
|
Mainly ML based low bitrate / high packet loss improvements
|
|
|
HCrikki
|
2024-03-09 10:36:44
|
werent the 1.5 specific changes disabled by default ?
|
|
|
Quackdoc
|
2024-03-09 10:37:35
|
most of them, all the ML ones were but there are other misc changes that made it in
|
|
|
HCrikki
|
2024-03-09 10:38:59
|
afaik error loss resilience is in but only noticable in severely degraded streaming situations
|
|
|
sklwmp
|
2024-03-10 02:20:30
|
forgive me for my ignorance, but can someone enlighten me as to why you would *ever* choose baseline jpeg over progressive jpeg? with progressive, you get both:
- smaller file sizes
- progressive decoding
what do you have to lose by going progressive? even ect or jpegtran can losslessly convert between the two, so you aren't even losing any data by doing so
|
|
|
|
afed
|
2024-03-10 02:26:11
|
faster encoding/decoding if that is important for such speeds, better compression on some images
|
|
|
fab
|
|
_wb_
|
2024-03-10 03:17:11
|
Mozjpeg and jpegli do progressive by default for a reason. The only disadvantage is enc/dec speed (and memory), but I don't think that still really matters for jpeg in 2024. Even progressive jpeg is very fast to decode.
|
|
2024-03-10 03:19:34
|
Jxl does simple progressive in two passes by default so you get the advantages but not really the disadvantages (which in case of jpeg come from doing many passes and no tiling, which is bad for memory locality so comes at a speed cost)
|
|
|
sklwmp
|
2024-03-10 05:08:54
|
Wonder why Android (at least Pixels) produce baseline JPEGs by default, then
|
|
2024-03-10 05:08:57
|
encode speed, probably?
|
|
|
jonnyawsom3
|
2024-03-10 05:12:03
|
Hardware encoding
|
|
|
_wb_
|
2024-03-10 05:27:48
|
That or just not bothering to change libjpeg-turbo default, I guess
|
|
|
fab
|
2024-03-10 08:08:54
|
I've news for you, rewritten all the yt quantuzers
|
|
|
Nova Aurora
|
2024-03-10 08:24:10
|
How many times have you done so?
|
|
|
fab
|
2024-03-10 09:27:02
|
Zero
|
|
2024-03-10 09:27:58
|
I have a video of my cousin singing with autotune and a Romanina song hluego
|
|
2024-03-10 09:28:26
|
But I don't know but many papers were released in february
|
|
2024-03-10 09:28:40
|
zero inMarch
|
|
|
fab
|
|
2024-03-11 02:09:37
|
i miss when psy wasn't utilized <@226977230121598977>
|
|
|
DZgas Ж
|
|
fab
i miss when psy wasn't utilized <@226977230121598977>
|
|
2024-03-11 02:12:22
|
i dont use psy
|
|
2024-03-11 02:13:38
|
The problem with PSY is a poor explanation of the principles of its operation, it is not written anywhere at all, no details. From my tests, I can conclude that PSY is forbids to updating the block with small changes and "twitches"
|
|
2024-03-11 02:16:35
|
I also can't know that. but it seems that such a function is extremely highly configured in VP9, which makes it possible to observe not smooth updates of information in blocks, But Sharp changes in them.
But it would be nice if some person who worked directly with PSY in x264 x264 libvpx -- would explain everything as it really is
|
|
|
fab
|
2024-03-11 02:19:29
|
|
|
2024-03-11 02:19:57
|
you were right on this, stop with this psy, keep grain
|
|
2024-03-11 02:38:24
|
even webp lossy without psy is acceptable
|
|
2024-03-11 02:40:25
|
|
|
2024-03-11 02:43:59
|
|
|
2024-03-11 02:44:16
|
My techniques are improving
|
|
2024-03-11 02:45:51
|
|
|
2024-03-11 02:46:18
|
Very low bpp still most people recognize the object
|
|
2024-03-11 02:46:53
|
Likely not good as JPEG XL
|
|
2024-03-11 02:47:57
|
It cannot be as it would slow down the politicans x account
|
|
2024-03-11 02:48:19
|
So I have to keep the WebP look
|
|
2024-03-11 02:50:31
|
|
|
2024-03-11 02:50:41
|
<@226977230121598977>
|
|
2024-03-11 02:50:47
|
King of compression
|
|
|
DZgas Ж
|
2024-03-11 02:52:35
|
https://media.discordapp.net/attachments/1145767720718172200/1205207845029552209/1088142012546564206.gif
|
|
2024-03-11 02:53:32
|
I would really like to look at the WEBP q100, which has completely disabled the use of DCT
|
|
|
gb82
|
2024-03-11 05:18:14
|
Do you guys know if there is a compressed vector-based image format like SVG?
|
|
2024-03-11 05:18:59
|
I know there are SVG "optimizers" that try to losslessly strip SVGs of redundant/"useless" data, but Brotli does a much better job than these most of the time, even on compression level 1
|
|
2024-03-11 05:19:18
|
|
|
2024-03-11 05:20:53
|
I wonder if anyone has standardized such a format? like an `scvg` or something; `Scalable Compressed Vector Graphics`
|
|
2024-03-11 05:24:32
|
Ah, I guess there's `.svgz`
|
|
2024-03-11 05:29:46
|
yeah, even brotli -2 will beat gzip's most effortful setting
|
|
|
yoochan
|
2024-03-11 05:51:20
|
You can do both, optimize and gzip
|
|
|
fab
|
2024-03-11 06:11:49
|
|
|
2024-03-11 06:12:18
|
Someone shared this meme exploiting a bug
|
|
2024-03-11 06:12:46
|
|
|
2024-03-11 06:14:07
|
What i'm worrying about is how SVT av1 psy behave with tune 0
|
|
|
gb82
|
|
yoochan
You can do both, optimize and gzip
|
|
2024-03-11 07:58:54
|
yeah, but even then, with this image Brotli wins
|
|
|
fab
What i'm worrying about is how SVT av1 psy behave with tune 0
|
|
2024-03-11 08:00:05
|
sorry about that. I think Blue is spreading FUD, we haven't seen the problematic clip yet so we can't be sure exactly what's going on
|
|
|
DZgas Ж
|
2024-03-12 08:22:17
|
<:PepeOK:805388754545934396> <#805176455658733570>
|
|
|
fab
|
2024-03-13 11:52:26
|
|
|
2024-03-13 11:53:04
|
Works I did dday, blazemedia Hdblog
|
|
2024-03-13 02:52:46
|
|
|
2024-03-13 02:52:57
|
My Optimizations aren't lossless
|
|
2024-03-13 02:53:14
|
Here I did a jxl like deblocking inspired techniques
|
|
2024-03-13 02:53:46
|
The video was 243kbps and the encoding was suffering in vp9 cause av1 bitrate
|
|
2024-03-13 02:54:23
|
Obviously that could be skipped as not important unless you buffer on tv
|
|
2024-03-13 02:57:08
|
|
|
2024-03-13 02:57:18
|
Final results with jxl filter
|
|
2024-03-13 02:58:18
|
This between also the boosting technique i implemented on yt -yrm are enough for today
|
|
2024-03-13 02:58:48
|
And in 1,8-3,1mb vp9 everything improved from one hour ago
|
|
|
fab
|
|
2024-03-13 02:59:18
|
This is 360p vp9
|
|
2024-03-13 03:01:42
|
With h264 green tint perhaps
|
|
2024-03-13 03:02:08
|
Is problem of my optimization except the one in blogs
|
|
2024-03-13 03:02:20
|
On videos I'm poor
|
|
|
|
afed
|
2024-03-13 03:13:08
|
<@853026420792360980> <:Thonk:805904896879493180>
|
|
2024-03-13 03:13:13
|
|
|
2024-03-13 03:13:49
|
webp
|
|
2024-03-13 03:18:06
|
i guess it's something in bmp, but if not, something is broken in ffmpeg conversion
this strange block
|
|
2024-03-13 03:19:51
|
|
|
2024-03-13 04:49:56
|
yeah, it's just a weird source
|
|
|
Traneptora
|
|
afed
<@853026420792360980> <:Thonk:805904896879493180>
|
|
2024-03-13 05:28:48
|
if I had to guess it's an swscale bug
|
|
2024-03-13 05:29:02
|
but without more info I can't diagnose
|
|
|
|
afed
|
2024-03-13 05:31:28
|
|
|
|
Traneptora
if I had to guess it's an swscale bug
|
|
2024-03-13 05:34:19
|
i checked and it's the same for imagemagick, so maybe it's not ffmpeg
|
|
2024-03-13 05:36:32
|
just some weird data in the alpha channel when capturing, maybe it's a cursor or something like that
|
|
2024-03-13 05:39:22
|
yeah, so it's not a conversion bug it's weird ffmpeg gdigrab behavior
|
|
|
w
makes the unknowing user wonder why the fork that's supposed to "fix" it doesn't work when it's a problem with the core design of the encoder
|
|
2024-03-17 05:31:50
|
<:Hypers:808826266060193874>
https://gitlab.com/AOMediaCodec/SVT-AV1/-/issues/1920#note_1818055602
|
|
|
username
|
2024-03-19 04:30:21
|
I was running oxipng on a directory and it decided to do this around an hour or so in
|
|
|
jonnyawsom3
|
2024-03-19 04:36:21
|
I've had it mess up a few times, fragmenting output across lines, changing the color of the window, all sorts of mess that mean I have to restart
|
|
|
username
|
2024-03-19 04:39:58
|
I set `--preserve` so it's *almost* impossible to tell what files it has or hasn't already processed however luckily voidtools everything has it logged
|
|
2024-03-19 04:41:11
|
idk if there's a better way to do this but I'm just going to move everything oxipng has already processed to another folder so it doesn't waste effort and time on the same files
|
|
|
I've had it mess up a few times, fragmenting output across lines, changing the color of the window, all sorts of mess that mean I have to restart
|
|
2024-03-19 04:45:31
|
you could do something like this `for /R . %I in (*.png) do ( oxipng.exe -v -o max --preserve "%I" )` so that it won't mess up in the middle of doing a full directory, only downside is it will only do a single file at a time and also the process will be created and then destroyed each time it's done with a file
|
|
|
username
idk if there's a better way to do this but I'm just going to move everything oxipng has already processed to another folder so it doesn't waste effort and time on the same files
|
|
2024-03-19 04:57:55
|
oh. apparently it did finish as this is all that's left in the folder when I move all the stuff processed by oxipng out
|
|
|
jonnyawsom3
|
2024-03-19 05:04:19
|
Yeah, most times my garbled cmd is just visual, with it still working but the progress being unreadable
|
|
|
CrushedAsian255
|
2024-03-19 09:20:50
|
Haven’t had that occur on Mac or Linux
|
|
|
DZgas Ж
|
2024-03-20 06:17:55
|
I don't know how to work with anything other than the Original Jpeg in 2024, everyone is chasing crazy resolutions and quality, but apparently no one takes into account the decoding speed
images are photographed already at 8k resolution
what other format besides JPEG can be opened in such a resolution and quickly
by the way, I recently noticed how instagram previews have switched to webp hmmm
It's funny that q95 - Jpg is superior to webp in high quality. just like mp3 is superior to opus
But ordinary normis don't need to know about it, they can get crazy
yesterday I thought for a long time how to get around this damn cloudfire so that i download 420 thousand art, as a result, nothing works: curl wget and other "low" programs. so I took all the links and wrote them in the html index and opened this. (its works, i downloaded 420k img)
what am I talking about — I can't even imagine what would happen to the processor and memory if there was something other than the classic jpeg baseline yuv420 — because only it can be opened in the amount of 70 thousand on one page inside chrome. jxl, avif — definitely not possible... even png.
|
|
2024-03-20 06:25:42
|
quite recently (a couple of months ago) I abandoned png for large images, and now I use jpeg yuv420 q100 (not loseless) in order to give away some of my work. I'm only doing this because it decodes quickly, and takes up less memory than all other formats (I was shocked by the speed of jpeg decoding in chrome, and how much RAM it takes up. it spends 6 times less memory, unlike for example xnview, probably because of some special way of store the decoded image, Not with pure pixels)
|
|
|
username
|
|
DZgas Ж
quite recently (a couple of months ago) I abandoned png for large images, and now I use jpeg yuv420 q100 (not loseless) in order to give away some of my work. I'm only doing this because it decodes quickly, and takes up less memory than all other formats (I was shocked by the speed of jpeg decoding in chrome, and how much RAM it takes up. it spends 6 times less memory, unlike for example xnview, probably because of some special way of store the decoded image, Not with pure pixels)
|
|
2024-03-20 06:27:49
|
I assume baseline JPEGs? progressive ones would probably fill up ram, also iirc chrome uses libjpeg-turbo for decoding
|
|
|
DZgas Ж
|
|
username
I assume baseline JPEGs? progressive ones would probably fill up ram, also iirc chrome uses libjpeg-turbo for decoding
|
|
2024-03-20 06:29:01
|
yes, exclusively baseline. I always clarify (usually)
|
|
|
username
I assume baseline JPEGs? progressive ones would probably fill up ram, also iirc chrome uses libjpeg-turbo for decoding
|
|
2024-03-20 06:31:07
|
There is a strong difference between in memory usage and in decoding speed, which I involuntarily imagine is a different image format
|
|
2024-03-20 06:32:29
|
while jpeg baseline is faster than any of the existing compression format at all (!)
|
|
|
Kremzli
|
|
DZgas Ж
I don't know how to work with anything other than the Original Jpeg in 2024, everyone is chasing crazy resolutions and quality, but apparently no one takes into account the decoding speed
images are photographed already at 8k resolution
what other format besides JPEG can be opened in such a resolution and quickly
by the way, I recently noticed how instagram previews have switched to webp hmmm
It's funny that q95 - Jpg is superior to webp in high quality. just like mp3 is superior to opus
But ordinary normis don't need to know about it, they can get crazy
yesterday I thought for a long time how to get around this damn cloudfire so that i download 420 thousand art, as a result, nothing works: curl wget and other "low" programs. so I took all the links and wrote them in the html index and opened this. (its works, i downloaded 420k img)
what am I talking about — I can't even imagine what would happen to the processor and memory if there was something other than the classic jpeg baseline yuv420 — because only it can be opened in the amount of 70 thousand on one page inside chrome. jxl, avif — definitely not possible... even png.
|
|
2024-03-20 01:18:43
|
How is "mp3 superior to opus"?
|
|
|
HCrikki
|
|
DZgas Ж
I don't know how to work with anything other than the Original Jpeg in 2024, everyone is chasing crazy resolutions and quality, but apparently no one takes into account the decoding speed
images are photographed already at 8k resolution
what other format besides JPEG can be opened in such a resolution and quickly
by the way, I recently noticed how instagram previews have switched to webp hmmm
It's funny that q95 - Jpg is superior to webp in high quality. just like mp3 is superior to opus
But ordinary normis don't need to know about it, they can get crazy
yesterday I thought for a long time how to get around this damn cloudfire so that i download 420 thousand art, as a result, nothing works: curl wget and other "low" programs. so I took all the links and wrote them in the html index and opened this. (its works, i downloaded 420k img)
what am I talking about — I can't even imagine what would happen to the processor and memory if there was something other than the classic jpeg baseline yuv420 — because only it can be opened in the amount of 70 thousand on one page inside chrome. jxl, avif — definitely not possible... even png.
|
|
2024-03-20 01:32:50
|
browsers or cdns can or by default lazyload parsing the rest of the images far below the fold (to save bandwidth mainly). 'infinite scrolling' got popular as it hides the rest of pages and only parses the next blocks
|
|
|
TheBigBadBoy - 𝙸𝚛
|
|
Kremzli
How is "mp3 superior to opus"?
|
|
2024-03-20 01:33:12
|
~~it is superior in size for the same quality~~ <:KekDog:805390049033191445>
|
|
|
HCrikki
|
2024-03-20 01:33:56
|
its just a workaround like how progressive decoding is simulated for images - load a very blurry version of a pic as placeholder, then swap the blurry placeholder with the full image once it finishes loading
|
|
|
lonjil
|
2024-03-20 01:34:34
|
mp3 is so good at preserving cymbals
|
|
|
DZgas Ж
|
|
Kremzli
How is "mp3 superior to opus"?
|
|
2024-03-20 03:26:52
|
Don't open that pandora's box.
|
|
|
HCrikki
its just a workaround like how progressive decoding is simulated for images - load a very blurry version of a pic as placeholder, then swap the blurry placeholder with the full image once it finishes loading
|
|
2024-03-20 03:28:47
|
there is no point in making a blur beforehand. it can be done in HTML CSS
|
|
|
Kremzli
|
|
DZgas Ж
Don't open that pandora's box.
|
|
2024-03-20 03:30:16
|
📂
|
|
|
Fox Wizard
|
2024-03-20 03:30:50
|
<:KittyGasp:1126563391939547198>
|
|
|
DZgas Ж
|
2024-03-20 03:31:03
|
https://cdn.discordapp.com/attachments/662412664252923924/1079978957086150799/soyboy-opus.jpg
|
|
|
Kremzli
|
2024-03-20 03:31:44
|
https://tenor.com/view/yakuza-like-a-dragon-yakuza-gaiden-like-a-dragon-gaiden-gaiden-gif-6758678452832849548
|
|
|
jonnyawsom3
|
2024-03-20 04:01:36
|
My relief at that not being a video
|
|
|
|
Posi832
|
|
DZgas Ж
https://cdn.discordapp.com/attachments/662412664252923924/1079978957086150799/soyboy-opus.jpg
|
|
2024-03-21 08:31:43
|
Challenge accepted.
|
|
|
HCrikki
|
2024-03-21 09:22:29
|
https://bsky.app/profile/videah.net/post/3ko4lkdt2jv2v
|
|
2024-03-21 09:23:16
|
ok thats really bad unexpected
|
|
|
w
|
2024-03-22 12:26:26
|
I encountered a handful on Google images already. Failed to copy and copy link to discord
|
|
|
HCrikki
|
2024-03-22 12:30:04
|
i wonder if its 2+ framed avifs pretending to be progressive
|
|
|
Bre
|
2024-03-22 07:45:30
|
new to libraw, which one is correct?
|
|
|
_wb_
|
2024-03-22 08:18:56
|
Define "correct" 🙂
|
|
|
Bre
|
|
_wb_
Define "correct" 🙂
|
|
2024-03-22 08:48:50
|
hmmm....idk maybe as seen from DSLR's viewfinder display?
|
|
|
damian101
|
2024-03-22 08:48:52
|
left looks like tonemapped to lower target peak brightness compared to right
|
|
|
w
|
2024-03-22 09:49:44
|
you mean left looks brighter than right
|
|
|
Nyao-chan
|
2024-03-22 11:33:52
|
I've seen avif at least 2 years ago, on rule34 of all places
|
|
|
HCrikki
|
2024-03-22 02:00:25
|
as uploads? only ones i saw were served as delivery-only images by cdns
|
|
2024-03-22 02:00:43
|
even then only for small images. according to cloudflare large images are brutally intensive to encode so they serve webp in their place (stuck maintaining 2 workflows instead of settling on any one format and leaving better compressed og jpg as the fallback)
|
|
|
Nyao-chan
|
2024-03-22 04:15:30
|
it was from a cdn, indeed
They do use it for thumbnails, but also normal posts, but it's more rare
|
|
|
damian101
|
|
w
you mean left looks brighter than right
|
|
2024-03-23 09:13:44
|
yeah, but also compressed dynamic range in the highlights
|
|
2024-03-23 09:14:16
|
but doesn't really look different for me aside from brightness
|
|
|
w
|
2024-03-23 09:32:41
|
i was trying to point out that it's completely subjective because it's a photo from a camera
|
|
|
damian101
|
2024-03-23 12:01:59
|
realism is kind of an objective reference...
|
|
2024-03-23 12:02:27
|
on a brighter screen the right image might look more realistic, on a darker screen the left one...
|
|
|
jonnyawsom3
|
2024-03-23 12:48:56
|
I only realised yesterday that if I wanted to 'calibrate' image intensity for my monitor, then because I have the brightness set to half in the monitor's settings, I'd need to halve the value too (Or at least close to it)
|
|
|
w
|
|
on a brighter screen the right image might look more realistic, on a darker screen the left one...
|
|
2024-03-23 01:53:53
|
brightness is subjective in reality because our eyes and cameras adjust to the environment
|
|
2024-03-23 01:54:26
|
so the experience of something being "bright" can be the same when something is 1 nits and 10000 nits
|
|
|
HCrikki
|
2024-03-23 02:01:41
|
imo thumb should be closer perceptually (your eyes' perception changes according to your *own* current and previous exposition to lighting). the raw has much more information and will be much higher quality when its processed to look similar to the jpg thumb. the hw jpg codec processes (badly though) the raw input both for thumb and a jpg final image
|
|
|
w
|
|
Bre
hmmm....idk maybe as seen from DSLR's viewfinder display?
|
|
2024-03-23 02:03:48
|
i imagine it's just this, with the thumb being that
|
|
|
damian101
|
|
w
brightness is subjective in reality because our eyes and cameras adjust to the environment
|
|
2024-03-24 12:58:06
|
Our eyes also adjust to screen brightness. But yes, surrounding outside both screen and camera frame is relevant...
|
|
|
w
|
2024-03-24 02:20:46
|
nobody is going to see anything 50 nits when outside in the sun
|
|
2024-03-24 02:21:56
|
and photographers still don't care about hdr
|
|
2024-03-24 02:22:05
|
stop it. HDR is a curse
|
|
|
Our eyes also adjust to screen brightness. But yes, surrounding outside both screen and camera frame is relevant...
|
|
2024-03-24 02:39:48
|
this is confusing. Because our eyes also adjust to a screen, it's even less relevant
|
|
|
spider-mario
|
|
w
and photographers still don't care about hdr
|
|
2024-03-24 08:45:29
|
they do
https://gregbenzphotography.com/hdr/
|
|
|
yoochan
|
2024-03-24 11:20:28
|
With naked eyes I can't, at the same time, read a book in the sun and find my keys in my bag... Storing hdr image is amazing but for real, how many bits of range would I need to faithfully render an image on a screen to match the dynamic range of the retina?
|
|
|
damian101
|
|
w
nobody is going to see anything 50 nits when outside in the sun
|
|
2024-03-24 12:07:23
|
that's why some dynamic compression is good and HDR shouldn't be purely about maximum realism
|
|
|
w
|
2024-03-24 12:08:08
|
that's what subjective means
|
|
2024-03-24 12:08:30
|
<:trollface:834012731710505011>
|
|
|
damian101
|
|
yoochan
With naked eyes I can't, at the same time, read a book in the sun and find my keys in my bag... Storing hdr image is amazing but for real, how many bits of range would I need to faithfully render an image on a screen to match the dynamic range of the retina?
|
|
2024-03-24 12:09:57
|
well, with video, a lot is about temporal brightness change
|
|
2024-03-24 12:11:17
|
wth is webp2
|
|
2024-03-24 12:12:21
|
nothing relevant it seems
|
|
|
lonjil
|
2024-03-24 12:15:34
|
they were going to make a real image format based on AV1, but it was canned in favor of "just chuck a plain AV1 stream inside a HEIF container" i.e. AVIF
|
|
|
spider-mario
|
|
yoochan
With naked eyes I can't, at the same time, read a book in the sun and find my keys in my bag... Storing hdr image is amazing but for real, how many bits of range would I need to faithfully render an image on a screen to match the dynamic range of the retina?
|
|
2024-03-24 12:21:07
|
about 12 stops https://dl.acm.org/doi/10.1145/1836248.1836251
|
|
2024-03-24 12:21:27
|
(~5000:1)
|
|
2024-03-24 12:22:36
|
that’s “at the same time” (no adaptation), so the range that you might want to show on a screen “ever” is wider
|
|
|
yoochan
|
|
spider-mario
that’s “at the same time” (no adaptation), so the range that you might want to show on a screen “ever” is wider
|
|
2024-03-24 02:21:11
|
Thank you for the sourced answer!
|
|
|
VcSaJen
|
2024-03-26 06:49:36
|
At some point I heard that non-CRT screens can't even display 8bit per channel colors, they use temporal dithering to compensate. Is this info outdated in 2024? How many bits screens can display without dithering?
|
|
|
190n
|
2024-03-26 06:50:57
|
i think there are some crappy TN panels that are natively 6 bpc and use temporal dithering to get to 8, but otherwise afaik every panel is honest about its bit depth
|
|
|
_wb_
|
2024-03-26 07:19:29
|
CRT screens cannot even display more than one pixel at a time, they use temporal dithering to compensate 🙂
|
|
|
190n
|
2024-03-26 07:19:42
|
lmfao
|
|
2024-03-26 07:20:22
|
now i'm thinking about making a crt so big it has to use multiple electron guns
|
|
2024-03-26 07:20:44
|
but that sounds like hell cuz the tv signal is tied to how a single gun works, so you'd have to buffer the signal or something
|
|
|
VcSaJen
|
2024-03-26 08:28:35
|
Slow-Mo guys did a slow motion of OLED and plasma screens. Very interesting
|
|
|
spider-mario
|
2024-03-26 08:36:34
|
if one takes into account random variations in CRT output (i.e. “noise”, but if I just said “noise”, one might think I was referring to the 16 kHz sound), how many bits of dithering is it equivalent to?
|
|
2024-03-26 08:37:04
|
if someone is claiming dithering to be a specific defect of non-CRT screens, it strikes me as the sort of argument/misconception that is sometimes used to argue that vinyl is superior to CD
|
|
2024-03-26 08:39:12
|
(reminder: with proper dithering, the only deviation between the true signal and the quantised signal is noise https://secure.aes.org/forum/pubs/conventions/?elib=11829)
|
|
2024-03-26 08:41:28
|
https://youtu.be/cIQ9IXSUzuM?t=12m15s
> Dither doesn’t “drown out” or “mask” quantisation noise, it actually _replaces_ it.
|
|
2024-03-26 08:44:45
|
(a bit more on this: https://www.strollswithmydog.com/sub-bit-signal/
https://blog.kasson.com/the-last-word/detectability-of-visual-signals-below-the-noise/)
|
|
|
yoochan
|
2024-03-26 08:59:21
|
<@604964375924834314> you who are ressourcefull, do you have something which explain how to do noise shaping ? I can grasp the big principle, but miss something when I try to go down an algorithm for it (in 1D)
|
|
|
VcSaJen
|
2024-03-26 09:06:56
|
Back when I heard that statement, most LCD monitors used VGA connector anyway
|
|
|
spider-mario
|
|
yoochan
<@604964375924834314> you who are ressourcefull, do you have something which explain how to do noise shaping ? I can grasp the big principle, but miss something when I try to go down an algorithm for it (in 1D)
|
|
2024-03-26 09:07:53
|
nothing off the top of my head, but I’ll make sure to get back to you if I come across anything
|
|
|
yoochan
|
|
spider-mario
nothing off the top of my head, but I’ll make sure to get back to you if I come across anything
|
|
2024-03-26 09:11:06
|
I'll post it here if I find something before you
|
|
|
|
Posi832
|
|
DZgas Ж
https://cdn.discordapp.com/attachments/662412664252923924/1079978957086150799/soyboy-opus.jpg
|
|
2024-03-26 06:41:49
|
So I gave it a try: https://positron832.neocities.org/blog/2024/mp3-vs-opus-triangles/
|
|
|
DZgas Ж
|
2024-03-26 06:51:00
|
😳
|
|
|
Posi832
So I gave it a try: https://positron832.neocities.org/blog/2024/mp3-vs-opus-triangles/
|
|
2024-03-26 06:53:42
|
So yes, opus sucks here
|
|
|
Posi832
So I gave it a try: https://positron832.neocities.org/blog/2024/mp3-vs-opus-triangles/
|
|
2024-03-26 06:55:50
|
low bitrate MP3's are poorly encoded using parameters without modification, which makes their sampling rate lower than OPUS, which is unacceptable. According to the documentation, mp3 up to 64 kbps can work in stereo at 48000 hz
|
|
|
fab
|
2024-03-26 06:58:48
|
As the instrumental is literally same dark
|
|
2024-03-26 06:59:17
|
Just open headphones on a conor maynard or justin bieber tracks
|
|
2024-03-26 06:59:22
|
Or Travis Scott
|
|
2024-03-26 06:59:33
|
They have lot of this night sound
|
|
2024-03-26 07:00:02
|
I optimized the audio but don't know if sound good to average people
|
|
2024-03-26 07:00:18
|
I just did abx on the file you can had followed
|
|
|
DZgas Ж
|
2024-03-26 07:00:55
|
<@416586441058025472>❌
|
|
|
|
Posi832
|
|
DZgas Ж
low bitrate MP3's are poorly encoded using parameters without modification, which makes their sampling rate lower than OPUS, which is unacceptable. According to the documentation, mp3 up to 64 kbps can work in stereo at 48000 hz
|
|
2024-03-26 07:01:09
|
Do you know of any encoders capable of that? I recall LAME using lower samplerates at low bitrates regardless if a high samplerate is specified
|
|
|
DZgas Ж
|
|
fab
|
2024-03-26 07:01:23
|
Ok argument
|
|
|
DZgas Ж
|
|
2024-03-26 07:01:42
|
What wrong on how justin bieber and matteo milazzo sounds in a instrumental
|
|
2024-03-26 07:02:00
|
Use paint with the painter yellow colour
|
|
|
DZgas Ж
|
|
Posi832
Do you know of any encoders capable of that? I recall LAME using lower samplerates at low bitrates regardless if a high samplerate is specified
|
|
2024-03-26 07:03:25
|
--resample 44.1
--resample 48
|
|
|
fab
|
2024-03-26 07:04:04
|
You're right sounds like he has rinco in his mouth
|
|
|
DZgas Ж
|
2024-03-26 07:04:11
|
yeah
|
|
2024-03-26 07:04:16
|
<:BlobYay:806132268186861619>
|
|
2024-03-26 07:05:10
|
stereo 32 kbps 44.1
|
|
|
|
Posi832
|
2024-03-26 07:06:53
|
I just tried it and wow you're right
|
|
|
DZgas Ж
|
2024-03-26 07:07:12
|
<:PepeOK:805388754545934396> Knowledge is power
|
|
2024-03-26 07:08:28
|
but here is the same story as with JXL q0 - the developers decide for the user that if he really needs to compress so much, at least let it be possible to listen to it.
|
|
|
fab
|
2024-03-26 07:10:22
|
Please tell me it sounds Better''
|
|
|
DZgas Ж
yeah
|
|
2024-03-26 07:10:24
|
Make a new screenshot, ogg file
|
|
2024-03-26 07:10:37
|
I'm not Justin buwbeR fan
|
|
|
DZgas Ж
|
|
fab
|
2024-03-26 07:10:46
|
Is still same
|
|
2024-03-26 07:10:50
|
Or worse
|
|
2024-03-26 07:11:36
|
I wonder that Ringo in his mouth vocoder where it Comes
|
|
2024-03-26 07:11:53
|
And if engineers can fix it the damage I cared
|
|
|
DZgas Ж
|
|
Posi832
I just tried it and wow you're right
|
|
2024-03-26 07:12:44
|
LAMEworse quality parameters retain higher frequencies more, experiment from -q 0 to -q 9
|
|
|
fab
|
2024-03-26 07:12:59
|
It still sounds bad but I made in this way 2 years ago and yt fixed after 2month
|
|
2024-03-26 07:13:25
|
At least the bass sounds better
|
|
|
|
Posi832
|
2024-03-26 07:13:29
|
I'll have to play around with the encoders' settings. Going offline
|
|
|
fab
|
2024-03-26 07:16:02
|
Spamming with vp9 videos isn't the solution
|
|
2024-03-26 07:17:11
|
You have to trust the engineers that can like give requisite of adopting and adoperating the right and precision tools
|
|
|
DZgas Ж
So yes, opus sucks here
|
|
2024-03-26 07:17:55
|
No i think with the edit i did not anymoore
|
|
2024-03-26 07:31:25
|
Ok now i'm comparing on a phone in my memory
|
|
|
jonnyawsom3
|
2024-03-29 05:55:34
|
This video hurt me... From it being rendered in 480p yet uploaded as 4K, to all the missing info and facts https://youtu.be/NxzMAYckaV4
|
|
|
damian101
|
2024-03-30 08:24:21
|
what I learned from this:
1. Lossy DWA compression exists in EXR, which is very cool.
2. Blender apparently handles different bit depths incorrectly, either when exporting, importing or for the pixel diff tool...
|
|
|
Quackdoc
|
|
what I learned from this:
1. Lossy DWA compression exists in EXR, which is very cool.
2. Blender apparently handles different bit depths incorrectly, either when exporting, importing or for the pixel diff tool...
|
|
2024-03-30 09:09:29
|
for dwaa yeah, it helps a decent chunk and it's still significantly better then PNG for exporting. Oh PNG, how you ruined countless exports.
|
|
2024-03-30 09:17:24
|
also note that when working with RGBA you cant compare exr to png
|
|
2024-03-30 09:17:48
|
well unless the alpha is always opaque I guess
|
|
2024-03-30 09:21:03
|
also I dunno what blender does, EXR is a special format that is really flexible and allows data that "isn't seeable" so to speak. For instance with a png the max value will be 1.0, this isn't the case in EXR
|
|
2024-03-30 09:22:25
|
so comparing EXR to PNG is often not really possible to do as a true "Apples to Apples", though the comparison can be considered accurate even if the reasons are wrong, it is true that EXR can retain significantly more information that PNG can
|
|
|
jonnyawsom3
|
2024-03-30 10:47:31
|
Comparing 32 bit to 16 and then saying "PNG IS AWFUL" does make you wonder how thorough his testing actually is...
|
|
|
lonjil
|
2024-03-30 10:50:00
|
more like 24 bits to 16 bits
|
|
2024-03-30 10:51:02
|
nice thing with float is that you basically can't get clipping, which isn't true of png
|
|
|
damian101
|
|
Comparing 32 bit to 16 and then saying "PNG IS AWFUL" does make you wonder how thorough his testing actually is...
|
|
2024-03-30 10:52:41
|
he compared 16 bits to 8 bits, and also the diff he was looking was definitely incorrect, something was wrong there
|
|
|
lonjil
nice thing with float is that you basically can't get clipping, which isn't true of png
|
|
2024-03-30 10:53:43
|
how do you mean?
|
|
|
jonnyawsom3
|
|
he compared 16 bits to 8 bits, and also the diff he was looking was definitely incorrect, something was wrong there
|
|
2024-03-30 11:02:25
|
I mean technically he was comparing 32 bit float to 8 bit ints ;P
|
|
|
lonjil
|
|
how do you mean?
|
|
2024-03-30 11:12:30
|
with ints, you get N bits of precision, and the range is intrinsically tied to the precision. With floats, these are decoupled, so it's easy to go to arbitrarily huge numbers without hitting a limit. With ints it's a lot easier to accidentally hit the maximum value and get clipping. E.g. in audio usually the range for ints is -2^15 to 2^15-1, and for floating point it's -1.0 to +1.0. You can't go over 2^15-1, but you can easily go above 1.0 in intermediate workflow steps with no issues.
|
|
|
damian101
|
|
lonjil
with ints, you get N bits of precision, and the range is intrinsically tied to the precision. With floats, these are decoupled, so it's easy to go to arbitrarily huge numbers without hitting a limit. With ints it's a lot easier to accidentally hit the maximum value and get clipping. E.g. in audio usually the range for ints is -2^15 to 2^15-1, and for floating point it's -1.0 to +1.0. You can't go over 2^15-1, but you can easily go above 1.0 in intermediate workflow steps with no issues.
|
|
2024-03-30 11:14:37
|
oh, that probably explains the significant diff between 8 bit PNG and float source then, because there was definitely something wrong there
|
|
|
jonnyawsom3
|
2024-03-30 04:58:24
|
I need a refresher... How does lossless WebP multithread again? Is it tiles that reduce compression, or something else?
|
|
2024-03-30 04:59:46
|
Found a typo that reduced config.thread_count to 1 instead of 6 in an application using libwebp, so trying to figure out if that's causing major slowdowns (without having to complie and then try to use it myself)
|
|
|
Vlad (Kuzmin) Erium
|
|
what I learned from this:
1. Lossy DWA compression exists in EXR, which is very cool.
2. Blender apparently handles different bit depths incorrectly, either when exporting, importing or for the pixel diff tool...
|
|
2024-03-30 05:11:30
|
DWAA/DWAB is always half float
|
|
2024-03-30 05:12:54
|
maybe this is a reason for "bit depth incorrectness"
|
|
|
damian101
|
2024-03-30 05:17:42
|
half float?
|
|
|
Vlad (Kuzmin) Erium
maybe this is a reason for "bit depth incorrectness"
|
|
2024-03-30 05:17:58
|
it's not rounding errors
|
|
2024-03-30 05:18:04
|
major clipping could maybe explain it, though
|
|
|
Vlad (Kuzmin) Erium
|
|
half float?
|
|
2024-03-30 05:21:49
|
16 bit float (half) yes.
Also Blender usually loaded float images as a half float too. Do not remember if they changed this behaviour. But it possbiel switch to 32 bit float in image properties
|
|
|
Quackdoc
|
|
he compared 16 bits to 8 bits, and also the diff he was looking was definitely incorrect, something was wrong there
|
|
2024-03-30 05:52:57
|
even if you compared say a double float PNG, half float EXR would still be better since PNG clips at 1.0
|
|
|
damian101
|
2024-03-30 05:53:48
|
just scale your content into the 0-1 range 💀
|
|
2024-03-30 05:53:59
|
why would such software not do that...
|
|
|
Quackdoc
|
2024-03-30 05:56:20
|
that sounda horrid
|
|
|
Vlad (Kuzmin) Erium
|
2024-03-31 02:53:19
|
No. It will not.
PNG use fixed precision 16bit across whole 0-1 range.
EXR DWAA/DWAB have float precision where values higher than 0.5 slowly degrade to 11-12bit precision.
|
|
2024-03-31 02:54:24
|
Good visualization:
https://www.shadertoy.com/view/4tVyDK
|
|
|
Quackdoc
|
2024-03-31 03:44:47
|
does DWAA lead to more fidelity loss? yes. However even with that the range of data you can keep with EXR + DWAA is still significantly higher then PNG. Hell PNG can't even store alpha correctly, even completely disregarding any other benefits, PNG's lack of proper alpha handling makes it unsuitable for any high quality render that uses alpha unless it's for distribution
|
|
2024-03-31 03:50:37
|
and ofc you can always just use lossless compression
|
|
|
Vlad (Kuzmin) Erium
|
2024-03-31 04:24:17
|
Yep, PNG is doomed. In 3DCG/VFX no one use it in right mind
|
|
|
Quackdoc
|
2024-03-31 04:35:50
|
Watch image format creators ruin a great image format with one funny line
|
|
2024-03-31 04:35:52
|
[av1_dogelol](https://cdn.discordapp.com/emojis/867794291652558888.webp?size=48&quality=lossless&name=av1_dogelol)
|
|
|
lonjil
|
|
Quackdoc
Watch image format creators ruin a great image format with one funny line
|
|
2024-03-31 08:10:57
|
what's that?
|
|
|
Quackdoc
|
|
lonjil
what's that?
|
|
2024-03-31 04:22:47
|
"The colour values in a pixel are not premultiplied by the alpha value assigned to the pixel."
|
|
|
lonjil
|
|
spider-mario
|
2024-03-31 04:48:35
|
also, I don’t know how often it’s used in practice but EXR supports per-channel alpha
|
|
2024-03-31 04:48:45
|
so that you can have e.g. coloured glass and the like
|
|
|
Vlad (Kuzmin) Erium
|
2024-04-01 02:19:36
|
Yep, EXR as a container more like PSD.
More interesting feature is variable bit depth and color management per channel/layer
|
|
|
jonnyawsom3
|
2024-04-02 01:36:02
|
So a game called `Content Warning` released yesterday, and is free to keep until tomorrow afternoon. It features recording video clips on a camera, that are saved to your actual computer... I was naturally curious so I took a look myself...
|
|
2024-04-02 01:36:45
|
`Running FFmpeg encode... -y -thread_queue_size 512 -r 24 -i C:\Users\jonat\AppData\Local\Temp\rec\6d633436-44c4-4737-a1e5-b7d8f4fa46e1\eb14f1a8-a2b6-4d26-972d-3daf85a576cc/Test%04d.png -f f32le -ac 2 -ar 48000 -thread_queue_size 512 -i C:\Users\jonat\AppData\Local\Temp\rec\6d633436-44c4-4737-a1e5-b7d8f4fa46e1\eb14f1a8-a2b6-4d26-972d-3daf85a576cc/audio.raw -f f32le -ac 1 -ar 48000 -thread_queue_size 512 -i C:\Users\jonat\AppData\Local\Temp\rec\6d633436-44c4-4737-a1e5-b7d8f4fa46e1\eb14f1a8-a2b6-4d26-972d-3daf85a576cc/mic.raw -map 0:0 -map 1:0 -c:a libvorbis -ac 2 -ar 24000 -filter_complex amix=inputs=2 -c:v libvpx -cpu-used -5 -deadline realtime -pix_fmt yuv420p -speed 1 -preset ultrafast C:\Users\jonat\AppData\Local\Temp\rec\6d633436-44c4-4737-a1e5-b7d8f4fa46e1\eb14f1a8-a2b6-4d26-972d-3daf85a576cc/output.webm`
|
|
2024-04-02 01:41:03
|
It saves every frame as a 32 bit PNG then encodes a VP8 video at awful quality, which may be a stylistic choice for 'analogue horror', but to me it just looks like a blocky mess
|
|
2024-04-02 01:41:46
|
(Naturally it looks better in a smaller window/Discord's preview)
|
|
2024-04-02 01:42:50
|
What surprises me is the ffmpeg binary bundled in the game also has VP9 and Opus among many other encoders already, but I guess they wanted compatibility an speed above anything else
|
|
|
Quackdoc
|
2024-04-02 02:18:07
|
thats interesting
|
|
|
DZgas Ж
|
2024-04-04 08:52:41
|
jpegli 😵
|
|
|
fab
|
2024-04-04 12:53:36
|
https://m.youtube.com/watch?v=qugf6TBsq1o&themeRefresh=1
|
|
|
Demiurge
|
2024-04-06 12:54:06
|
Not bad
|
|
2024-04-06 01:10:51
|
Does premultiplied alpha just mean pixel values on a white background essentially in practice?
|
|
2024-04-06 01:11:08
|
What does it matter if it's a black or a white background?
|
|
2024-04-06 01:11:25
|
Premultiplied vs non premultiplied
|
|
2024-04-06 01:12:02
|
Wouldn't premultiplied be more complex to render?
|
|
|
Quackdoc
|
|
Demiurge
Does premultiplied alpha just mean pixel values on a white background essentially in practice?
|
|
2024-04-06 01:37:59
|
https://community.adobe.com/t5/photoshop-ecosystem-discussions/change-in-exr-open-from-cs2-to-cs3-can-this-be-fixed/m-p/1521042/page/4#M918
|
|
2024-04-06 01:38:09
|
an excellent post on alpha
|
|
2024-04-06 01:41:45
|
note "premultiplied" is a lot of a misnomer, people are now more or less using the terms "associated" and "unassociated" now due too it being a much better representation of what it is actually doing
|
|
|
Demiurge
|
2024-04-06 03:03:54
|
I don't fully understand yet. So far they seem like equivalent terms because in practice the RGB values are premultiplied by the alpha
|
|
|
Quackdoc
|
2024-04-06 03:29:41
|
Zap's post later on explains it in more detail https://community.adobe.com/t5/photoshop-ecosystem-discussions/change-in-exr-open-from-cs2-to-cs3-can-this-be-fixed/m-p/1521072/page/5#M948
but the TLDR of it is associated alpha better models how light actually works. particularly around how renderers work
another useful post on the matter is https://groups.google.com/g/ocio-dev/c/ZehKhUFqhjc
particularly this part
> a nice intuition for what would
> otherwise be corner cases. Consider a pixel where rgb > alpha, such as
> 2.0, 2.0, 2.0, 1.0. Nothing special about this - it just represents a
> 'specular' pixel where it's emitting 2.0 units of light, and is fully
> opaque. A pixel value of (2.0, 2.0, 2.0, 0.0)? Nothing special
> about this either, it represents a pixels that's contributing 2.0
> units of light energy, and happens to not occlude objects behind it.
> Both of these cases can cause trouble with unpremultiplied
> representations.
|
|
|
|
Posi832
|
2024-04-06 05:10:13
|
Captain Disillusion has a video about this : D https://youtube.com/watch?v=XobSAXZaKJ8
|
|
|
Quackdoc
|
2024-04-06 05:17:48
|
his video isn't wholly accurate, he gets across what makes it needed, but his technicals aren't quite right, he kinda falls under the "taking the premultiplied part literally"
|
|
|
spider-mario
|
2024-04-06 10:50:43
|
> Chris,
>
> since you claim to know exactly what I mean, without actually
> discussing this with me outside this rather bizarre thread, I
> would like to clarify that I, as the original author of OpenEXR,
> completely agree with Zap Andersson's post earlier today.
😂
|
|
|
Quackdoc
|
2024-04-06 11:43:52
|
this thread is almost as good as a chromium thread I once read, I wish I could find it but it was lost to time I suppose. There were some extraordinary good and bad takes in it
|
|
|
Demiurge
|
2024-04-06 03:20:09
|
None of these pixel formats are designed to realistically imitate the behavior of light, which is why it's hard to intuitively reason on the subject
|
|
2024-04-06 03:23:36
|
If 0=transparent and color values are premultiplied, and linearly represent the amount of light, then that gets pretty close but wouldn't there still be black fringes?
|
|
|
Quackdoc
|
2024-04-06 03:25:01
|
Ignoring everything else about RGB pixels. Ignoring gamut, color space, so on and so forth. RGB, when you're talking about associated alpha, actually does do a fairly decent job at it.
With Associated Alpha, we can actually think of RGB as actual units of energy itself and alpha as actual transparency of an object. Think of it like a candlelight. With a candlelight, you have the luminous glow of the flame, but it has no solid object except for the flame itself, which would be built, let's say, 50%.
|
|
|
Demiurge
If 0=transparent and color values are premultiplied, and linearly represent the amount of light, then that gets pretty close but wouldn't there still be black fringes?
|
|
2024-04-06 03:29:20
|
You need to completely disregard the idea of premultification. It is not what it is. It in no way represents how the actual Alpha Association works.
|
|
2024-04-06 03:30:08
|
There is a reason why we are moving away from the term. it's because it makes no sense at all.
|
|
|
Demiurge
|
|
Quackdoc
Ignoring everything else about RGB pixels. Ignoring gamut, color space, so on and so forth. RGB, when you're talking about associated alpha, actually does do a fairly decent job at it.
With Associated Alpha, we can actually think of RGB as actual units of energy itself and alpha as actual transparency of an object. Think of it like a candlelight. With a candlelight, you have the luminous glow of the flame, but it has no solid object except for the flame itself, which would be built, let's say, 50%.
|
|
2024-04-06 03:39:26
|
But most of the time the digital values do not linearly represent energy
|
|
|
Quackdoc
|
2024-04-06 03:39:32
|
another good one from zap which indirectly explains to why its a misnimer
https://twitter.com/aaronkennedy0/status/1583891205339086849
> Exept it's not a hack. Say a raytracer shoots 10 rays into a pixel, 7 hit a red object, 3 hit nothing. The hits are treated as alpha=1 and misses 0. When you average the samples for the pixel you get 0.7 red and 0.7 alpha. Without "muliplying" anything, "pre" or not.
|
|
|
Demiurge
But most of the time the digital values do not linearly represent energy
|
|
2024-04-06 03:39:42
|
thats irrelevant,
|
|
2024-04-06 03:40:12
|
how light is shaped doesn't really matter, its still a representation of energy
|
|
|
Demiurge
|
2024-04-06 03:41:02
|
When interpolating or resizing it matters
|
|
|
Quackdoc
|
2024-04-06 03:41:38
|
that doesn't seem immediately relevant to the topic at hand
|
|
|
Demiurge
|
2024-04-06 03:41:56
|
Because the difference in energy between two digital values is distorted by the gamma curve
|
|
|
Quackdoc
|
2024-04-06 03:42:37
|
This doesn't really seem relevant to the topic at hand, unless I'm missing something
|
|
|
Demiurge
|
2024-04-06 03:42:52
|
Isn't the topic about mixing/resizing/resampling and blending pixel data with alpha?
|
|
|
lonjil
|
|
Demiurge
Because the difference in energy between two digital values is distorted by the gamma curve
|
|
2024-04-06 03:43:10
|
and it's common for tools that do interpolation or resizing to convert to linear light before doing the operation, and then back. It doesn't really matter.
|
|
|
Demiurge
|
2024-04-06 03:43:28
|
It's pretty relevant
|
|
|
lonjil
|
2024-04-06 03:43:31
|
and when I say common, I mean you have to do that
|
|
|
Quackdoc
|
2024-04-06 03:43:31
|
blending yes. if you dont linearize before composting its wrong regardless
|
|
|
lonjil
|
2024-04-06 03:43:37
|
^
|
|
2024-04-06 03:43:54
|
doing it in non-linear light is wrong regardless of any alpha stuff
|
|
|
Demiurge
|
|
lonjil
and it's common for tools that do interpolation or resizing to convert to linear light before doing the operation, and then back. It doesn't really matter.
|
|
2024-04-06 03:44:04
|
I thought most tools were more naive than that
|
|
|
Quackdoc
|
|
Demiurge
I thought most tools were more naive than that
|
|
2024-04-06 03:44:18
|
Bad tools are...
|
|
2024-04-06 03:44:21
|
krita...
|
|
2024-04-06 03:44:26
|
kdenlive...
|
|
|
lonjil
|
2024-04-06 03:44:29
|
bad tools are common
|
|
|
Quackdoc
|
2024-04-06 03:44:46
|
Gimp at least handles it correctly
|
|
|
Demiurge
|
2024-04-06 03:45:19
|
Okay, maybe it's not relevant if clever tools that linearize when mixing and resampling are also common
|
|
|
Quackdoc
|
2024-04-06 03:45:40
|
rather then being clever its a matter of being right and wrong
|
|
2024-04-06 03:46:37
|
krita vs gimp is a good example of wrong vs right
https://cdn.discordapp.com/attachments/719811866959806514/1216585057112821851/image.png?ex=660a26a0&is=65f7b1a0&hm=29e2c0ee0ff4369a2f0c562d3c97c4721e2bfaa9d83a44fbed862b2ec270582f&
https://cdn.discordapp.com/attachments/719811866959806514/1216585057561739284/image.png?ex=660a26a0&is=65f7b1a0&hm=31b405536b45b2e83a3d692726e217c99a37c120a8aa2b90b84b3913d98f8e8c&
|
|
|
lonjil
|
2024-04-06 03:54:58
|
did Adobe ever get their shit together?
|
|
|
Quackdoc
|
2024-04-06 03:55:54
|
iirc they still use stupid hacks
|
|
2024-04-06 04:01:05
|
It is risk knowing that tooling like OCIO if I remember correctly will treat all alpha as associated simply because it turns out if you treat unassociated alpha as associated it's not really that bad in a lot of cases
|
|
2024-04-06 04:01:56
|
also here is a quote from troy sobotka, author of hitchhikers guide to digital color (excellent resource btw)
> It is closer to the basis of how light transport works under an RGB model. This is easily verifiable when we think about something like a reflection on glass; there’s no degree of occlusion, and solely additive light.
> Only associated alpha handles that.
> (Gritz was one of the earlier people using the TIFF terminology, and it is damn wise as it helps people not get stuck thinking it is a simple math thing; it’s not!!! There is literally only one encoded state of the light, and it’s associated alpha. The other one isn’t encoded, and it can’t represent occlusion and emission properly.)
|
|
2024-04-06 04:02:07
|
ah crap stupid discord mobile
|
|
|
Demiurge
|
2024-04-06 04:04:23
|
What's wrong with calling it premultiplied? It has to be multiplied if it's unassociated
|
|
|
lonjil
|
2024-04-06 04:05:09
|
because you're not doing any pre-multiplication
|
|
|
Demiurge
|
2024-04-06 04:06:07
|
You're not doing it because it's already premultiplied :)
|
|
|
lonjil
|
2024-04-06 04:06:12
|
no
|
|
2024-04-06 04:06:36
|
if you happen to have unassociated alpha data, then doing the multiplication for the unassociated over operation give you associated alpha data
|
|
|
Quackdoc
|
|
Demiurge
What's wrong with calling it premultiplied? It has to be multiplied if it's unassociated
|
|
2024-04-06 04:06:57
|
```
unfa
04/27/2021 9:33 AM
I keep thinking in terms of "pre-mutiplied" alpha,
where I'd only want to affect the alpha component.
troy_s 04/27/2021 9:34 AM
You obviously don't want to scale occlusion beyond
the 0-100% range, but for light it is acceptable, and
required.
This is all associated alpha, which is why
premultiplied is a beyond shit term.
It makes people think about math, and the wrong
math.
unfa
04/27/2021 9:35 AM
Oh yes, sometimes this creates problems. I hope
we'll soon have a way to choose affected channels
soon, so I can avoid having alpha >1 when I wanted
to increase exposure.
troy_s 04/27/2021 9:35 AM
You never want to separate alpha and RGBA. Ever.
They are associated
The sole time this is required is for colour
operations.
If you think "Are these light emissions associated
with the degree of occlusion?" it is a helpful
mnemonic. (edited)
```
|
|
|
lonjil
|
|
lonjil
if you happen to have unassociated alpha data, then doing the multiplication for the unassociated over operation give you associated alpha data
|
|
2024-04-06 04:07:07
|
but if you never had unassociated alpha data to begin with, that multiplication never happens!
|
|
|
Quackdoc
|
2024-04-06 04:07:08
|
hope OCR didn't bugger this too bad
|
|
|
Demiurge
You're not doing it because it's already premultiplied :)
|
|
2024-04-06 04:08:02
|
it literally isn't
|
|
2024-04-06 04:08:12
|
I mean in some cases i could be
|
|
2024-04-06 04:08:23
|
but its completely irrelevant to how it works
|
|
|
Demiurge
|
2024-04-06 04:09:31
|
Ok, so if you want to increase emissions separately from occlusion...
|
|
|
Quackdoc
|
|
Demiurge
|
2024-04-06 04:10:37
|
Although occlusion will always still multiply emission...
|
|
2024-04-06 04:12:59
|
It makes sense for RGB values to be out of range but not occlusion/alpha
|
|
2024-04-06 04:13:33
|
But the emission is still a multiple of occlusion...
|
|
2024-04-06 04:14:31
|
So the term still makes sense to me...
|
|
|
lonjil
|
|
Quackdoc
|
|
Demiurge
But the emission is still a multiple of occlusion...
|
|
2024-04-06 04:15:01
|
its not, what happens when alpha is 0?
|
|
|
Demiurge
|
2024-04-06 04:15:37
|
Fully occluded means that pixel has no emission either.
|
|
|
lonjil
|
2024-04-06 04:16:12
|
the pixel is what's doing the occlusion
|
|
|
Quackdoc
|
2024-04-06 04:16:13
|
if you have an alpha zero it means the pixel has no substance you can see, nothing solid
|
|
2024-04-06 04:16:28
|
does that mean the energy disappears? No!
|
|
2024-04-06 04:17:02
|
Take, for example, the glow of a candlelight. We can't see the glow unless it's bouncing off dust in the air.
Does that mean the energy disappears? No, it's still there.
|
|
|
Demiurge
|
2024-04-06 04:17:39
|
If you have an occluded pixel then yes, the energy from that pixel is reduced by a factor of it's occlusion.
|
|
|
Quackdoc
|
2024-04-06 04:17:42
|
energy doesn't disappear just because it has nothing to reflect off if
|
|
|
Demiurge
|
2024-04-06 04:17:57
|
It's a multiple
|
|
|
Quackdoc
|
2024-04-06 04:18:02
|
it will keep going until it hits something that does
|
|
|
lonjil
|
|
Demiurge
If you have an occluded pixel then yes, the energy from that pixel is reduced by a factor of it's occlusion.
|
|
2024-04-06 04:18:18
|
but it isn't occluded??
|
|
2024-04-06 04:18:30
|
like let's say it's the front-most pixel
|
|
2024-04-06 04:18:43
|
and has alpha=0 but RGB>0
|
|
|
Demiurge
|
2024-04-06 04:18:55
|
Energy can escape from partially occluded pixels but only by a factor of its original energy times the occlusion factor
|
|
|
Quackdoc
|
2024-04-06 04:19:28
|
You need to find what you mean by occluded?
|
|
2024-04-06 04:19:39
|
can you define*
|
|
|
Demiurge
|
2024-04-06 04:19:40
|
Meaning that it was already multiplied, or pre-multiplied if you will
|
|
|
Quackdoc
|
2024-04-06 04:19:44
|
STT took a crap
|
|
|
Demiurge
|
2024-04-06 04:20:19
|
Well from what I'm reading, it means that those pixels are going to be covered up by a different texture.
|
|
2024-04-06 04:20:36
|
If you are blending two textures together using an alpha mask
|
|
2024-04-06 04:21:06
|
As the simplest example
|
|
|
lonjil
|
2024-04-06 04:21:21
|
let's have a concrete example, say 1,1,1,0 as the background, and 1,0,0,0 as the foreground
|
|
2024-04-06 04:21:29
|
what do you get?
|
|
|
Demiurge
|
2024-04-06 04:21:32
|
Occluded just means covered up
|
|
2024-04-06 04:22:20
|
A pixel that's partially occluded would be blended with another pixel's color using linear light blending
|
|
|
lonjil
|
2024-04-06 04:22:58
|
the result would be 2,1,1,0
|
|
2024-04-06 04:23:11
|
no occlusion, just the foreground pixel adding light
|
|
|
Demiurge
|
|
lonjil
let's have a concrete example, say 1,1,1,0 as the background, and 1,0,0,0 as the foreground
|
|
2024-04-06 04:23:22
|
If both have an alpha of 0 then the result would be transparent nothing
|
|
|
lonjil
|
2024-04-06 04:23:28
|
wrong
|
|
2024-04-06 04:23:36
|
that's only true of unassociated alpha
|
|
|
Demiurge
|
2024-04-06 04:24:11
|
No, that's just what you expect when you add two of nothing together. You get nothing
|
|
|
lonjil
|
2024-04-06 04:24:22
|
go read the links that were posted earlier
|
|
|
Quackdoc
|
2024-04-06 04:24:28
|
well assuming the background of the media player has no solidity, it would *look* like nothing
|
|
2024-04-06 04:24:50
|
despite the energy still being there
|
|
|
lonjil
|
2024-04-06 04:25:15
|
the energy would reach the camera
|
|
2024-04-06 04:25:36
|
a candle flame is transparent but emits light
|
|
2024-04-06 04:25:49
|
that light will reach the camera, even if there is no solid background
|
|
|
Demiurge
|
2024-04-06 04:26:32
|
Not if I put an object between the camera and the candle
|
|
|
Quackdoc
|
2024-04-06 04:26:34
|
it can depend on how you are modeling the light. if there is nothing to reflect off of in cases it will be as if there is nothing
|
|
|
Demiurge
|
2024-04-06 04:26:36
|
That's occlusion
|
|
|
lonjil
|
|
Demiurge
Not if I put an object between the camera and the candle
|
|
2024-04-06 04:26:56
|
but how is that relevant?
|
|
|
Quackdoc
|
2024-04-06 04:27:13
|
its not unless its partial occlusion
|
|
|
Demiurge
|
2024-04-06 04:27:19
|
https://en.m.wikipedia.org/wiki/Alpha_compositing
|
|
|
lonjil
|
2024-04-06 04:27:31
|
if you put something in front, you get occlusion regardless of whatever alpha the background might have had
|
|
|
Demiurge
|
2024-04-06 04:27:48
|
It's relevant because the alpha channel represents occlusion when it's associated alpha
|
|
|
lonjil
|
2024-04-06 04:27:52
|
and partial transparency is something that both associated and unassociated alpha can represent
|
|
|
Demiurge
It's relevant because the alpha channel represents occlusion when it's associated alpha
|
|
2024-04-06 04:28:22
|
yeah, which means alpha=0 is perfectly sensible, and obviously the color can't be a multiple of it
|
|
|
Quackdoc
|
2024-04-06 04:29:15
|
occlusion can be a bit of a hard term here. I prefer the term solidity, it means the same thing, but its easier to visualize
|
|
|
lonjil
yeah, which means alpha=0 is perfectly sensible, and obviously the color can't be a multiple of it
|
|
2024-04-06 04:30:47
|
exactly this. if color was a multiple of alpha, this would break a lot of things
|
|
|
Demiurge
|
2024-04-06 04:31:03
|
Occluded meaning that it's covered up by something else
|
|
2024-04-06 04:31:15
|
Which is a good way of describing what is happening to the pixel values
|
|
2024-04-06 04:31:25
|
After blending
|
|
|
lonjil
|
2024-04-06 04:31:47
|
and when you have a pixel with a value like 1,0,0,0, that means it has color, which will show up on your screen, but it won't occlude anything
|
|
|
Demiurge
|
2024-04-06 04:31:54
|
If it's zero then it's fully covered up or invisible
|
|
|
lonjil
|
2024-04-06 04:32:12
|
no, if it's zero it isn't occluding anything behind it
|
|
2024-04-06 04:32:16
|
it isn't itself occluded
|
|
|
Quackdoc
|
|
lonjil
and when you have a pixel with a value like 1,0,0,0, that means it has color, which will show up on your screen, but it won't occlude anything
|
|
2024-04-06 04:32:21
|
only if you have a renderer that assumes background is reflective or the camera can see the energy directly
|
|
|
Demiurge
|
2024-04-06 04:32:37
|
0 means invisible right?
|
|
|
lonjil
|
|
Quackdoc
only if you have a renderer that assumes background is reflective or the camera can see the energy directly
|
|
2024-04-06 04:32:55
|
I tend to think that cameras can see photons
|
|
|
Demiurge
0 means invisible right?
|
|
2024-04-06 04:33:17
|
it means light passes right thru it
|
|
2024-04-06 04:33:29
|
but if it's emitting light itself, it will be visible
|
|
2024-04-06 04:33:42
|
it transforms the over operation into the add operation
|
|
|
Quackdoc
|
2024-04-06 04:33:58
|
if you set alpha 0, a lot of renderers will only show if the background is solid
|
|
|
Demiurge
|
|
lonjil
no, if it's zero it isn't occluding anything behind it
|
|
2024-04-06 04:34:32
|
Lonnie, I don't think that is how it works.
|
|
|
w
|
2024-04-06 04:34:38
|
why you should never use alpha
|
|
2024-04-06 04:34:48
|
i onyl go for beta
|
|
|
Quackdoc
|
|
w
i onyl go for beta
|
|
2024-04-06 04:34:58
|
based
|
|
|
lonjil
|
|
Quackdoc
if you set alpha 0, a lot of renderers will only show if the background is solid
|
|
2024-04-06 04:35:01
|
I find that a bit silly, but, I think either way it completely contradicts what Pashifox is saying
|
|
|
Demiurge
|
|
lonjil
but if it's emitting light itself, it will be visible
|
|
2024-04-06 04:35:15
|
Sorry, I meant to reply to this.
|
|
|
lonjil
|
|
Demiurge
Sorry, I meant to reply to this.
|
|
2024-04-06 04:35:36
|
a candle flame is nearly transparent, but is highly visible because it emits light
|
|
|
Quackdoc
|
|
lonjil
I find that a bit silly, but, I think either way it completely contradicts what Pashifox is saying
|
|
2024-04-06 04:35:40
|
it better models how humans see light
|
|
|
Demiurge
|
2024-04-06 04:35:48
|
If alpha is zero it doesn't add anything
|
|
|
Quackdoc
|
2024-04-06 04:35:57
|
it does
|
|
|
lonjil
|
|
Demiurge
If alpha is zero it doesn't add anything
|
|
2024-04-06 04:36:15
|
do the math with the equation on the wiki page you linked
|
|
2024-04-06 04:37:33
|
with associated alpha, it's C_out = C_foreground + C_background\*(1 - A_foreground)
|
|
2024-04-06 04:37:58
|
what happens when C_foreground is 1 and A_foreground is 0?
|
|
|
Demiurge
|
2024-04-06 04:38:41
|
It's wrong to call it occlusion then
|
|
|
Quackdoc
|
2024-04-06 04:38:50
|
its not
|
|
2024-04-06 04:38:59
|
you just have a wrong interpretation of it
|
|
|
Demiurge
|
2024-04-06 04:39:07
|
Because it's not being occluded by the pixel underneath
|
|
2024-04-06 04:39:15
|
If it's being added
|
|
|
lonjil
|
2024-04-06 04:39:21
|
> occluded by the pixel underneath
|
|
2024-04-06 04:39:23
|
????
|
|
2024-04-06 04:39:45
|
since when do objects closer to the camera get covered by objects further away
|
|
|
Quackdoc
|
2024-04-06 04:40:23
|
Candlelight is really a perfect example of this because you can have a unit of candle light that you cannot see because it's not being reflected into your eyes or the camera, but you can see it in part energy onto things around it.
|
|
|
Demiurge
|
2024-04-06 04:40:43
|
Well if the pixel underneath is covering up the texture on top... when blending two textures together based on an alpha mask the distinction between "top" and "bottom" is a meaningless one
|
|
2024-04-06 04:41:11
|
And occlusion just means which one is covering up the other
|
|
|
Quackdoc
|
2024-04-06 04:41:12
|
Why would the pixel underneath be covering up the pixel on top that doesn't make sense?
|
|
|
Demiurge
|
2024-04-06 04:42:45
|
The alpha mask or alpha value is a weight of how much each pixel covers up or gets covered up
|
|
|
Quackdoc
|
2024-04-06 04:43:21
|
"Layers" work top down, You can think of the topmost layer as the closest object to you and the bottommost layer as the object furthest away.
|
|
2024-04-06 04:43:41
|
For something to block something down, it would have to be in the forefront
|
|
2024-04-06 04:44:16
|
Now you can do special stuff where you take a mask and bring it to the forefront, That is possible.
But something in the back will never block something in the front under normal circumstances.
|
|
2024-04-06 04:45:41
|
ofc directionality is based on light source
|
|
|
Demiurge
|
2024-04-06 04:45:43
|
I think the alpha channel represents how much each pixel will block or get blocked by the pixel it's blending with
|
|
|
lonjil
|
2024-04-06 04:46:16
|
no
|
|
2024-04-06 04:46:30
|
the alpha of the background pixel is actually irrelevant to the blend operation
|
|
|
Demiurge
|
2024-04-06 04:46:45
|
That's what the word "occlusion" makes me think and understand
|
|
2024-04-06 04:47:00
|
And then it makes a lot of sense to say "premultiplied"
|
|
|
Quackdoc
|
2024-04-06 04:47:22
|
If you have something with alpha one in the foreground, and it will block out what's in the background, thats about it.
|
|
|
Demiurge
|
2024-04-06 04:47:23
|
Because the emissive values are indeed a multiple of how occluded they are
|
|
|
lonjil
|
2024-04-06 04:47:27
|
do you live in an alternate reality where a tree outside occludes your window curtains?
|
|
|
Demiurge
|
2024-04-06 04:49:17
|
My transparent glass window pane is not being occluded by a tree outside when the light shines through, it's just shining through. But when combining 2 pixels together the word "occlusion" just means which pixel wins
|
|
|
lonjil
|
|
Quackdoc
|
|
Demiurge
My transparent glass window pane is not being occluded by a tree outside when the light shines through, it's just shining through. But when combining 2 pixels together the word "occlusion" just means which pixel wins
|
|
2024-04-06 04:49:53
|
what?
|
|
2024-04-06 04:50:17
|
All it means is how much does the foreground block the background.
|
|
|
Demiurge
|
2024-04-06 04:51:03
|
Foreground and background are kind of arbitrary depending on how the computer algorithm is implemented.
|
|
|
lonjil
|
|
Quackdoc
|
2024-04-06 04:51:15
|
no its not lmao
|
|
|
Demiurge
|
2024-04-06 04:51:21
|
It's just blending 2 colors together
|
|
|
lonjil
|
2024-04-06 04:51:23
|
foreground and background is *fundamental* to alpha compositing
|
|
|
Demiurge
|
2024-04-06 04:51:30
|
It's literally a pixel
|
|
|
lonjil
|
2024-04-06 04:51:39
|
like please read the wiki article you linked
|
|
|
Quackdoc
|
|
lonjil
|
2024-04-06 04:51:49
|
every operation is defined in terms of foreground and background
|
|
|
Quackdoc
|
2024-04-06 04:51:53
|
monitors display pixels. They're not arbitrary at all.
|
|
|
lonjil
|
2024-04-06 04:51:56
|
and foreground and background are not treated the same
|
|
2024-04-06 04:52:18
|
if you want both inputs to be treated the same, you need to do something other than alpha compositing :)
|
|
|
Demiurge
|
2024-04-06 04:52:25
|
An alpha value is just a weight of which pixel gets more weight
|
|
|
lonjil
|
|
Demiurge
|
2024-04-06 04:52:49
|
Is the final color closer to this pixel? Or that pixel
|
|
|
lonjil
|
2024-04-06 04:53:05
|
well a candle flame has an alpha of nearly 0
|
|
|
Quackdoc
|
2024-04-06 04:53:10
|
With associated alpha, all the alpha does is say how much does it block the background? That's it. That's all it does.
|
|
|
lonjil
|
2024-04-06 04:53:17
|
but it will still show up much more than whatever is behind it
|
|
|
Quackdoc
|
2024-04-06 04:53:43
|
its literally a modifer that determins how much background color to add to a pixel
|
|
2024-04-06 04:53:47
|
thats it
|
|
2024-04-06 04:53:51
|
nothing else
|
|
2024-04-06 04:54:16
|
if you have 1 alpha, you add none of the background
|
|
|
lonjil
|
2024-04-06 04:54:30
|
C_out = C_foreground + C_background*(1 - A_foreground)
|
|
|
Quackdoc
|
2024-04-06 04:54:32
|
if you have 0 alpha, you add all of the background
|
|
|
Demiurge
|
|
lonjil
but it will still show up much more than whatever is behind it
|
|
2024-04-06 04:55:20
|
Unless you're outside in the sun in a brightly lit environment. Then the candle flame will look almost invisible because of how bright the surroundings are.
I'll re-read the article to make sure I'm understanding what it says
|
|
|
lonjil
|
|
Demiurge
Unless you're outside in the sun in a brightly lit environment. Then the candle flame will look almost invisible because of how bright the surroundings are.
I'll re-read the article to make sure I'm understanding what it says
|
|
2024-04-06 04:55:55
|
indeed, as long as what I'm saying is true in at least one situation, my point is made
|
|
|
Quackdoc
|
2024-04-06 04:56:04
|
Keep in mind, you cannot see light unless the light is being reflected into your eyes. But that does not mean the energy disappears.
|
|