|
Dexrn ZacAttack
|
2023-11-13 02:10:40
|
I can see |
|
|
username
|
2023-11-13 02:10:52
|
and by that I mean it doesn't work in non-nightly builds and what does work is incomplete |
|
|
Dexrn ZacAttack
|
2023-11-13 02:11:12
|
<@245794734788837387> does waterfox support firefox accounts |
|
|
username
|
2023-11-13 02:11:40
|
it should I think I have a friend who uses firefox accounts with it |
|
|
Dexrn ZacAttack
|
2023-11-13 02:12:11
|
Okay |
|
2023-11-13 02:12:20
|
also for some reason they still dont embed? |
|
2023-11-13 02:12:29
|
|
|
|
username
|
2023-11-13 02:12:52
|
oh I think I might know why |
|
2023-11-13 02:13:12
|
Safari has this weird thing where it refuses to embed anything that is too small in file size |
|
2023-11-13 02:13:24
|
the same thing happens with WebP files as well |
|
|
Quackdoc
|
2023-11-13 02:13:26
|
thats a thing? |
|
|
username
|
2023-11-13 02:13:43
|
there was talk of it a while ago in this server somewhere |
|
|
Dexrn ZacAttack
|
2023-11-13 02:13:48
|
Oh well that's why then |
|
2023-11-13 02:13:57
|
Can you upload a large one to http://blizzardfur.us.to |
|
2023-11-13 02:16:20
|
Cool |
|
2023-11-13 02:16:43
|
LMAO |
|
2023-11-13 02:16:56
|
<@245794734788837387> you and NeRd uploaded the same thing at the same time I think |
|
|
username
|
2023-11-13 02:17:04
|
oh wow |
|
2023-11-13 02:17:34
|
I was confused for a second as to why 2 of them appeared |
|
|
Dexrn ZacAttack
|
2023-11-13 02:18:19
|
Did you make a user |
|
2023-11-13 02:18:24
|
or are you UnknownUser |
|
|
username
|
2023-11-13 02:19:42
|
I didn't make an account/user |
|
|
Dexrn ZacAttack
|
2023-11-13 02:19:45
|
ah ok |
|
2023-11-13 02:20:48
|
Do you know how we can encode into JXL, and if it has animation support? |
|
|
username
|
2023-11-13 02:21:54
|
hm? jxl does have animation support and it's working on my end |
|
2023-11-13 02:22:58
|
maybe an issue with safari? I have heard safari's support for jxl differs between platforms/OSes |
|
|
Dexrn ZacAttack
|
2023-11-13 02:23:03
|
Probably |
|
2023-11-13 02:24:09
|
<@245794734788837387> How can we encode into it? Me, <@404585766799278081> and a few others have a [project](https://github.com/DexrnZacAttack/QmageDecoder) and I think it would be interesting to encode the boot animations into a JXL |
|
|
username
|
2023-11-13 02:25:25
|
the boot animation JXL you uploaded is animated for me |
|
|
Dexrn ZacAttack
|
2023-11-13 02:26:38
|
I didnt upload it |
|
2023-11-13 02:26:41
|
<@404585766799278081> did |
|
2023-11-13 02:26:45
|
I didnt know we had it working lmao |
|
|
username
|
2023-11-13 02:26:51
|
ooooh |
|
|
Dexrn ZacAttack
|
2023-11-13 02:28:37
|
Currently showing <@435464956910108672> |
|
2023-11-13 02:28:59
|
|
|
2023-11-13 02:29:05
|
embeds on waterfox |
|
|
username
|
2023-11-13 02:34:49
|
here's a list of browsers that support JXL off the top of my head in a non-specific somewhat random order:
- [Waterfox](https://www.waterfox.net/) [Firefox/Gecko]
- [Pale Moon](http://www.palemoon.org/) ["legacy Firefox"/Goanna]
- [Basilisk](https://www.basilisk-browser.org/) ["legacy Firefox"/Goanna]
- [Thorium](https://thorium.rocks/) [Chromium/blink]
- [Floorp](https://floorp.app/) [Firefox/Gecko]
- [Midori](https://astian.org/midori-browser) [Firefox/Gecko]
- [Cromite](https://www.cromite.org/) [Chromium/blink]
- [Safari](https://www.apple.com/safari/) [WebKit] [Apple devices only]
- [Epiphany](https://apps.gnome.org/Epiphany/) [WebKit] [Linux only] |
|
|
Quackdoc
|
2023-11-13 02:36:40
|
one day we will be able to add servo based browsers to that list :D |
|
|
username
|
2023-11-13 02:38:02
|
that I've been hoping for! although it needs to be added to image-rs first right? |
|
|
Quackdoc
|
2023-11-13 02:38:44
|
yeah, which I still have no idea what to do with since it's like, if they arent gonna accept it, im not gonna bother |
|
|
Dexrn ZacAttack
|
2023-11-13 02:40:35
|
What is a servo based browser? |
|
|
Quackdoc
|
2023-11-13 02:41:05
|
so far the only technical limitation is that we need to be able to set a max memory usage for jxl-oxide for image-rs to accept it, once jxl-oxide can do that somehow (too advanced for me since it needs to deal with multiple OS and wasm). then I can at least get it to the technically ready, but whether or not they would include it... well we have no answer on that yet |
|
|
username
|
2023-11-13 02:41:20
|
there aren't any yet since servo is still under development https://servo.org/ |
|
|
Quackdoc
|
2023-11-13 02:41:55
|
servo is a rust based web engine, think of it like gecko or blink I guess, it actually shares a lot with gecko code wise. it is pretty promissing |
|
|
Dexrn ZacAttack
|
2023-11-13 02:42:59
|
Interesting |
|
2023-11-13 02:43:53
|
is JXR related? |
|
2023-11-13 02:44:02
|
I assume not |
|
|
username
|
2023-11-13 02:44:47
|
servo was originally being developed by Mozilla and some components of it are in use by Firefox today but at some point Mozilla decided they didn't care anymore and fired most of the developers working on servo which is why it's now under the Linux Foundation |
|
|
Quackdoc
|
2023-11-13 02:45:46
|
they also wanted it for embedded usage, they were actually investing in it and had some support by people like samsung who wanted to use it |
|
2023-11-13 02:46:02
|
currently most devs are hired by igallia who got the funding from... someone, to work on it |
|
2023-11-13 02:46:15
|
tauri also recently got some funding to integrate it iirc |
|
|
username
|
2023-11-13 02:47:00
|
not really the only thing about it that is related is that it's one of the many "JPEGs" that exist https://jpeg.org/index.html |
|
|
Dexrn ZacAttack
|
2023-11-13 02:49:57
|
I hear so many people thinking jpeg absolutely sucks because of the compression, Doesnt JPEG have a lossless mode? |
|
|
username
|
2023-11-13 02:51:44
|
JPEG XL has an amazing lossless mode but if you are talking about the original JPEG then uhhh it sorta does and sorta doesn't have a lossless mode |
|
|
w
|
2023-11-13 02:52:15
|
the format doesnt decide if it's lossless or not |
|
|
Dexrn ZacAttack
|
2023-11-13 02:55:09
|
Does the lossy mode have transparency |
|
|
username
|
2023-11-13 02:55:42
|
yes |
|
2023-11-13 02:56:19
|
there's a lot of stuff here if you wanna read more: https://jpegxl.info/ |
|
|
Dexrn ZacAttack
|
2023-11-13 02:56:32
|
To me JXL sounds like a better PNG |
|
|
username
|
2023-11-13 02:57:16
|
it does beat out PNG both with features and compression ratio |
|
|
Dexrn ZacAttack
|
2023-11-13 02:59:33
|
How do the increadibly small images work |
|
2023-11-13 02:59:39
|
like the 53 byte images |
|
|
username
|
2023-11-13 03:04:26
|
there are people in this server who have explained it wayy better then I probably ever could but uhh here's some stuff for that and related to it:
https://jpegxl.info/art/
https://jxl-art.surma.technology/
https://jxl-art.surma.technology/wtf.html |
|
2023-11-13 03:06:53
|
the last link actually does some explaining for how it works |
|
|
fab
|
2023-11-13 04:30:50
|
Remove palemoon is bad |
|
2023-11-13 04:31:04
|
Midori is better |
|
2023-11-13 04:31:49
|
Also iceraven android good on par with yesterday Mozilla Firefox dev Android |
|
|
Traneptora
|
2023-11-13 05:45:28
|
Jxl supports pmuch everything |
|
2023-11-13 05:46:04
|
lossy and lossless, high bit depth, HDR, alpha, progressive loading, etc. |
|
2023-11-13 05:46:36
|
you can also do lossy RGB with lossless alpha which is cool |
|
2023-11-13 05:46:58
|
for, say, transparent bkg images on a shop website |
|
|
fab
|
2023-11-13 05:56:10
|
Which Android app support ultra hdr viewing |
|
|
Dexrn ZacAttack
|
2023-11-13 07:39:19
|
Will there be a video format <:tf:1170099139489570899> |
|
|
OkyDooky
|
2023-11-13 07:52:49
|
It's "effectively lossless" if you encode it at 100 quality. It still winds up being smaller than PNG, unless it's something that PNG is especially good at (like text on a solid color background).
(<@485504221781950465>) |
|
|
Traneptora
|
2023-11-13 08:15:10
|
there is a lossless jpeg extension but it's not always possible with baseline |
|
|
jonnyawsom3
|
2023-11-14 01:21:42
|
Technically, yes. We've crammed JXLs into an MKV and I wanted to try transcoding an MJPEG one day |
|
|
Quackdoc
|
2023-11-14 02:19:06
|
I don't have much of an interest in animated jxl outside of all intra, but I do have a couple jxl mkvs |
|
|
gb82
|
2023-11-14 03:25:14
|
you should send this exact message as a .md file |
|
|
Traneptora
|
2023-11-14 03:34:16
|
right click, copy text |
|
|
gb82
|
2023-11-14 03:36:56
|
so true. thanks |
|
|
Dexrn ZacAttack
|
2023-11-14 06:10:20
|
I don’t see mjpegs being used anywhere else except for IP Cameras |
|
2023-11-14 09:45:25
|
How do I create an animated JXL using FFMPEG? |
|
|
Traneptora
|
2023-11-14 08:18:55
|
at the moment, you can't. you have to make an apng then use cjxl |
|
|
Dexrn ZacAttack
|
2023-11-15 04:36:20
|
Yeah I found that out |
|
|
OkyDooky
|
2023-11-20 07:41:37
|
Is there an Android gallery app at there that supports decode? |
|
|
diskorduser
|
2023-11-20 11:39:13
|
Not yet |
|
|
OkyDooky
|
2023-11-21 07:16:16
|
There are two (Simple Gallery Pro and Aves) that I've posted in issues for JXL and both developers took the stance of "we'll include support in this app when there's more support for it in general." So, keep advocating for support in any other app or platform you can and it'll have a domino effect. I already successfully lobbied for JXL inclusion in a browser (Cromite), so it's definitely doable. |
|
2023-11-21 07:18:11
|
I'll post sometime later in those issue threads (for the gallery apps) about Cromite. Maybe that'll nudge the scales towards that tipping point a bit more. 🤞 |
|
|
fab
|
2023-11-21 08:45:32
|
Nobody can distinguish jxl |
|
2023-11-21 08:45:52
|
It appears like a better jpg |
|
2023-11-21 08:45:57
|
That's the problem |
|
|
Quackdoc
|
2023-11-21 09:28:04
|
Aves wouldnt be too hard to add basic support to, aves now uses mpv for video and can use it for fallback now for avif, using the same pipeline for JXL probably wouldnt be too hard |
|
2023-11-21 09:29:25
|
I do have media_kit android library fork (the api it uses for mpv) that adds support for jxl albeit a bit out dated, should just need a rebase https://github.com/Quackdoc/libmpv-android-video-build/tree/libjxl |
|
2023-11-21 09:30:09
|
build the libraries, patch aves to support jxl on the fallback pipeline, and that should be enough, flutter is pretty easy so I woulddnt be surprised if it's only a couple line patch aves side |
|
|
qdwang
|
2023-11-21 04:07:10
|
Hi guys, I'm a bit confused about the HDR part of jxl. I've tried this experiment:
1. Generate a 16bit PNG from RAW
2. use `cjxl --intensity_target=1000 src.png tgt.jxl` to convert PNG to JXL
3. Put the JXL file in a HDR supportive device(like iPhone15pro)
4. It still display in SDR mode.
(p.s. Real HDR JXL files from https://people.csail.mit.edu/ericchan/hdr/hdr-jxl.php can display in HDR mode in iPhone15pro Photos App)
So how can I convert a 16bit PNG to a HDR JXL manually? |
|
|
HCrikki
|
2023-11-21 04:13:21
|
imo a big mistake with jxl is that were waiting for devs to supply official builds for the political support about the format. if possible folks should produce patches and provide unofficial builds including that support in preparation for eventual upstreaming |
|
|
MSLP
|
2023-11-21 04:23:50
|
Well, avif was released earlier and with a chromium-team blessing anyway, so it may have just got removed from chromium earlier that way.
Changing factor may be if android team decides to include jxl support in standard media library.
Even jxl-recompressed-jpeg (aka legacy mode) support would be great, as it would give more space for storing the photos on your phone. |
|
2023-11-21 04:46:39
|
For this moment my personal wishlist regarding android is pretty simple:
- OpenCamera support for encoding JPEGs as JXL (only by using jpeg lossless transcode)
- Some sane gallery app that is able to view such files
I guess all of the above could be done only in java code (without including any binary libs in the package) |
|
|
spider-mario
|
2023-11-21 08:42:18
|
currently, this is mainly determined by whether the image is PQ- or HLG-encoded |
|
2023-11-21 08:42:41
|
a somewhat roundabout way to go about it could be to convert your PNG to EXR (making sure to produce linear data) and then convert the EXR to a PQ PNG using https://github.com/libjxl/libjxl/blob/main/tools/hdr/exr_to_pq.cc |
|
|
qdwang
|
2023-11-21 09:01:17
|
Thanks for this info. I'll have a try |
|
|
OkyDooky
|
2023-11-21 09:31:44
|
If I'm reading that correctly, isn't that basically what Thorium and Cromite are doing?
Re\: gallery apps. Are you suggesting that it'd be better to fork them and add initial JXL support, rather than asking the main devs to first support it using other patches (like Quackdoc's link above)?
(<@174118775753408512>) |
|
2023-11-21 09:31:56
|
I think the issue is less about it being hard and more about those devs just not being personally interested. They both conceded that an existing ecosystem for JXL usage would force their hands, though.
(<@184373105588699137>) |
|
|
HCrikki
|
2023-11-21 09:41:23
|
releasing 3rd-party builds with the support added and configurable defaults changed is not forking. its often as simple as just including one extra patch alongside the 50 youre already adding, or building the imaging library with jxl explicitly enabled |
|
2023-11-21 09:47:10
|
if working patches exist, unofficial builds should be produced so jxl can be consumed by at least testers. their merger upstream is only *ideal*, not really required if they oppose inclusion like with chromium. 3rdparty builds would serve as demonstrators that patches work, have no adverse effect on reliability or performance and accelerate resolution of any discovered issue |
|
|
Quackdoc
|
2023-11-21 09:59:55
|
the issue with aves specifically is they don't want to take PRs so I haven't bothered. also I more or less stopped working with flutter do to it's absolutely commical state on linux right now, Im actually getting better preformance using waydroid to run flutter apps |
|
2023-11-21 10:01:17
|
However I generally think yes, adding support to an application instead of waiting is generally better, ofc if you have the knowledge on how to do so |
|
|
damian101
|
2023-11-21 10:05:31
|
Well, what's the transfer curve of your source file? There's differen types of HDR. Setting intensity target to 1000 will do nothing for SDR transfer curves, as they usually are not tonemapped. |
|
|
Quackdoc
|
2023-11-21 10:08:01
|
tfw you map 1:1 sdr to hdr and blind yourself |
|
|
spider-mario
|
2023-11-21 11:15:20
|
illustrated: https://images.mixinglight.com/cb:sOVj.5c7e3/w:725/h:544/q:mauto/ig:avif/f:best/https://mixinglight.com/wp-content/uploads/2018/06/robbiexm310k.jpg |
|
2023-11-21 11:15:32
|
(from https://mixinglight.com/color-grading-tutorials/10-things-about-hdr-grading/ ) |
|
|
Quackdoc
|
2023-11-21 11:18:41
|
> But the idea that that high NIT HDR monitors are somehow uncomfortable or going to be blinding is just simply not true.
I actually disagree with this, even with the apple ipad pro, I had the unpleasurable experience of being bombed by a 1knit white image in a dark room, and calling it unpleasant is an understatement, it did wind up causing a mild headache for me |
|
2023-11-21 11:19:09
|
I mean, it's not super bad, but for people who are more sensitive then I am, it could lead to mild headaches easily |
|
2023-11-21 11:19:51
|
I mean it's not super bad ofc |
|
|
spider-mario
|
2023-11-21 11:20:37
|
headaches, plausibly (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2818758/); eye damage, thankfully not (https://iovs.arvojournals.org/article.aspx?articleid=2393529) |
|
|
190n
|
2023-11-21 11:22:26
|
> The sun, at midday, can be as bright as 1.6 Billion NITs
surely this figure is meaningless since it's so far away? |
|
|
spider-mario
|
2023-11-21 11:25:16
|
it still subtends a relatively significant angle |
|
2023-11-21 11:25:33
|
and is clearly unsafe to look at directly at midday |
|
|
Quackdoc
|
2023-11-21 11:25:48
|
one of the best references for PQ is is an office flourcent tube light since they often hit around the peak of what PQ can theoretically do and it's something you can stand right in front of to gauge it in conditions similar to how one would consume on a TV |
|
|
spider-mario
|
2023-11-21 11:25:58
|
https://en.wikipedia.org/wiki/Orders_of_magnitude_(illuminance)#Luminance |
|
|
Traneptora
|
2023-11-22 09:08:42
|
one nit is one candela per square meter |
|
2023-11-22 09:09:24
|
so the 1.6 billion nit number isn't based on its absolute brightness but how much luminous intensity it shines in one square meter |
|
|
190n
|
2023-11-22 09:09:45
|
square meter on the surface of the earth or on the sun? |
|
|
Traneptora
|
2023-11-22 09:09:53
|
from the perspective of the viewer |
|
2023-11-22 09:09:59
|
so on earth |
|
|
190n
|
2023-11-22 09:10:37
|
wait i thought that isn't how nits work |
|
2023-11-22 09:10:54
|
like a display doesn't have fewer nits if you look at it from farther away right? |
|
2023-11-22 09:11:23
|
so it would be brightness per area of the display, not per area on the inside of a sphere centered on the display? |
|
|
Traneptora
|
2023-11-22 09:11:30
|
well, it's per square meter |
|
2023-11-22 09:11:39
|
so the farther away the viewer is the fewer the nits the viewer sees |
|
2023-11-22 09:11:49
|
but the area for measuring a display is right on the display |
|
2023-11-22 09:12:02
|
in the case of the sun being 1.6 billion nits, it's measured from the surface of the earth |
|
|
190n
|
2023-11-22 09:12:48
|
that feels weird |
|
|
Traneptora
|
2023-11-22 09:13:17
|
if you were to measure a display's nits from far away you'd get a smaller number |
|
2023-11-22 09:13:26
|
but the numbers reported are typically on the display's rectangle itself |
|
|
190n
|
2023-11-22 09:13:34
|
like if you wanted to compare sunlight and a display on the same scale wouldn't you have to use brightness per angle or something, and decide a viewing distance for the display? |
|
|
Traneptora
|
2023-11-22 09:13:53
|
yes, you'd have to decide a viewing distance for the display |
|
|
190n
|
2023-11-22 09:14:20
|
hence me saying that comparing the 1.6B figure to nits doesn't seem very sensible |
|
2023-11-22 09:14:23
|
like sure it's very bright |
|
2023-11-22 09:14:38
|
ok this makes sense |
|
|
Quackdoc
|
2023-11-22 09:15:52
|
This is why I like the Florecent Tube analogy. Of course, the issue with it is not all fluorescent tubes are created equal. But for someone who's seen a really bright, good quality tube, then, you know, roughly what it would be like. |
|
|
qdwang
|
2023-11-22 09:21:16
|
I'm using linear PPM for source now. Encoding with `RGB_D65_DCI_Per_HLG` hint can give a nice HDR effect, but `RGB_D65_202_Per_PeQ` hint will cause the image too bright and too saturated on iphone. Is it normal? I don't know colorspace quite well. |
|
|
damian101
|
2023-11-22 09:22:13
|
if you encode from linear, you must specify linear, cjxl is an image encoder, not filtering software |
|
2023-11-22 09:22:51
|
if you want an HDR image format, you must first create an image in that format, with ffmeg or vapoursynth or something else |
|
2023-11-22 09:23:25
|
what you're setting is merely metadata |
|
|
Quackdoc
|
2023-11-22 09:23:59
|
tonemapping time, If only converting from SDR to HDR didn't suck. |
|
|
damian101
|
2023-11-22 09:30:18
|
it really shouldn't... everything that can be shown on an SDR screen, can be shown on a HDR screen |
|
|
qdwang
|
2023-11-22 09:30:47
|
But I had a test on darktable. For a 14bit raw file, the input profile is `linear Rec2020 RGB`, the output profile is `HLG P3 RGB`, the exported JXL file looks really nice on HDR devices. Does darktable process the HDR part for me? |
|
|
damian101
|
2023-11-22 09:32:04
|
yes, it apparently exports it as a proper HDR standard... |
|
|
Quackdoc
|
2023-11-22 09:33:36
|
if your input is raw, you will find most good software can do an ok job at exporting to sdr or hdr. OCIO + aces makes a lot of this nice and easy |
|
2023-11-22 09:34:00
|
the issue is when working with already graded images and converting that to a different format |
|
|
qdwang
|
2023-11-22 09:34:05
|
So if i want to achieve the same effect manually, first I need to decode to a `linear Rec2020 RGB` data and use ffmpeg to do an intermediate process and finally encode the result to jxl? |
|
|
damian101
|
2023-11-22 09:35:12
|
first you need to tonemap in some way, most importantly you need to set that peak of the linear curve somewhere |
|
2023-11-22 09:35:50
|
`-x color_space=RGB_D65_DCI_Rel_HLG --intensity_target 1000` would be appropriate for the output image |
|
|
qdwang
|
2023-11-22 09:43:39
|
I thought the darktable will do a color conversion before the export.
With input profile sets to `linear Rec2020 RGB` and output profile sets to `HLG P3 RGB`.
Is this process true?
**before edit**
raw file -> darktable -> decode -> demosaicing -> white_balance adjust -> convert RGB values to linear Rec2020 RGB
**on screen**
edited linear Rec2020 RGB -> convert to HLG P3 RGB -> (convert to 8bit sRGB maybe?) -> screen
**after edit**
edited linear Rec2020 RGB -> convert to HLG P3 RGB -> encode by libjxl with color space metadata set to RGB_D65_DCI_Per_HLG
Am I right? |
|
|
spider-mario
|
2023-11-22 09:55:10
|
I am not fully sure how darktable does the “on screen” part but at least the other two seem correct (except white balance comes before demosaicing) |
|
|
qdwang
|
2023-11-22 10:03:28
|
Thank you for pointing out the mistake. |
|
|
Quackdoc
|
2023-11-22 10:06:06
|
I think DT automatically tries to set colormanagement internally depending on system colormanagement stuff, so if you have a dci-p3 monitor on windows, it should convert to that, how it does it. no idea |
|
|
spider-mario
|
2023-11-22 10:07:25
|
if you are further away, there are also correspondingly more m² of the sun in a given angle, which compensates exactly, AFAICT |
|
2023-11-22 10:07:59
|
if you show a fullscreen white image on your display and then move further away, it still looks as bright within the angle it still subtends, doesn’t it? |
|
|
w
|
|
Traneptora
|
2023-11-22 01:23:34
|
no, because it's candela per square meter, not candela per steradian |
|
|
spider-mario
|
2023-11-22 01:24:56
|
if we are talking about the definition of luminance itself then that’s kind of moot since, as you point out, it’s measured where it is measured and viewer distance doesn’t come into it |
|
2023-11-22 01:25:06
|
the monitor is 250 cd/m² regardless of where we are |
|
2023-11-22 01:25:17
|
but I was assuming that the question related to the resulting flux |
|
|
Traneptora
|
2023-11-22 01:25:43
|
well the sun is only 1.6 Gnit from the earth surface |
|
2023-11-22 01:25:54
|
if you go to Neptune it's lower |
|
2023-11-22 01:26:46
|
so in that sense it doesn't conpensate exactly |
|
|
spider-mario
|
2023-11-22 01:30:29
|
is it? why would it be? |
|
2023-11-22 01:30:51
|
I can see why illuminance on Neptune would be lower, but the Sun’s luminance is what it is, isn’t it? https://en.wikipedia.org/wiki/File:Photometry_radiometry_units.svg |
|
|
Traneptora
|
2023-11-22 01:32:55
|
the sun's luminous intensity is what it is |
|
2023-11-22 01:33:31
|
if you go further away, you end up with a smaller angle |
|
2023-11-22 01:33:50
|
which doesn't totally compensate for the difference in square meters |
|
2023-11-22 01:34:18
|
one nit is one lumen/sr/m^2 |
|
2023-11-22 01:35:00
|
I guess it depends on the sensor? hm |
|
2023-11-22 01:35:07
|
don't do math when you just wake up |
|
|
w
|
2023-11-22 01:35:41
|
i thought nits doesnt change since it's what's emitted and lux is what's received |
|
|
lonjil
|
2023-11-22 01:36:08
|
Nits is just the amount of light passing through an area. |
|
|
Traneptora
|
2023-11-22 01:36:15
|
lumens is total quantity of light |
|
2023-11-22 01:36:23
|
lux is lumens per square meter |
|
2023-11-22 01:36:27
|
candelas is lumens per steradian |
|
|
lonjil
|
2023-11-22 01:36:41
|
This could be measured on the surface of a light emitter, or it could be measured further away, giving a different value. |
|
2023-11-22 01:46:32
|
bleh I'm probably wrong |
|
|
spider-mario
|
2023-11-22 02:15:46
|
you can use a camera as a rudimentary luminance meter |
|
2023-11-22 02:16:04
|
when we tried this on <@179701849576833024>’s TV, we got plausible values |
|
2023-11-22 02:16:53
|
and this doesn’t change with distance from the TV, as long as the TV still covers the area of the sensor you’re measuring from |
|
2023-11-22 02:17:27
|
(we were investigating what sort of HDR tone mapping the TV did) |
|
|
Fraetor
|
2023-11-22 02:45:12
|
The difference is the portion of your field of view, The Sun is only about half a degree in size on earth, while a monitor at typical viewing distance might be 30 degrees. Therefore you have a few thousand times the area of monitor that you do of Sun. The Sun must have so much greater a Luminance than a monitor, given that with only a small area it can light up a lot more than a monitor. |
|
2023-11-22 02:49:24
|
You can also consider a telescope, which effectively increases the apparent size of whatever you are looking at. If you point a telescope at a monitor it doesn't get much brighter, as it was already filling a good portion of your vision, while if you do the same towards the Sun you will blind yourself. |
|
|
spider-mario
|
2023-11-22 02:55:24
|
that is relevant for the unfocused illumination of a wide area, but in the eye, similar math as above will apply to the part of the retina that the sun is mapped to by the eye’s lens, right? (but yeah, doing so on a wider area of the retina could compromise heat dissipation and so on) |
|
2023-11-22 02:56:42
|
with telescopes, my understanding is that at least part of the problem is also that a telescope’s entrance pupil is much larger than the eye’s (2-8mm depending on constriction https://www.ncbi.nlm.nih.gov/books/NBK381/#:~:text=The%20normal%20pupil%20size%20in%20adults%20varies%20from%202%20to%204%20mm%20in%20diameter%20in%20bright%20light%20to%204%20to%208%20mm%20in%20the%20dark%2E), so much more light can enter and exit it than into the naked eye (although, in terms of density, this will be partially undone by the enlargement) |
|
|
Fraetor
|
2023-11-22 02:58:25
|
That's true. |
|
|
spider-mario
|
2023-11-22 03:02:11
|
in any case: looking directly at the sun at midday: bad idea; doing so through a telescope, binoculars, etc.: worse idea |
|
2023-11-22 03:02:41
|
actually, part of why I switched from a DSLR to a mirrorless camera was to eliminate that risk |
|
2023-11-22 03:03:31
|
if I accidentally point the camera at the sun while composing, the sensor will take the hit instead of my eyes |
|
|
Demiurge
|
2023-11-22 09:03:59
|
candela per square meter, so division? So a small portion of a large screen has the same amount of nits as the entire screen, assuming the screen is uniformly lit? |
|
2023-11-22 09:04:38
|
Makes sense I guess |
|
2023-11-22 09:05:20
|
The perceived brightness obviously changes a lot depending on the distance it is viewed from, because of the inverse square law |
|
2023-11-22 09:06:11
|
Also because of how our eyes work, the perceived brightness will also change depending on the ambient brightness of the surrounding environment. |
|
|
spider-mario
|
2023-11-22 09:49:27
|
it doesn’t |
|
2023-11-22 09:50:12
|
from any single point, you do get less of the energy because you subtend a lesser angle relative to it, but there are also correspondingly more points in a given angle relative to you |
|
|
Demiurge
|
2023-11-22 10:03:07
|
That's a good point. |
|
2023-11-22 10:05:10
|
If you're a greater distance away, but you are using a larger screen to compensate for that, and the "luminance per area" of the screen is the same, then you'd be getting the same amount of light into your eyes as long as the edges of the screen are subtended the same angle relative your eyes. |
|
|
qdwang
|
2023-11-23 06:59:39
|
Now i encounter another weird problem. I managed to export 16bit ppm in `displayP3_HLG` colorspace. But after I encode the ppm to jxl with color_space `RGB_D65_DCI_Per_HLG`, it becomes a non HDR image. Is this normal? If I export ppm in `displayP3`, then the converted jxl has the HDR effect(but these is a color shift). |
|
2023-11-23 07:01:03
|
I thought if these is no color conversion process in jxl encoding, so i should use `displayP3_HLG` on source and `RGB_D65_DCI_Per_HLG` on export to make the correct HDR effect. |
|
|
Traneptora
|
2023-11-23 07:08:14
|
how is it a "non-HDR" image, exactly? |
|
2023-11-23 07:08:31
|
HLG is display-relative |
|
2023-11-23 07:08:38
|
unlike PQ, which is absolute |
|
2023-11-23 07:08:54
|
how are you viewing the JXL file? |
|
2023-11-23 07:09:06
|
what happens if you view it with plplay or mpv? |
|
2023-11-23 07:09:44
|
for lossy JXL there *is* a color conversion process on encoding. it encodes it into XYB, and you can request any space upon decode. |
|
|
qdwang
|
2023-11-23 07:10:15
|
I only view the JXL in my iphone xs(which has a HDR display) comparing to the ProRAW photo. |
|
2023-11-23 07:27:15
|
I cannot take a screenshot to show the effect, it's rendered in SDR mode on the screenshot |
|
2023-11-23 07:44:28
|
it looks like this, if the photo is HDR, the Photos App will use the complete screen luminance range from 0-1000nits I think to display the photo. |
|
2023-11-23 07:45:19
|
So the flash light in this image will be much brighter than the UI white |
|
2023-11-23 07:46:28
|
But in my test, encoding a 16bit ppm in `displayP3_HLG` with hint `RGB_D65_DCI_Per_HLG` will result in a non HDR image |
|
2023-11-23 07:49:01
|
on the contrary, encoding a 16bit ppm in `displayP3` with hint `RGB_D65_DCI_Per_HLG` can result in a HDR image, but the color will shift. My goal is to take the RAW to a HDR JXL without a color shift. |
|
2023-11-23 07:50:25
|
🥹 I have tried a lot of combinations non of them produce the right result. |
|
2023-11-23 07:52:44
|
BTW, I use swift to convert ProRAW to PPM with this code `context.createCGImage(ciImage, from: rect, format: .RGBA16, colorSpace: CGColorSpace(name: CGColorSpace.displayP3_HLG))` |
|
|
damian101
|
2023-11-23 07:55:44
|
Did you set intensity_target?
`-x color_space=RGB_D65_DCI_Rel_HLG --intensity_target 1000`
Note that the entire point of HLG images is that they look good when interpreted as SDR. Are you sure your image viewer supports tonemapping from HLG? |
|
|
qdwang
|
2023-11-23 07:56:23
|
no, but i check the converted jxl with `jxlinfo`, it shows a intensity_target 1000 |
|
|
damian101
|
2023-11-23 07:56:54
|
well, there was some weird bug regarding this... lemme find it... |
|
2023-11-23 07:57:23
|
https://discord.com/channels/794206087879852103/848189884614705192/1174109216026415247 |
|
|
qdwang
|
2023-11-23 08:04:53
|
the jxlinfo for the image:
```
JPEG XL image, 3024x4032, lossy, 16-bit RGB
intensity_target: 1000.000000 nits
min_nits: 0.000000
relative_to_max_display: 0
linear_below: 0.000000
Color space: RGB, D65, P3 primaries, HLG transfer function, rendering intent: Perceptual
``` |
|
2023-11-23 08:06:38
|
Actually I also don't know why a `displayP3` RGB with `RGB_D65_DCI_Per_HLG` metadata can trigger the HDR effect. |
|
2023-11-23 08:10:29
|
My colorspace knowledge is a mess😅 |
|
|
spider-mario
|
2023-11-23 08:31:23
|
my current best (but still not very good) guess would be some oddity with the way that `createCGImage` behaves with `colorSpace=displayP3_HLG` |
|
2023-11-23 08:31:50
|
does `identify -verbose` on the corresponding PPM indicate that the whole range of values is used? |
|
|
qdwang
|
2023-11-23 08:38:53
|
Image statistics:
Overall:
min: 1822 (0.0278019)
max: 59486 (0.907698)
mean: 10708.5 (0.163402)
median: 11254 (0.171725)
standard deviation: 4894.57 (0.0746863) |
|
2023-11-23 08:39:11
|
|
|
|
spider-mario
|
2023-11-23 08:53:56
|
hm, so it should indeed be at least somewhat HDR |
|
|
damian101
|
2023-11-23 08:54:09
|
Display P3 is DCI P3 primaries (DCI) combined with the common D65 white point (D65) and an sRGB transfer curve, which you replaced with HLG. |
|
|
spider-mario
|
2023-11-23 08:54:14
|
that 0.906698 should be around 550 cd/m² on a 1000 cd/m² display if I didn’t mess up my calculations |
|
2023-11-23 08:54:23
|
(assuming it’s (0.907698, 0.907698, 0.907698) or close to that) |
|
|
qdwang
|
2023-11-23 08:59:29
|
I got to sleep. I'll shoot another ProRAW tomorrow and provide the raw file, ppm file and the generated jxl file. Then we can figure out the problem. |
|
2023-11-23 09:00:17
|
Thanks for the explain. |
|
|
Traneptora
|
2023-11-23 09:33:21
|
this may be a bug in apple's implementation |
|
2023-11-23 09:34:06
|
what if you view it with mpv? |
|
2023-11-23 09:34:08
|
or plplay? |
|
|
spider-mario
|
2023-11-23 10:02:28
|
wait, are you opening the JXL in the Files app? |
|
2023-11-23 10:02:46
|
it didn't show HDR JXLs as HDR at the moment |
|
2023-11-23 10:02:56
|
the Photos app does |
|
2023-11-23 10:03:49
|
oh, no, you said that the JXL created from the non-HLG PPM _is_ shown as HDR, right? |
|
|
Quackdoc
|
2023-11-24 12:07:12
|
"HDR" isn't technically defined well (or even at all as far as I can tell), but generally it is a matter of transfer. using any transfer that can represent a large nit range (IE. HLG) will trigger "HDR" on most monitors |
|
|
qdwang
|
2023-11-24 09:26:17
|
I use Photos App to view the photos |
|
2023-11-24 09:38:50
|
<@604964375924834314> <@401816384109150209>https://drive.google.com/drive/folders/1WdYXb8lrhaMc6OiBjOgnxsERWrnB89FX?usp=sharing |
|
2023-11-24 09:42:39
|
Let me describe the issue again:
Encoding the `displayp3-hlg.ppm` to JXL with `RGB_D65_DCI_Per_HLG` dec hints cannot trigger the HDR effect in iOS HDR devices.
The PPM is generated by this swift code: `context.createCGImage(ciImage, from: ciImage.extent, format: .RGBA16, colorSpace: CGColorSpace(name: CGColorSpace.displayP3_HLG))` from the ProRAW file. Other options are all default. |
|
|
spider-mario
|
2023-11-24 09:44:45
|
when I open it in Photos, it does a transition from dark to less dark, suggesting it is in fact HDR |
|
2023-11-24 09:44:46
|
just dark |
|
|
Quackdoc
|
2023-11-24 09:48:25
|
triggering for me on windows too, looks like HLG is sent perfectly fine |
|
|
qdwang
|
2023-11-24 09:49:16
|
One interesting thing is, when you tap the Edit button for the fxl file in Photos App, it will become brighter. The flash is not as bright as the raw one, but still brighter than before. |
|
2023-11-24 09:50:25
|
I don't have a windows device with me now. All my description is base on the comparison of the display effect of the raw and the jxl one. |
|
2023-11-24 09:53:34
|
I uploaded a video in the folder, that will show the difference. |
|
|
spider-mario
|
2023-11-24 09:54:08
|
it seems to me that it becomes dimmer, then brighter again but to the same level |
|
2023-11-24 09:54:13
|
as becomes apparent when you then “Cancel” the edit |
|
|
qdwang
|
2023-11-24 09:57:14
|
I uploaded a video to show the issue. Could you check if it's the same in your device? |
|
|
Quackdoc
|
2023-11-24 09:57:29
|
its dark image
```
Average: 57.5217348904
95th: 97.0525076651
99th: 143.320340372
peak: 205.721028847
5th: 11.3425015668
frame: 0
``` |
|
2023-11-24 09:57:40
|
those are nits |
|
2023-11-24 09:58:21
|
well after using djxl to decode it to png since my stuff no work with jxl atm |
|
|
qdwang
|
2023-11-24 10:01:06
|
yes, this photo is darker than my previous one.
```
Image statistics:
Overall:
min: 1142 (0.0174258)
max: 49915 (0.761654)
mean: 31857 (0.486106)
median: 33276 (0.507759)
```
But should it be this dark for a `display p3 hlg` photo?
I just want to figure out if the problem comes from the Apple part`(ProRaw -> p3-hlg)` or the Jxl part`(p3-hlg -> jxl)` or my method`(raw -> p3-hlg -> jxl to create a HDR jxl)` |
|
|
Quackdoc
|
2023-11-24 10:02:09
|
you need to grade your footage with the proper intended luminance, then you have to make sure that luminance comes across when applying your transfer |
|
2023-11-24 10:03:25
|
to be clear, your *dump-p3-hlg.ppm* is dim |
|
2023-11-24 10:03:40
|
the jxl is representing that perfectly fine |
|
|
qdwang
|
2023-11-24 10:07:39
|
well, it comes from the default settings to extract the RGB data in displayp3-hlg colorspace from ProRAW with apple's swift code. The same settings with displayp3 colorspace exports the image with right luminance. Maybe it's just how the internal decoder in ios sdk works.
Glad to know the jxl part is fine, than i just need to tweak the Apple part.
Thank you~ |
|
|
Quackdoc
|
2023-11-24 10:11:01
|
it's worth noting that there really isn't a "right luminance" I can't claim what apple is doing, but typically it is up to the image author to set stuff like that. the tool typically can't tell how bright or dark an image "should" be. since luminance depends on a lot of factors |
|
|
qdwang
|
2023-11-24 11:25:02
|
🥲 I give up. I just cannot reproduce the same HDR effect like ProRaw displaying in Photos App without a color shift in JXL. |
|
|
Traneptora
|
2023-11-24 11:51:10
|
try using PQ instead of HLG |
|
|
Quackdoc
|
2023-11-24 12:05:23
|
PQ is for sure good since HLG does have it's limitations, but making sure you are your display is setup right goes a long way too since you might be grading for something entirely different without knowing it |
|
|
Traneptora
|
2023-11-24 12:05:48
|
well grading on a mobile device with the photos app is probably not ideal |
|
|
Quackdoc
|
2023-11-24 12:06:36
|
I don't know squat about apples modern ecosystem, |
|
2023-11-24 12:06:58
|
also it's worth noting that display p3 and dcip3 have different white points, so that could account for *some* colorshift |
|
2023-11-24 12:07:41
|
althought I think setting D65 should handle that properly? but yeah, checking your display setup, and export setup are similar is necessary, if you are grading for HDR on an SDR display for instance, you are in for a bad time |
|
|
Traneptora
|
2023-11-24 12:44:42
|
"DCI" is the libjxl term for P3 primaries |
|
2023-11-24 12:44:53
|
notice they used D65_DCI |
|
|
Quackdoc
|
2023-11-24 12:48:39
|
yeah, I think It should work properly, I just don't like making claims I can't backup with a degree of certainty and I'm not sure how setting cjxl stuff works. at least One time I was doing some stuff with some proprietary image encoder, and thought I was setting the import settings but instead I was setting an override. and that bit me in the ass |
|
|
qdwang
|
2023-11-24 01:15:22
|
I tried PQ, if i export raw to ppm with `displayP3_PQ`, it will output a more dim image with this statistics:
Image statistics:
Overall:
min: 1937 (0.0295567)
max: 38056 (0.580697)
mean: 28455.5 (0.434202) |
|
|
Traneptora
|
2023-11-24 01:15:39
|
very well could be that the image is actually just dim |
|
|
qdwang
|
2023-11-24 01:16:13
|
This is exporting with `displayP3`:
Image statistics:
Overall:
min: 80 (0.00122072)
max: 65535 (1)
mean: 35012 (0.534249) |
|
|
Traneptora
|
2023-11-24 01:16:38
|
sounds like an issue with the exporter |
|
2023-11-24 01:16:50
|
it's not running any kind of peak detection when you export with PQ |
|
|
qdwang
|
2023-11-24 01:17:32
|
so apple's image rendering with PQ and HLG can make RGB data darker |
|
|
Traneptora
|
2023-11-24 01:17:41
|
no? |
|
2023-11-24 01:17:45
|
you're exporting a dark image |
|
2023-11-24 01:17:53
|
from whatever mobile app you're using |
|
2023-11-24 01:18:07
|
and apple's photo viewer is correctly rendering it as a dark image |
|
|
qdwang
|
2023-11-24 01:18:17
|
not an app, it's exported with swift code: `context.createCGImage(ciImage, from: ciImage.extent, format: .RGBA16, colorSpace: CGColorSpace(name: CGColorSpace.displayP3_PQ))` |
|
|
Traneptora
|
2023-11-24 01:18:50
|
well whatever you are calling is running peak detection if you export it as SDR and not as HDR |
|
2023-11-24 01:18:58
|
which is the correct behavior. it sounds like the image itself is just dark |
|
2023-11-24 01:19:21
|
in order to view HDR images on SDR screens you have to run peak detection to do it properly |
|
2023-11-24 01:19:32
|
that's why it *looks* brighter when you do that |
|
2023-11-24 01:19:38
|
but the image itself is probably just not that bright |
|
|
Quackdoc
|
2023-11-24 01:22:02
|
what exactly are these numbers supposed to represent? 16bit rgb averaged? if so thats not really a good guideline |
|
|
qdwang
|
2023-11-24 01:22:09
|
I'm viewing HDR images on HDR display(iphone xs) |
|
|
Traneptora
|
2023-11-24 01:22:34
|
but if you export it as SDR then it's not an HDR image |
|
2023-11-24 01:22:49
|
and the export process will run peak detection |
|
|
Quackdoc
|
2023-11-24 01:23:19
|
Im thinking it's just exporting the raw data as is, to PQ/HLG |
|
|
qdwang
|
2023-11-24 01:23:25
|
How can a PPM be HDR? I just use the RGB values in the PPM and encode it with the dec hint |
|
|
Traneptora
|
2023-11-24 01:23:43
|
same way a PNG can be HDR |
|
2023-11-24 01:23:55
|
you have pixel data with a color profile |
|
2023-11-24 01:24:06
|
in this case the color profile is an enum space |
|
2023-11-24 01:24:09
|
not an ICC profile |
|
|
w
|
2023-11-24 01:25:12
|
proof hdr is fail |
|
|
Traneptora
|
2023-11-24 01:25:33
|
If you export the raw camera data to an SDR image (e.g. Display P3) then it will run peak detection, and thus it will look bright on your screen
if you export the raw camera data to an HDR image, no peak detection will be run, and the original image (which is dark) will display |
|
|
qdwang
|
2023-11-24 01:26:21
|
Do you mean `a peck detection` during the libjxl encoding process? |
|
|
Traneptora
|
2023-11-24 01:26:45
|
no, I mean peak detection, and libjxl does not perform peak detection |
|
2023-11-24 01:27:03
|
I'm saying that whatever API you are calling from swift is running peak detection in order to export HDR data as SDR |
|
|
qdwang
|
2023-11-24 01:27:24
|
ok, i get it |
|
|
Quackdoc
|
2023-11-24 01:29:58
|
are you sure it's not just a grading for SDR and doing a dumb tonemap? or grading for "graphics white" on HDR?
the PQ image has a max of 0.58 which roughly corresponds with 203 area nits. |
|
|
qdwang
|
2023-11-24 01:34:09
|
I don't know why apple's export with PQ transfer function can be such dim. If you see the photo, there is a flash light in it, it should reach the highest level. |
|
2023-11-24 01:34:36
|
You guys can check the RAW file here: https://drive.google.com/drive/folders/1WdYXb8lrhaMc6OiBjOgnxsERWrnB89FX?usp=sharing |
|
2023-11-24 01:37:48
|
"are you sure it's not just a grading for SDR and doing a dumb tonemap? or grading for "graphics white" on HDR?", I have only one goal which is to create a 16 bit JXL file with the same HDR display effect as the ProRAW file on iPhone's Photos App. |
|
|
Quackdoc
|
2023-11-24 01:40:29
|
this is the difference between relative and absolute transfers. your editing is likely dealing in a colorspace that uses a relative transfer like sRGB, gamma2.2, 2.4 etc. so when a pixel has a value of (1, 1, 1) it literally means "blast the light as bright as possible"
however things like PQ use an absolute luminance range. we don't use 50% power, 25% power, use use 100 nits, 1000 nits etc. so imagine an sRGB image, you typically see averages around 80% or greater, because sRGB is intended to be graded for 80nit displays. if you were to dumbly map display that as PQ. every image would be over 2000+ nits. |
|
2023-11-24 01:41:01
|
because of this we map SDR to a somewhat decent approximation in the absolute nit range, being agreed upon being around 203 nits. |
|
2023-11-24 01:42:43
|
ofc we don't live in a perfect world where that can make a lot of sense, because people are complicated, because of this when you do dumb mapping like this, it winds up often looking dim |
|
2023-11-24 01:43:44
|
now, how do you get that working in iphone photos? no idea. There may be app settings that need changed, or maybe photos app doesnt support it |
|
|
qdwang
|
2023-11-24 01:44:24
|
I get you point. So you guys means, if I export raw to ppm with a PQ transfer, it should be absolute. So the generated RGB value represents the photo's lightness. right? |
|
|
Quackdoc
|
2023-11-24 01:46:30
|
you should likely export the picture with the transfer you want applied to it from the photos app. there should be, hopefully be a setting for that |
|
|
qdwang
|
|
CrushedAsian255
|
2023-11-27 03:39:03
|
would there be a way to use JPEG XL as a intraframe video codec? like mjpeg / mj2? |
|
2023-11-27 03:39:14
|
as it has significantly better efficiency than JPEG |
|
|
Quackdoc
|
2023-11-27 03:42:35
|
I was using JXL as a mezzanine format, with as simple patch you can get ffmpeg to mux/demux JXL images into a container via riff tags. Right now it seems weirdly buggy since ffmpeg with butcher the timecodes for some reason when done so, but one could dink around with it with this patch
I have yet to look into actually fixing the issue since I havent done any video editing of any semi-serious stuff at all for a while
```diff
diff --git a/libavformat/riff.c b/libavformat/riff.c
index df7e9df31b..16e37fb557 100644
--- a/libavformat/riff.c
+++ b/libavformat/riff.c
@@ -34,6 +34,7 @@
* files use it as well.
*/
const AVCodecTag ff_codec_bmp_tags[] = {
+ { AV_CODEC_ID_JPEGXL, MKTAG('J', 'X', 'L', ' ') },
{ AV_CODEC_ID_H264, MKTAG('H', '2', '6', '4') },
{ AV_CODEC_ID_H264, MKTAG('h', '2', '6', '4') },
{ AV_CODEC_ID_H264, MKTAG('X', '2', '6', '4') },
``` |
|
2023-11-27 03:46:03
|
as for how fast it is, Lossless is pretty slow, but I was getting this with lossy, it was e5 d2 something like that https://cdn.discordapp.com/attachments/549650852839686153/1087020829969227926/image.png |
|
|
diskorduser
|
2023-11-27 06:43:25
|
Does it encode jxl video with this simple patch? |
|
|
Quackdoc
|
2023-11-27 06:44:01
|
yes, it treates JXL as s imple riff compatible format, so it will encode image sequences into a container that supports them (MKV, NUT, MP4 etc) |
|
|
diskorduser
|
2023-11-27 06:44:52
|
nice |
|
2023-11-27 06:45:25
|
do you have any windows ffmpeg build with this patch? |
|
|
Quackdoc
|
2023-11-27 06:48:48
|
not off hand, but it shouldnt be too hard to spin one up soonish, |
|
|
qdwang
|
2023-11-27 02:08:41
|
`jxlinfo` cannot show if a JXL is progressive encoding or not. Is there any method to check? |
|
|
jonnyawsom3
|
2023-11-27 08:45:27
|
Technically all JXL images are progressive, the progressive flag just splits it into more progressive layers |
|
|
CrushedAsian255
|
2023-11-28 02:02:58
|
is there a github repo that i can compile? |
|
|
Quackdoc
|
2023-11-28 02:33:47
|
no, its just a simple patch, you could compile via mingw for ease, i have pkgbuild for it... someherew |
|
|
CrushedAsian255
|
2023-11-28 02:40:53
|
how did you get mpv to read it? |
|
2023-11-28 02:40:57
|
or did it just work? |
|
2023-11-28 02:41:29
|
does this just give JPEG XL a FourCC code? |
|
|
Quackdoc
|
2023-11-28 02:57:50
|
pretty much, mpv will work when it's compiled against ffmpeg that works |
|
|
Demiurge
|
2023-11-28 09:41:09
|
You sure? Sometimes it looks like it loads in entire blocks at a time, left to right, then top to bottom. |
|
|
qdwang
|
2023-11-28 09:41:14
|
Thanks. There are 3 progressive flags for VarDCT in libjxl, will the `-p` enable all the three flags or just some of them? |
|
2023-11-28 09:41:44
|
They are `JXL_ENC_FRAME_SETTING_PROGRESSIVE_AC`, `JXL_ENC_FRAME_SETTING_QPROGRESSIVE_AC`, `JXL_ENC_FRAME_SETTING_PROGRESSIVE_DC` |
|
|
Demiurge
|
2023-11-28 09:42:05
|
Like in lossless mode for example, that doesn't look progressive at all, without additional settings |
|
|
w
|
2023-11-28 09:42:32
|
blocks are progressive |
|
|
qdwang
|
2023-11-28 09:43:17
|
I check the documents, it says **This means there is always a basic
progressive preview available in VarDCT mode.** |
|
2023-11-28 09:47:17
|
https://github.com/libjxl/libjxl/blob/main/doc/format_overview.md?plain=1#L191 |
|
|
Demiurge
|
2023-11-28 09:51:37
|
well, lossless doesn't use vardct. |
|
2023-11-28 09:51:56
|
So maybe that's why it doesn't look like it's loading progressively. |
|
2023-11-28 09:52:33
|
Try loading a lossless JXL here on this demo page: https://google.github.io/attention-center/ |
|
2023-11-28 09:53:10
|
It loads from top to bottom, left to right. |
|
|
qdwang
|
2023-11-28 09:53:14
|
But I've tried a test with libjxl, all default settings but with modualr encoding, the progressive is enable by default. If i set progressive to 0, it will generate a larger file. (the test is lossy modular) |
|
|
Demiurge
|
2023-11-28 09:53:42
|
But that's just the default order. The blocks can be re-ordered to load in any order, and there's even a flag to do middle-out ordering |
|
2023-11-28 09:54:45
|
This is just what I've heard, but: lossy modular uses some sort of DWT algorithm that lends itself well to progressive decoding. |
|
|
qdwang
|
2023-11-28 09:55:17
|
I didn't know that, thanks for mention this. |
|
|
Demiurge
|
2023-11-28 09:55:18
|
I could be wrong but I think it uses DWT instead of DCT |
|
|
Quackdoc
|
2023-11-28 09:55:20
|
progressive lossless hurts right now T.T |
|
|
Demiurge
|
2023-11-28 09:55:44
|
lossless modular does not use either DCT nor DWT |
|
2023-11-28 09:56:11
|
It's pretty cool that the image format has so many different schemes built into it |
|
2023-11-28 09:57:24
|
the developers seem to call it "squeeze" and I think it's a DWT algorithm |
|
2023-11-28 09:57:38
|
But it's a little bit outside my knowledge. |
|
|
Quackdoc
|
2023-11-28 09:57:53
|
I loaded the 35x windwaker image that was transcoded to jxl and it almost crashed my browser xD |
|
|
qdwang
|
2023-11-28 09:58:11
|
Great tool... if `djxl` could have this `X% bytes` parameter.. |
|
|
Demiurge
|
2023-11-28 09:59:56
|
cool! where can I find that image? |
|
2023-11-28 10:00:09
|
By the way, that demo page works with any image format your browser supports |
|
2023-11-28 10:00:19
|
It should work with PNG, GIF, JPEG, anything |
|
2023-11-28 10:00:28
|
so long as your browser can handle it |
|
|
qdwang
|
2023-11-28 10:02:51
|
It doesn't work in iOS 17. Apple's JXL support is just a half-finished integration with libjxl. |
|
|
Quackdoc
|
2023-11-28 10:02:56
|
wind waker image: 23164x18480
https://cdn.discordapp.com/attachments/673202643916816384/1081264925659365436/windwaker35xnative.jxl |
|
|
qdwang
|
2023-11-28 10:06:49
|
No matter VarDCT or Modular, Safari in iOS 17 needs a 100% loaded JXL to display. |
|
|
Demiurge
|
2023-11-28 10:07:35
|
That's unfortunate. What if you try other image formats? |
|
|
qdwang
|
2023-11-28 10:09:23
|
Baseline JPEG can display line by line without a problem. I don't have other formats on my device.. |
|
|
Demiurge
|
2023-11-28 10:13:28
|
Wow. It's a ridiculously massive image file that has JPEG reconstruction data |
|
|
Quackdoc
|
|
Demiurge
|
2023-11-28 10:18:37
|
And I just benchmarked and tested djxl decodes it faster than djpeg |
|
2023-11-28 10:18:50
|
but uses more CPU time since it's decoding on 4 cores |
|
2023-11-28 10:19:22
|
```
$ time djpeg windwaker35xnative.jpg >/dev/null; time djxl --disable_output windwaker35xnative.jxl
real 0m4.266s
user 0m3.862s
sys 0m0.241s
JPEG XL decoder v0.8.2 954b4607 [SSE4,SSSE3,SSE2]
Read 14744762 compressed bytes.
Decoded to pixels.
23164 x 18480, 174.41 MP/s [174.41, 174.41], 1 reps, 4 threads.
real 0m2.713s
user 0m6.750s
sys 0m1.324s
``` |
|
|
Quackdoc
|
2023-11-28 10:23:40
|
it is quite the large file, It's my testing baby, the original image was a jpeg from the dolphin emulators showcase on arbitary render res, |
|
|
Traneptora
|
2023-11-28 01:59:25
|
I'm considering writing a tool to report the structure of a JXL file and possibly tweak it |
|
2023-11-28 02:00:57
|
like, report frame boundaries, add or remove part2 container, etc. |
|
|
yurume
|
2023-11-29 05:28:49
|
deep into the bitstream level, right? |
|
2023-11-29 05:29:08
|
I once pondered about a textual representation of JXL for the purpose of testing |
|
|
Tirr
|
2023-11-29 05:31:45
|
jxl-oxide CLI has some reporting features (`jxl-info --all-frames --with-offset`) |
|
|
yurume
|
2023-11-29 05:34:53
|
though parsing back is a lot harder |
|
|
Tirr
|
2023-11-29 05:36:39
|
yep modifying bitstream is a whole different problem |
|
2023-11-29 05:37:09
|
jxl-oxide cannot entropy encode |
|
|
|
gbetter
|
2023-11-29 09:27:55
|
I'd appreciate that, I've been looking for the jxl equivalent of this: https://www.nayuki.io/page/png-file-chunk-inspector |
|
|
_wb_
|
2023-11-29 01:15:55
|
<@604964375924834314> I got a new macbook this week, one with the nice 1600 nits XDR screen. In Chrome I can see HDR avif images fine (and in Thorium also HDR jxl images), but in Safari it looks like all HDR images (both avif and jxl) are tone mapped and rendered in SDR? Is that expected? |
|
|
spider-mario
|
2023-11-29 01:16:36
|
depends on what you mean by “expected” |
|
2023-11-29 01:16:45
|
it matches the behaviour I've observed |
|
2023-11-29 01:17:11
|
in that sense, it's expected |
|
|
_wb_
|
2023-11-29 01:25:47
|
also HDR jxl images look HDR in the Photos app (and also in Adobe Camera Raw) but not in the Preview app |
|
2023-11-29 01:36:27
|
do you know if Safari is planning to make HDR images work? |
|
2023-11-29 01:38:00
|
because at the moment the only out-of-the-box way to see HDR images on the web seems to be using Chrome with AVIF, PNG, or JPEG+gainmap |
|
|
Traneptora
|
2023-11-29 04:46:55
|
hevc mp4 video with one frame with hdr in <video> works fwiw |
|
2023-11-29 04:47:13
|
on safari iirc |
|
2023-11-29 04:47:37
|
but in <img> that sounds correct |
|
2023-11-29 04:48:35
|
Thorium and Jxl appears to do hdr correctly |
|
2023-11-29 04:49:04
|
Mercury and Jxl reinterpret-casts the bt2020pq pixel data as sRGB |
|
2023-11-29 07:17:04
|
https://github.com/Alex313031/Mercury/issues/51 <@604964375924834314> |
|
2023-11-29 07:22:53
|
interesting, how does this compare to umbrielpng? |
|
2023-11-29 07:23:21
|
https://github.com/Traneptora/umbrielpng |
|
|
gb82
|
2023-11-30 05:00:51
|
that's what I'm experiencing as well |
|
|
Quackdoc
|
2023-11-30 09:22:39
|
interesting seems like djxl is much better at converting + tonemapping sRGB to HLG then it is at converting to PQ. |
|
2023-11-30 09:31:23
|
seems to be some colorshift still tho |
|
|
spider-mario
|
2023-11-30 10:18:22
|
sRGB _to_ HLG / PQ? (I ask because I’m not sure that this direction is usually called tone mapping) |
|
|
Quackdoc
|
2023-11-30 10:23:08
|
indeed, it's more often called inverse tone mapping |
|
|
spider-mario
|
2023-11-30 11:18:26
|
if you have a 255 cd/m² sRGB image, djxl will map this to a PQ signal that reproduces the same 0-255 cd/m² as-is, but will apply the inverse OOTF (γ=1/0.97) when converting to HLG |
|
2023-11-30 11:19:53
|
so if that HLG image is then displayed at 1000 cd/m², with the γ=1.2 of the OOTF, the overall gamma when going from 0-255 to 0-1000 will be 1.2/0.97 ≈ 1.24 |
|
|
Quackdoc
|
2023-11-30 11:46:48
|
ah interesting |
|
|
Traneptora
|
2023-11-30 02:13:33
|
PQ typically requires peak detection for sanely displaying it on a non-HDR monitor |
|
2023-11-30 02:13:39
|
libjxl leaves this up to the client |
|
2023-11-30 02:14:07
|
if you decode a PQ-tagged jpegxl file with jxlatte to sRGB, it will run this peak detection |
|
2023-11-30 02:14:17
|
if you decode it to PQ it just attaches a PQ icc profile and sends it downstream |
|
2023-11-30 02:14:34
|
but libjxl doesn't do any kind of peak detection, as that isn't really its job |
|
|
Quackdoc
|
2023-11-30 02:26:29
|
ah, What I'm doing is I'm taking an SRGB image and encoding it into a JXL, then decoding it as PQ. Since I wanted to test the inverse tonemapping. I do have an HDR monitor so thats not an issue.
also its worth noting that using peak detect for tone mapping is good with a large chunk of HDR content, but i find it will ruin really well mastered content as it doesn't properly preserve, seen by seeing variance. |
|
2023-11-30 02:28:06
|
But I'm mainly just investigating the viability of mastering With the intent of converting it to an XYB JXL for distribution, so that way you can get the gamut mapping benefits that XYB JXL has. |
|
2023-11-30 02:28:34
|
if only could do full lossless xyb encoding |
|
|
spider-mario
|
2023-11-30 02:33:52
|
by the way, libjxl has a tool (needs `cmake -DJPEGXL_ENABLE_DEVTOOLS=ON`) to convert HLG to PQ |
|
2023-11-30 02:33:57
|
(and vice versa with another tool) |
|
2023-11-30 02:34:24
|
usage looks like this: `tools/render_hlg --target_nits=1000 --pq hlg-input.png pq-output.png` |
|
2023-11-30 02:34:53
|
actually, two tools for the other direction |
|
2023-11-30 02:36:02
|
`tools/pq_to_hlg --max_nits=4000 pq-input.png hlg-output.png` first tone-maps from 4000 to 1000 cd/m², then applies the inverse HLG EOTF for a 1000-cd/m² peak (so γ=1/1.2 then OETF) |
|
2023-11-30 02:36:36
|
(that’s the method recommended in BT.2408) |
|
2023-11-30 02:37:37
|
whereas `tools/display_to_hlg --max_nits=4000 pq-input.png hlg-output.png` applies the inverse HLG EOTF directly (where the inverse OOTF for a 4000-cd/m² peak would be γ=1/1.49) |
|
2023-11-30 02:38:15
|
(which treats the PQ image as if it was the result of displaying an HLG image on a 4000-nit monitor) |
|
2023-11-30 02:39:09
|
`pq_to_hlg` should be the default choice for pq->hlg
(BT.2408:
> it is desirable that conversion between PQ and HLG should take place using the same reference peak displayed luminance for the signals used in the conversion. There is currently an industry consensus that this common peak luminance should be 1 000 cd/m².
) |
|
|
Quackdoc
|
2023-11-30 04:00:16
|
that could be pretty nifty to have actually |
|
|
spider-mario
|
2023-11-30 04:09:37
|
oh, and `display_to_hlg` on an sRGB image should do what you are currently observing with djxl |
|
2023-11-30 04:09:53
|
(without having to go through jxl) |
|
|
Orum
|
2023-12-01 09:12:59
|
is JXL's ISO noise decoding non-deterministic? |
|
2023-12-01 09:14:07
|
...or does the spec even specify? |
|
|
Tirr
|
2023-12-01 09:14:34
|
it's specified and deterministic |
|
|
Orum
|
2023-12-01 09:22:37
|
interesting... 🤔 |
|
|
monad
|
2023-12-01 10:00:14
|
which doesn't imply anything about some particular implementation https://github.com/libjxl/libjxl/pull/2825 |
|
|
Tirr
|
2023-12-01 10:04:33
|
there might be bugs, yeah |
|
2023-12-01 10:06:19
|
if the noise from the same jxl file is nondeterministic, it's a bug |
|
|
Orum
|
2023-12-01 10:55:01
|
yeah, mpv is showing some weirdness when opening a jxl with ISO noise |
|
|
Traneptora
|
2023-12-01 03:34:32
|
define "weirdness" |
|
|
Orum
|
2023-12-01 11:45:09
|
this: https://cdn.discordapp.com/attachments/673202643916816384/1180072683640205382/2023-12-01_14-06-42.mkv?ex=657c1754&is=6569a254&hm=fe73e335fae4381051be478cc36062d991b3c10d5f1be0131756f72886e5884f& |
|
|
Quackdoc
|
2023-12-02 01:36:51
|
Can JXL losslessly transcode an image as XYB and have the xyb encoded tag? I see that cjxl doesn't support this. Wondering if this is just a "very low priority" thing or if jxl prohibits this. |
|
|
Traneptora
|
2023-12-02 02:46:43
|
you mean like lossless jpeg transcoding of an xyb jpeg? |
|
|
Quackdoc
|
2023-12-02 02:51:41
|
ah I mean losslessly encoding an XYB input frame, the general Idea im thinking of would be for an image authoring software to send an XYB formatted frame directly to libjxl, and to losslessly encode it with the goal of being able to author large gamut images and rely on jxl a decoder for gamut mapping |
|
|
Traneptora
|
2023-12-02 02:53:34
|
atm cjxl doesn't support XYB input |
|
2023-12-02 02:54:03
|
but xyb isn't "lossless" as it appears in jxl tbh |
|
2023-12-02 02:54:32
|
authoring software is unlikely to work in XYB anyway |
|
|
Quackdoc
|
2023-12-02 03:00:39
|
I am aware that format conversions can incur a degree of loss, but the end goal of this would be to minimize loss as much as possible. Instead of exporting to something like rec2020 or p3 or whatever, and then encoding that to XYB, it could be nice to go right from the working space, say something really out their like ACEScg directly to XYB. since XYB is just really nice to have for colorspace conversion features. |
|
2023-12-02 03:01:32
|
I am also aware that "lossless" for distribution doesn't always, or even often make sense, but it is a nice point to bring up when asking people to implement |
|
|
Traneptora
|
2023-12-02 03:08:31
|
right but XYB isn't something that authoring software works in |
|
2023-12-02 03:08:35
|
it works in RGB typically |
|
2023-12-02 03:09:04
|
the authoring software would first need to run an XYB conversion and feed that, might as well let cjxl handle it |
|
|
Quackdoc
|
2023-12-02 03:15:31
|
Indeed I myself don't care about where the transform happens, but it would be nice to be able to manage it in existing color managed pipelines. |
|
|
Traneptora
|
2023-12-02 06:39:12
|
but no, jxl codestream format does not prohibit this |
|
2023-12-02 06:39:36
|
In fact it doesn't specify encoding at all |
|
2023-12-02 06:39:58
|
it only specifies how to decode legal codestreams |
|
2023-12-02 06:40:52
|
encoding every file to a black rectangle, strictly speaking, is a compliant way to encode. just very unhelpful |
|
|
Quackdoc
|
2023-12-02 08:01:37
|
i see good to know |
|
|
Pieter
|
2023-12-02 09:14:30
|
It has a pretty amazing compression ratio. |
|
|
Traneptora
|
2023-12-02 11:21:06
|
that it would |
|
2023-12-02 11:22:00
|
Speaking of compression ratio, I need to make the libhydrium entropy encoder more intelligent |
|
|
|
veluca
|
2023-12-02 11:28:00
|
what specifically about it? |
|
|
Traneptora
|
2023-12-03 01:04:32
|
better lz77 detection |
|
2023-12-03 01:04:35
|
is one |
|
|
|
veluca
|
2023-12-03 01:04:53
|
Ahhhh yeah that one is always fun |
|
2023-12-03 01:04:57
|
What do you do there? |
|
|
Traneptora
|
|
|
veluca
|
2023-12-03 01:05:17
|
I found it surprisingly hard to do something performance-competitive in that context |
|
2023-12-03 01:05:27
|
But then again it was 3 years ago, who knows now |
|
2023-12-03 01:05:56
|
It's easy to do fast RLE though 😉 |
|
|
Traneptora
|
2023-12-03 01:06:15
|
it does RLE only, and only if it's explicitly emabled on stream init |
|
2023-12-03 01:06:38
|
im thinking of making the entropy encoder autodetect it |
|
|
|
veluca
|
2023-12-03 01:06:47
|
Rle should be fast enough that it should be fine |
|
|
Traneptora
|
2023-12-03 01:11:47
|
how does fpnge do it wrt zlib ? |
|
|
|
veluca
|
2023-12-03 01:12:31
|
It does rle only 😛 |
|
|
Traneptora
|
2023-12-03 01:12:37
|
ah |
|
2023-12-03 01:12:52
|
with a ton of zeroes it gets better tho |
|
|
|
veluca
|
2023-12-03 01:13:03
|
If I wanted to do something really fast, I'd probably take a fast hash of 4-8 bytes |
|
|
Traneptora
|
2023-12-03 01:13:28
|
with a LUT? |
|
|
|
veluca
|
2023-12-03 01:13:40
|
Something like that yeah |
|
|
Traneptora
|
2023-12-03 01:13:44
|
how would you make that work |
|
|
|
veluca
|
2023-12-03 01:14:05
|
Probably, there's very little gain in checking hashes at anything but the start of a pixel |
|
|
Traneptora
|
2023-12-03 01:14:52
|
for a fast hash of 8 bytes, why not just cast to uint64_t |
|
|
|
veluca
|
2023-12-03 01:15:03
|
Also, I could cheat and assume that if there's an interesting copy to be done, it's probably at a row a little above (+- few pixels) |
|
2023-12-03 01:15:34
|
Well, because then you'd probably put in a hash table by doing % table size, and that would be bad |
|
|
Traneptora
|
2023-12-03 01:16:20
|
oh, make the table size 256 and do `& 0xFF` or something |
|
|
|
veluca
|
2023-12-03 01:16:31
|
But cast to u64 + a few steps of something like boost's hash combine should do it |
|
|
Traneptora
|
2023-12-03 01:16:37
|
oh but then it's not a hash, hm. |
|
|
|
veluca
|
2023-12-03 01:17:00
|
Or one of the fast PRNGs |
|
2023-12-03 01:17:05
|
Say, xorshiro |
|
|
Traneptora
|
2023-12-03 01:17:12
|
seems unnecessary |
|
|
|
veluca
|
2023-12-03 01:17:15
|
Use that as a seed and produce a random number |
|
2023-12-03 01:17:25
|
Eh, it's just a few cycles |
|
2023-12-03 01:17:46
|
You can also multiply by some constant |
|
2023-12-03 01:18:21
|
But I think the biggest gains you get by not checking at every byte, but just every pixel (or even every few pixels, if your match is long enough you'll find it anyway) |
|
2023-12-03 01:18:31
|
(for PNG) |
|
|
Traneptora
|
2023-12-03 01:19:03
|
`(x | (x >> 8) | (x >> 16) | (x >> 24)) & 0xFF` |
|
|
|
veluca
|
2023-12-03 01:19:12
|
Eh, kinda sucks |
|
|
Traneptora
|
|
|
veluca
|
2023-12-03 01:19:28
|
It's permutation invariant, for one |
|
|
Traneptora
|
2023-12-03 01:19:56
|
well the goal is speed, not prng |
|
|
|
veluca
|
2023-12-03 01:20:00
|
Also, a whole lot of things will result in 0xFF |
|
|
Traneptora
|
|
|
veluca
|
2023-12-03 01:20:14
|
Do ^ instead of |, at least |
|
|
Traneptora
|
2023-12-03 01:20:23
|
oh, that's what I meant |
|
2023-12-03 01:20:28
|
xor |
|