x264 10-Bit: encode 10 Bit, output 8 Bit?
x264 10-Bit: encode 10 Bit, output 8 Bit?
Hello!
I have recently been experimenting with the x264 and x265 10-Bit libraries.
My standard encoding setting used to be:
x264 (regular 8-Bit)
CRF16
slow
tune film
I now tested the same settings with the x264 10-Bit library.
Result:
- smaller filesize
- smoother gradations (less banding)
On paper, this is a win/win.
However, there is basically no hardware out there that supports x264 10-Bit decoding.
Since my Windows setup only outputs the 10-Bit file as 8-Bit and it still looks better, I was wondering if there is a way to encode in 10-Bit but basically have the color depth pre-converted to 8-Bit for output?
I am also asking because I stumbled over this older thread in another forum:
https://forum.doom9.org/showthread.php?t=170236
I don't really understand what's going on there but it looks to me as if someone encoded x265 at 16-Bit but was able to set the output to 8-Bit or 10-Bit?
https://forum.doom9.org/showthread.php? ... ost1679814
Is there a way to do what I am thinking of?
I tried to find an x265 equivalent (because x265 10-Bit can be hardware-decoded by many devices) but if I want x264 CRF16 film quality I need to use the grain tune in x265, which makes encoding even slower and eliminates any filesize savings...
I appreciate any help!
P.S.:
I know that people usually request a log (and I can provide one later if needed).
However, I thought it does not really make sense to post logs due to the generic nature of my question...
I have recently been experimenting with the x264 and x265 10-Bit libraries.
My standard encoding setting used to be:
x264 (regular 8-Bit)
CRF16
slow
tune film
I now tested the same settings with the x264 10-Bit library.
Result:
- smaller filesize
- smoother gradations (less banding)
On paper, this is a win/win.
However, there is basically no hardware out there that supports x264 10-Bit decoding.
Since my Windows setup only outputs the 10-Bit file as 8-Bit and it still looks better, I was wondering if there is a way to encode in 10-Bit but basically have the color depth pre-converted to 8-Bit for output?
I am also asking because I stumbled over this older thread in another forum:
https://forum.doom9.org/showthread.php?t=170236
I don't really understand what's going on there but it looks to me as if someone encoded x265 at 16-Bit but was able to set the output to 8-Bit or 10-Bit?
https://forum.doom9.org/showthread.php? ... ost1679814
Is there a way to do what I am thinking of?
I tried to find an x265 equivalent (because x265 10-Bit can be hardware-decoded by many devices) but if I want x264 CRF16 film quality I need to use the grain tune in x265, which makes encoding even slower and eliminates any filesize savings...
I appreciate any help!
P.S.:
I know that people usually request a log (and I can provide one later if needed).
However, I thought it does not really make sense to post logs due to the generic nature of my question...
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
The log contains some important stuff, like whether your source is 10 bit.
Please don't presume ANYTHING when asking for help ...
Please don't presume ANYTHING when asking for help ...
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
And the log should also indicate whether or not you've followed the instructions to implement 10- and 12-bit encoding found here: viewtopic.php?f=11&t=34165
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Right, sorry.
I should have mentioned that.
Source is regular Blu-ray AVC stuff.
So 1080p @ 8-Bit @ BT.709.
I followed the instructions for the libraries.
I started 10-Bit encoding on a nightly build, but the libraries seem to work in 1.0.7 as well (although it says otherwise)...
Logs are coming soon...
I should have mentioned that.
Source is regular Blu-ray AVC stuff.
So 1080p @ 8-Bit @ BT.709.
I followed the instructions for the libraries.
I started 10-Bit encoding on a nightly build, but the libraries seem to work in 1.0.7 as well (although it says otherwise)...
Logs are coming soon...
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
OK, here are the encode logs - I created fresh ones from a short sample, to make things simpler:
1. "Reference" encode (x264-8 / CRF16 / slow / film):
https://pastebin.com/raw/niH1p0LF
2. x264 10-Bit encode (x264-10 / CRF16 / slow / film):
https://pastebin.com/raw/8mCTb0en
3. x265 10-Bit encode (x265-10 / CRF16 / medium / grain):
https://pastebin.com/raw/iFnTbSMD
I hope you guys can help me out.
I would really love 10-Bit encoding gradation in a compatible 8-Bit package.
1. "Reference" encode (x264-8 / CRF16 / slow / film):
https://pastebin.com/raw/niH1p0LF
2. x264 10-Bit encode (x264-10 / CRF16 / slow / film):
https://pastebin.com/raw/8mCTb0en
3. x265 10-Bit encode (x265-10 / CRF16 / medium / grain):
https://pastebin.com/raw/iFnTbSMD
I hope you guys can help me out.
I would really love 10-Bit encoding gradation in a compatible 8-Bit package.
Last edited by jd17 on Mon May 22, 2017 5:45 pm, edited 1 time in total.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
So if you put 8 bit source in a ten bit box, you've added air, that's all.
The question has been put to rest some years back, but still gets blogged.
The question has been put to rest some years back, but still gets blogged.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Well, if that were the case, why do I get smaller filesizes and smoother gradations?
I made screenshots (tif) of a frame that is prone to banding, cut out the area in question and increased brightness + contrast to point out the difference, please have a look:
Source:
x264-8 / CRF16 / slow / film:
x265-8 / CRF13 / slow / no tune:
x264-10 / CRF16 / slow / film:
x265-10 / CRF16 / medium / no tune:
I can also see the advantage of 10-Bit encoding outside of that exaggerated enhancement.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
@jd17 I wouldn't take it personally.
@mduell @musicvid Let's be more careful with tone, please.
@mduell @musicvid Let's be more careful with tone, please.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
All else being equal, a true 10 bit upsample would be quite a bit larger, if such a thing were actually taking place. We've run the tests. Politely.. musicvid wrote: ↑Mon May 22, 2017 10:19 am
So if you put 8 bit source in a ten bit box, you've added air, that's all.Well, if that were the case, why do I get smaller filesizes and smoother gradations?
Your question would be a good one if you have ten bit source.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
OK, true or untrue - whatever happens during the encode - the results still looks better to my eyes.
In both x264 10-Bit and x265 10-Bit + grain.
So is it possible to do what I asked?
Can this untrue 10-Bit encode be packed in an 8-Bit package?
In both x264 10-Bit and x265 10-Bit + grain.
So is it possible to do what I asked?
Can this untrue 10-Bit encode be packed in an 8-Bit package?
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
No. 8-bit video encoded as 10-bit can actually compress more due to less quantization error. Counter-intuitive since 10 is more than 8, but true nonetheless.
You may try one of the denoisers to smooth flat areas a bit.
You may try one of the denoisers to smooth flat areas a bit.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
10-bit output depth allows the encoder to use a greater internal precision before/during quantization regardless of input depth, resulting in slight improvements in some areas at the expense of encoding time. It's mostly useful for animation, but some people will like the results even of live action material…
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Is than an answer to my question?
I have tried that. I want to be as close to the source as possible and those 10-Bit encodes are the best thing I have seen yet.You may try one of the denoisers to smooth flat areas a bit.
Denoising always comes at a price in my experience...
I actually prefer some noise, so as not to loose any detail.
The beauty of those 10-Bit encodings is that the smoothing is just visible where it helps: critical gradations.
I see no drawbacks in any detail/focus area with x264 10-Bit film or x265 10-Bit grain...
Last edited by jd17 on Mon May 22, 2017 8:33 pm, edited 1 time in total.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
My full response was the answer to your question.
There is too much quantization error in 8-bit encoding and too few values to reproduce such gradations smoothly. For any color channel you have only 256 values, or differences between shades. Your source isn't varying colors much in the background there, and it isn't varying lightness much either. So you're trying to express a smooth gradation with only 10-20 values. It simply isn't possible without going to a higher bit depth. Denoising can sometimes rearrange pixels to be slightly more pleasing, but cannot fully alleviate this issue in 8-bit space.
There is too much quantization error in 8-bit encoding and too few values to reproduce such gradations smoothly. For any color channel you have only 256 values, or differences between shades. Your source isn't varying colors much in the background there, and it isn't varying lightness much either. So you're trying to express a smooth gradation with only 10-20 values. It simply isn't possible without going to a higher bit depth. Denoising can sometimes rearrange pixels to be slightly more pleasing, but cannot fully alleviate this issue in 8-bit space.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Correction, 220 values for luma and 225 for chroma, given TV range.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Yes, thank you - I get that!
But understand where I am coming from...
The 10-Bit video is still just being displayed/output in 8-Bit when I play it in Kodi (both in Windows and AMLogic box).
So a conversion to 8-Bit is happening anyhow.
Is there really no way to define this 8-Bit output before?
What was that guy doing I linked to (doom9 forum)?
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Well, the upsampling conversion to 10-bit is rearranging the pixels so the dithering pattern changes. That's what you're noticing most, and probably what you're noticing about the other person's encodes.
What most pros do in post is use an NLE like Resolve or Premiere, denoise to remove the dither patterns (which is what we're comparing here), then render grain to simulate any perceptual loss of detail. There are film grain overlays available for free around the web: https://www.google.com/search?q=free+fi ... 8&oe=utf-8
Essentially, this is yet another means for rearranging the micro patterns to look less digital and more natural. When done properly, it should look fine encoded 8-bit. But it requires some work in a video editor and a high quality intermediate render (ProRes or similar) before feeding to HandBrake, so expect to spend some time, CPU, and storage space on it.
What most pros do in post is use an NLE like Resolve or Premiere, denoise to remove the dither patterns (which is what we're comparing here), then render grain to simulate any perceptual loss of detail. There are film grain overlays available for free around the web: https://www.google.com/search?q=free+fi ... 8&oe=utf-8
Essentially, this is yet another means for rearranging the micro patterns to look less digital and more natural. When done properly, it should look fine encoded 8-bit. But it requires some work in a video editor and a high quality intermediate render (ProRes or similar) before feeding to HandBrake, so expect to spend some time, CPU, and storage space on it.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
You may be interested in Stu Maschwitz's pipeline demo, essentially the same as what I mentioned: https://www.youtube.com/watch?v=kWaUow8HM3Q
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
I have not actually looked at the other person's encodings - I was just wondering what he meant in the linked .pdf:
As far as I understood you though, this is not possible?
I guess the least complicated way to get a bit of filesize saving and smoother gradation is to go for x265 10-Bit + grain.
I am actually quite happy with:
x265 10-Bit
CRF17
medium
tune grain
This gives me smoother gradations and essentially no loss of detail compared to x264 8-Bit CRF16 slow, film.
The files are about 80% of the x264 size.
However, encoding time is more than doubled and I sacrfifice device compatibility...
This is how I got the idea that maybe encoding can be done at a higher bitrate, but be packed (down-converted) into an 8-Bit container eventually, while still offering superior visual quality.x265 16 bit
8 bit output
x265 16 bit
10 bit output
As far as I understood you though, this is not possible?
I guess the least complicated way to get a bit of filesize saving and smoother gradation is to go for x265 10-Bit + grain.
I am actually quite happy with:
x265 10-Bit
CRF17
medium
tune grain
This gives me smoother gradations and essentially no loss of detail compared to x264 8-Bit CRF16 slow, film.
The files are about 80% of the x264 size.
However, encoding time is more than doubled and I sacrfifice device compatibility...
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Correct. What you are seeing is the upsampling algorithm in the higher bit-depth encoders permanently changing the dithering pattern evident in the gradation. Try encoding to 10-bit as you did before, but use RF 0 (x264's lossless mode), and then re-encode that as 8-bit at a normal RF like 17. I bet some or all of that newly introduced dithering pattern survives, and you've just simulated the "N-bit, 8-bit output" you mention.As far as I understood you though, this is not possible?
Of course, the 8-bit file will be larger than the 10-bit file due to quantization error.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
I fear you might have misunderstood my intentions a bit, judging from the linked YouTube video
(although I learned something from that too )...
My goal is basically to be as close to the source as possible, let's say I want "visually lossless", while still saving some gigs.
Again in my old standard preset, that used to be x264 8-Bit CRF16 slow, film.
I am just looking for a worthy successor for that preset.
Maybe you also remember my other discussion about audio -> lossless TrueHD / DTS-HD MA to FLAC 16-Bit.
I have the feeling that I am getting quite close to a very good package here, I just wanted to make sure that I use the right parameters to get there.
The help I got in the forum has been amazing! Thank you so much!
So if there are no "objections" from your side I think I'm going with:
Video: x265 10-Bit, CRF17, medium, tune grain
Audio: FLAC 16-Bit (using Hybrid until DTS-HD decoding is implemented in HandBrake)
(although I learned something from that too )...
My goal is basically to be as close to the source as possible, let's say I want "visually lossless", while still saving some gigs.
Again in my old standard preset, that used to be x264 8-Bit CRF16 slow, film.
I am just looking for a worthy successor for that preset.
Maybe you also remember my other discussion about audio -> lossless TrueHD / DTS-HD MA to FLAC 16-Bit.
I have the feeling that I am getting quite close to a very good package here, I just wanted to make sure that I use the right parameters to get there.
The help I got in the forum has been amazing! Thank you so much!
So if there are no "objections" from your side I think I'm going with:
Video: x265 10-Bit, CRF17, medium, tune grain
Audio: FLAC 16-Bit (using Hybrid until DTS-HD decoding is implemented in HandBrake)
Last edited by jd17 on Mon May 22, 2017 8:23 pm, edited 1 time in total.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
Thanks! I think I'm actually going to try this for fun, just to see where it gets me.BradleyS wrote: ↑Mon May 22, 2017 8:01 pmCorrect. What you are seeing is the upsampling algorithm in the higher bit-depth encoders permanently changing the dithering pattern evident in the gradation. Try encoding to 10-bit as you did before, but use RF 0 (x264's lossless mode), and then re-encode that as 8-bit at a normal RF like 17. I bet some or all of that newly introduced dithering pattern survives, and you've just simulated the "N-bit, 8-bit output" you mention.
Of course, the 8-bit file will be larger than the 10-bit file due to quantization error.
That is obviously neither practical, nor sensible for regular encodings, but I'm curious.
Re: x264 10-Bit: encode 10 Bit, output 8 Bit?
I'd really like to see some tests of those quantization - dithering theories.
Only place I was able to objectify results were with 10 bit and RGB 4:4:4 source.
Only place I was able to objectify results were with 10 bit and RGB 4:4:4 source.