CPU vs Nvidia GPU for pre-encoding at 8Mbps
Forum rules
An Activity Log is required for support requests. Please read How-to get an activity log? for details on how and why this should be provided.
An Activity Log is required for support requests. Please read How-to get an activity log? for details on how and why this should be provided.
CPU vs Nvidia GPU for pre-encoding at 8Mbps
I am pre-encoding (correct term?) movies for streaming with Emby from my NAS to prevent the NAS having to transcode on the fly. With my current internet speeds, I am encoding them to stay below 8Mbps with AAC audio. I played around a little with using the GPU instead of CPU to do the encoding, and the speed difference is incredible. Is there going to be a large quality difference if I switch to using the GPU? I am using a Intel 9900K and a Geforce 2080ti on Windows 10. I am looking at a list of about 500 more movies and wondering if saving a couple hours of encoding per movie might be worthwhile, but if it ends up being a poor finished product I will just keep it as-is. I'm not super concerned with file size as long as it isn't a huge difference.
Re: CPU vs Nvidia GPU for pre-encoding at 8Mbps
The only way you can tell for sure is to try encoding a few. And not just one - the same settings will give different results dependent upon the souce.
Is the quality acceptable to YOU? Do YOU notice enough difference to be a factor in your decision?
Certainly, using dedicated encoder hardware is faster than software, but it comes at a price - lower quality or larger file size, and often both.
Is the quality acceptable to YOU? Do YOU notice enough difference to be a factor in your decision?
Certainly, using dedicated encoder hardware is faster than software, but it comes at a price - lower quality or larger file size, and often both.
Re: CPU vs Nvidia GPU for pre-encoding at 8Mbps
Thank you for the input. I thought maybe there might be a general consensus as to how big of a deal the trade off is. I will try a couple and see how it changes.
Re: CPU vs Nvidia GPU for pre-encoding at 8Mbps
Consensus is a difficult thing to obtain on things like this. You have people who count the blocks where the black areas of a scene are not QUITE black, and curmudgeons like me that think if you're looking for compression artifacts, the show is too boring to be worth watching.
When I last experimented with hardware encoding, though, I could tolerate the visual differences and loved the 20x increase in frame rate, but didn't like the 5% increase in file size. So I stuck with CPU encoding.
But you're not me. My opinions on quality are just that, and should not be an influence on you're opinion. I'm not the one that needs to be pleased by the results.
That's one of the big things about free tools - it doesn't cost anything but your time (and electricity cost, I guess) to figure out what works for you. You do not have to encode whole movies to make choices; pick a worst-case scene (lots of detail and/or movement), encode that with both encoders and different settings, and decide what works for you.
I consider it an adventure game - you are given unlimited lives, a goal (make something useful), and tools (the programs).
When I last experimented with hardware encoding, though, I could tolerate the visual differences and loved the 20x increase in frame rate, but didn't like the 5% increase in file size. So I stuck with CPU encoding.
But you're not me. My opinions on quality are just that, and should not be an influence on you're opinion. I'm not the one that needs to be pleased by the results.
That's one of the big things about free tools - it doesn't cost anything but your time (and electricity cost, I guess) to figure out what works for you. You do not have to encode whole movies to make choices; pick a worst-case scene (lots of detail and/or movement), encode that with both encoders and different settings, and decide what works for you.
I consider it an adventure game - you are given unlimited lives, a goal (make something useful), and tools (the programs).
Re: CPU vs Nvidia GPU for pre-encoding at 8Mbps
It is generally acknowledged that especially at lower bitrates such as yours, software (CPU) encoding has better results. To these eyes, the difference is sometimes dramatic.
Re: CPU vs Nvidia GPU for pre-encoding at 8Mbps
It's going to depend a lot on the content too, in addition to personal preferences.
8Mbps low-efficiency hardware encode can be fine for a DVD of The Social Network (clean, low motion, etc), and wildly insufficient for 4K Blu-ray of Saving Private Ryan (grain, high motion, etc).
If you could do 100 or 200 Mbps, yea sure, there's reasonable consensus the low efficiency hardware encoders are fine.
8Mbps low-efficiency hardware encode can be fine for a DVD of The Social Network (clean, low motion, etc), and wildly insufficient for 4K Blu-ray of Saving Private Ryan (grain, high motion, etc).
If you could do 100 or 200 Mbps, yea sure, there's reasonable consensus the low efficiency hardware encoders are fine.