Okay so i find it strange i can not "stress" the p.c. i have out utilizing all of the aspects of the hardware, the img attached shows that noting or hardly anything is being used. What is the deal here (internally), software or driver-ly?
One would think, "hey this pc's and the processes i am doing should utilize all aspects of the availability, but what do you know, it does not so what i do have to do who do i have to pay to get 95% utilization across my hardware so that i do not sit here for 7-8hours hoping something doesn't go wrong.
What gives cause its feeling artificially halted/slowed/hindered? where is the location of people realizing this and utilizing
"alternative means" to stop the b.s.? p.m. if necessary, thanks.
Edit: to explain,
How can say an engine like 2077 become moded and use real-time rendering to make objects look near realistic, in seconds in comparison to say an engine rendering a video that is only 10 seconds taking 5-8 hours?
Where is my mis-understanding in this situation?
Replies
if you're pulling data over a network, everything else is going to be waiting.
if you want to pin every core for shits and giggles, build unreal from source
193 samples = 2 hours for 1280x720 = 10 seconds turntable.
41.7gb to 39.7gb file = 2.05gb's AVI-lossless.
Guesstimates:
should be x3's amount = 3h:33m's+ for 3k. 6.0gb's+ file for "3k."
and 12gb's from/to 15gb's for 6kres, was 18 GBS 12 hours+ <Actually is 11hours not a guess.
win 10 ?
assuming it's not running at 105degrees ...
is the encoder running on the efficiency cores? (they'll be the last 4/8/whatever listed in the CPU graph) - those are incredibly slow and deeply unsuited to tasks like video encoding
afaik win10 still doesn't have proper scheduling for efficiency cores (tasks marked as low priority / background are forced onto them and that often includes background tasks like c-compilers, encoders etc) - you could test this theory by disabling them in the bios