geekhack
geekhack Community => Off Topic => Topic started by: tp4tissue on Tue, 10 October 2017, 19:33:45
-
Borrowed a buddy's 1050Ti.. because I got these new uhd blurays i wanted to watch..
So.... basically, the 1050Ti is on the -edge- of being able to playback the latest HEVC using Madvr scaling.. It's tittering on not being able to scale the video at all..
So, for example, if ya'll got a monitor w/ the movie, and another screen that needed Gpu-acceleration, it's going to choke up.
The only way around it would be to use the more standard player outputs, and they're fine, but drastically reduced in quality vs Madvr output.
As it stands, madvr consumes 40-80% of the 1050Ti's processing power depending on the variety of hevc uhd files I've tested.
This is an upper end 1050ti, @ 2050mhz core, 4507mhz vram. That 80% quote is with the core fully loaded, not half multiplier.
I know the 1050ti is going to be quite popular this christmas, so, keep this information in mind..
1070 and especially 1080 is the best place to be for the coming 2 years.
-
personally i almost always feel that the xx70 model in each nvidia generation is the sweet spot for price/performance
at least that's that i always end up getting
-
personally i almost always feel that the xx70 model in each nvidia generation is the sweet spot for price/performance
at least that's that i always end up getting
mmmm..... depends on what you're doing sure..
but overall.. the xx80 is more often just a higher bin.. with less problems down the road..
-
Borrowed a buddy's 1050Ti.. because I got these new uhd blurays i wanted to watch..
So.... basically, the 1050Ti is on the -edge- of being able to playback the latest HEVC using Madvr scaling.. It's tittering on not being able to scale the video at all..
So, for example, if ya'll got a monitor w/ the movie, and another screen that needed Gpu-acceleration, it's going to choke up.
The only way around it would be to use the more standard player outputs, and they're fine, but drastically reduced in quality vs Madvr output.
As it stands, madvr consumes 40-80% of the 1050Ti's processing power depending on the variety of hevc files I've tested.
This is an upper end 1050ti, @ 2050mhz core, 4507mhz vram. That 80% quote is with the core fully loaded, not half multiplier.
I know the 1050ti is going to be quite popular this christmas, so, keep this information in mind..
1070 and especially 1080 is the best place to be for the coming 2 years.
Dahm, one of my IRL friends literally just ordered one. rip
-
Dahm, one of my IRL friends literally just ordered one. rip
It's probably fine for the laymen, for example, if they've never heard of madvr...
Or if their main portal is netflix 4k, which is at a very modest bitrate.
But if you want Higher Quality hevc 4k playback and scaling, 1050ti is cutting it close.
-
I'm saving for when the price of the 1080 TI OC comes down, and then I'm throwing an ASUS PG348Q in there as well.
And then I'm poor AF. Well... how bad can eating only cooked potatoes for a month be when I join over 9000 FPS masterrace?
-
I'm saving for when the price of the 1080 TI OC comes down, and then I'm throwing an ASUS PG348Q in there as well.
And then I'm poor AF. Well... how bad can eating only cooked potatoes for a month be when I join over 9000 FPS masterrace?
I don't think the PG348Q has HDR, which is necessary for all the new UHD content.
I've been watching UHD with highlights downsampled to my 8-bit panel.. the problem is the mastering is meant for HDR color space.. so certain details are blended out.
ULMB and HDR are both requisites for any monitor purchases going forward..
-
I'm saving for when the price of the 1080 TI OC comes down, and then I'm throwing an ASUS PG348Q in there as well.
And then I'm poor AF. Well... how bad can eating only cooked potatoes for a month be when I join over 9000 FPS masterrace?
I don't think the PG348Q has HDR, which is necessary for all the new UHD content.
I've been watching UHD with highlights downsampled to my 8-bit panel.. the problem is the mastering is meant for HDR color space.. so certain details are blended out.
ULMB and HDR are both requisites for any monitor purchases going forward..
Fair point. I think the only monitors that are actually on par with regards to HDR content are the LG tvs. But so far, the reviews keep complaining about motion blur. Like it costs too much processing to also deliver crisp and instant motion.
-
Fair point. I think the only monitors that are actually on par with regards to HDR content are the LG tvs. But so far, the reviews keep complaining about motion blur. Like it costs too much processing to also deliver crisp and instant motion.
HDR is a whole new ball game..
Takes alot more processing power.. especially when you go up to 4k 120hz
Here's hoping amd catches up , or else we're going to be stuck in the bog with nvidia..
Sandbagging us with crappy 10% refresh every year.
-
#Boycott1050's
-
Here's hoping for a 1080 on Black Friday/Cyber Monday!
-
Here's hoping for a 1080 on Black Friday/Cyber Monday!
without competition, it'll be $50 off tops..
probably $430-450 for 1080
Man,, remember those 980 TI for $200.. wow, that was a deal.. hahahahaha
you wouldn't buy it for UHD 4k movies, but for everything else, what a steal.
-
Here's hoping for a 1080 on Black Friday/Cyber Monday!
without competition, it'll be $50 off tops..
probably $430-450 for 1080
Man,, remember those 980 TI for $200.. wow, that was a deal.. hahahahaha
you wouldn't buy it for UHD 4k movies, but for everything else, what a steal.
Hey, $50 pays for a novelty kit in a new keycap set, so I'll take it ;)
-
Here's hoping for a 1080 on Black Friday/Cyber Monday!
without competition, it'll be $50 off tops..
probably $430-450 for 1080
Man,, remember those 980 TI for $200.. wow, that was a deal.. hahahahaha
you wouldn't buy it for UHD 4k movies, but for everything else, what a steal.
Even better, there was this thing with the, I believe, it was the GeForce 4600? Which only differed from the Ti version by a small connection on the board, so if you drew a line with the proper material (a pencil?) BOOM all of a sudden you had a Ti. Crazy.
-
HDR is a whole new ball game..
Takes alot more processing power.. especially when you go up to 4k 120hz
Here's hoping amd catches up , or else we're going to be stuck in the bog with nvidia..
Meh. HDR is no biggie for 3D rendering. 3D video cards already do a colour space conversion from linear gamma in the frame buffer to a gamma of approx. 2.2 of sRGB before they send it over the cable.
Then every LCD monitor does another colour space conversion from sRGB to whatever colour space/gamma the panel really has.
I think NVidia should swallow their pride and get on the Freesync 2 train. Preferrably the features of Freesync 2 should become part of the DisplayPort standard like Freesync (1) already is ...
The big feature of AMD's Freesync 2 is not really refresh rate, but that there is only one colour space transform at the video card's side - directly to the colour space of the panel. That's HDR support without any loss of precision for going over the cable.
BTW, I've got a Kaby Lake CPU which should do HEVC decoding at 2160p/60 independently of the GPU. But I dunno if my software stack really supports it ...
-
HDR is a whole new ball game..
Takes alot more processing power.. especially when you go up to 4k 120hz
Here's hoping amd catches up , or else we're going to be stuck in the bog with nvidia..
Meh. HDR is no biggie for 3D rendering. 3D video cards already do a colour space conversion from linear gamma in the frame buffer to a gamma of approx. 2.2 of sRGB before they send it over the cable.
Then every LCD monitor does another colour space conversion from sRGB to whatever colour space/gamma the panel really has.
I think NVidia should swallow their pride and get on the Freesync 2 train. Preferrably the features of Freesync 2 should become part of the DisplayPort standard like Freesync (1) already is ...
The big feature of AMD's Freesync 2 is not really refresh rate, but that there is only one colour space transform at the video card's side - directly to the colour space of the panel. That's HDR support without any loss of precision for going over the cable.
BTW, I've got a Kaby Lake CPU which should do HEVC decoding at 2160p/60 independently of the GPU. But I dunno if my software stack really supports it ...
So basically Freesync, specwise, is ahead of NVIDIA's Gsync? Is color space transformation THAT taxing on the GPU? Given the entrance of different color spaces with HDR, you would expect gpu manufactures just to build in a dedicated chip to do just that, like a "color coprocessor".
-
So basically Freesync, specwise, is ahead of NVIDIA's Gsync? Is color space transformation THAT taxing on the GPU? Given the entrance of different color spaces with HDR, you would expect gpu manufactures just to build in a dedicated chip to do just that, like a "color coprocessor".
Right now, Color transformation can't even be done..
Here's what happens.. UHD bluray data says, show this pixel with a 750nit value.. ALL of our 8bit monitors operated under the assumption that nothing goes over 400nit.
So what happens.. when they encode a large patch of clouds, all over 400nit.. it all returns the highest value.
This is the inherent incompatibility.
Right now, mpc-hc through madvr can compress the highlight 400nit + values, and return it to visible space on 8 bit panels.. However this process is dynamic and ends up dropping detail, because it's not smart enough to know what the person mastering the video wanted to show in that frame