Another here for LED backlit - Specifically Samsung 8000 series I think. The 240Ghz technology whatsisit (supposedly- according to the salesman who seemed pretty well informed - it is the thing that is responsible for my staring at it agog with wonder and awe. I think its to do with the motion issue) is absolutely GORGEOUS. Stunning, really. That is what I decided would be next TV, but thats a ways off still.
Ah, good ole' salesman, trying to "help". Most refresh rate technologies are lies to the consumers, using motion interpolation. There is black frame insertion and the other one is the motion predictor which often gives the soap opera effect.
Black screen insertion adds a black screen in place of the real refresh i.e. 120hz or 1000/120 = 8.33ms, a black frame is added that gives an illusion of smoother motion by tricking your eyes with the black frame and when the next frame hits your eyes it looks sharp, clear, and new. Ironically this motion interpolation is preferred by most people because it mimics the CRT refresh rate in reverse. See a CRT image decays and so the image is refreshed and redrawn so it pretty much stays in your eyes fresh so it look good.
The next one is probably more true to being an interpolation than the previous. It literally cuts out parts of the frame and draws them ahead in a new made image. For example a soccer player kicking a ball, there would literally be a fake leg in between frames. People call it a soap opera because they are seeing the frames being created individually and are acting as anomalies which people can see. And it's ironic because most people will say the human eye can't see more than 30 frames per second or whatever it is that bull**** myth that keeps popping around. And what's more this motion interpolate adds a ****load of input lag.
Only the 3D TVs with shutters are real 120hz accepting a 120hz signal from a source but not all 3D TVs are real some use tricks for 3D.
Most TVs are still actually 60hz(60Hz signal/30FPS) with 240hz motion interpolation. See 60Hz is 1000/60 = 16.68 millisecond(rounded up), the image is redrawn 60 times a second and the millisecond is the pixel update time. This is different from response time which is the amount of time it takes a pixel to change state for example a CRT of good quality usually operates in the low microsecond even nanoseconds.
See most broadcast frame rates are 30 frames per second. So in essence, each frame is being redrawn twice because 30FPS is 1000/30 = 33.33ms so there is an additional update per unique frame.
The reason why 120, 240, 480 are used is because they are equal divisors of lower frame rates. If real then there is a big difference compared to motion interpolation. The sad thing is most people have never experienced higher refresh rates than 90, so many people are so used to 60Hz.
Let's take 240Hz for example. Let's say you had a television with real 240Hz; let's go ahead and say it's an OLED television with ultra-low microsecond response time.
Let's clear something up:
1. Response time is simply the amount of time it takes the pixels to change state. i.e. say a high-end CRT it's electron gun operates in 10-20 nanoseconds but the phosphor say takes 150-200 microseconds to change state, ignoring phosphor decay FYI.
2.Refresh rate is the amount of times an image is drawn and or redrawn and also dictates the pixels change state of a frame. i.e. 90FPS(1000/90 = 11.1ms) with say 90Hz, so every 11.1 millisecond the pixels change state or should I say every 11.1ms an image is drawn or refreshed(if we used 180Hz, twice as much as 90 then it would refresh at 5.57ms), while the pixels themselves are able to change state as fast as say 120 microseconds.
First 1000/240 = 4.68ms(rounded up), this means that every about 4 milliseconds a frame is drawn or refreshed.
Let's say your watching a blu-ray film at 24P(24 frames per second; progressive scan); 240/24 = 10. The first frame is a unique frame, the next 9 frames are refreshes of the original. 10 refreshes x 24 frames = 240. If we used 30/60 as common then it's 2 refreshes x 30 = 60.
The whole divisor thing also works in another way. Say you had 480 frames per second in a game but your monitor is 120hz. If you get 480/120 = 4, so in essence your seeing 120 frames per second of 1/4th tears of 480. So even though your not seeing 480 frames per second individually or complete, your still seeing 1/4th of 480 frames per second clustered into 120 complete frames. Which is why serious/hardcore/pro gamers don't use v-sync because the game's feeling still changes despite not having complete frames both the game and the feeling of the mouse.
So in essence refresh rate is not just a to show a single image but also acts as a divider to less or more frames. So with less an equal say 30 and 120, there is enough refresh rate to show a complete frame and thus the images are simply refreshed. But if you have more than the refresh rate the complete frames drawn by the GPU are simply cut up and you see tearing in game because your only seeing a partial frame.
But now if you add more frame rates your opening up a whole new can of worms. Because it just makes things so much better but that is more so to gaming rather than films as higher frame rates slows down the film due to the way they are shot.
If your wondering redrawing the image with say a 30FPS source and 120hz, the image is updated more often despite being lower frame rate. And thus there is a smoother transition in between frames. Though the transition is more smoother with actual individual frames 120 with 120, even with less and equal it's still benefits the movements.
Won't be going in for the whole 3D tv thing, though. I think there is very little that needs to be 3-D on tv, if anything. Anybody think that is gonna take off or what?
3D television is a good thing but not for obvious reasons.
First my view on 3D is that it is a gimmick, not fake. Just not my thing nor is it gonna be popular.
But it's a good thing because it pisses the living **** out of manufacturers because people complain that LCDs response time are too slow and causing issues. So it forces manufacturer to go to OLED or risk losing money. In fact most people don't realize just how much response time LCD monitors/televisions have they see 2-5 GtG, WtW, and BtB but what about the other colors. Those numbers are anywhere between 15-100+ms depending on color and that is on top of the 60Hz standard refresh rate so 16.68ms + response time.
It forces manufacturers to introduce higher refresh rates, for far too long refresh rates have been ignored. Even just a few refresh rates higher makes a difference, just going from 60 to 75 or 75 to 85 is noticeable. Though I'd say 90Hz should have been more standard because it's equal to the 30FPS when watching movies and clips online and whatnot.
It was only until last year 2009, that the first 120hz LCD monitors and later on in 2010 more LCD monitors and a few televisions with real refresh rates came out. It's like, really that goddamn long?. CRTs have had very high refresh rates in particular to high-end consumer CRTs even at respectable resolutions.
Do you know how many would have killed just to have say 90Hz on a 2560x1600 or 1920x1080/1200 monitor. Just a little extra bit that makes a difference.
A great example is this 1280x1024 Hanns-G LCD monitor I bought back in like '07 or so. Unfortunately it was a HORRIBLE TN panel but it had 85Hz over VGA. And it felt a lot smoother than LCD I had previously used. So just that little extra helped. It might have been an utterly horrible panel that had black levels so crushed you could run in front of me in a game that is dark and I wouldn't see you. But the refresh rate more than made up for it.
So the whole 3D is a last hurrah for the manufacturers to gouge people with LCDs before moving into OLED. LCDs have been manufactured for so long that moving into OLED is gonna cost them so much money so they'd rather eek out the last few dollars before people realize just how bad LCDs are.
Also I noticed a lot of talk on LEDs. I wanted to point something out to those who think LEDs are different. They aren't in fact some people believe LEDs are something new and don't realize it's just a backlight. Not just that but I've even read of salesman lying to consumers saying LED = OLED when it's not absolutely not.
LED is simply a backlight like CCFL. LEDs do not improve anything except power consumption and being thinner. The most common use is edge-lit which is the same **** as CCFL light.
LEDs don't improve the monitor images. In fact it can actually harm the image because the white point of the LEDs is usually around 9300K which makes the whites bluer rather than the more standard 6500K white point that is used in calibration for monitors.
The other method is local dimming. Now this method does make a difference and certainly helps a lot. But your not gonna see this on LED monitors and it's usually priced at a premium. But it's still not gonna be a complete solution.
But again LEDs aren't gonna help with image quality.
Remember they are still LCDs, they are still indirect lit. No matter what you do to improve the light it's still gonna bleed through. So no real blacks, it's gonna be grey and black crushed due to the inherent flaws in LCD panels.