réponse assez complet :
"I assume you know what polarizers and polarized light are? If not, google it. LCDs work with a polarizing crystal layer over a polarizing sheet. The direction of polarity of the crystal layer is controlled electronically (LCD = liquid crystal display). When its polarity is oriented in the same direction as the polarizing sheet's, all of the backlight is let through and the pixel is white (well, 50% since the polarizing sheet only lets 50% of the light through). When its polarity is oriented perpendicular to the sheet's, it blocks all the light and the pixel is black. When it's oriented somewhere in between 0 and 90 degrees, some of the backlight is let through and you get greys.
Unfortunately the level of control of the polarizing crystals isn't quite enough to give 1024 orientations between 0 and 90 degrees. So panels use FRC to create in-between shades.. (Older panels couldn't even do 256 orientations., leading to 6-bit + FRC) . Using 8bit + FRC as an example, you want to produce 1024 shades but the panel can only orient the crystals to produce 256 shades. e.g. Shade 0, shade 4, shade 8, etc. If the pixel is supposed to display shade 401, but the panel can only produce 400 and 404, what you do is rapidly shift the pixel between 400 and 404.
You show 400 75% of the time, and 404 25% of the time (hence the F in FRC - frequency). This produces the illusion of the pixel showing shade 401. If it were showing shade 400 and 404 50% of the time, it'd produce the illusion of shade 402. If it were showing shade 400 25% of the time and 404 75% of the time, that would be shade 403.
The 6-bit + FRC panels were a bit problematic. Your eye can distinguish roughly 256 shades of red and blue (more for green). So this flickering between 6-bit colors was visible, especially if you're sensitive to the flickering of fluorescent lights. The pixels would appear to swim a little, especially in your peripheral vision (which is more sensitive to changes in brightness) or if you were moving your eyes around the screen (so the different brightnesses fell onto different photoreceptors in your eye, and your brain could see that the image was changing).
But I would expect it to be less of a problem with 10-bit panels. As I said, it's mostly the green where you can discern different shades with 8-bit color, and even that is just barely. The main reason to use 10-bit panels is because most modern camera equipment can record 10-bit or more. When you quickly convert that down to 8-bit for display, it can create biases which show up as slight banding.
The other reason for 10-bit is if the monitor is displaying a larger color space than sRGB. For example, if you're display in the Adobe RGB color space (which is about 40% bigger than sRGB), that's effectively stretching the difference between each color by 40%. Now 8-bit color isn't enough and banding is easily visible, so you need 10-bit. However, it looks like both of the monitors you've listed are limited to sRGB. So this shouldn't be a factor.
Since the vast majority of images and movies available are encoded with 8-bit color, they would display the same on both monitors. Unless you're working with photos or video shot with 10-bit or higher color depth, there really isn't any reason to prefer true 10-bit over 8-bit + FRC. And even then most people would be hard-pressed to see the difference."
source : http://www.tomshardware.co.uk/foru [...] t-frc.html