Do you know that that CPI is achieved through positive interpolation, which is just about the same concept as upscaling pictures/videos, which means, it's not really as accurate as a native DPI would be.
<snip>
The max DPI of the older version of the AVAGO 9500, which is Steelseries Sensei's sensor, is 5700 DPIs. Anything above that will contain higher amounts of jitter, smoothing and maybe even input lag and is just not recommendable.
Fair call, this makes sense if it is achieving it through interpolation like you say. Perhaps the Sensei is using some of that fancy gimmicky onboard processor to compensate? I've got other mice that run at around that 5500-6500 DPI range (a Saitek and a Razer) and don't notice any jitter, or lag, whatsoever with the Sensei by comparison.
Thanks for the info though, I didn't really think to investigate the difference between CPI and DPI.
Basically, the dictionary meaning of each one, which doesn't makes any sense at first is:
CPI = Counts per Inch
DPI = Dots per Inch (dots as in the number of pixels on your screen!)
Your mouse sensor always works with Counts Per Inch, but those get translated later into Dots per Inch so that your computer may understand them and move the cursor accordingly.
So, in a 800 CPI mouse, if you move your mouse left for a single Inch the sensor will make 800 Counts, then those will be sent to your computer which will move the cursor 800 Dots (or Pixels if you prefer) towards left on the screen.
In the end, as many say, it's almost the same thing because they both have the same value and one always gets converted into the other, but Counts are related to the
sensor, while Dots are related to the
screen.
Also, it has no relationship whatsoever with accuracy. A good video that explains some of this, and even more, is this one, from a Logitech's Senior Engineer: