@NKRO
He isn't talking about Windows Pointer Speed but about Ingame Sensitivity which works differently in a 3D Engine like a FPS.
There the sensitivity settings determine how much your avatar turn for one count from the mouse.
In that enviroment there is no pixel limitation, the only limitations is the sammlest angle your avatar can turn but this angle is usually negligible small, for example ~0,0055° (360/65536) in the Q3 engine.
Exactly I never wrote anything about WPO, I assumed since almost every gamer out there uses 6/11 that is a given.
I'm going to say that DPI is something that will be debated ad-nauseum. Many people have put in their own theories or facts on DPI as have I if you look at my posts around here.
This is a post I made a while back, condensed with other links. What I wrote may not be necessarily correct it's not set in stone though I have ran into various forum and posters providing a very similar formula. So in a way, while I don't necessarily have a every single information correct and it's open to debate. The debates seem to center around similar understandings.
So basically DPI is cut into a number of different segments.
1. DPI is mathematically correlated. In other words like the link above correlate accuracy by showing that more DPI per inch is more accurate similar to using higher resolution shows a better image. i.e. in 16:10 a 1920x1200 looks significantly better than say 640x400. Of course sensitivity plays a role like lower sensitivity but everyone uses different sensitivity.
2. Finding the minimum amount of DPI needed for a given sensitivity, resolution, FoV, and m_yaw/pitch. In other words because the game treats pixels around the crosshair differently than those around your vision. You can find the minimum amount of DPI before you start skipping pixels as in the phoon calculator. i.e. You might be able to extend 400DPI to be useful at a resolution like 1920x1200 simply because if your sensitivity is low enough your minimum recommended DPI is below the DPI of the mouse. In a way treat DPI like CPI or counts per inch, instead of treating DPI as a pure speed modifier, you also need a certain amount to count over the pixels to your sensitivity. Think of it like DPI is a scanner not purely a speed modifier.
3. Setting the sensitivity to 1.0 and raising the DPI till you find the sensitivity you use. In other words instead of say 2.5 x 400dpi = 41cm/360, 1.0 x 1000dpi = 41cm/360. 2.5*400 = 1000 i.e. 1K DPI. This one doesn't make much sense because despite the fact that some games do have a sensitivity area were reaching closer to 1.0 means closer 1:1 movement until you reach 1.0 and anything below is supposedly wasted commands that are thrown away. It doesn't make much sense because there's also m_yaw/m_pitch modifying the sensitivity but it's quite possible that some of those games were coded with some sort of additional layer.
4. This is the one that probably requires the most intelligent mathematician and most people would have absolutely no clue. But basically calculate all the settings together and using mathematical formulas to find all the data necessary. So while people have come up with different calculations that show basic information literally calculate everything using advanced mathematics(angular, euclidean, pi, etc.etc.).