Author Topic: DPI vs CPI: The real deal?  (Read 149762 times)

0 Members and 1 Guest are viewing this topic.

Offline timofonic

  • Thread Starter
  • Posts: 59
DPI vs CPI: The real deal?
« on: Wed, 02 June 2010, 23:33:35 »
Hello.

Anyone could explain properly CPI, DPI and PPI? Making it a wiki article could be very interesting for all us too.


CPI = Counts Per Inch
DPI = Dots per Inch


(from Mouse speed section of wikipedia article)
Quote
The computer industry often measures mouse sensitivity in terms of counts per inch (CPI), commonly expressed less correctly as dots per inch (DPI)*– the number of steps the mouse will report when it moves one inch.

In early mice, this specification was called pulses per inch (ppi).[13] If the default mouse-tracking condition involves moving the cursor by one screen-pixel or dot on-screen per reported step, then the CPI does equate to DPI: dots of cursor motion per inch of mouse motion. The CPI or DPI as reported by manufacturers depends on how they make the mouse; the higher the CPI, the faster the cursor moves with mouse movement. However, software can adjust the mouse sensitivity, making the cursor move faster or slower than its CPI. Current software can change the speed of the cursor dynamically, taking into account the mouse's absolute speed and the movement from the last stop-point. In most software[specify] this setting is named "speed", referring to "cursor precision".

However, some software[specify] names this setting "acceleration", but this term is in fact incorrect. The mouse acceleration, in the majority of mouse software, refers to the setting allowing the user to modify the cursor acceleration: the change in speed of the cursor over time while the mouse movement is constant.


For simple software, when the mouse starts to move, the software will count the number of "counts" received from the mouse and will move the cursor across the screen by that number of pixels (or multiplied by a rate factor, typically less than 1). The cursor will move slowly on the screen, having a good precision. When the movement of the mouse passes the value set for "threshold", the software will start to move the cursor more quickly, with a greater rate factor. Usually, the user can set the value of the second rate factor by changing the "acceleration" setting.
Operating systems sometimes apply acceleration, referred to as "ballistics", to the motion reported by the mouse. For example, versions of Windows prior to Windows XP doubled reported values above a configurable threshold, and then optionally doubled them again above a second configurable threshold. These doublings applied separately in the X and Y directions, resulting in very nonlinear response.

Starting with Windows XP and for many OS versions for Apple Macintosh, computers use a ballistics calculation that compensates for screen-resolution in a slightly different way, which affects the way the mouse feels.[citation needed] Ballistics are further affected by the choice of driver software.



(From a SteelSeries FAQ section)
Quote
DPI is an expression from the printing world and has nothing to do with mouse movement. DPI is meant to describe that for one inch of distance you move your mouse on any surface, the equivalent number of counts are sent to the PC - resulting in movement on your screen. CPI is the correct term for this as it actually is consistent with what you seek to describe with the abbreviation.



(Some chinglish document I found here
Quote
CPI refers to the positioning accuracy of the mouse, the unit is DPI or CPI, that mouse, each can move one inch maximum number of messages for accurate positioning.

DPI is the number of pixels per inch. CPI is the sampling rate per inch. DPI and CPI can be used to indicate the resolution of the mouse, but the DPI response is a static index, used in the printer or scanner seems more appropriate. As the mouse is a dynamic process, with the CPI to represent a more appropriate resolution of the mouse.

CPI / DPI buttons to adjust the mouse speed and scope. Radium-plated mouse resolution are two stalls.

An optical mouse is 500cpi/dpi-- 1000cpi/dpi. A laser mouse is 500cpi/dpi--1000cpi/dpi.


Here's an older document as reference too: http://www.epanorama.net/documents/pc/mouse.html
« Last Edit: Wed, 02 June 2010, 23:39:21 by timofonic »

Offline Bullveyr

  • Posts: 386
  • Location: Austria
DPI vs CPI: The real deal?
« Reply #1 on: Fri, 04 June 2010, 03:19:23 »
atm I'm too lazy for that and I guess my "articel" wouldn't really be "neutral" :D
« Last Edit: Fri, 04 June 2010, 03:22:04 by Bullveyr »
Quote from: ripster;185750
Mechanical switches are mechanical.

Offline Arc'xer

  • Posts: 482
DPI vs CPI: The real deal?
« Reply #2 on: Fri, 04 June 2010, 13:05:50 »
Steelseries doesn't know jack ****. They just do it despite selling mice with high DPI. Which means they are trying to hide behind a guise of knowledge when they don't.

The sheer fact is after doing extensive research on it. I've come up with my own ideas on the entire subject.

Quote
I apologize for this long, long post.

Something else is going on cause with your resolution 800 should be slow and 1800 more normal but not enough. Did you turn off mouse acceleration maybe that is the problem. If that's not the case then you need to get used it.

DPI is a hardware based sensitivity. It's better to raise DPI and compensate by lowering sensitivity to avoid software or more precisely in-game sensitivity control. Calculation though crude is Old DPI*old sensitivity/New DPI. Though it depends on the game, some of the older games get negative acceleration from higher DPI.

Say you used 800 and 1 sensitivity and you bump up to 1800. 800*1/1800= 0.44444444444444444444444444444444.

Also you speak of cursor precision. Despite what some people will say. I've spent tons of hours researching DPI and from what I've gathered higher DPI is more precision. First to attain 1:1 movement in a higher resolution i.e. pixel precision to current higher resolution for desktop and or in-game.

Second of all if we do some math and convert an inch into MM we get 25.4 or 2.54cm. 25.4/800=0.03175mm, in other words at 800 DPI your mouse is reading with in this range. 0.03175mm is half of what 400 DPI is, being 0.0635mm. 25.4/1800=0.014111111111111111111111111111111mm, 25.4/3500=0.0072571428571428571428571428571429mm; with 5700 it's 25.4/5700=0.0044561403508771929824561403508772mm. You might be asking yourself what does this all mean. Aside from becoming faster both in pixel movement per inch and when that measurement is read. It's more accurate because it's reading at smaller and smaller intervals of millimeters.

A more simpler approach [URL=http://hardforum.com/showpost.php?p=1035470202&postcount=20]http://hardforum.com/showpost.php?p=1035470202&postcount=20[/URL] and [URL=http://hardforum.com/showpost.php?p=1034229686&postcount=39]http://hardforum.com/showpost.php?p=1034229686&postcount=39[/URL]

Now your resolution is 1920x1200(I'm guessing you left out the 20). To have 1:1 movement i.e. DPI based on inch. You need 1920 on the X-axis and 1200 on the Y-axis.

I know some people will say it's an RTS precision doesn't matter just speed but remember you don't want to exert so much effort to move around that you suffer from fatigue. Plus if your good at noticing pixel skipping to achieve a speed of what high DPI will feel like, you usually set the in-game speed higher and end result is 450-800 is going to require pixel skipping. It doesn't matter whether you set windows to 6/11 it's going to reach the end somehow. So it'll do so by skipping or read all 1920x1200 in between 4-6 inches of movement, instead of the usual 1 inch with say a lower resolution and a mouse equating said resolution. Like for example 800x600 with an 800DPI mouse.

In other words would you rather have low DPI+higher sensitivity or high DPI+lower sensitivity. Lets say 400 and 2 vs 800 and 1, both feel exactly the same; the difference being is say you used 800x600 the 400 DPI you need to move 2 inches to complete 800 when set at say 1 while at 2 it reaches 800 but it does so by skipping.

Unfortunately both mice aren't all that suited for FPS gaming but for RTS I've read they alright quite popular with Asian gamers or those with smaller hands.

Sorry if I've gotten out of hand, I could probably go on and on and on with what I've read. I know I mostly speak out of an FPS gamer's perspective, but it doesn't hurt to translate some of the information to other games. I've done tons of research, just tons on DPI and mice. I know it's an RTS but still it doesn't hurt to know.
[/URL] FYI this post was a response to an RTS question but since SC2 has such bad mouse control ignore the RTS section and focus on an FPS aspect.

I wish I was a more eloquent typist but the sheer fact is. Most people have a negative understanding of DPI. In simplest terms they don't know jack ****, it reminds me of OCN.

Ripster said go to OCN. Don't they don't know anything about DPI same with their monitor section they don't know what a refresh rate is and or a response time.

If you want me to I can enhance on their subject as well. What I posted isn't a end all be all.

I wish I had the capability to test this on a variety of monitors, resolutions, games and as well as mice.

I think the worst aspect of FPS gaming is counter-strike. That game has placed so much ignorance on gamers it's not even funny. It seems that every single post about dislike of high DPI is from that ****ty game. I honestly don't even know myself why I play it.

Offline Arc'xer

  • Posts: 482
DPI vs CPI: The real deal?
« Reply #3 on: Fri, 04 June 2010, 13:19:42 »
DPI and CPI are both the same ****. In fact both are correct but both are different. Despite the printer like words they are correct in some sense considering the mouse is a sensor, reading the surface.

DPI is a dots per inch and CPI is counts per inch. Well here's the thing I've mentioned how there is sub-inch measurements in a read out.

(To keep things simple) For example 400DPI is as I mentioned 25.4/400= 0.0635mm.

What does this mean. DPI wise there are 400 pixels being counted in an inch based on the monitor either by reading directly(1:1) or skipping pixels. Measurement wise the counting of those pixels are based within 0.0635mm.

0.0635mm*400=25.4.

Edit: I reread this and I might be causing confusion. I'm not saying the pixels read are based on the monitors dot pitch. I'm saying that within the inch to equate 400 counts, the pixels are per 0.0635mm. Hench the 400*0.0635 equating 25.4.

In other words the smaller the measurement I presume it's in the ultra low millimeters maybe as far down as the micrometer, maybe nanometer. The faster the mouse becomes as it's reading pixels more often to try and equate 1 inch but it's also reading movements in smaller and smaller intervals of measurement, despite the speed increase.

For example say you use 400DPI and 10 sensitivity to have 1 inch/360. If you get a 4000DPI and set it to 1. It's the same feel but the 4000DPI is more accurate since it's reading 0.00635mm rather then 0.0635mm. Notice how 4000DPI is probably 2 magnitudes lower in measurement. A

AND your also eliminating a lot of software interpolate a.k.a in-game sensitivity; which is another plus.
« Last Edit: Fri, 04 June 2010, 17:57:58 by Arc'xer »

Offline Arc'xer

  • Posts: 482
DPI vs CPI: The real deal?
« Reply #4 on: Sat, 05 June 2010, 05:21:06 »
I should point something out that I left out, sensitivity itself.

I'm not sure what you are high/medium/low. But it doesn't matter what you can still achieve the desired distance CM/Inch with high DPI. BTW every thing I speak on is based on 6/11 movements i.e. 1:1 pixel movements.

I also want to embellish on a little dilemma. Some of the older games don't have direct input or a method of allowing the use of higher DPI so negative acceleration is there.

With direct input it's reduced or eliminated along with a higher resolution. Which some of the older games relying on X/Y overlay of the desktop will actually resist negative acceleration more so with higher resolution along with dinput.

It's not to say dinput is a bulletproof method. I can cause negative acceleration with 640x480, 1800DPI 0.5 sensitivity in say Wolf:ET even though it's direct input. Implementation of direct input also requires proper coding to allow more modern mice to take advantage of DPI more so with higher DPI. And maybe even requiring higher resolutions despite being direct input, especially with older direct input codes.

I'm aware of 3rd party tools like Rinput but I worry of those programs modifying the game and having some problem thinking it's a hack or cheating or something.

Generally small little guide I found:

Quote
-0-15 cm for a 360 degree rotation in game = Ultra/High sens
15-20 cm for a 360 degree rotation in game = Medium-high sens
20-25 cm for a 360 degree rotation in game = Medium sens
25-35 cm for a 360 degree rotation in game = Low sens
35+ cm for a 360 degree rotation in game = Very-low sens
75-100+ for a 360 degree rotation in game = Ultra-low sens

There is this little notion going around that high DPI is only for high sensitivity. It's not though it benefits them greatly by reducing software(in-game) interpolation. (Reminds me a bit of the idiots who refer to pixel response time as the refresh rate of their LCD monitors, not realizing that refresh rate is completely different from response time.)

You can have high DPI and a lower sensitivity. Most people on the forums about gaming and FPS and whatnot. Are so goddamn oblivious to this fact that some of them are not worth a grain of salt as a gamer or enthusiast to gaming. That's not to say you HAVE to use it but generally they either say something ignorant/stupid or they say some bull**** that the progamers(professional gamers) know best. When really most progamers don't know jack **** about computers, nor care, and play out of habit i.e. same resolution, mouse etc.etc.

The little OCN mouse guide calculator albeit crude and simplistic does a bang up jump in translating sensitivities

Quote
[(Current dpi) x (In-game sensitivity)] / (Maximum dpi) = (New Sensitivity for max dpi)

The reason why I mention crude and simplistic is it's not accounting for other factors like m_yaw/m_pitch etc.etc. though you can multiply say 0.022 yaw/pitch and find the sensitivity that will actually be.

The sheer fact is mice sensitivity is something that someone with very high levels of mathematics and understanding of in depth calculations needs to spend time researching and putting together a proper calculator.

But, anyways many like to follow the progamers and use 400 DPI 2-5 sensitivity. Let's use 2 since it's simpler, with the calculation 400*2=800, which I'd say is around 35-40cm mark if I'm not mistaken.

i.e. the mouse feels like an 800 DPI mouse but it's doing so by skipping pixels. Simplest solution is bump the DPI itself to 800.

Here comes problem it's twice as fast. Most people I've notice don't modify their sensitivity, they get used to it, which is wrong unless you want to.

So now your 400 and 2 is actually 800 and 2 or 1600. That means each inch corresponds to 1600 pixels worth of movement. If that person lowered their sensitivity to 1 with 800. If feels EXACTLY like 400 and 2 but more accurate due to both reading movements at smaller intervals and having more DPI to match the resolution and reducing in-game sensitivity or interpolation. Say for example 800x600. Lets bump it up 1600 a.k.a. 800*2=1600. 400*2 or (800*1)/1600= 0.5, so now your mouse feels exactly like 400 and 2 sensitivity or 800 and 1 sensitivity but your DPI is reading within pixels. Another doubling 3200 or 1600*2=3200 1600*0.5/3200=0.25, feels exactly the same as 400/2, 800/1, 1600/0.5 but you further reduced in-game sensitivity and reading pixels at much smaller intervals, 25.4/3200=0.0079375mm compared that to 25.4/400=0.0635mm. Lets max it out let's use the Xai 5001.

That extra one is Steelseries shooting themselves in the foot by saying guys DPI is gimmicky. 800/5001=0.15996800639872025594881023795241. So that huge long number feels exactly like 400/2, 800/1 etc.etc. but allows you to use the max DPI, same feel of movement 35-40cm/360. Full current max 5700, 800/5700=0.14035087719298245614035087719298.

http://hardforum.com/showpost.php?p=1035485485&postcount=28 and http://hardforum.com/showpost.php?p=1035487698&postcount=40

This post is interesting this is quite the high sensitivity gamer. 2inch/360, 1 inch/180.

He uses an 800 DPI mouse for that kind of sensitivity he is probably hitting some high numbers 8-15 area.

This post of his is from 2009 and if he is still using this monitor: AOC 2217V 22" 1680x1050.

Then he is skipping pixels. He mentioned that he bought a Lachesis which is the only 4000DPI max exact mouse to be released. The lachesis is a poor quality mouse with known issues, yeah he did waste his money on that mouse.

Lets make it a guesstimate he is using 800 and 8 sensitivity for 2"/360. I don't know what mouse he is using but I suspect it's some cheapo optical 800DPI you find all over the place.

Considering he is claw lets say he bought a G9X since it's a claw mouse. He would benefit from higher DPI because he is skipping pixels despite him hitting his targets he can make better shots.

In fact from that thread:
Quote
If i were to look really carefully, the crosshair really would skip a pixel, but that's within the guns margin of error. These days, bullets don't go straight anymore, even in single fire, the bullet will always skew to the side, maybe that's why i don't notice the difference in my game.

If that's his gameplay on those youtube videos he is skipping pixels like a mofo on those scopes. You can actually notice him firing to the point where he lets the game control the impacts in other words missing quite a bit and having that skipping visibly on the scope. Not just affect aim but basically luck shooting his targets to death.

If he had a G9X and wanted the precise 1:1 of his sensitivity X axis 1680, Y axis 1050 on desktop. In game he can either raise DPI higher or the max. Or let it ride at his resolution.

Math wise 800*8=6400. 6400/1680=3.8095238095238095238095238095238. Feels exactly the same as 800*8 but it's the same feel and movement with less interpolation.

Unfortunately as for the Y axis it won't be the same albeit to calculate shows how simplistic the OCN calculator is. Only if your using separate X/Y axis if you like same DPi then it doesn't matter. Not sure how it would work unless you modify the m_yaw/m_pitch, since most games don't have a separate X/Y sensitivity.

So that's something he has to work but for simple reasoning say 1680, that 3.X is his new sensitivity feeling exactly like 800*8. Say he uses 5700, 1.1228070175438596491228070175439. Notice how it removes a lot of interpolation pixel skipping but still retains a higher number for 2"/360 guess unlike the low sensitivity example.

Another interesting factor is this post: http://hardforum.com/showpost.php?p=1035486887&postcount=36 and , similar one in quake live forums http://www.quakelive.com/forum/showthread.php?t=38409

These two deal with not pixel accuracy based on a resolution of what you see. But literally finding the perfect resolution within a 360 degree sphere a.k.a. the entire field of view of a character.

But I think these sensitivities are for those with medium/high/ultra-high sensitivities. From the looks of it.

So what they do is calculate pixel perfect accuracy of the field of view with the entire pixel coverage of your monitor i.e. same as the DPI calculator but going further into also extending into the entire 180/360/720etc.etc. factor.

Really I think that DPI/CPI is not as simple as some make it out to believe. I really think we need some mathematical prodigies and geniuses maybe so far as to say the best and brightest of their field to study this. Because it's not as simple as this is my sensitivity/DPI let's role.

Some people spend a long time trying to find the perfect or best sensitivity low/medium/high sens doesn't matter. There are people out there who refuse to become full of bliss from ignorance. So yeah in a way knowledge is depression but there is truth that lies in the find.
« Last Edit: Sat, 05 June 2010, 05:26:45 by Arc'xer »

Offline Rajagra

  • Posts: 1930
DPI vs CPI: The real deal?
« Reply #5 on: Sat, 05 June 2010, 05:54:57 »
Quote from: Arc'xer;189871
Really I think that DPI/CPI is not as simple as some make it out to believe.


But it should be. Any given mouse is capable of differentiating a certain number of different positions per inch. That is its native resolution. Everything else is just multipliers to achieve the desired response. Smoke and mirrors.

Even when you run a mouse at less than its maximum dpi, it probably achieves this internally using a divider before sending the info to the computer. (Losing information in the process.)

Many of the issues you mention are caused by the numerous stages of getting info from the mouse to the final app (interface protocol, driver software, Direct Input, Windows settings, game settings.) Each link introduces the possibility of data getting clipped/quantized/scaled. The lack of simplicity is not caused by the DPI question itself, it is caused by how the data is handled.

Offline obsolete

  • Posts: 19
DPI vs CPI: The real deal?
« Reply #6 on: Sun, 06 June 2010, 13:13:26 »
i think i might be able to help out
im not a mathematician or know much about optics

but i have played fps' at an invite level (could say pro, but its not a job for $)

so ya heres how i see it:

DPI/CPI/PPI are the same. PPI was the original term, not really used anymore. counts per inch (CPI), is more descriptive/correct than dots per inch (DPI), and thats all SteelSeries was saying.

so DPI is an incorrect term for CPI and technically means nothing but everyone calls CPI DPI because of marketing. so ill call it DPI too

DPI is how many positions are recorded in every inch traveled by the mouse  and the increase of speed on screen with the increase of DPI is a side effect.

So like Arc said if you half your sensitivity and double your DPI it will be more accurate. I dont think all those decimal numbers were necessary to prove something so simple.
inches/dots
1/2 = 0.5
1/4 = 0.25
2 DPI means you move the mouse half an inch before the cursor moves
4 DPI means you move a quarter of an inch which is more accurate

okay so more DPI is definitely better?
no not really.

because your screen is made out of pixels theres a certain point where more DPI becomes redundant. the screen only has so many dots, you could say.

you want 1 dot to convert to 1 pixel when everything is done, to be as accurate as possible. so you never skip a pixel

calculating how much DPI is needed can be hard
for a video game you could do it like this

360 / fov * res / "per360 = DPI

fov ingame field of view
res horizontal screen resolution
"per360 inches per 360* turn ingame
DPI dots per inch for perfect pixel accuracy

so lets say
90* fov
800x600
10" to do a 360

360/90*800/10 = 320 DPI needed in this example.

now in counterstrike 1.6 most people use low sensitivity and low resolution.
thats why they say high DPI is worthless, and its true in these conditions.
actually its worse than useless, its just bad. the way the source engine handles mouse movement is while youre playing the cursor is moving in the background. everytime it hits the edge of the screen it is re-centered and this causes slight negative acceleration in game. the higher your DPI the more this happens.

its pretty obvious why they dislike high DPI and whos ignorant, just sayin
« Last Edit: Mon, 01 August 2011, 06:07:30 by obsolete »

Offline Rajagra

  • Posts: 1930
DPI vs CPI: The real deal?
« Reply #7 on: Sun, 06 June 2010, 14:02:02 »
Quote from: obsolete;190259
the way the source engine handles mouse movement is while youre playing the cursor is moving in the background. everytime it hits the edge of the screen it is re-centered and this causes slight negative acceleration in game. the higher your DPI the more this happens.


This is why I only use 64-bit mice, they have a much bigger 'virtual screen' area so suffer from that problem less.

Just kidding. I do think 'negative acceleration' is an unfortunate way to describe the problem It could imply other kinds of failure (e.g. simply exceeding the ability of the sensor, so it slows down as you move faster.) But I guess the phrase is here to stay.

Offline Arc'xer

  • Posts: 482
DPI vs CPI: The real deal?
« Reply #8 on: Sun, 06 June 2010, 14:17:50 »
Counter-strike 1.6 and Source are non-direct input games. They use an X/Y overlay based on your desktop. Which recenter itself within the middle of the desktop every time it hits the edge. Higher resolutions can fix this but since Counter-strike sucks it's recoil is affected by the higher resolutions at least 1.6 since lower resolutions reduce recoil.

IMO 1.1 is the worst piece of **** mouse I've ever laid my hands on, along with the other microsoft basic mouse since they all use the same engine. In fact they've actually reduced the quality in material over time, there's whole threads of people saying the Microsoft basic mice(WMO, IMO, IME) coming out with better quality years ago than now, even year by year changes.

I don't even know why people use that mouse I can hit the max tracking so easy it's not even funny it can hardly handle movements faster than 25cm, past 28 it's dead in the water. I liked the 9000FPS it feels smoother at 125hz polling rate than the deathadder at 125hz but it just stops tracking like no ones business. I believe with my sensitivity it's hitting 1.8-2.5m tracking speed just before it hits the end of my mouse pad, 48cm/360.

Your calculations aren't showing the sensitivity you use. DPI is what's minimum with your calculations. 10" is around 3.5-4 sensitivity so what your basically saying is your using 400DPI mouse with 800x600 a resolution that basically means at 6/11 your 2" for end points.

So assuming that; 2 inches of your sensitivity is based on interpolation directly by either reading 400 beginning to half to end, or skipping pixels from beginning to end i.e. 400DPI with 1:2 interpolation.

But the same can be said by your in game sensitivity 3.5-4, assuming that's correct. So first your resolution is higher than your DPI by twice as much on the x-axis and y-axis by 1.5. And second every time you move in game your skipping 3.5-4 pixels to aim at your target.

I will say this I'm ignoring the m_yaw/m_pitch of 0.022. Even though I know that should be calculated.

Say you get an 800DPI mouse you can cut your in-game sensitivity in half. 800DPI 1.75-2 sensitivity. If you get a 1600 DPI mouse well unfortunately you can't set the DPI below 1 in 1.6 but you can set the sensitivity to 1(though it'll feel like 4) but in other games well you can. It'll all feel the same but your matching the pixel of your resolution and adding additional accuracy by reducing software interpolation with the same feel as your 10"/360. And your matching your screen's pixel count at least on the x-axis returning you back to 1:1. Though it doesn't hurt to have higher as you can just lower your sensitivity further and still maintain the same feel. While with higher software(in-game) sensitivity you are forced to skip pixels to achieve that distance.

The calculations that I've posted all that I've read and spent time researching. Are calculations to show the accuracy of higher DPI. Instead of saying 1/400 or 400th of an inch, I'd say 0.0635mm but 800 it's 0.03175mm, 25.4(1 inch) / XXXX.

What I'm trying to show is with higher DPI your gaining accuracy by reading at far smaller intervals a long with lowering sensitivity so that it feels exactly like your old but your not only more accurate especially with higher resolutions but your reducing interpolation from sensitivities and reading at sub-pixel resolutions, even if the game has it's own sensitivity, your resolution still affects it.

As for negative acceleration. From what I've gathered higher frame rates, resolution, and refresh rate help reduce or eliminate it. Even direct input games have negative acceleration with high DPI. Though I believe it's my problem.

WolfET is a great example uses in_mouse 1 as stock. And for some reason a game that is almost 8 years old plays like **** on my computer so no constant 125/250FPS if you know anything about the quake engines and the benefit of higher frame rates. so even at 640x480 1800DPI 0.4-0.6 sensitivity it still gets massive negative acceleration.

But it's my fault fluctuating frame rate and whatnot ultra low resolution. Direct input is not a miracle but it helps. I would upgrade my computer but don't have the money for it now. Could also be Vista 64 itself, I recall some people on other forums with XP and very good computers running Wolf:ET at 125/250/333 constant but with Vista it would hardly get 80ish FPS, in other words better frame rates in CoD4 than Wolf. So either it's a OS problem or something is going on.

When my G500 was working I did test 5700DPI in CoD4 but a problem came up either frame rate wise or there is a cap on how much dinput works. I got negative acceleration, not really negative but there's a certain point where the game's sensitivity is so low all of a sudden the sensitivity feels like someone turned up the gravity. So it could either be my own fault or the game's fault. I think it's my own fault really. That sensitivity is somewhere in the 0.08-0.05 area, if I remember.
« Last Edit: Sun, 06 June 2010, 14:25:49 by Arc'xer »

Offline kishy

  • Posts: 1576
  • Location: Windsor, ON Canada
  • Eye Bee M
    • http://kishy.ca/
DPI vs CPI: The real deal?
« Reply #9 on: Sun, 06 June 2010, 14:25:58 »
Quote from: ripster;190284
Wonder what the DPI is on Unicomp Endurapro and IBM M13 trackpoints?

Probably around 12.  Those suckers are unusable with modern monitors.


My M13 is usable on my 20" 1680x1050 LCD.

Wouldn't wanna play FPS games with it...but it's usable, and tolerable.
Enthusiast of springs which buckle noisily: my keyboards
Want to learn about the Kishsaver?
kishy.ca

Offline obsolete

  • Posts: 19
DPI vs CPI: The real deal?
« Reply #10 on: Mon, 07 June 2010, 03:28:18 »
We understand that higher DPI is more accurate. but like I said you cant be more accurate than the number of pixels there are on your screen.
theres a certain number of DPI and above that it becomes useless, the number depends on your setup

When you use the number of inches it takes for you to do a 360" that takes into account the sensitivity. assumed 6/11 windows or direct input.
m_yaw/pitch @ default 0.022 shouldnt affect anything, i think it keeps your mouse the same speed as in windows.

My logic was that if your fov is 90* then its 4 screens to do a 360, so you multiply your resolution by 4 (taking the horizontal distance because we're calculating based on the distance to do a horizontal 360) that should give the number of pixels which you then divide by the number of inches to do a 360 and that gives you pixels/dots in every inch aka DPI

for your desktop you need the same amount of DPI as pixels. however it also has the side effect of high DPI, the mouse speed will be too fast making precision difficult. turning down the windows sensitivity is known to cause many problems.
so i guess you have to choose from skipping pixels, having a super fast cursor, or messing with windows sens (possibly causing the mouse to move faster in one direction than the other, or skipping pixels once again)
« Last Edit: Mon, 07 June 2010, 05:17:49 by obsolete »

Offline obsolete

  • Posts: 19
DPI vs CPI: The real deal?
« Reply #11 on: Mon, 07 June 2010, 03:32:59 »
now as for the imo 1.1 being a 'piece of ****' as you eloquently put it, sure thats true in many ways. The normal version is slippery, the scroll wheel is a buggy, it has 125 hz and negative accel at that polling rate, has rubber mouse feet, and it only costs $20.

I own 10 or 15 gaming mice, and use the cheapest worst oldest optical junk.
why
Its the lightest, the sensor is reliable, mounted in the center, has a low liftoff distance and no angle snapping.

You can fix all of its problems, overlclock the USB port to 500hz gives it faster max tracking speed and gets rid of neg. accel. Gloss paint or a special version to fix coating, teflon mouse feet, tighten scroll wheel and youre done.

and one more thing, despite the imo 1.1 having the highest frames per second of any sensor (9000) that number means nothing since its limited by the hz rate of the USB which maxes out overclocked at 1000.
« Last Edit: Mon, 07 June 2010, 03:36:27 by obsolete »

Offline Bullveyr

  • Posts: 386
  • Location: Austria
DPI vs CPI: The real deal?
« Reply #12 on: Mon, 07 June 2010, 04:57:50 »
@obsolete

Your formula is too simple, or to put it in other words: not correct.

R = ( pi * W ) / ( I * tan[ F / 2 ] )

where
W = screen resolution width
I = real sensitivity (distance per 360 turn)
F = horizontal fov

R = mouse resolution required

That would be the correct formula to calculate the min. CPI for "pixel perfect aiming" but because it will give you a lower number you can use the simplified one (doesn't need a calculator :D) and still be one the safe side.

Quote
and one more thing, despite the imo 1.1 having the highest frames per second of any sensor (9000) that number means nothing since its limited by the hz rate of the USB which maxes out overclocked at 1000.

No, A9500 has higher FPS, so does the Cypress OviationONS Sensor, but the Cypress isn't comparable, but that's not the real point.
FPS of the sensor and the polling rate have nothing to do with each other because the FPS is important for the tracking of the mice and not for the smoothness or responsiveness.

@Arc'xer

There is no interpolation from the ingame sensitivity.

The thing is, all you need is enough CPI, you don't really benefit from more, the smallest thing you can is you crosshair moving 1 pixels. The possibility to "aim between pixels" means nothing in practical gaming, especially with higher resolutions.

You can also look what one count from the mouse really means in a 3D Shooter enviroment, for example my settings:

450 CPI (--> no ExactSense interpolation on my Xai) and 35cm/360° (+ 1680*1050 but that doesn't matter)

At a distance of 40m 1 count makes a difference of around 4cm, sounds pretty and enough accurate to me?

In a perfect world a certain amount of movement would always represent your crosshair moving 1 pixel but that isn't really achievable for different reasons.

You should also consider one thing: Mouse Sensors have the tendency to become less accurate at higher CPI, that's what a former R&D Manager from SteelSeries told me.
So they send more movement data but also more wrong data, ofc that gets compensated to some degree by the lower ingame sensitivity.

Did you use your IMO 1.1 with a higher pollingrate or with the stock 125Hz?

Quote
As for negative acceleration. From what I've gathered higher frame rates, resolution, and refresh rate help reduce or eliminate it.


Yep, higher resoltion means that the cursor (on the desktop in the background) needs longer to hit the edge and FPS (among many other things) affect the intervals in which the cursor gets repositioned to the center.

W:ET doesn't use DirectInput but you can force it with mforce for example.

I never heard of some getting busted by PB by using rinput or mforce.
You could also just lower your CPI, as I have just explained :D, allthough that might not be the best choice if you use a Dathadder (does it still have much lower max. ips at lower CPI settings?).
Many people use the DA with 3/11 win pointer speed.

Final word: Doesn't really matter if you call it CPI, DPI or whatever but CPI is the correct term which is used by all the sensor manufacturers.


Quote from: ripster;190284
Wonder what the DPI is on Unicomp Endurapro and IBM M13 trackpoints?

Probably around 12.  Those suckers are unusable with modern monitors.

You could try Enotus Mouse Test to find it out. Mouse Movement Recorder should also work.
Quote from: ripster;185750
Mechanical switches are mechanical.

Offline obsolete

  • Posts: 19
DPI vs CPI: The real deal?
« Reply #13 on: Mon, 07 June 2010, 06:13:16 »
gotta remember im looking at things more practically (for shooting games) which is why id say the simple formula is the right one to use.
its not perfect but its a rough calculation anyway. you cant measure 360 distance exactly, and if you calculate it its very likely to be wrong because the mouse wont have the exact DPI listed. because of the mouse DPI being on the safe side is a good thing.

i think FPS would be important for responsiveness, its just every sensor already has over 1000. good to know it affects tracking
too bad every mouse with the A9500 has tracking problems
im wrong on the highest FPS, my excuse is i dont really consider lasers as sensors :)

but ya i agree with everything Bullveyr said
« Last Edit: Mon, 07 June 2010, 07:18:31 by obsolete »

Offline Bullveyr

  • Posts: 386
  • Location: Austria
DPI vs CPI: The real deal?
« Reply #14 on: Mon, 07 June 2010, 07:55:52 »
Quote from: obsolete;190506

i think FPS would be important for responsiveness, its just every sensor already has over 1000

They did right from the start, FPS way under 1000 would have crap performance (low CPI and max. ips).

Quote
im wrong on the highest FPS, my excuse is i dont really consider lasers as sensors :)

The BlueTrack Sensor in the Sidewinder X8 goes up to 13.000 FPS: :D
Quote from: ripster;185750
Mechanical switches are mechanical.

Offline Arc'xer

  • Posts: 482
DPI vs CPI: The real deal?
« Reply #15 on: Mon, 07 June 2010, 11:56:58 »
Quote from: Bullveyr;190499
There is no interpolation from the ingame sensitivity.

The thing is, all you need is enough CPI, you don't really benefit from more, the smallest thing you can is you crosshair moving 1 pixels. The possibility to "aim between pixels" means nothing in practical gaming, especially with higher resolutions.


How does having more hurt? I remember reading that minimum DPI about you only need the minimum or a certain minimum but at the same time people mentioned higher doesn't hurt but lower does.

So the question then becomes if DPI is a linear-hardware modifier of sensitivity. While software is a non-linear modifier(due to the fact you can choose many numbers individual, double-digit, decimal). Why would it hurt having more by compensating and adjusting the software sensitivity to make it feel like your prior use.

Ignoring the negatives of a few games. I don't get how more with compensation hurts, if the game is programmed well enough.

All this bandwidth clipping and whatnot makes it sound like it's 1990, it's 2010.

Quote from: Bullveyr;190499
You can also look what one count from the mouse really means in a 3D Shooter enviroment, for example my settings:

450 CPI (--> no ExactSense interpolation on my Xai) and 35cm/360° (+ 1680*1050 but that doesn't matter)


Doesn't matter?

Why bother playing with the monitor on if it doesn't matter. Unless your playing some older game, I just see you throw away a 90 dollar mouse.

Quote from: Bullveyr;190499
At a distance of 40m 1 count makes a difference of around 4cm, sounds pretty and enough accurate to me?


Wait a minute let me get this straight. At 40 meters which is practically nothing in real-life it's almost point-blank if you think about it, your telling me every 10 meters your deviating at 10mm or 1 cm per 10 meters.

So if you ever play a game with hundreds of meters of distance at 400 meters your trying to aim with a deviation of 40cm or 400mm.

So in other words if you aim at a target there's a chance he's inside one of your blind spots, where the mouse has difficulty providing the accuracy you desire. That doesn't sound accurate enough.

Offline Bullveyr

  • Posts: 386
  • Location: Austria
DPI vs CPI: The real deal?
« Reply #16 on: Mon, 07 June 2010, 17:23:04 »
Quote from: Arc'xer;190537
How does having more hurt? I remember reading that minimum DPI about you only need the minimum or a certain minimum but at the same time people mentioned higher doesn't hurt but lower does.

I never said that higher CPI hurts in general apart from neg. accel. with older games (like you experience in W:ET) and the "technical inaccuracy at high CPI), but that you don't benefit in real world gaming from more than a certain amount of CPI.

It's all about that we don't need more CPI in most cases and the sensor manufacturers should concentrate on other things.
Also people shouldn't care that much about CPI when looking for a new mouse.

Quote
So the question then becomes if DPI is a linear-hardware modifier of sensitivity. While software is a non-linear modifier(due to the fact you can choose many numbers individual, double-digit, decimal). Why would it hurt having more by compensating and adjusting the software sensitivity to make it feel like your prior use.

Ignoring the negatives of a few games. I don't get how more with compensation hurts, if the game is programmed well enough.

Ingame Sensitivity is still linear, just with way smaller steps.

I'm not saying that people should decrease their CPI because it's better, allthough it most likely is in your W:ET case :wink:, but that they don't really benefit from raising it or even buying a mouse with more CPI.

Quote
Doesn't matter?

Why bother playing with the monitor on if it doesn't matter. Unless your playing some older game, I just see you throw away a 90 dollar mouse.

It doesn't matter for the calculation provided because the game doesn't care about my resolution or display.

I'm not throwing away anything and even if it would be a free mouse. :D

Quote
Wait a minute let me get this straight. At 40 meters which is practically nothing in real-life it's almost point-blank if you think about it, your telling me every 10 meters your deviating at 10mm or 1 cm per 10 meters.

So if you ever play a game with hundreds of meters of distance at 400 meters your trying to aim with a deviation of 40cm or 400mm.

So in other words if you aim at a target there's a chance he's inside one of your blind spots, where the mouse has difficulty providing the accuracy you desire. That doesn't sound accurate enough.

Just to be on the same side (dunno if you ever played RtCW but I couldn't find one from W:ET and I'm no mapper, so I can't measure one for myself).

That is supposed to be around 25m , allthough I dunno if it's a guess or if they actually measured it).





Source

In RtCW or W:ET a lot of the fighting is done under 40m.

This doesn't really apply on all the fancy modern shooter with their crappy Iron Sights. With those games you can't hit **** from the hip at 40m and with the Iron Sights the calculation would be different (zoom and changed sensitivity).

As mentioned before with my settings I can aim at every pixel, which are actually pretty small on a 20", or in other words I can hit everything I see.

With my settings (1680*1050 and FOV=120) the center pixel on the screen represents a square of around 6*6cm at a distance of 40m.
At 400m that's a square of 60*60cm, so you might not even see a person (thinner than 60cm) at that distance, depending on how the game handels such a situation.
In the best case scenario the person would be 1*3 pixels and would be able to set my crosshair on him.

Nevertheless in a game with such distances you would have a zoom option and from the hip your weapon wouldn't be accurate enough.

It doesn't help me that I could theoretically (with higher CPI) hit him in the left eye, right eye, belly button, a certain finger etc. if I can only aim with my crosshair at one of the 3 pixels.
I hope you get what I want to say. ;)
Quote from: ripster;185750
Mechanical switches are mechanical.

Offline obsolete

  • Posts: 19
DPI vs CPI: The real deal?
« Reply #17 on: Mon, 07 June 2010, 22:09:01 »
arc,

there are some reasons for lower DPI,
-mainly the negative accel that you cant just 'ignore'
considering at least three of the most popular games today have this (CS, CSS, TF2)
-moving in your desktop or game menus is annoying with high DPI
-sensors tend to send a higher % of incorrect reports at higher DPI (does this matter enough to make a difference? maybe not)

now if I use the optimal setup for most games
-CRT monitor with high rates
-Low sensitivity

what reason do I have to use high DPI?
« Last Edit: Mon, 07 June 2010, 22:13:36 by obsolete »

Offline Arc'xer

  • Posts: 482
DPI vs CPI: The real deal?
« Reply #18 on: Tue, 08 June 2010, 00:09:02 »
Quote from: obsolete;190770
arc,

there are some reasons for lower DPI,
-mainly the negative accel that you cant just 'ignore'
considering at least three of the most popular games today have this (CS, CSS, TF2)
-moving in your desktop or game menus is annoying with high DPI
-sensors tend to send a higher % of incorrect reports at higher DPI (does this matter enough to make a difference? maybe not)

now if I use the optimal setup for most games
-CRT monitor with high rates
-Low sensitivity

what reason do I have to use high DPI?


Does it look like I don't already know that. Why is it you keep saying I don't know that.

I honestly don't care about those games, it's amazing that those piece of **** games are even popular. I mean for ****s sake counter-strike the worst FPS ever created. How in the hell does a person come up with a game were a gun can miss an entire magazine of rounds with the barrel literally touching the person.  It's a piece of **** over-relying on headshot stupidity, a headshot is a given not a must. If I ran around trying to shoot people in the head in the military, I'd probably get be demoted to soup kitchen because of all the wasted rounds I fired.

I honestly hate valve games their engines feel so nasty and sloppy. It's an embarrassment that they twist the quake engine into such a disgusting mutant. I don't know what the hell is wrong with the half-life engines but there is something seriously wrong with that series of engines.

I honestly don't know why I ever wasted my money on that game. The list of bull**** that goes on with that game is so beyond belief. And then when they make a second version of it you expect to fix the problems but no it's the same thing all over again, same problems, same effects.

And I don't play WolfET it's dead been dead and dying for years.

What reason? What reason? After all that, I wrote you have to ask what reason.

I don't care anymore. I posted my own examinations of extended research and I see the same bull**** that occurs in other forums with the whining of DPI and this that. I'm gonna end it here I'm not gonna drag this **** any longer.

Offline cndrmn

  • Posts: 1
DPI vs CPI: The real deal?
« Reply #19 on: Mon, 23 August 2010, 04:56:14 »
Quote from: Bullveyr;190499
You could try Enotus Mouse Test to find it out. Mouse Movement Recorder should also work.


I just wanted to add that the first link produces a virus warning for me.

Also I was very sad to read that the WMO is apparently a "piece of ****" :( it is my favourite mouse

Offline Bullveyr

  • Posts: 386
  • Location: Austria
DPI vs CPI: The real deal?
« Reply #20 on: Tue, 24 August 2010, 08:36:24 »
False positiv, at least I never head a problem despite the warning.

Who said the WMO is a piece of ****?
Quote from: ripster;185750
Mechanical switches are mechanical.

Offline EverythingIBM

  • Posts: 1269
DPI vs CPI: The real deal?
« Reply #21 on: Tue, 24 August 2010, 13:16:37 »
Ch_123's and my responses seem to be missing.

Well I wanted to say I thought the scrollpoint pro WAS 800 DPI, but the ones listed at lenovo's site say they were discontinued in 2001 -- I bought mine in 2008. Plus they never had a picture of the later model with the glowing stick.

So, mine very well could be 800 DPI, and older ones 400 DPI?
Keyboards: '86 M, M5-2, M13, SSK, F AT, F XT

Offline Glymbol

  • Posts: 4
DPI vs CPI: The real deal?
« Reply #22 on: Sun, 12 September 2010, 08:33:40 »
Quote from: Arc'xer;190288
Counter-strike 1.6 and Source are non-direct input games. They use an X/Y overlay based on your desktop. Which recenter itself within the middle of the desktop every time it hits the edge.

Actually it recenters every frame, so FPS also affects negative acceleration. Here's formula for maximum horizontal speed (on mousepad) in CS 1.6. It should be true for other X/Y overlay based games:
Code: [Select]
max_horizontal_speed = (resolution_H * FPS * 2,54) / (2 * DPI * 100)  [m/s]Just set 640x480, lower your FPS (fps_max 50), use 1600 CPI and you will notice massive negative acceleration ingame. max_horizontal_speed for this settings will be 0.25 m/s.

Quote from: obsolete;190487
We understand that higher DPI is more accurate. but like I said you cant be more accurate than the number of pixels there are on your screen.

Yes you can. DPI has nothing to do with precision in CS 1.6 (but with speed). There's no "pixel skipping" because it is possible only in 2D environment. In 3D you rotate in two axises, the 3D scene is generated every frame and presented on 2D screen. I know it's sounds funny, but actually you can turn by angle smaller than "1 pixel" and still see something changed even with 640x480 res. Using "simplified CPI formula": 4 * 640 = 2560 pixels.

For example in CS 1.6 the smallest angle of horizontal turn is m_yaw * sensitivity. You can lower m_yaw under 0.022 but not m_pitch so let's stay with default 0.022 value, otherwise you will end with different speed X/Y. The smaller the angle the better precision, so sensitivity 1.0 would be the best setting.
You have 360° / (0.022° * 1.0) = 16363 possible positions to aim in full turn. As you see its much more than 2560 yet turning by 0.022° still changes the view.

Of course with 400 CPI mouse and above settings turning will be extremely slow (~104 cm/360°). You set sensitivity to 2.0 to have sensible speed, but now the smallest angle is two times larger 0.044° and you have "only" 8181 possible aiming positions. However if you use 800 CPI mouse you can stay with sensitivity 1.0 and speed will remain the same.

The question is: what angle is small enough to be able to aim on target at maximum distance. I belive in CS 1.6 everything up to sensitivity 3.0 is fine. If this is too slow you should use higher CPI mouse and lower sensitivity accordingly.

IMO 1.1 has negative acceleration on it's own, and even with 500Hz polling rate it's noticable for me. Of course you can live with that if you really don't like angle snapping. Otherwise mouse with Avago ADNS-3060 would be better choice.
« Last Edit: Sun, 12 September 2010, 10:31:43 by Glymbol »

Offline vicariouscheese

  • Posts: 56
DPI vs CPI: The real deal?
« Reply #23 on: Wed, 22 September 2010, 21:58:14 »
there is no perfect mouse, everything has issues.  ms typically has negative accel, mx518 has prediction, deathadder skips ridiculously on certain surfaces, etc etc.  and these are the ones pro gamers use :P

whoever said wmo is crap... how come its probably the most widely used mouse by professional gamers? not counting those that have to use hardware from their sponsors, obviously they use what theyre required to.  high dpi and newest laser tracking may be better on the tech sheet, but there are barriers that keep the top players from using them (plus once youre used to certain hardware, people dont change it unless they have to for the most part)

to the dude who posted the tf2 jump vid... are you wonderland?  props if you are ;)  im a tf2 player, 400dpi, 3.0 in game.  nowhere near as good a rocket jumper as that guy though~

Offline Glymbol

  • Posts: 4
DPI vs CPI: The real deal?
« Reply #24 on: Wed, 29 September 2010, 17:37:13 »