Is there any way to retrieve precise DPI on Android devices (not predefined ldpi, mdpi, hdpi, xhdpi, xxhdpi, xxxhdpi), like you can retrieve precise DPI on iOS?
what problem are you trying to solve?
I’m working on application for schoolbooks and trying to make some nice tools like milimeter ruler, protractor… so users could measure real phisical things with it on there tablets.
I know I could make option to calibrate tools, but it would be nice feature to avoid it and have it exact without calibration.
OK, well … bad news
there is no exact DPI under Android
this post explains it well: Android Screen Density Inaccuracies
by quoting Dianne Hackborn (Android framework engineer)
so here the two quotes
The density and densityDpi is an abstract density bucket the device manufacturer has decided makes sense for their UI to run in. This is what is used to evaluate things like “dp” units and select and scale bitmaps from resources.
The xdpi and ydpi are supposed to be the real DPI of the screen… though as you’ve seen, many devices don’t set it correctly. This is our fault, it isn’t actually used anywhere in the platform, so people don’t realize they have a bad value, and we haven’t had a CTS test to try to make sure it is sane (it’s not clear how that test should work). Worse, we shipping the original Droid with a graphics driver that reports the wrong value here… in fact that reported that same damnable 96.
Unfortunately, I don’t have a good solution if you want to get the real exactly screen dots per inch. One thing you could do is compare xdpi/ydpi with densityDpi and if they are significantly far apart, assume the values are bad and just fall back on densityDpi as an approximation. Be careful on this, because a correctly working device may have densityDpi fairly different than the real dpi — for example the Samsung TAB uses high density even though its screen’s really density is a fair amount lower than 240.
Android framework engineer
Well a device is more likely to have a density that is higher than its real dpi rather than lower; making it higher makes things UI larger and thus more readable and usable, while making it smaller quickly makes the UI so small that it doesn’t work. I wouldn’t want to go any lower than what you’d see on say the G1, which is 180dpi and uses the mdpi (160) density. At any rate, the whole point of density is that it is not directly tied to the real screen dpi. It is quantized, so that there are a limited number of densities applications need to deal with. Manufacturers have some flexibility in picking it based on the feel they want for their device. In the case of Samsung, they wanted a larger more easily touched UI. Others may want the same thing, for example to make it easier for people who have poor eyesight to use their device or whatever other reason.
I’ve [another poster] also just looked at my AC100 (Tegra 250), and it reports xdpi =
160, yet the correct value should be about 120.
As you say, this seems to be completely broken, and the Tegra example
shows that 96 cannot be used as a sentinel for a wrong value.
Well fortunately that one isn’t a compatible device so won’t have Market. :} Out of curiosity, what are you trying to do? I know you said you are trying to display a ruler, but what exactly do you want it to be used for? Just for people to hold stuff up to it to measure?
One of the reasons the devices haven’t been good in this regard is because after introducing these APIs, we have never actually found a single place in the standard UI where they should be used, so nobody realizes they are shipping with bad values. sigh
Android framework engineer
Thanks for this info. I’ll stick to calibration option then.
see this archived blog post
Supporting the multiple screen sizes of multiple devices in Adobe AIR
those are pretty solid advices
now for displaying rulers and stuff, using calibration is the way to go, because of the phone vendors that could prefer display “smaller” or “bigger”