When handling Multi-Device Resolutions., it's actually most important to determine the Avg. Viewing Range of the End User in order to best Scale the User Interface.
The reason for this is because the Effective DPI is directly proportional to how far you are from the Screen.
This will range from 0.15m (6 inch) to 3.00m (10 feet), on Avg. 72 DPI is comfortable for the widest range of eyesight.
Let's say for a moment we have a 24" Full HD (1080p) Display.
This means our Display Dimensions are 53.1cm (20.9in) x 29.9cm (11.8in)., this results in a Native 92 DPI
This essentially means that the Ideal Viewing Distance is going to be 2m / 6f 6in... of course this isn't the range that most will ACTUALLY be from their Display.
Mobile (0.15 - 0.60m)
Desktop / Laptop (0.90 - 1.00m)
Console (3m)
This as a note is why you might've heard things about "Designing for the 10ft Experience"
So why is this important? Well because the Effective DPI will scale against the Native DPI depending on Distance.
That is to say that our above Display, you want to be ideally ~2m from... but what if you are 3m instead?
Well in that case the Effective DPI of the Display increases from 92 DPI to 138 DPI., and thus the scale of an image will appear approx. 66% the Size; meaning we want to increase it's size to 150% the Original to have it still be represented the exact same.
And this works in reverse as well.
Let's say that we are viewing at 1m instead of 3m... in that case the Effective DPI will drop to 47 DPI; and we will want to reduce it's size to 49% to again retain the same effective size.
Now strictly speaking., for the moment let's not actually focus on the Viewing Distance but instead just determine how this would Statically Change in regards to the Resolution Output.
That is to say that we want to Design & Develop our Application at a "Default" Resolution... again let's use 1080p (Full HD) as this is pretty much an Industry Standard.
How does changing our Resolution affect our DPI., and thus how do we then Correct this Programmatically so that our User Interface is ALWAYS the same size regardless of the Resolution the End User wants to use?
For this will use a more Standard 32" Display... and we'll design against a 1080p Resolution (as this will be the most common)
This means our Display DPI is going to be ~69 DPI., and so that's what we want to keep in terms of Effective UI DPI.
If we check against the various "Standard" Resolutions (720p / 900p / 1080p / 1440p / 1620p / 1800p / 2160p)., we can figure out what the various Native DPI will be.
46 - 52 - 69 - 92 - 103 - 115 - 138
Remember here '69' is our 100% Scaling Value., in essence how we handle this (in pseudocode)
Differential = [Native] / [Effective]
If Differential > 1.0
Then Differential = Abs( Differential - 2.0 )
Else Differential = [Effective] / [Native]
This then ends up our Scaling Factor for Resolution... now if we want to take into account the Device Type (i.e. Mobile / Desktop / Console) that is to say "Are we Holding the Device?", "Are we setting at a Computer Desk / have it on our Lap?" or "Are we setting on a Couch on the other side of the Room?"; then we'd have a second Differential Calculation that we then Multiply against our Resolution Differential.
All this will result in a close approximation of Scaling that retains the same UI Sizes on all Devices and Viewing Distances.
You can as a note also use this for Resolution Scaling (i.e. having different Virtual and Physical Resolutions)., so you could even have the Game Output remain a Consistent Size., but that's more difficult / problematic as Virtual Resolution is going to determine performance, and really you're best aiming to keep a similar performance level across devices and graphics hardware.