Most of these voltage/current meters can only go up to about 60v because that's the maximum voltage the IC or the linear voltage regulator powering the IC which does the math can handle.
By powering the measurement IC (microcontroller, multimeter ic) from a separate power source (coin cell battery or something), the meters could in theory support more.
However, most of these cheap meters use a simple 10 bit ADC which means they use a voltage divider to bring down the voltage down to something the microcontroller can tolerate (let's say 5v) and then an integrated ADC converts that 0-5v into a 0-1023 interval.
So each digit in theory is 0.0048v ... for 50v input that would mean a 0.048v step ... then you have to keep in mind that in reality, the ADC is not perfect and there's a +/- 3-5 step error, some are even worse ... so really I'd consider the minimum step about 0.1v if the meter is designed to work with up to 50-60v.
Then you have to keep in mind that the resistors used in the voltage divider will gradually warm up from the voltage regulator and the microcontroller and the current shunt near them so as they warm up they drift a bit (there's a temperature coefficient for resistors) so the accuracy will go down a bit more there... these cheap meters don't use expensive (as in 20-40 cent a piece, 0.5-1% resistors) which don't drift much as they warm up.
As for current...
These cheap meters also use kinda average current shunts with crappy temp coefficient, as they warm up from current going through them, the resistance will change a bit and the microcontroller which uses the voltage drop on the resistor to measure the current won't adjust for it, because there's no temperature sensor to keep track of that.
If price was not an issue, I'd design a current meter using a chip that uses a hall effect sensor IC like
http://www.digikey.com/product-detail/en/ACS710KLATR-12CB-T/620-1335-1-ND/2179483http://www.digikey.com/product-detail/en/ACS711ELCTR-12AB-T/620-1370-1-ND/2470594for up to let's say 10A of current... and maybe something with better accuracy and higher voltage at the output like this
http://www.digikey.com/product-detail/en/ACS714ELCTR-05B-T/620-1258-1-ND/1955900for 0-1A for better resolution/accuracy.... and/or maybe use a relay to introduce a current shunt and measure the voltage drop on it only when current is under 1A.
The hall effect sensors don't use a resistor on the current path so there would be no voltage drop, which would also be a good thing when you power a circuit from a low voltage source with limited current output like a coin cell battery or something.. basically when the voltage drop on the current shunt resistor could be enough to matter.