Actually
NO values of ch are greater than -10 and less than 10. When you read a '0' from the console, the value as an integer is 48 -- the Unicode value of the character '0'. Likewise, '1' is 49, '2' is 50, etc. The odd thing here is that you seem to realize that, as you're subtracting '0' from ch before using the numeric value.
The idiomatic
Java way to do this would be to use the static methods Character.isDigit() to
test if a character is a digit, and Character.digit() to convert from a character to a number. Then your program works internationally, not just for ASCII-using locations like the US.