• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

discarding numeric primitives and using BigDecimal

 
Ranch Hand
Posts: 125
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi All,

Need to know your thoughts on the same. For a new engagement we plan to discontinue usage of primitives as a programming practice and strictly follow the usage of BigDecimal instead. Just want to know from you all if there are any perils we may encounter in this process. I believe normal/advanced mathematical calculations can be handled with BigDecimal.

The reason we are resorting to this approach is one of the typical found quiet frequently in this and other java forums is primitives don't support null checks and assume a default value. This gets transparently set in the Hibernate domain objects when persisted in the database.
And instead of a null indicator in the database what we end up is with default value of the primitive.

Let me know your thoughts.

 
Marshal
Posts: 79239
377
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Primitives only assume default values when they are fields.

Horses for courses. The average engineer uses floating-point arithmetic because the imprecision is tolerable. The average banker should avoid floating-point arithmetic because the imprecision is not tolerable. There will be a performance overhead from not using primitives, but I expect that will be tolerable.

And remember null behaves differently in SQL from in Java. And remember always to quote a rounding mode when dividing.
 
Mohit Sinha
Ranch Hand
Posts: 125
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thanks Campbell.

You mention about nulls behave different in sql from in java. What's your point here. My main concern was not setting the java persistent object value when persisting to the database when i know beforehand that its corresponding equivalent in the java space is null
 
Campbell Ritchie
Marshal
Posts: 79239
377
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
If you try comparisons with nulls in Java you get an Exception.
If you try null comparisons in SQL you get null as a result. So SQL uses 3-value logic: true, false, null, and null is rather like don't know. Java uses 2-value logic: true and false.
 
Master Rancher
Posts: 4830
74
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I think there are two issues here: primitives vs. reference types, and floating-point vs. BigDecimal. The issues overlap in some ways, but I think they should be considered separately nonetheless.

1. Primitives vs. reference types: If you need to be able to represent a null which is different from a default value, then yes you probably need to use reference types for fields of persistent objects - at least, for all fields that map to nullable columns. However this does not necessarily imply BigDecimal. A common alternate solution is to use wrappers: Integer, Long, Double, Float, etc. If nullability is your only concern, there's no need to switch to BigDecimal.

Note also that if a column is declared NOT NULL, then there's no reason to allow nulls in the model field either. In this case I usually prefer to use primitives, specifically because they don't allow nulls to slip in where they shouldn't. Disallowing nulls then makes other coding easier, as you don't have to insert null checks in your code later on. You get fewer NullPointerException bugs if nulls aren't possible in the first place.

So yes, if a column must be nullable, use a reference type to represent that column in the model class. But if the column isn't nullable, I encourage you to use a primitive type. Unless you have some other reason to avoid a primitive. Such as...

2. Floating-point vs. BigDecimal. If you're dealing with money, you probably want to use BigDecimal. Well, mathematically literate programmers could also just as easily use int or long, transposing the decimal place. (E.g. record $1.23 as 123 cents.) But that sort of thinking seems to be out of vogue these days; oh well. So sure, use BigDecimal for monetary quantities. However, for many other types of things, primitives are just fine, even superior. I discussed this recently here. Also, many quantities are inherently integers, with no decimal point needed - using int, long, or even BigInteger communicates this much more clearly than using BigDecimal does.

So, I suggest you choose your data types first by ignoring nullability concerns. Just consider whether you need precisely-rounded decimals, or whether readability, speed, and/or storage space are more important to you. Then consider nullability - if you were going to use an int, but need it to be nullable, use an Integer instead.
 
reply
    Bookmark Topic Watch Topic
  • New Topic