For example, variable "a" could likely be true, false, or null. Would something like this,
whose intent is to return either true or false (by "converting" null to false), cause conniptions for you or your team?
(3) Does your answer depend on the likely specific falsy value(s) you're converting from? For example, it's OK for null -> false, but not for other things like empty string, zero, undefined, or NaN?
Bear Bibeault wrote:(P.S. I gave you a cow not only for an interesting topic, but for your clever subject)
Bear Bibeault wrote:3) No, depends on context (does it make sense where it is being used)
I've used !! for such purposes twice in the past week, and in both cases I think it makes the code clearer. I just needed to step back for a minute and make sure I'm not in toddler-with-a-hammer mode.
I'll mention my code at our next dev team meeting and will be prepared to fight off charges of "excessive cleverness" or the like.
While I was thinking about this kind of stuff yesterday, I went poking around jQuery source and found the following:
So at first glance I thought this was pretty clever stuff (and possibly an argument in my favor regarding the use of !!): the "getter" function fired always returns true or false, despite the underlying value only ever being undefined or true (in the current code base).
But then I started to wonder, in this case, why not just initialize fired to false when declaring it and avoid the !!. The only code I see testing the value of fired tests it as "!fired" -- which would return the same value (true) whether fired is undefined or false. Is there something going on here I'm not understanding, or should I just chalk it all up to a style preference?