Jesper de Jong wrote:Did you see the whole video - you could write this using tau as
which looks as least as elegant as the version with pi.
Jesper de Jong wrote:Did you see the whole video - you could write this using tau as
which looks as least as elegant as the version with pi.
There are only two hard things in computer science: cache invalidation, naming things, and off-by-one errors
Check out my kickstarter CLICK HERE
My book, my movies, my videos, my podcasts, my events ... the big collection of paul wheaton stuff!
Pat Farrell wrote:A majority of my students got it wrong. Don't know if that reflects on them or on me.
Mike Simmons wrote:Eh, I'd blame Java, for mimicking C. Who probably got it from Algol or somewhere.
Stephan van Hulst wrote:If # were to be used for integer division, most of my programs would end up using # over /.
Stephan van Hulst wrote:I think / is more clear, especially when you look from a discrete math point of view.
Mike Simmons wrote:Hmmm, really? For me it seems very much the other way around. Perhaps it's our respective backgrounds: I studied physics and engineering in school, and most of my early programming was to perform calculations for those disciplines. Whereas I see you're studying computer science. Perhaps the problems I was given were biased towards real numbers, while yours were biased towards integers?
Does discrete mathematics not include rational numbers? I don't remember ever being taught that 1/3 = 0 was an acceptable answer. Until computer programming, when I was taught that's simply what you get, regardless of whether it makes sense. But to the rest of the mathematically-inclined world (as far as I know), 1/3 is a ratio, very much NOT equal to 0. And 0.3333333333 is not exactly equal to 1/3, but it's at least a fairly close approximation, much better than 0 or 1. How does 1/3 = 0 make sense in general? I would certainly agree that's it's what you want, sometimes. But I don't see how it makes more sense than 0.3333333333, as a default. And if the result must be an int, in general I am just as likely to want Math.floor() as Math.ceil() or Math.round(). (And yes, I know none of these exactly maps to integer conversion when both positive and negative values are considered, but I'm ignoring that extra messiness as beside the point.)
(And I feel much the same about numeric overflow being silently ignored when using ints. That's another discussion, but most of my arguments are the same in spirit, if not in detail.)
There are only two hard things in computer science: cache invalidation, naming things, and off-by-one errors
Pat Farrell wrote:I say Phooey to Tau.
Don't get me started about those stupid light bulbs. |