Hi Paull, I'm sorry, I though I answered this one earlier, but somehow I didn't post my answer. Perhaps I hit preview instead of submit.
Anyway, number one with a bullet is lambdas and method refs. They let you do things that would be just to cumbersome using anonymous classes. I hope you'll permit me the liberty of reaching back to Java 7 for another feature, which is the try-with-resources statement. This (relatively) small feature made it really easy to ensure that resources are properly terminated, even when you're using more than one at the same time. It was such a pain to do this correctly before this feature was added that virtually no one did it. Another feature that had a really big impact is streams. When used for appropriate tasks, they can really improve the expressiveness of the language. Unfortunately they're also prone to overuse and abuse.
Maybe you have asked the wrong question. Maybe it should be, “What sort of factors have necessitated new features?” In the case of λs, it was probably the increasing popularity of functional languages and the difficulty of writing anonymous classes that drove the adoption of λs. Maybe Streams were introduced because they make it easier to exploit multi‑core processors. Maybe by the turn of the decade, Java® was looking like old‑fashioned twentieth century imperative programming. Maybe it was necessary to keep up with modern programming practice in order to maintain the language's market share.
Author and "Sun God"
posted 2 years ago
Hello Ritchie. I agree with you about lambdas, though I would claim that it's a dangerous precedent: In this case it was a fine addition to the language, but as a rule "everyone else is doing it" is a bad reason to add a feature to a language (or, for that matter, to do pretty much anything else). Languages have a "feel" and a community. If a new feature harms the feel, or fails to address the needs of the community, it can make a language worse. At one time in the distant past, Sun was going to add XML literals to Java, because XML was all the rage. Thankfully, it took a while to do it, and by the time it seemed feasible, XML literals had fallen out of fashion. I think pretty much everyone is glad that they did not become part of the language. As for streams, Brian Goetz told me that the were not added primarily to make it easier to use multicore machines, which is a good thing: As discussed in Item 48 ("Use caution when making streams parallel"), the great majority of stream pipelines will see no benefit from parallelization, and some will degrade catastrophically, or even fail to compute the correct results. This doesn't mean that there are no good uses for parallel streams: when the the conditions are right, they can result in linear speedup in the number of cores, with little effort on the part of the programmer. What it does mean is that you should not parallellize streams indiscriminately. You should only do so when you can prove that parallelism will preserve correctness, and you have a strong indication that it is likely to make your stream pipeline run faster. The requisite conditions are discussed in some detail in Item 48. Also, you have to remember that parllelization is merely an optimization, and you must test the performance of your code before and after parallelizing to find out if the optimization was justified.
posted 2 years ago
Thank you This book looks to be as good as the previous editions.
Josh's answer to another question in this forum also applies here
Joshua Bloch I do think that Java should continue mining functional programming for good ideas, but I don't think Java should adopt all the FP constructs present in Scala. If people want to program in Scala, they can! Java is simpler than Scala, and more of a "blue collar language." That's the soul of Java. Of course it's been moving further from that ideal at least since generics (especially wildcards) were added in Java 5. It's a tough balancing act. You want to provide useful facilities, but at the same time a language only has space for a certain amount of complexity until it becomes unwieldy. Java is already becoming unwieldy. So my answer is, yes Java should consider more functional programming features, but it should only include them if they're really compelling, and easily understandable to Java's target audience.
It isn't always clear what's too complex. As an illustrative example (from Item 45), it's unclear which of these two methods of computing the Cartesian product is superior:
Here's what I say in Item 45:
Which of the two versions of newDeck is better? It boils down to personal preference and the environment in which you’re programming. The first version is simpler and perhaps feels more natural. A larger fraction of Java programmers will be able to understand and maintain it, but some programmers will feel more comfortable with the second (stream-based) version. It’s a bit more concise and not too difficult to understand if you’re reasonably well-versed in streams and functional programming. If you’re not sure which version you prefer, the iterative version is probably the safer choice. If you prefer the stream version and you believe that other programmers who will work with the code will share your preference, then you should use it.
posted 2 years ago
Thanks for your response ,you've a had a lot to wade through.
I completely agree that the "everyone else is doing it" argument is a weak one.
Your response to the original question made me think of a followup question I've had a for a while regarding streams.
When used for appropriate tasks, they can really improve the expressiveness of the language. Unfortunately they're also prone to overuse and abuse.
What are the appropriate tasks for streams ?
I don't really use them. Part of that is I am currently working with a lot of legacy code.
Sundar pointed to one use that you mentioned, which seems to be simplifying loop constructs.
Slime does not pay. Always keep your tiny ad dry.
Building a Better World in your Backyard by Paul Wheaton and Shawn Klassen-Koop