• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • paul wheaton
  • Liutauras Vilda
  • Ron McLeod
Sheriffs:
  • Jeanne Boyarsky
  • Devaka Cooray
  • Paul Clapham
Saloon Keepers:
  • Scott Selikoff
  • Tim Holloway
  • Piet Souris
  • Mikalai Zaikin
  • Frits Walraven
Bartenders:
  • Stephan van Hulst
  • Carey Brown

How come transformation methods of Spark like map, reduce, filter also in Scala (without Spark)?

 
Ranch Hand
Posts: 2954
13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
In Spark, there are transformation methods like map, reduce, filter. These methods are specific to Spark and are  called Spark transformations. Howcome these methods are present in Scala programming language (without Spark).

Thanks
 
Bartender
Posts: 15735
368
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Mapping, filtering and reducing are some of the most essential operations in any functional language. Scala needs them regardless of whether you use Spark or not.

A better question is, why does Spark for Scala feel that it needs separate versions of these functions?
 
Monica Shiralkar
Ranch Hand
Posts: 2954
13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I think the answer to the question you said as the better question  to ask is that since Spark requires these methods to do the work they are intended to on a cluster of machines  so they require a different version of these methods. That is what I think. I may be wrong.
 
Ranch Hand
Posts: 376
Scala Monad
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Spark got inspired for  it's API from Scala's collection API that has those methods (filter, map, flatMap, reduce, etc). When invoked in a Scala collection, they run locally, returning a new collection, when using Spark, the methods will be invoked on the RDD API, and will return a transformed RDD that will run in the Spark cluster.
 
Politics n. Poly "many" + ticks "blood sucking insects". Tiny ad:
Gift giving made easy with the permaculture playing cards
https://coderanch.com/t/777758/Gift-giving-easy-permaculture-playing
reply
    Bookmark Topic Watch Topic
  • New Topic