Is R intended to satisfy the requirement of performance for large scale enterprise application, in terms of high speed and high frequency processing, concurrent computing and scalibility, when it is integrated with other languages like Java or C++ or used solely? Does the book R in Action cover similar topics?
With the massive stores of data now being collected, this is an increasingly important question. R was orginally designed to handle moderate to large amounts of data (in the megabyte and gigabyte range). It keeps the data in memory, which leads to a zippy experience for interactive users, but creates limits for very large datasets. Most users keep their data in external databases or data warehouses and access portions of it through R's extensive DBMS access routines.
Appendix G in "R in Action" describes working with large datasets.
posted 8 years ago
Thanks for your reply, Robert.
I was thinking that R should have some kind of feature of scalability available for any package so that package developer and R's user can focus on their scientific computing or application integration without need to worry about its performance particularly scalibility. Maybe my thought is too luxury, since it's hard for any computing language.
and POOF! You're gone! But look, this tiny ad is still here: