This week's book giveaway is in the Programmer Certification forum. We're giving away four copies of OCP Oracle Certified Professional Java SE 11 Programmer I Study Guide: Exam 1Z0-815 and have Jeanne Boyarsky & Scott Selikoff on-line! See this thread for details.
Can anyone suggest a framework matched for the requirement,
Read data from MS Excel and insert into data base.
Read the data from MS Excel and with the data already stored in the DB and do some manipulations
Create out put in MS excel.
All the excel files will be approximately 1~2 GB size.
The process are scheduled jobs, where files will be read from a windows directory or FTP location.
Thinking of spring batch, any recommended frameworks which works well with MS excel reading.
Apache POI is the prime Java library for working with Excel sheets.
The DB part would be handled by using JDBC, possibly with an ORM layer (like JPA) on top of it.
posted 1 year ago
Thanks for your reply.
Sorry to miss the serials in my query.
A tually it is for a scheduled batch program, which reads csv files with more than 1 million records,then need to perform aggregation and some calculations ( for data fore cast ) then write the file to Excel sheets.
Nothing in that additional information changes the recommendation.
posted 1 year ago
Seems the subject line is misleading.
The support needed is to have any suitable framework or better APIs by which I can do some statistics operations.
For example, the program has to read CSV file of 1 million records and need to aggregate it with a similar file having another 1 million records.
Any suggestion like, need to choose between Java 8 streams, Jython, or any stream ORMs.
Two problems: 1) APIs for statistics operations to reduce the Java code.
2) Better performance while handling this much data in the memory.
Considering these factors , any recommendations please