This week's book giveaway is in the JavaScript forum.
We're giving away four copies of Cross-Platform Desktop Applications: Using Node, Electron, and NW.js and have Paul Jensen on-line!
See this thread for details.
Win a copy of Cross-Platform Desktop Applications: Using Node, Electron, and NW.js this week in the JavaScript forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic

Download Large files using Servlet / Tomcat  RSS feed

 
Preetha Varma
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

We have a website used for downloading large files as large as 6gb and sometimes larger. We have all files in FTP server.

The websit is JSP/Servlet /Tomcat combination. Below is code sample.

I need a solution to increase the download speed. I understand it is bound to network bandwidth but are the steps that we need to take while we have such large files.

I read about multipart downloads,gunzip streaming. Can you please suggest a best solution with some samples.


Thanks
Preetha
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Short answer: NO

All of the servlet / JSP tools are designed for cycles of request/response completed in a short time.

FTP and servlets can only handle a single connection stream, if FTP is too slow then there is no servlet solution.

Look into protocols like BitTorrent which can support multiple streams.

Bill

 
Preetha Varma
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thank You Bill.

So you are saying using some bitorrent API like http://mpetazzoni.github.io/ttorrent/ should help me in speeding the download process?

Thanks
Preetha
 
Ulf Dittmer
Rancher
Posts: 42972
73
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Start by defining exactly how fast it needs to be. A requirement such as "it needs to be faster" will get you nowhere. I suspect that the most efficient way will be to use a fast server using fast network adapters and a fast network. But 6GB is sizable, so I also suspect that downloading will always "feel" too slow. Maybe a better approach is to reconsider whether all that data really needs to be downloaded, or whether an incremental approach would possible. (I'm not mentioning compression because I'm assuming that you have already investigated various algorithms and found them lacking.)

But in any case, start by defining precise requirements.
 
Preetha Varma
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks again for the reply.

In this case,i am trying to improve the speed.6gb files take close to 2 hrs to download in a normal home network. What would be your suggestion to improving that performance based on the code attached in first thread.

The files could be anything from word, pdfs, executables etc.

Is there anything that we can try? What is the algorithm you assume that i would have tried. Can you please throw some light ?

regards
Preetha
 
Ulf Dittmer
Rancher
Posts: 42972
73
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The class libraries have zip and gzip compression somewhere in java.util

Do I understand correctly that this is a single file? That'd be pretty long- I don't know that a human being would want to handle such a file :-) nor that standard desktop apps would be happy to do so. Maybe it can be broken down into several files? That would also make it easier to implement the incremental approach I mentioned (by not downloading the parts that have not changed).

Lastly, since you mention "home DSL", make sure it's not the download speed limit of that you're running into.
 
Dave Tolls
Ranch Hand
Posts: 2809
30
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The only thing code-wise you could do would be to see whether that buffer needs to be bigger.
But that is likely to be small-potatoes compared to the other possible causes of slowness mentioned by others here.
 
Preetha Varma
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
These are single individual files. This is website where we share files with clients and do not want to give FTP access due to security reasons.

I mentioned home network because the client could be accessing from any network . It happens the same from my office network too. Its slow.

So what should be a practice if we have to share large files ? How does other team handle this?
 
Ulf Dittmer
Rancher
Posts: 42972
73
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
1) figure out how much of a difference compression makes

2) determine hard requirements for download times. Without that you're wandering in the dark.

3) what "security reasons" prevent use of FTP? There is no difficulty in making FTP secure.
 
Preetha Varma
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks for suggestions.

Do you think i should also give a thought about Apache flume and hadoop? I have never worked on this area..but they talk about large data transfers.
 
Ulf Dittmer
Rancher
Posts: 42972
73
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
That has nothing to do with your problem. Answer the questions I asked, and we might be getting somewhere.
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
This question occurs to me: How much does this data change between downloads? Also how frequently do people update?

1. Each time completely different - you are stuck with moving the entire thing

2. Small changes but frequent - consider an approach like the big online games use - only download updates. If Guild Wars 2 had to completely DL 19 GB every update I would never get a chance to play

This will require some serious programming effort on your part to define a way to map content so that it can be marked as changed or unchanged. Sounds like fun to me.

Bill
 
Consider Paul's rocket mass heater.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!