• Post Reply Bookmark Topic Watch Topic
  • New Topic

network binary file truncation  RSS feed

 
yoram kanfi
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
hello,
Thank you in advance...
I'm trying to write a function that gets a file using java.net.URL
I'm able to download ascii (.html) files with no problems (so far); but when i try to download a .jpg the file gets truncated.
The actual file size is 52,177 bytes; but i only get 2,489 bytes.
Here's the function:
public void getFile(String pUrl)
{
try
{
URL lUrl = new URL(pUrl);
URLConnection lUrlConnection = lUrl.openConnection();
InputStream lInputStream = lUrl.openStream();
DataInputStream lDataInputStream = new DataInputStream(lInputStream);
FileOutputStream lFileOutputStream = null;

String lPrefix = "./dfiles/";
String lLocalFile = null;

lFileOutputStream = new FileOutputStream(lPrefix + "MyFile.jpg");
for(;
{
byte lData = lDataInputStream.readByte();

try
{
lFileOutputStream.write(lData);
}
catch(EOFException e)
{
break;
}
}
lDataInputStream.close();
lFileOutputStream.close();
}
catch(MalformedURLException e)
{
System.err.println(e.toString());
}
catch(IOException e)
{
System.err.println(e.toString());
}
}
any help or insights would be greatly appreciated.
sincerely,
me.
 
Larry LeFever
Greenhorn
Posts: 18
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Why read one byte at a time? Better to read some buffer-size at a time, and then write that buffer completely during each iteration -- making sure you call "flush()" during each iteration. Otherwise, you might lose data. You might be inadvertently overwriting that one byte that you've just read from the URL. "flush()"-ing your out-buffer before re-filling it might be worth a try. That's how I'd do it.
Also, I forget whether or not reading from a URL-stream includes the HTTP-headers in what you read. Of course, you don't want to store those in the binary file.
 
yoram kanfi
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
excuse me while i bumbble around with this...
is this what you mean:
try
{
lFileOutputStream.write(lData);
lFileOutputStream.flush();
}
catch(EOFException e)
{
break;
}
...if so then i still did not get the full file.
thank you once again.
 
Carl Trusiak
Sheriff
Posts: 3341
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I think you are really close! The problem is, you've chosen the wrong InputStream. DataInputStream reads in (basically) ascii data. With that, values of 10 or 12 are considered end of line and not returned in your stream and EOF character means End of file so, transfer halts (and this could simply be a specific spot on your jpeg). I think you may have better luck with simply BufferedInputStream which strictly reads in bytes as they are available on the InputStream.

Let me know if this helps any!
Good Luck
 
yoram kanfi
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
well the function works...
what i did notice however was that when the url is in upper case then the file gets truncated.
all i did was use .toLowerCase() and i got the full file.
any ideas?
sincerely,
yoram.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!