• Post Reply Bookmark Topic Watch Topic
  • New Topic

Convert result set into input stream  RSS feed

 
rajesh babu Y
Ranch Hand
Posts: 32
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

I have result set with me having 93 columns with 1 lac rows. I want to download the result set as xls file.

Because of the data set size is large, i'm not able to hold the data in java object. So i'm trying to convert result set to input stream and giving this input stream to servlet output stream by specifying the contentType as text/xls.

I'm not able to convert result set into input stream.

Could you provide a solution here...

Thanks,
Rajesh
 
Tony Docherty
Bartender
Posts: 3209
78
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Because of the data set size is large, i'm not able to hold the data in java object. So i'm trying to convert result set to input stream and giving this input stream to servlet output stream by specifying the contentType as text/xls.

You can't just specify the content type and expect the data to be magically transformed to that data type. If you want to output the data as an xls file you need to create an xls file and then send that. There may be a way to do this on the fly but I've never done it so I'm not sure if it's possible or how to do it. You could try looking at the libraries that are available for creating xls files such as POI and see what options are available.
 
rajesh babu Y
Ranch Hand
Posts: 32
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Tony,

we've large set of data(approx 2GB) and having 93 columns. Using POI facing big performance hit.

So i'm thinking to convert the result to stream till the end user, because, application jvm won't have much space.

Could you give me how should i convert result set to stream ...?

resultset.getAsciiStream(column ) is not working for me, result set have 93 columns in it.

Thanks,
Rajesh
 
Tony Docherty
Bartender
Posts: 3209
78
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The getXXXStream methods are useful for retrieving large amounts of data from a single column and row and not for streaming data from multiple columns and rows.

First of all you need to decide in what format you what to send the data, for instance you could:
1. Read each column value for each row and write each value as text to an output stream.
2. Read each column value for each row and write each value as it's data type to a DataOutputStream.
3. Read all the data for a row, package it up in an object and write that object to an output stream.
4. Read all the data for n rows, compress the data and write the compressed data to an output stream.

And no doubt there are many other ways or doing it. However, given the amount of data you need to transmit, assuming the data will compress well, using some form of compression may well be the best approach.
 
rajesh babu Y
Ranch Hand
Posts: 32
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Tony,

I got you. But here performance should be a consideration, so i can go with your 4th one.

Could you elaborate how i can achieve that.
 
Tony Docherty
Bartender
Posts: 3209
78
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Some types of data compress better than others so it depends on what sort of data it is as to whether it is worth while or not. If it is mainly text and/or primitive data types it should compress well.
You still need to decide on the format you want to use before compression because you will need to be able access each data field on the server after decompressing the data stream. I would initially look at using a CSV format (there are libraries available for reading and writing CSV files) and see if there is a library that allows you to do this on the fly. If not it may be a case of breaking the data into manageable chunks.
You will also need to look at the java.util.zip package for an appropriate compression/decompression stream.
 
rajesh babu Y
Ranch Hand
Posts: 32
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Tony,

Right now i'm working in a way of iterating the each row by column and appending the row to string builder.

At a chunk of 50 rows appended to the string builder, i'm flushing that to output stream.

Here is my code snippet...:



could you provide inputs here to read the n rows at a time from the result set ...

i'm displaying the result data mapping with column headers
 
Tony Docherty
Bartender
Posts: 3209
78
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Please use code tags when posting code. I would normally add here that one of the reasons to do so it they preserve the formatting of the code but as you haven't used any indentation etc they haven't really helped in this case. In future please indent your code in a consistent manner as it makes reading the code so much easier.

could you provide inputs here to read the n rows at a time from the result set ...

I don't know what you mean by this as you are reading 50 rows and then writing the block of data out.

Some questions:
Why have a for-each header loop and then use an index rather than the header name - why not just have a for loop from 0 to i < header.length?
Why are you writing the data as bytes?
Why are you declaring the variable screenPix and assigning an array to it which will then be discarded on the next line?
What happens to the last n rows of data if the result set isn't exactly divisible by 50?

You may need to do some performance testing with a compression output stream to find if there are any efficiency advantages with writing larger blocks of data, if not you could just write a line at a time.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!