• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Reading in db file

 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I'm currently playing around with the RandomAccessFile (RAF) class and api (which I have little experience of until now) and am having some interesting *unexpected* results I could do with some help on.

I've sucessfully managed to read in the header details of the file which are cookie [257], each rec len [159] and num fields [7].



However, when I read the next 159 bytes to read the Field Names I seem to get too much information back as it spilts on to the first record.



OUTPUT
------
♦name location @ ♦size ♦ smoking ☺ ♦rate ♦date
♣owner Palace smallville

please help.
 
Anton Golovin
Ranch Hand
Posts: 527
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi, James! You may wish to use the following code:



to remove those extra characters. The reason the extra characters are there is that in some situations, new String(byte[]) constructor fails to properly convert bytes into the right characters; I suspect that this is cause for some automatic failures, because it seems this wrong conversion depends on the language configuration of each individual computer - it works ok on one and does not work on another. On mine it does not work, and I think it is becuase the language of the OS (US-EN) and the preferred program language (RU) make the constructor fail. Spacifying the character set in the constructor removes this problem.
 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Anton,

unfortunately this didnt make any difference - the output was exactly the same. anymore ideas?
 
Anton Golovin
Ranch Hand
Posts: 527
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator


This is my method. If it does not work, another method may be to convert each individual byte into the corresponding character and to put the characters one by one into a StringBuffer... very cumbersome, though.
 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Technically, there is nothing different between our code..sooo frustrating.

Regarding your final comment, are you suggesting that I read in each byte 1 by 1 and do some conversion until 159 bytes (i.e a record) has been read? If so I dont see how this would make a difference to the result.

thanks.
 
Anton Golovin
Ranch Hand
Posts: 527
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Well, here's what it is:

Before reading a record, the file pointer must be positioned at its start. Before that can be done, though, one should read in the datafile header - the field names, etc... After the header is read, you will know the offset for the first record, and you will be able to read it. Here's the code:



[ October 13, 2004: Message edited by: Anton Golovin ]
[ October 13, 2004: Message edited by: Anton Golovin ]
 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Ahh, I see my mistake - I was treating the field headers as if they were in the data section but they're ofcourse in the scheme section - (as per the spec, doh!).

Thanks for your help, goodluck with your result.

 
Anton Golovin
Ranch Hand
Posts: 527
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by James Clinton:
Ahh, I see my mistake - I was treating the field headers as if they were in the data section but they're ofcourse in the scheme section - (as per the spec, doh!).

Thanks for your help, goodluck with your result.



You are most welcome, James. Let me likewise wish you success in your undertaking. May I ask you a question? Are you reading your records into a cache?
[ October 13, 2004: Message edited by: Anton Golovin ]
 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Are you reading your records into a cache?


I haven't made a decision about cache-ing yet, but I have considered it. My thinking was -

Build a cached model of the flat file and referred to this for all non-writes until a write was performed. Then I was either going to update the cache and the model at the same time to keep then in sync OR flush the cache and rebuilt it.

What do you think?
 
Nit Kad
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am just going to start my assignement and I was thinking about caching the db file or not. There is definite advantage if the data is small. But do we have to consider that it might increase beyond a point of caching in a object?
 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I would think so yes, otherwise it'll go down as a limiation in your design.

How have people implemented this in the past - is there a pattern to follow, recommedations and no-go areas we should be aware of?
 
Anton Golovin
Ranch Hand
Posts: 527
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I cache, using an ArraList to read in records sequentially at startup; records are in the form of String[] arrays, and deleted records are entered as nulls.

I write in the choices.txt file about caching. It makes searches very quick. The only requirement, in my view, that caching imposes is that before a cache is updated, the data file must be updated.
 
Clivant Yeo
Ranch Hand
Posts: 124
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I implemented my project using memory caching. I was about to finish when to my dismay, my server program will crash at about 70000 records after the second client had locked into the server(I tested it when I had nothing better to do). I am reimplementing my project to the non-caching mode, as there are the GUI and the network connections (Sockets for my case) which needs memory more than my records (they are stored in the database file after all), plus the memory that the network connections will consume is quite arbitrary, it's better to keep as much for them as possible. Just my two cents.

Regards,
Clivant
 
James Clinton
Ranch Hand
Posts: 190
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks for that Clivant.

I think I'm going to do the same as it probably wont earn any extra marks.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic