Win a copy of The Little Book of Impediments (e-book only) this week in the Agile and Other Processes forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

URLBirdy(1.3.2) Primary Key & DuplicateKeyException

 
vijay selvaganapathy
Greenhorn
Posts: 14
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi all,
In my assignment i am caching the records by recording the record_count (which is actually recNo) and its corresponding offset
[CODE]
public void cacheRecordOffset() throws IOException {
long file_curr_position = getFilePointer(); long file_size = file_access.length();
file_access.seek(file_curr_position);
while (file_curr_position < file_size) {


map2.put(new Integer(rec_count++), new Long(file_curr_position));

// rec_count is primary key i.e. recNo

file_access.readShort();
Iterator iterator = map1.keySet().iterator(); // map1 represents linked hash map

String data_record[] = readRecord(iterator); // reads all 7 fields
for (int i = 0; i < data_record.length; i++)
System.out.println(data_record[i]);
setFilePointer(file_access.getFilePointer()); // store the file pointer in class variable

System.out.println("\n" + filePointer);
file_curr_position = getFilePointer();
file_access.seek(file_curr_position);
}
System.out.println("map2 =" + map2);
}

My questions are:

1) Is this proper way of "caching" ?.since in read method i just set the file pointer to the offset represented by the record and start reading the record.

2) In create Method i insert the new record at the end of the file and return the new rec_no( e.g if there are 10 records this method will return 11) since the String argument we pass in the method parameter contains only data fields(excluding the byte flag). Is this OK since this will never throw the duplicateKeyException.

3)Instruction:
// Deletes a record, making the record number and associated disk
// storage available for reuse.
public void delete(int recNo) throws RecordNotFoundException;

Instruction:
//2 byte flag. 00 implies valid record, 0x8000 implies deleted record


In delete method do we just have to change the byteFlag from 0 to 0x8000 or we have to delete the whole record?.


All your valuable suggestions will be greatly appreciated.

Thankx in advance,

VJ
 
Darya Akbari
Ranch Hand
Posts: 1855
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Vijay,

couldn't you write it a little bit nicer (use always the tags) like:



Look to the fat marks. I don't know what they do. But one you should know is that you only update the data file but never remove a record from it physically. Update either the whole record or the flag field.

I can't see whether your caching works as expected but you can best answer this question yourself when you make a test output and check whether your data file records are now in your cache object, in your case the map object.

Concerning the DuplicateKeyException, I threw it whenever one create the same record already existing in the data file.

Regards,
Darya
[ July 13, 2005: Message edited by: Darya Akbari ]
 
vijay selvaganapathy
Greenhorn
Posts: 14
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Darya,
Thx for the prompt reply.I forgot to close the Code tag thats why it became messy.The program works fine too when i run the program. I maintain the recordnum and offset in linkedhashmap as key-value pair. The problem with the posting is you cannot preview your post.

Regards,

VJ
 
Darya Akbari
Ranch Hand
Posts: 1855
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Vijay,

I also miss the preview feature. However you can edit your post after you post it.

Good Luck,
Darya
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic