• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

NIO: mapping big file

 
Ranch Hand
Posts: 342
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi,

I am currently using RandomAccessFile to read portions of a 35Gb data file. I'm wondering if using NIO to map this large file (MappedByteBuffer?) would improve performance. However I have been reading about problems with mapping more than 2Gb file on 32 bit architecture; does anyone have experience of this. I don't really want to have to write any paging logic because it would be too involved for the task in hand I think.

any comments?

cheers,
Ben
 
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
When you use FileChannel's map() method, you get a MappedByteBuffer. The get() and put() methods of this class use an int to specify position; this means that it's not possible to access any position greater than Integer.MAX_VALUE. So you can never map a region larger than 2147483647 bytes (2 GB) in one buffer; you'd need to write some paging logic to use multiple buffers. Assuming your system could otherwise support mapping so much memory; I've never tried it.
[ November 29, 2004: Message edited by: Jim Yingst ]
 
Ben Wood
Ranch Hand
Posts: 342
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thanks for the input Jim, very useful. I'll stick with the plain old RandomAccessFile for now then. I don't fancy trying to implement the paging logic for the data and splitting the file into several separate ones is just too messy when it comes to getting what I want back out again
 
reply
    Bookmark Topic Watch Topic
  • New Topic