• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

OutOfMemory with File.listFiles()

 
Ranch Hand
Posts: 126
Oracle
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I have around 9lac files in a folder that we need to process. When the load encounters around 9Lac the listFiles() method in File class throws OutOfMemory error. Is there any better mechanism to handle that?

I can obviously put filter to reduce the list to some extent. But that condition also has apossibility to blow up some day. Is there any class that does iterations for huge collection object avoiding OutOfMemory errors.

Cheers.

- Nitin
 
Author and all-around good cowpoke
Posts: 13078
6
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
If you have 900,000 files in a single folder, there is no way to avoid having a plain listFiles() create 900,000 objects.

Is there some regularity in file naming or date that you could use to work with smaller subsets?

For example, all files with names starting "a", as accomplished by a FileNameFilter.

Alternately - depending on your operating system, run a system command that generates a text list of all file names and work with that list one file at a time.

Bill
reply
    Bookmark Topic Watch Topic
  • New Topic