• Post Reply Bookmark Topic Watch Topic
  • New Topic

OutOfMemory with File.listFiles()  RSS feed

 
Nitin Dubey
Ranch Hand
Posts: 126
Oracle
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I have around 9lac files in a folder that we need to process. When the load encounters around 9Lac the listFiles() method in File class throws OutOfMemory error. Is there any better mechanism to handle that?

I can obviously put filter to reduce the list to some extent. But that condition also has apossibility to blow up some day. Is there any class that does iterations for huge collection object avoiding OutOfMemory errors.

Cheers.

- Nitin
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If you have 900,000 files in a single folder, there is no way to avoid having a plain listFiles() create 900,000 objects.

Is there some regularity in file naming or date that you could use to work with smaller subsets?

For example, all files with names starting "a", as accomplished by a FileNameFilter.

Alternately - depending on your operating system, run a system command that generates a text list of all file names and work with that list one file at a time.

Bill
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!