I am trying to load a file and put its data into an array so that I can more easily use it; that's not really presenting a problem, as I can get that just fine. What I'm encountering is that when I try to extract a specific set of chars using a StringTokenizer from each element in that array I get some funky results.
This makes the array, but I can't use it how it is. I need to make it one element longer. Each time I do, though, I get a null pointer exception. (that's if I have ).
Here is where I'm using the StringTokenizer:
The last value of the array returns null (default value) which isn't that much of a surprise, since I'm making it one longer than my libraryFile array.
If I change the length of my libraryFile array to the proper size, I get a null pointer exception on the StringTokenizer.
So my issue, basically, is that I can't get my original array to be sized properly. Does anybody know why, or have any tips?>
From API documentation of StringTokenizer,
StringTokenizer is a legacy class that is retained for compatibility reasons although its use is discouraged in new code. It is recommended that anyone seeking this functionality use the split method of String or the java.util.regex package instead.
Declare line inside the try block. Do your reading from the file with a while loop.
. . . while((line = fin.readLine()) != null) . . .
If you know how to use a List (ArrayList is probably the best kind of List for this purpose) simply add lines to the List.
Then you can get them back with the method in the Collections class which changes a Collection to an array, or use the List.