Win a copy of Kotlin in Action this week in the Kotlin forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic

Stream tokenizer not functioning as exspected  RSS feed

 
tony navaratnam
Greenhorn
Posts: 14
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
i am trying to save a txt file to disc and then load and use the contents back again.
The text file is created from the following code:


Each line takes the form of a letter used for identification and three ints:

(quotes) r(quotes)+a+" "+b+" "+c+" "+d+"\n"

or in the text file

o 123 456 687 565 []p 123 354 575 444[] ..........etc

I am trying to read the file as an streamtokenizer :



I don't intend this to be the final code but this code is to test the stream and the logic required further in the code to place and use the integers. All the research i have done leads me to believe that the code should return:

letterr
integer X1Y1Z1
integer X2Y2Z2
integer X3Y3Z3
integer X4Y4Z4
letterr
.
.
.
EOF and termination

The integer may be one two or three digits long but never larger. When I run the program i get

letterr
integer 0
integer X1
integer Y2
integer Z3
.
.
.
.
letterr
integer Z4 //the previous digit
integer X1
integer Y1
.
.
.
EOF an termination

I can't figure out why my program is acting this way
any ideas wil be apppreachiated
[ February 21, 2007: Message edited by: tony navaratnam ]
 
Stan James
(instanceof Sidekick)
Ranch Hand
Posts: 8791
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I'm inclined to thing StreamTokenizer is a poor fit for your problem. Scanner, new in Java 5, would be a nifty tool, though. If you're not up to Java 5, I'd read lines and parse each line with StringTokenizer. Take a look at those alternatives and see if they make sense. If not, we'll dig into how StreamTokenizer works ... and I think it's not what you think.
 
tony navaratnam
Greenhorn
Posts: 14
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Its all fine now.You were right about the stream tokenizer , it is fine and i was not writing the file properly. I was using writechars which leaves a gap between each letter etc. A quick change to writebytes solved my problem.

Thanks
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!