I'm writing something of an IDE for a little-known language and I use a syntax tree to parse the code into logical representations. Essentially breaking it down into a variables section, functions, statement, and then finally argument to a statement. This approach works for most of the code files I've tested it against, but unfortunately some of my files are extremely large. As in, 100,000+ lines long. When it attempts to read these, it can run out of memory - especially because I made the tree immutable and so any code transformation needs to copy the tree (looking back on it, maybe a poor design decision).
Has anyone run into Java memory issues later? Are there best practices to increase efficiency or increase memory?
When I run my jar, I add the following command to it: -Xmx1g
Since I read that allows more memory use.
Yeah, that tells the JVM to use 1 gigabyte for its heap. But if you're using a 64-bit version of Java (not too hard to arrange if you aren't) then you can use a much larger heap size. I just tried running a little test program with a 32-gigabyte heap, with no complaint from the JVM. Even larger heaps might be possible, I haven't tried.