Thanks & sorry it was a bit vague as my post was written on a phone while my brain was fried from heat.
//embarrassed
My goal is to create a map program (hex style for fantasy rpgs, incidentally, as i wasn't entirely happy with anything i could find) with ~1-mile hexes as it's base and designed to easily handle at least 500mile wide maps. Unlike a lot of the ones I've seen, however, I plan to store a lot of different information on each hex, most of which will be numeric in nature, and each 'layer' being representative of a specific type of information (elevation or rainfall, for example).
The amount of repetition will vary greatly by layer, and a large amount of the planned functionality will involve doing read-compute-write with several layers at the same time. Many of the planned algorithms, however, plan to use small incremental changes to iteratively generate larger
patterns, so most of them will be completely full of data that doesn't repeat that often, and using larger data types than planned may have to happen to make some of them work as intended. On the upside, however, it does mean that the number of data entries can be constant hex-to-hex.
I'm not trying to make it completely optimized from the start, However, since all functions being things built around this central data-structure(s) I would like to get what kind(s) I should be using correct from the start to minimize re-writing.
I plan to save this data, along with other data, into a single file for ease or transport. In time, though, the data set may grow to be larger than available ram, geometric relations being what they are, possible future features, and with the inclusion of multiple data-types, so I was trying to structure it from the start to be able to pull portions of the data up without loading the whole set. I had considered making 2d arrays for each layer, but trying to quickly call up -all- the data on multiple hexes in this case would require pulling up, searching through, and closing every layer in turn. (Which i'm trying to avoid for latency reasons)
To directly answer another question, though, I shouldn't need to add/remove items, but if I choose to at some point, it will be infrequent enough that simply recreating a larger data-set and moving the data over would be feasible.
In development terms, however, being able to easily add new layers of data for future functionality would be a big plus. Unlike a number of projects like this, however, ai/pathfinding concerns with regard to terrain will be very minimal even in the worst case, so structuring with that in mind is thankfully not a concern. I may, however, end up having a number of data sets of this size, such as if I include even a reasonably abstracted economic component to the program, as I would like to eventually do once the main part of it is sufficiently fleshed out, so I'm trying to conserve resources (ideally keeping the program to 1gb of ram use or less if possible to increase the number of systems it can run on, and would rather have parts of the program work a bit slow, rather than make it unusable to people with older machines). Since this is intended as a single-user, offline application with no personal data, thankfully, data-security isn't really a concern. (In fact being able to manually pass data files between computers easily is intended)
Hope that makes more sense and that I didn't miss any of your questions and thank you for your help as I try to not code myself into a proverbial corner.
ps: i have some, albeit rusty, education in
java, however, this is my first time coding anything where I had to be concerned with memory management, so I might be being just overly cautious as it's my first time dealing with these kinds of challenges