A memory leak would start like you describe, with more and more memory being allocated.
But that doesn't have to mean you have a leak. It would only be a leak if that memory is never released, and strictly only if the memory is never released and there is no way to release it.
In C that would mean you allocate a block of memory to a pointer, then loose the pointer without first deallocating the memory.
For example
Now that memory of roughly 10KB is gone forever, the OS thinks the application is still using it but the application knows nothing about it.
Do that a million times and you've allocated almost 10GB of RAM that noone can use until you exit the application (if then).
In
Java such situations are mostly impossible because of the garbage collector.
When a reference goes out of scope all memory it references that doesn't have a reference elsewhere is marked to be freed and returned to the operating system at some point.
There is no guarantee when that will happen, but it is guaranteed to happen before the JVM runs out of memory.
That's not to say you can get memory problems still, but those are caused by allocating memory (through the creation of large numbers of objects) which are held onto unnecessarilly by the application (in other words, design flaws rather than programming errors).
Say you are trying to read a 1GB file.
If you read it all into RAM and then put the reference to that data in a static variable and never use it again, you're now holding a GB of RAM you shouldn't.
Technically that's not a leak, because you could still release it, but it has pretty much the same effect (you're holding onto RAM you don't need).