I don't know if there are any special tools for this but you could use the java.net.URL class to open each page of your website (use the openConnection method) download the content and parse out all the URLs manually.
This probably is not what you wanted to hear. But it is possible like that.
posted 12 years ago
Thanks for that. But I want to know are there any easy to use APIs for that. Also I should be able to get the links before site is hosted. I want to keep track of new URLs also that may be included later. Please respond as early as possible.
I like the Quiotix Parser with a handy Visitor design that is easier to use than walking the DOM. I made a cross referencer to show all all the pages in a site link to each other. Here's a fragment of the visitor ... really all you need to write.
A good question is never answered. It is not a bolt to be tightened into place but a seed to be planted and to bear more seed toward the hope of greening the landscape of the idea. John Ciardi
Create symphonies in seed and soil. For this tiny ad:
how do I do my own kindle-like thing - without amazon