• Post Reply Bookmark Topic Watch Topic
  • New Topic

Extracting List of URL/URI  RSS feed

 
Ranch Hand
Posts: 40
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello all,
I want get the list of all the url/uri that can be reached from my site.
I want both hyperlinks and action targets.
How to get the URLs.
Thanks in advance.
Prabhakar
 
Ranch Hand
Posts: 229
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I don't know if there are any special tools for this but you could use the java.net.URL class to open each page of your website (use the openConnection method) download the content and parse out all the URLs manually.

This probably is not what you wanted to hear. But it is possible like that.
 
Prabhakar Rao
Ranch Hand
Posts: 40
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks for that.
But I want to know are there any easy to use APIs for that.
Also I should be able to get the links before site is hosted.
I want to keep track of new URLs also that may be included later.
Please respond as early as possible.

Prabhakar
 
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I suspect you could use the JTidy toolkit for this since it creates a seachable DOM for each web page. It would also have the advantage of catching any XHTML format errors.

Bill
 
(instanceof Sidekick)
Ranch Hand
Posts: 8791
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I like the Quiotix Parser with a handy Visitor design that is easier to use than walking the DOM. I made a cross referencer to show all all the pages in a site link to each other. Here's a fragment of the visitor ... really all you need to write.
 
It is sorta covered in the JavaRanch Style Guide.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!