Hello
Java friends,
I am struggling here with an issue which I couldn't figure out how to handle. Basically I am sending queries to google and returning links and then opening streams to each of these urls.
Here is my UrlReader class:
the problem I am having is some urls are not responding or basically have 404 and when it happens my application stops running and quit with the message of forexample:
There is a problem downloading from: http://www.washingtonpost.com/wp-dyn/content/article/2006/03/23/AR2006032300817.htmljava.io.FileNotFoundException: http://www.washingtonpost.com/wp-dyn/content/article/2006/03/23/AR2006032300817.html
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:798)
what I want to do is build some kind of timer or a flag and if the connection fails or simply if the server doesn't respond I want my program to silently keep on crawling through the rest of the list of URLs.
What kind of way should I follow interms of that? I'd be grateful if you guys point me out the approach I should take here.
my best
ilteris