• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Devaka Cooray
  • Knute Snortum
  • Paul Clapham
  • Tim Cooke
Sheriffs:
  • Liutauras Vilda
  • Jeanne Boyarsky
  • Bear Bibeault
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Ron McLeod
  • Piet Souris
  • Frits Walraven
Bartenders:
  • Ganesh Patekar
  • Tim Holloway
  • salvin francis

inputStream openStream  RSS feed

 
Ranch Hand
Posts: 38
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello Java friends,

I am struggling here with an issue which I couldn't figure out how to handle. Basically I am sending queries to google and returning links and then opening streams to each of these urls.

Here is my UrlReader class:




the problem I am having is some urls are not responding or basically have 404 and when it happens my application stops running and quit with the message of forexample:


There is a problem downloading from: http://www.washingtonpost.com/wp-dyn/content/article/2006/03/23/AR2006032300817.htmljava.io.FileNotFoundException: http://www.washingtonpost.com/wp-dyn/content/article/2006/03/23/AR2006032300817.html
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:798)



what I want to do is build some kind of timer or a flag and if the connection fails or simply if the server doesn't respond I want my program to silently keep on crawling through the rest of the list of URLs.

What kind of way should I follow interms of that? I'd be grateful if you guys point me out the approach I should take here.


my best
ilteris
 
ilteris kaplan
Ranch Hand
Posts: 38
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
is this a bad question to ask in this section?

thanks
 
author and iconoclast
Posts: 24203
43
Chrome Eclipse IDE Mac OS X
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
There's no way to know if a URL is bad until you try to open it. The immediate problem here is that if you find a bad URL and catch an exception, you return a null InputStream; then the "readContent" method tries to open a BufferedReader on that null stream, you get a NullPointerException, and the program aborts. If you catch an exception while opening a URL, don't continue trying to read it -- move on to the next one. The readContent() method could simply check for those nulls.
 
ilteris kaplan
Ranch Hand
Posts: 38
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
thanks for the heads up! adding an if-clause did the trick.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!