• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

inputStream openStream

 
Ranch Hand
Posts: 38
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello Java friends,

I am struggling here with an issue which I couldn't figure out how to handle. Basically I am sending queries to google and returning links and then opening streams to each of these urls.

Here is my UrlReader class:




the problem I am having is some urls are not responding or basically have 404 and when it happens my application stops running and quit with the message of forexample:


There is a problem downloading from: http://www.washingtonpost.com/wp-dyn/content/article/2006/03/23/AR2006032300817.htmljava.io.FileNotFoundException: http://www.washingtonpost.com/wp-dyn/content/article/2006/03/23/AR2006032300817.html
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:798)



what I want to do is build some kind of timer or a flag and if the connection fails or simply if the server doesn't respond I want my program to silently keep on crawling through the rest of the list of URLs.

What kind of way should I follow interms of that? I'd be grateful if you guys point me out the approach I should take here.


my best
ilteris
 
ilteris kaplan
Ranch Hand
Posts: 38
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
is this a bad question to ask in this section?

thanks
 
author and iconoclast
Posts: 24207
46
Mac OS X Eclipse IDE Chrome
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
There's no way to know if a URL is bad until you try to open it. The immediate problem here is that if you find a bad URL and catch an exception, you return a null InputStream; then the "readContent" method tries to open a BufferedReader on that null stream, you get a NullPointerException, and the program aborts. If you catch an exception while opening a URL, don't continue trying to read it -- move on to the next one. The readContent() method could simply check for those nulls.
 
ilteris kaplan
Ranch Hand
Posts: 38
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
thanks for the heads up! adding an if-clause did the trick.
 
With a little knowledge, a cast iron skillet is non-stick and lasts a lifetime.
reply
    Bookmark Topic Watch Topic
  • New Topic