what I want to do is build some kind of timer or a flag and if the connection fails or simply if the server doesn't respond I want my program to silently keep on crawling through the rest of the list of URLs.
What kind of way should I follow interms of that? I'd be grateful if you guys point me out the approach I should take here.
There's no way to know if a URL is bad until you try to open it. The immediate problem here is that if you find a bad URL and catch an exception, you return a null InputStream; then the "readContent" method tries to open a BufferedReader on that null stream, you get a NullPointerException, and the program aborts. If you catch an exception while opening a URL, don't continue trying to read it -- move on to the next one. The readContent() method could simply check for those nulls.