Help coderanch get a
new server
by contributing to the fundraiser
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

need of help or projectmate or guide

 
Greenhorn
Posts: 23
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
dear friends:
iam a Mtech student doing the following pjt.
if anybody intrested to actively participate
in the pjt then mail to dineshjayaraj@yahoo.com.
also if any body have done related pjt which
i have mentioned below please send the coding
which help for me to progress.
thanks in advance
dinesh jayaraj
1.To validate links (URLs) in html files or SQL tables to verify the currency of the links (Link-validator)
2.To identify new files in specified FTP sites and download these (ftp-download)
3.To monitor a set of specified web sites (URLs) for verifying content change and download the page, if there is change in the content (site-monitor)
3. Salient features of the project:
Broad functional requirements for the three tools are given below. The user via the GUI interface specifies I/O requirements.
3.1 Link Validator:
�Input: i) ASCII or HTML file (on local hard disc or network) containing URLs, ii) set of URLs typed into a text box on the GUI, iii) URLs in a specified field and table in a SQL table on the Win�NT server. URLs in these files have to be extracted by the tool.
�Output: Validation reports files, created in the directory from where the tool is executed. Content of the diagnostic reports is to be worked out. Some statistical output may also be required. Output may be displayed on screen or output to a file, at user discretion.
�Validation criteria: Can be based on HTTP errors returned by web sites, time outs, DNS errors, etc. Details to be worked out.
�Should work with proxy servers.
�User should be able to stop the tool at any time and restart later. When started, the tool should remember the previous state and check with the user if it should continue validation from that point or start from beginning.
�Should support scheduling.
3.2 FTP-Download:
�Input: A file containing a set of FTP/ HTTP sites (URLs) from where the files are to be downloaded. The file may be in the local hard disc or on the LAN.
�Each ftp site may have associated user id and password to be used by the tool to gain authorized access.
�The tool should remember what files were already downloaded and identify new files that need to be downloaded.
�Output: New files, if any, downloaded from each ftp site, and stored in appropriate directories in the local hard disc.
�Should work with proxy servers.
�User should be able to stop the tool at any time and restart later. When started, the tool should remember the previous state and check with the user if it should continue download from that point or start from beginning.
�Should support scheduling.
3.3 Site Monitor:
�Input: ASCII or HTML file containing a list of URLs that needs to be monitored. The file may be in the local hard disc or LAN. The tool also accepts the option if the pages associated with the URLs need to be only verified for content change or also to be downloaded if there is content change.
�Output: Indication if the content has changed (and the date of change) and downloaded page stored in an appropriate directory, if the download option was selected.
�The tool needs to keep track of when a URL was visited last time and status of that page (size, date, no. of lines, etc. and also perhaps a copy of the page) and use this information for comparison with the information gathered in the current visit.
�Should work with proxy servers.
�User should be able to stop the tool at any time and restart later. When started, the tool should remember the previous state and check with the user if it should continue download from that point or start from beginning.
�Should support scheduling.

 
Be reasonable. You can't destroy everything. Where would you sit? How would you read a tiny ad?
We need your help - Coderanch server fundraiser
https://coderanch.com/t/782867/Coderanch-server-fundraiser
reply
    Bookmark Topic Watch Topic
  • New Topic