I need to review a raft of URLs to see whether they are still in use and if so what more I can find out about the site.
The tool will take URLs and breakdown the critical elements of what it finds.
Single or batch (csv) URLs can be provided to the tool which will trawl the internet and return the following details:
1. Registation details
a. Registrant name & company
b. Admin contact
c. Registered on:
d. Date of expiry
e. Registered through
2. Host server
c. Hosting provider
d. Any other domains on the same IP
a. Live content (OR error returned OR 301)
b. Sitemap? (provide link)
c. Last update date?
d. Copyright date?
e. Author tag?
f. Number of pages (spider?) - nice to have!
4. Technology ([url removed, login to view])
a. Content Management System
c. Analytics packages
The tool will also flag if the IP is the same with or without the 'www’ subdomain and return the results accordingly.
The results should be returned as a CSV for examination in Excel. The application could be web-based or desktop based, but should be (ideally) run without any installation files on the client machine.
Happy to refine this requirements based on comments and feedback of what is possible. Please no stock responses to say that the project can be completed to my satisfaction, I'd like to know what can and can't be done.