Author: Vordrak, Posted: Fri May 14, 2010 12:47 pm Post subject: Website Analysis Software Suggestions ---- Hi,
I am looking for some software to analyse and crawl a website and create a map as far as possible. This would be used for basic testing of my websites and research.
Basically I would like a program where you can put in a domain name. The program would then be able to determine what websites and stuff were on that domain. Basically, it would 'scan' for web-pages.
Ideally it would reveal sub-folders that might not be linked to from the index page. For example, if http://mysite.com had a subfolder called http://mysite.com/forum containing a PHPBB forum I would like to know, even if that part of the site was not linked to from the index page at http://mysite.com?
Does that make sense? In general, I am looking for a lawful, research program that can be pointed at a domain and scan all public websites at that address. Even ones that might not be obvious. This would be a pure web scanner limiting itself to port 80 but ideally it would support proxies.
Suggestions?
Author: operat0r2, Posted: Fri May 14, 2010 3:05 pm Post subject: ---- wget -r http://google.com
more advanced (java etc ) httrack
for sec scanning:
w3af and nikto (wikto )
HP's SPI Dynamics