4) website crawler software kali Linux – metasploit:
use auxiliary/crawler/msfcrawler msf auxiliary(msfcrawler) > set rhosts www.example.com msf auxiliary(msfcrawler) > exploit you might want to check my metasploit posts
5) httrack
httrack http://192.168.x.x –O ~/Desktop/file
httrack will mirror the site for you, by visiting and downloading every page that it can find. Sometimes this is a very useful option.
6) burp suite
Burpsuite has a spider built in, you can right-click on a request and ‘send to spider’
7) wget -r
wget -r http://192.168.x.x
wget can recursively download a site (similar to httrack)
https://www.gnu.org/software/wget/manual/wget.html#Recursive-Retrieval-Options
This is very informative and well detailed. That is great and very helpful for hacking beginners. Thanks!