Packages and Binaries:


Fast golang web crawler for gathering URLs and JavaSript file locations. This is basically a simple implementation of the awesome Gocolly library.

Installed size: 9.37 MB
How to install: sudo apt install hakrawler

  • libc6
root@kali:~# hakrawler --help
Usage of hakrawler:
  -d int
    	Depth to crawl. (default 2)
  -h string
    	Custom headers separated by two semi-colons. E.g. -h "Cookie: foo=bar;;Referer:" 
    	Disable TLS verification.
    	Output as JSON.
  -proxy string
    	Proxy URL. E.g. -proxy
  -s	Show the source of URL based on where it was found. E.g. href, form, script, etc.
  -size int
    	Page size limit, in KB. (default -1)
    	Include subdomains for crawling.
  -t int
    	Number of threads to utilise. (default 8)
  -timeout int
    	Maximum time to crawl each URL from stdin, in seconds. (default -1)
  -u	Show only unique urls.

Updated on: 2024-Feb-16