Hakrawler is a simple and fast tool to scan and crawl web pages in a few minutes. Bug bounty hunters use this tool to crawl all the URLs and easily store them in the text file. The Hakrawler tool was created by Luke Stephens, known as “Hakluke” in the cybersecurity community. He have a YouTube channel called Hakluke. This tool is written in the Go language and gathers all URLs and JavaScript file locations of a website. It will discover every endpoint and asset within a web application or a website.
How does the Hakrawler work?
When the user executes the hakrawler tool and inputs the targeted URL. This tool will extract all the URLs for the webpages of the targeted URL from Way back machine, robots.txt file, and sitemap.xml files. It will discover all the URLs. They can be target URLs, subdomains connected to the target, JavaScript files, forms, and external website URLs. All the URLs will show up in the output.
Features
- Simple to connect to other tools (accepts hostnames from stdin, dumps plain URLs to stdout using the -plain tag).
- Collect URLs from Way back machine, robots.txt, and sitemap.xml files.
- This tool is fast because it is written in the go language.
- Discovers new target domains and subdomains when they are found throughout the crawling process.
- Quickly filter out the output to narrow down the scope.
- The results can be exported in raw HTTP requests format into the files. This will help to perform SQL Injection from tools such as SQLMap.
How to install Hakrawler in Kali Linux?
These steps are the same as installing the hakrawler in any Linux distribution, such as Parrot Security OS, Ubuntu, etc.
The only prerequisite is that the Go programming language should be installed first.
Go to the GitHub page of the tool and copy the code as shown below:
go install github.com/hakluke/hakrawler@latest
Now, the tool is installed, as shown in the image below.
To check, if the tool is installed correctly and working.
We will look the help of the tool.
The location where tools written in the Go language are stored:
cd ~/go/bin
hakrawler --help
But, we have to set this tool as a global variable. So, we can access with from any directory.
To make it available globally, follow the steps.
- Go to the location of the tool.
- Move the tool to the bin folder.
sudo mv hakrawler /usr/local/bin
To see practically how to use the hakrawler tool. Refer to “How To Get Subdomains And Juicy URLs with Hakrawler Tool?”.
Options
-d : Input the number of depth to crawl. The default is 2.
-dr: It will disable following the HTTP redirects.
-h string: Custom headers are separated by two semi-colons. For example, -h “Cookie: foo=bar;;Referer: http://example.com/”
-i: To crawl only inside the path.
-insecure: It will disable TLS verification.
-json: It will give output in JSON format.
-proxy string: Enter the proxy URL to route the website to the VPN tunnel. For example, -proxy http://127.0.0.1:8080
-s: Show the URL source based on where it was found. For example, href, form, script, etc.
-size int: To input the limit of the page size. By default, it is -1.
-subs: It will include subdomains for crawling.
-t : Number of threads. By default, it’s 8.
-timeout : Maximum time to crawl each URL from stdin in seconds. By default, it is -1.
-u: It will show only unique URLs.
-w: To show the only link in which the URL was found.
Conclusion
Hakrawler is an excellent tool for extracting subdomains and URLs very fast. This tool was created by YouTuber and bug bounty hunter named Luke Stephens (Hakluke). It can give the result fast as compare to other tools.