@0x0SojalSec tweeted out a pure genius one-liner for automated SQL Injection pentesting and it while it was mind-blowing, it is also useful to dissect into the various elements. Along the way we can learn some great tools for command line penetration testing!
Check out the original tweet or the image below:
This is a great example of how automated toolkits can provide do a lot of work that doesn’t cost a lot of time. So, let’s disect the command and learn 5 great command line tools from @0x0SojalSec’s sorcery that will certainly prove useful on a pen-testing engagement.
Subfinder is a command line tool from ProjectDiscovery.io that accepts a top-level domain and will return a set of subdomains from historical DNS records. Whenever relying on historical DNS records, the output is only as good as the service’s repository of historical data, but ProjectDiscovery’s service is top notch. For every TLD I gave it, I got a full list of all subdomains going back almost 10 years. So to start off, this command is converting a list of top-level domains into a list of all possible subdomains, expanding the potential attack surface.
ProjectDiscovery does it again with dnsx. This tool is a simple and handy utility to query DNS records and it may be compared to the dig command. This tools is a great compliment to subfinder, because it while subfinder queries historical DNS records to provide the most complete list of attack surface possible, dnsx effectively narrows down that list to only those that are currently active.
The waybackurls command line tool acts as an API to The Wayback Machine internet archive. The benefit of this tools is that it gives similar results to a web-crawl or spidering tool such as wget but without interacting with the attack surface itself and thus provides a higher degree of stealth. That being said, if you don’t have to worry about triggering an alert on the blue team for smashing the attack surface, you can use another spider tool to gain a result that is more current and potentially more valuable. Plus points for stealth here! 🥷
You can install waybackurls using the GoLang command line:
What are you going to do with all that lØØt? Filtering it again would be a good idea to eliminate any duplicates, and remove any boring stuff like recursive page formats like blog posts and media files and the like. The uro tool removes:
- human written content e.g. blog posts
- urls with same path but parameter value difference
- incremental urls e.g.
- image, js, css and other static files
Uro is a great tool to turn a chaotic list of URLs into one that is suitable for SQL injection. You can’t SQL inject a JS file, media file, and there is no sense attacking the same attributes over and over again. The OP also correctly states that while one-liners may get a lot of work done efficiently, they do not substitute for manual inspection and attacks. This filter may eliminate some potentially valuable links, but in an automated attack, it will also effectively reduce the low probability clutter.
Httpx is another tool from ProjectDiscovery and a good command line tool for filtering valid URLs. Httpx is a fast and multi-purpose HTTP toolkit that allows running multiple probes using the retryablehttp library. It is designed to maintain result reliability with an increased number of threads. Here is a quick installation and usage guide. The tool has many uses including as a recon tool for identifying the web-stack a particular website is using, but is also an efficient way to filter a list of URLs and eliminate ones that are not 200 response codes.
In this case, httpx provides a good secondary filter because it will pick out only the currently valid URLs. Since the one-liner fetched the URLs from the wayback machine, some of them may be dead, and it’s better to not probe too many dead URLs. You might seem like a bot. More points for stealth! 🥷