Waybackurls – Fetch All The URLs That The Wayback Machine Knows About For A Domain

Accept line-delimited domains on stdin, fetch known URLs from the Wayback Machine for *.domain and output them on stdout. Usage example: ▶ cat domains.txt | waybackurls > urls Install: ▶ go get github.com/tomnomnom/waybackurls Credit This tool was inspired by @mhmdiaa’s waybackurls.py script. Thanks to them for the great idea! Download Waybackurls

ClearURLs – An Add-On Based On The New WebExtensions Technology And Will Automatically Remove Tracking Elements From URLs To Help Protect Your Privacy

ClearURLs is an add-on based on the new WebExtensions technology and is optimized for Firefox and Chrome based browsers. This extension will automatically remove tracking elements from URLs to help protect your privacy when browse through the Internet, which is regularly updated by us and can be found here. Application Many websites use tracking elementsRead More

Galer – A Fast Tool To Fetch URLs From HTML Attributes By Crawl-In

A fast tool to fetch URLs from HTML attributes by crawl-in. Inspired by the @omespino Tweet, which is possible to extract src, href, url and action values by evaluating JavaScript through Chrome DevTools Protocol. Installation from Binary The installation is easy. You can download a prebuilt binary from releases page, unpack and run! or withRead More

UDdup – Urls De-Duplication Tool For Better Recon

The tool gets a list of URLs, and removes “duplicate” pages in the sense of URL patterns that are probably repetitive and points to the same web template. For example: https://www.example.com/product/123https://www.example.com/product/456https://www.example.com/product/123?is_prod=falsehttps://www.example.com/product/222?is_debug=true All the above are probably points to the same product “template”. Therefore it should be enough to scan only some of these URLs byRead More

Sigurls – A Reconnaissance Tool, It Fetches URLs From AlienVault’s OTX, Common Crawl, URLScan, Github And The Wayback Machine

sigurls is a reconnaissance tool, it fetches URLs from AlienVault’s OTX, Common Crawl, URLScan, Github and the Wayback Machine. Usage To display help message for sigurls use the -h flag: $ sigurls -h _ _ ___(_) __ _ _ _ _ __| |___/ __| |/ _` | | | | ‘__| / __|__ | (_|Read More

Urlhunter – A Recon Tool That Allows Searching On URLs That Are Exposed Via Shortener Services

urlhunter is a recon tool that allows searching on URLs that are exposed via shortener services such as bit.ly and goo.gl. The project is written in Go. How? A group named URLTeam (kudos to them) are brute forcing the URL shortener services and publishing matched results on a daily basis. urlhunter downloads their collections andRead More

OnionSearch – A Script That Scrapes Urls On Different .Onion Search Engines

OnionSearch is a Python3 script that scrapes urls on different “.onion” search engines. Prerequisite Python 3  Currently supported Search engines ahmia darksearchio onionland notevil darksearchenginer phobos onionsearchserver torgle onionsearchengine tordex tor66 tormax haystack multivac evosearch deeplink ️ Installation With PyPI pip3 install onionsearch With Github git clone https://github.com/megadose/OnionSearch.gitcd OnionSearch/python3 setup.py install  Usage Help:Read More

dorkScanner – A Typical Search Engine Dork Scanner Scrapes Search Engines With Dorks That You Provide In Order To Find Vulnerable URLs

A typical search engine dork scanner that scrapes search engines with queries that you provide in order to find vulnerable URLs.IntroductionDorking is a technique used by newsrooms, investigative organisations, security auditors as well as tech savvy criminals to query various search engines for information hidden on public websites and vulnerabilities exposed by public servers. DorkingRead More

Extended-XSS-Search – Scans For Different Types Of XSS On A List Of URLs

This is the extended version based on the initial idea already published as “xssfinder”. This private version allows an attacker to perform not only GET but also POST requests. Additionally its possible to proxy every request through Burp or another tunnel.First stepsRename the example.app-settings.conf to app-settings.conf and adjust the settings. It should work out ofRead More

X