Scans web applications for second-order subdomain takeover by crawling the app, and collecting URLs (and other data) that match certain rules, or respond in a certain way.
Installation
From binary
Download a prebuilt binary from the releases page and unzip it.
From source
Go version 1.17 is recommended.
Docker
docker pull mhmdiaa/second-order
Command line options
<div class="snippet-clipboard-content position-relative overflow-auto" data-snippet-clipboard-copy-content=" -target string Target URL -config string Configuration file (default "config.json") -depth int Depth to crawl (default 1) -header value Header name and value separated by a colon 'Name: Value' (can be used more than once) -insecure Accept untrusted SSL/TLS certificates -output string Directory to save results in (default “output”) -threads int Number of threads (default 10)”>
-target string Target URL -config string Configuration file (default "config.json") -depth int Depth to crawl (default 1) -header value Header name and value separated by a colon 'Name: Value' (can be used more than once) -insecure Accept untrusted SSL/TLS certificates -output string Directory to save results in (default "output") -threads int Number of threads (default 10)
LogQueries: A map of tag-attribute queries that will be searched for in crawled pages. For example, "a": "href" means log every href attribute of every a tag.
LogNon200Queries: A map of tag-attribute queries that will be searched for in crawled pages, and logged only if they contain a valid URL that doesn’t return a 200 status code.
LogInline: A list of tags whose inline content (between the opening and closing tags) will be logged, like title and script
Output
All results are saved in JSON files that specify what and where data was found
The results of LogQueries are saved in attributes.json