Cariddi - Take A List Of Domains, Crawl Urls And Scan For Endpoints, Secrets, Api Keys, File Extensions, Tokens And More...

2 years ago 149
BOOK THIS SPACE FOR AD
ARTICLE AD



Take a list of domains, crawl urls and scan for endpoints, secrets, api keys, file extensions, tokens and more...

Preview

Installation

You need Go.

Linux

git clone https://github.com/edoardottt/cariddi.git cd cariddi go get make linux (to install) make unlinux (to uninstall)

Or in one line: git clone https://github.com/edoardottt/cariddi.git; cd cariddi; go get; make linux

Windows (executable works only in cariddi folder.)

git clone https://github.com/edoardottt/cariddi.git cd cariddi go get .\make.bat windows (to install) .\make.bat unwindows (to uninstall)

Get Started

cariddi -h prints the help in the command line.

Usage of cariddi:
-c int
Concurrency level. (default 20)
-cache
Use the .cariddi_cache folder as cache.
-d int
Delay between a page crawled and another.
-e Hunt for juicy endpoints.
-ef string
Use an external file (txt, one per line) to use custom parameters for endpoints hunting.
-examples
Print the examples.
-ext int
Hunt for juicy file extensions. Integer from 1(juicy) to 7(not juicy).
-h Print the help.
-i string
Ignore the URL containing at least one of the elements of this array.
-it string
Ignore the URL containing at least one of the lines of this file.
-oh string
Write the output into an HTML file.
-ot string
Write the output into a TXT file.
-plain
Print only the results.
-s Hunt for secrets.
-sf string
Use an external file (txt, one per line) to use custom regexes for s ecrets hunting.
-version
Print the version.

Examples

cariddi -version (Print the version)

cariddi -h (Print the help)

cariddi -examples (Print the examples)

cat urls | cariddi -s (Hunt for secrets)

cat urls | cariddi -d 2 (2 seconds between a page crawled and another)

cat urls | cariddi -c 200 (Set the concurrency level to 200)

cat urls | cariddi -e (Hunt for juicy endpoints)

cat urls | cariddi -plain (Print only useful things)

cat urls | cariddi -ot target_name (Results in txt file)

cat urls | cariddi -oh target_name (Results in html file)

cat urls | cariddi -ext 2 (Hunt for juicy (level 2 of 7) files)

cat urls | cariddi -e -ef endpoints_file (Hunt for custom endpoints)

cat urls | cariddi -s -sf secrets_file (Hunt for custom secrets)

cat urls | cariddi -i forum,blog,community,open (Ignore urls containing these words)

cat urls | cariddi -it ignore_file (Ignore urls containing at least one line in the input file)

cat urls | cariddi -cache (Use the .cariddi_cache folder as cache.)

For Windows use powershell.exe -Command "cat urls | .\cariddi.exe"

Contributing

Just open an issue/pull request. See also CONTRIBUTING.md and CODE OF CONDUCT.md

Help me building this!

A special thanks to:

zricethezav

To do:

Tests

Tor support

Proxy support

Ignore specific types of urls

Plain output (print only results)

HTML output

Build an Input Struct and use it as parameter

Output color

Endpoints (parameters) scan

Secrets scan

Extensions scan

TXT output


Cariddi - Take A List Of Domains, Crawl Urls And Scan For Endpoints, Secrets, Api Keys, File Extensions, Tokens And More... Cariddi - Take A List Of Domains, Crawl Urls And Scan For Endpoints, Secrets, Api Keys, File Extensions, Tokens And More... Reviewed by Zion3R on 8:30 AM Rating: 5

Read Entire Article