TL;DR

  • Recon is a loop, not a one-off: collect → triage → investigate → repeat.
  • Keep outputs versioned per run so you can spot what changed.
  • Separate discovery (wide) from validation (deep).

1) Scope → target list

Create a clean targets.txt (one root domain per line) and a folder per program.

2) Wide pass (daily/weekly)

  • subdomains
  • live HTTP endpoints
  • basic tech + title
  • screenshots (optional)

3) Deep pass (per interesting host)

  • crawling + JS extraction
  • parameter discovery
  • auth flow mapping
  • targeted fuzzing/scanning (not “spray and pray”)

4) Notes that make you faster

  • “Where did I get this URL from?”
  • “Why do I think this host is special?”
  • “What did I already try?”

If you can’t answer those in 10 seconds, you’ll redo work.