Welcome to the N.Y.A.W.C documentation¶
A very useful web crawler for vulnerability scanning. Not Your Average Web Crawler (N.Y.A.W.C) is a Python application that enables you to crawl web applications for requests instead of URLs. It crawls every GET and POST request on the specified domain and keeps track of the request and response data. It’s main purpose is to be used in web application vulnerability scanners
How it works¶
- Add your start request to the queue.
- Crawler starts first request in the queue (repeats until ``max threads`` option reached).
- Crawler adds all requests found in the response to the queue (except duplicates).
- Crawler goes to step #2 again to spawn new requests.
Please note that if the queue is empty and all crawler threads are finished, the crawler will stop.