5 Forks
21 Stars
21 Watchers

Crawler

整理本人在2021年10月-12月期间写的一些爬虫demo,比如用于渗透测试中SQL注入的URL收集脚本(爬取必应和百度搜索结果的URL),子域名爆破demo,各大高校漏洞信息收集爬虫,以及入门爬虫时写的一些基础代码

How to download and setup Crawler

Open terminal and run command
git clone https://github.com/ggfgh/Crawler.git
git clone is used to create a copy or clone of Crawler repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with Crawler https://github.com/ggfgh/Crawler/archive/master.zip

Or simply clone Crawler with SSH
[email protected]:ggfgh/Crawler.git

If you have some problems with Crawler

You may open issue on Crawler support forum (system) here: https://github.com/ggfgh/Crawler/issues

Similar to Crawler repositories

Here you may see Crawler alternatives and analogs

 scrapy    Sasila    Price-monitor    webmagic    colly    headless-chrome-crawler    Lulu    newcrawler    scrapple    goose-parser    arachnid    gopa    scrapy-zyte-smartproxy    node-crawler    arachni    newspaper    webster    spidy    N2H4    easy-scraping-tutorial    antch    pomp    talospider    podcastcrawler    FileMasta    lux    scrapy-redis    haipproxy    DotnetSpider    TumblThree