Libraries tagged by 64Robots
dr4g0nsr/sitemap-crawler
20 Downloads
Crawler for any type of site using robots.txt and sitemap.xml as the source of URL. Useful for cache regenerating.
dawid-z/phpcrawl
86 Downloads
PHPCrawl is a webcrawler/webspider-library written in PHP. It supports filters, limiters, cookie-handling, robots.txt-handling, multiprocessing and much more.
critiq/laravel-head-manifest
116 Downloads
Define your HTML head attributes in a manifest shared between Laravel and an SPA project so that robots crawling the page have access to up too path-specific metadata without requiring server side rendering on the SPA side
beeg99/phpcrawl
3 Downloads
PHPCrawl is a webcrawler/webspider-library written in PHP. It supports filters, limiters, cookie-handling, robots.txt-handling, multiprocessing and much more.
su-sws/stanford_metatag_nobots
38 Downloads
stanford_metatag_nobots for blocking robots
su-sws/nobots
43957 Downloads
This module blocks (well-behaved) search engine robots from crawling, indexing, or archiving your site by setting a "X-Robots-Tag: noindex,nofollow,noarchive" HTTP header.
mll-lab/liquid-handling-robotics
11259 Downloads
PHP package to handle data from liquid handling robots