Libraries tagged by 64Robots

dr4g0nsr/sitemap-crawler

3 Favers
20 Downloads

Crawler for any type of site using robots.txt and sitemap.xml as the source of URL. Useful for cache regenerating.

Go to Download


dawid-z/phpcrawl

1 Favers
86 Downloads

PHPCrawl is a webcrawler/webspider-library written in PHP. It supports filters, limiters, cookie-handling, robots.txt-handling, multiprocessing and much more.

Go to Download


critiq/laravel-head-manifest

1 Favers
116 Downloads

Define your HTML head attributes in a manifest shared between Laravel and an SPA project so that robots crawling the page have access to up too path-specific metadata without requiring server side rendering on the SPA side

Go to Download


beeg99/phpcrawl

0 Favers
3 Downloads

PHPCrawl is a webcrawler/webspider-library written in PHP. It supports filters, limiters, cookie-handling, robots.txt-handling, multiprocessing and much more.

Go to Download


su-sws/stanford_metatag_nobots

2 Favers
38 Downloads

stanford_metatag_nobots for blocking robots

Go to Download


su-sws/nobots

1 Favers
43957 Downloads

This module blocks (well-behaved) search engine robots from crawling, indexing, or archiving your site by setting a "X-Robots-Tag: noindex,nofollow,noarchive" HTTP header.

Go to Download


mll-lab/liquid-handling-robotics

1 Favers
11259 Downloads

PHP package to handle data from liquid handling robots

Go to Download


<< Previous