Download the PHP package lobotomised/laravel-autocrawler without Composer
On this page you can find all versions of the php package lobotomised/laravel-autocrawler. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download lobotomised/laravel-autocrawler
More information about lobotomised/laravel-autocrawler
Files in lobotomised/laravel-autocrawler
Package laravel-autocrawler
Short Description A tool to crawl your own laravel installation checking your HTTP status codes
License MIT
Homepage https://github.com/lobotomised/laravel-autocrawler
Informations about the package laravel-autocrawler
Check your application for broken links
Using this package you can check if your application have broken links.
Installation
This package can be installed via Composer:
When crawling your site, it will automatically detect the url your application is using. If instead it scan http://localhost, check in your .env you properly configure the APP_URL variable
Usage
Crawl a specific url
By default, the crawler will crawl the URL from your current laravel installation. You can force the url with the --url
option:
Concurrent connection
The crawler run with 10 concurrent connections to speed up the crawling process. You can change that by passing the --concurrency
option:
Timeout
The request timeout is by default 30 seconds. Use the --timeout
to change this value
Ignore robots.txt
By default, the crawler respect the robots.txt. These rules can be ignored with the --ignore-robots
option:
External link
When the crawler find an external link, it will check this link. It can be deactivated with the --ignore-external-links
option:
Log non-2xx or non-3xx status code
By default, the crawler will only in your console. You can log all non-2xx or non 3xx status code to a file with the --output
option. Result will be store in storage/autocrawler/output.txt
The output.txt will look like that:
Fail when non-2xx or non-3xx are found
By default, the command exit codes is 0. You can change it to 1 to indicate that the command has failed with the --fail-on-error
Launch the robot interactively
Eventually, you may configure the crawler interactively by using the --interactive
option:
Working with GitHub actions
To execute the crawler you first need to start a web server. You can choose to install apache or nginx. Here is an example using the php build-in webserver
If the crawl found some non-2xx or non-3xx response, the action will fail, and the result will be store as an artifacts of the Action.
Documentation
All commands and informations are available with the command:
Alternatives
This package is heavily inspire by spatie/http-status-check, but instead of being a project dependency, it is a global installation
Testing
First we need to start the included node http server in a separate terminal.
Then to run the tests:
Changelog
Please see CHANGELOG for more information on what has changed recently.
License
The MIT License (MIT). Please see License File for more information.
All versions of laravel-autocrawler with dependencies
illuminate/console Version ^10.0|^11.0
illuminate/filesystem Version ^10.0|^11.0
spatie/crawler Version ^8.2