Download the PHP package gverschuur/laravel-robotstxt without Composer

On this page you can find all versions of the php package gverschuur/laravel-robotstxt. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package laravel-robotstxt

Run tests Code Climate issues Code Climate maintainability Scrutinizer

Dynamic robots.txt ServiceProvider for Laravel 🤖

Installation

Composer

Manual

Add the following to your composer.json and then run composer install.

Service provider registration

This package supports Laravel's service provider autodiscovery so that's it. If you wish to register the package manually, add the ServiceProvider to the providers array in config/app.php.

Usage

Basic usage

This package adds a /robots.txt route to your application. Remember to remove the physical robots.txt file from your /public dir or else it will take precedence over Laravel's route and this package will not work.

By default, the production environment will show

while every other environment will show

This will allow the default install to allow all robots on a production environment, while disallowing robots on every other environment.

Custom settings

If you need custom sitemap entries, publish the configuration file

This will copy the robots-txt.php config file to your app's config folder. In this file you will find the following array structure

In which:

By default, the environment name is set to production with a robot name of * and a disallow entry consisting of an empty string. This will allow all bots to access all paths on the production environment.

Note: If you do not define any environments in this configuration file (i.e. an empty configuration), the default will always be to disallow all bots for all paths.

Examples

For brevity, the environment array key will be disregarded in these examples.

Allow all paths for all robots on production, and disallow all paths for every robot in staging.

Allow all paths for all robot bender on production, but disallow /admin and /images on production for robot flexo

Allow directive

Besides the more standard disallow directive, the allow directive is also supported.

Allow a path, but disallow sub paths:

When the file is rendered, the disallow directives will always be placed before the allow directives.

If you don't need one or the other directive, and you wish to keep the configuration file clean, you can simply remove the entire key from the entire array.

Sitemaps

This package also allows to add sitemaps to the robots file. By default, the production environment will add a sitemap.xml entry to the file. You can remove this default entry from the sitemaps array if you don't need it.

Because sitemaps always need to an absolute url, they are automatically wrapped using Laravel's url() helper function. The sitemap entries in the config file should be relative to the webroot.

The standard production configuration

Adding multiple sitemaps

Compatiblility

This package is compatible with Laravel 9, 10 and 11. For a complete overview of supported Laravel and PHP versions, please refer to the 'Run test' workflow.

Testing

PHPUnit test cases are provided in /tests. Run the tests through composer run test or vendor/bin/phpunit --configuration phpunit.xml.

robots.txt reference

The following reference was while creating this package:

https://developers.google.com/search/reference/robots_txt


All versions of laravel-robotstxt with dependencies

PHP Build Version
Package Version
Requires php Version ^8.0
laravel/framework Version ^9.0|^10.0|^11.0
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package gverschuur/laravel-robotstxt contains the following files

Loading the files please wait ....