Download the PHP package infusionweb/laravel-robots-route without Composer

On this page you can find all versions of the php package infusionweb/laravel-robots-route. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package laravel-robots-route

Laravel 5 robots.txt Route

A route for serving a basic robots.txt in Laravel 5.1+, based on configuration settings.

This is a fork of ellisthedev/laravel-5-robots, which was a fork of jayhealey/Robots, which was based on earlier work.

The purpose of this fork is to create a set-it-and-forget-it package that can be installed without much effort. It is therefore highly opinionated and not built for configuration.

When enabled, it allows access to all clients and serves up the sitemap.xml. Otherwise, it operates almost identically to Laravel's default configuration, denying access to all clients.

Installation

Step 1: Composer

Via Composer command line:

Or add the package to your composer.json:

Step 2: Remove the existing robots.txt

Laravel ships with a default robots.txt which disallows all clients. It needs to be removed for the configured route to work.

Step 3: Enable the route

Add the service provider to your config/app.php:

Publish the package config file:

You may now allow clients via robots.txt by editing the config/robots.php file, opening up the site to search engines:

Or simply setting the the ROBOTS_ALLOW environment variable to true, via the Laravel .env file or hosting environment.

Credits

License

The MIT License (MIT). Please see License File for more information.


All versions of laravel-robots-route with dependencies

PHP Build Version
Package Version
Requires php Version >=5.5.9
illuminate/support Version ^5.1
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package infusionweb/laravel-robots-route contains the following files

Loading the files please wait ....