Download the PHP
package frosh/robots-txt without Composer
On this page you can find all versions of the php package
frosh/robots-txt. It is possible to download/install
these versions without Composer. Possible dependencies are resolved
automatically.
Vendor frosh Package robots-txt Short Description Generate robots.txt License
MIT
FAQ
After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.
Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.
In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories.
In this case some credentials are needed to access such packages.
Please use the auth.json textarea to insert credentials, if a package is coming from a private repository.
You can look here for more information.
Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
To use Composer is sometimes complicated. Especially for beginners.
Composer needs much resources. Sometimes they are not available on a simple webspace.
If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
This plugin provides a robots.txt for a Shopware 6 shop. Currently it is not possible to distinguish different user-agents. Note that in general only the robots.txt at the root will be considered.
Allow and Disallow rules
Currently there exist the following default rules
If you need to modify them this should be done by a template modification. If there are other general rules which are useful for others, consider creating a pull request.
It is possible to configure the Disallow and Allow rules in the plugin configuration. Each line needs to start with Allow: or Disallow: followed by the URI-path. The generated robots.txt will contain each path prefixed with the absolute base path.
For example suppose you have two "domains" configured for a sales channel example.com and example.com/en and the plugin configuration
The robots.txt at example.com/robots.txt contains:
Sitemaps
In addition to the rules the sitemaps containing the domain will be linked. Again suppose we have the domains example.com and example.com/en configured, the robots.txt will contain
Composer command for our command line client (download client)This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free.Standard composer command
The package frosh/robots-txt contains the following files
Loading the files please wait ....
Loading please wait ...
Before you can download the PHP files, the dependencies should be resolved. This can take some minutes. Please be patient.