Download the PHP package su-sws/nobots without Composer

On this page you can find all versions of the php package su-sws/nobots. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package nobots

No Bots

Version: 8.x

Maintainers: jbickar, sherakama, pookmish Changelog.txt

This module blocks (well-behaved) search engine robots from crawling, indexing, or archiving your site by setting a "X-Robots-Tag: noindex,nofollow,noarchive" HTTP header.

Enable the module to block search engine robots. Disable the module to let them crawl your site.

Installation

Install this module like any other module. See Drupal Documentation

Configuration

Enable the module to block search engine robots. Disable the module to let them crawl your site.

Advanced Configuration

You can set the nobots variable to FALSE to leave the module enabled, yet still allow robots to crawl your site.

An example use-case would be to set $settings['nobots'] = FALSE; in settings.php for the production environment, and TRUE for all other environments.

To set the state nobots to TRUE. drush @[alias] state:set nobots 1 would block robots. You can also set the state nobots to FALSE. drush state:set nobots 0 would set this state to false.

Troubleshooting

If you are experiencing issues with this module try reverting the feature first. If you are still experiencing issues try posting an issue on the GitHub issues page.

Contribution / Collaboration

You are welcome to contribute functionality, bug fixes, or documentation to this module. If you would like to suggest a fix or new functionality you may add a new issue to the GitHub issue queue or you may fork this repository and submit a pull request. For more help please see GitHub's article on fork, branch, and pull requests

Releases

Steps to build a new release:

Deprecated

This repo and su-sws/nobots is replaced by drupal/nobots.


All versions of nobots with dependencies

PHP Build Version
Package Version
No informations.
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package su-sws/nobots contains the following files

Loading the files please wait ....