Download the PHP package fuelviews/laravel-robots-txt without Composer
On this page you can find all versions of the php package fuelviews/laravel-robots-txt. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download fuelviews/laravel-robots-txt
More information about fuelviews/laravel-robots-txt
Files in fuelviews/laravel-robots-txt
Package laravel-robots-txt
Short Description Laravel robots txt package
License MIT
Homepage https://github.com/fuelviews/laravel-robots-txt
Informations about the package laravel-robots-txt
Laravel Robots.txt Package
Laravel Robots.txt is a robust and easy-to-use solution designed to automatically generate and serve dynamic robots.txt files for your Laravel application. The package provides intelligent caching, environment-based rules, and seamless integration with your application's routing system.
Requirements
- PHP ^8.3
- Laravel ^10.0 || ^11.0 || ^12.0
Installation
Install the package via Composer:
Publish the configuration file:
Basic Usage
Automatic Route Registration
The package automatically registers a route at /robots.txt
that serves your dynamic robots.txt file:
Configuration
Configure your robots.txt rules in config/robots-txt.php
:
Environment Behavior
Development/Staging Environments
In non-production environments (app.env
!== 'production'), the package automatically generates a restrictive robots.txt:
This prevents search engines from indexing your development or staging sites.
Production Environment
In production, the package uses your configured rules to generate the robots.txt file.
Advanced Usage
Using the Facade
Direct Class Usage
Named Routes
The package registers a named route that you can reference:
Configuration Options
Disk Configuration
Specify which Laravel filesystem disk to use for storing the robots.txt file:
User Agent Rules
Define rules for different user agents:
Sitemap Integration
Include sitemap URLs in your robots.txt:
This generates:
Caching System
The package uses an intelligent caching system that regenerates the robots.txt file only when:
- The configuration changes
- The application environment changes
- The application URL changes
- The cached file doesn't exist
Cache Management
Cache is automatically managed, but you can clear it manually:
File Storage
Automatic Storage
The package automatically stores the generated robots.txt file to your configured disk at robots-txt/robots.txt
.
Custom Storage
Example Generated Output
Production Environment
Non-Production Environment
Testing
Run the package tests:
Troubleshooting
Robots.txt Not Updating
If your robots.txt isn't reflecting configuration changes:
- Clear the application cache:
php artisan cache:clear
- Ensure your configuration is valid
- Check file permissions for the storage disk
Route Conflicts
If you have an existing /robots.txt
route or static file:
- Remove any static
public/robots.txt
file (the package automatically removes it) - Ensure no other routes conflict with
/robots.txt
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Credits
- Joshua Mitchener
- Daniel Clark
- Fuelviews
- All Contributors
📜 License
The MIT License (MIT). Please see License File for more information.
Built with ❤️ by the Fuelviews team
All versions of laravel-robots-txt with dependencies
illuminate/contracts Version ^10.0||^11.0||^12.0
spatie/laravel-package-tools Version ^1.92