πŸ’Ž PREMIUM: Plugins/db robotstxt - High Quality

Bisteinoff SEO Robots.txt

Beschrijving

Have you encountered an obstacle while creating and editing robots.txt file on your website?

Bisteinoff SEO Robots.txt is an easy-to-use plugin that helps you generate and configure a correct robots.txt file, which is essential for search engine optimization (SEO). This file defines crawling rules for search engine bots such as Google, Bing, Yahoo!, Yandex, and others.

The plugin works perfectly both if the file robots.txt has never been created or if it already exists. Once installed the plugin makes an optimized robots.txt file that includes special rules common for WordPress websites. After that you can proceed further customization specific for your own website if needed.

If the plugin detects one or several Sitemap XML files it will include them into robots.txt file.

No FTP access, manual coding or file editing is required that makes managing settings easy and convenient!

Key Features

  • Automatic generation of optimized robots.txt with WordPress-specific rules
  • Special rules for Google and Yandex search engines
  • Custom rules support for any search engine bot
  • Automatic sitemap detection and inclusion
  • WooCommerce compatibility with specific rules
  • Multisite support
  • Easy-to-use admin interface
  • Modern PHP architecture with namespaces for conflict-free operation

Schermafbeeldingen

  • The Settings Page.
  • Error message if a non-virtual file robots.txt exists and the functionality to fix it.
  • The message when the problem is fixed.

Installatie

  1. Upload db-robotstxt folder to the /wp-content/plugins/ directory
  2. Activate the plugin through the ‘Plugins’ menu in WordPress
  3. The plugin will automatically create a virtual robots.txt file
  4. Go to Settings > SEO Robots.txt to customize rules

FAQ

Will it conflict with any existing robots.txt file?

No, it will not. If the file robots.txt is found in the root folder it will not be overridden. On the Settings page you will see a notification with two options: rename or delete the existing file robots.txt. The plugin provides this functionality directly in the admin interface.

Could I accidentally block all search robots?

Once the plugin is installed it will work fine for all search engine robots. If you are not aware of the rules for fine-tuning a robots.txt it is better to leave the file as is or read first a corresponding manual to learn more about the directives used for robots.txt.

Note: the following directives would block the corresponding search robot(s):

Disallow:
Disallow: /
Disallow: *
Disallow: /*
Disallow: */

You should use any of these directives only if you do not want any page of your website to be accessible for crawling.

Where can I read the up-to-date guide on robots.txt?

What happens when I update to version 4.0?

For regular users: Nothing changes! The plugin will automatically migrate all your settings. Everything continues to work exactly as before.

For developers: Version 4.0 introduces a complete code refactoring with modern PHP classes and namespaces. If you have custom code that references this plugin’s functions, constants, or options, please review the migration information below.

Migration to v.4.0 – Information for Developers

If you have custom code that integrates with this plugin, please note these changes:

Checking for deprecation notices: All deprecated elements will trigger _doing_it_wrong() notices when WP_DEBUG is enabled. Enable debug mode to identify any issues:
define(‘WP_DEBUG’, true);

Changed option names:
* db_robots_custom bisteinoff_plugin_robots_custom
* db_robots_custom_google bisteinoff_plugin_robots_custom_google
* db_robots_if_yandex bisteinoff_plugin_robots_enable_yandex
* db_robots_custom_yandex bisteinoff_plugin_robots_custom_yandex
* db_robots_custom_other bisteinoff_plugin_robots_custom_other

Note: Options are migrated automatically. Old option names are removed from the database after successful migration.

Changed constants:
* DB_PLUGIN_ROBOTSTXT_VERSION BISTEINOFF_PLUGIN_ROBOTS_VERSION
* DB_PLUGIN_ROBOTSTXT_DIR BISTEINOFF_PLUGIN_ROBOTS_DIR

Note: Old constants remain defined for backward compatibility.

Changed functions (now deprecated):
* publish_robots_txt() Use \Bisteinoff\Plugin\RobotsTXT\Generator::generate() instead
* db_robots_admin() Use \Bisteinoff\Plugin\RobotsTXT\Admin::add_menu_page() instead
* db_robotstxt_admin_settings() Use \Bisteinoff\Plugin\RobotsTXT\Admin::render_settings_page() instead
* db_settings_link() Use \Bisteinoff\Plugin\RobotsTXT\Loader::add_settings_link() instead

Note: Deprecated functions continue to work with backward compatibility.

Action required:
Update your custom code to use the new naming conventions. All deprecated elements will be removed after Feb 16th 2027.

Beoordelingen

17 september 2023
As a SEO specialist, I can say that this plugin is indispensable in everyday work. with it, I can quickly generate the desired file without unnecessary labor.it is convenient that there are different blocks for entering information for different search engines, since often different indexing rules may be relevant for different search engines.
Lees alle 3 beoordelingen

Bijdragers & ontwikkelaars

“Bisteinoff SEO Robots.txt” is open source software. De volgende personen hebben bijgedragen aan deze plugin.

Bijdragers

“Bisteinoff SEO Robots.txt” is vertaald in 3 localen. Dank voor de vertalers voor hun bijdragen.

Vertaal “Bisteinoff SEO Robots.txt” in je eigen taal.

Interesse in ontwikkeling?

Bekijk de code, haal de SVN repository op, of abonneer je op het ontwikkellog via