Skip to content
  • Quick Ref
  • Contact
  • About
wpcanyon.com

wpcanyon.com

Block bad bots with .htaccess (copy/paste rules)

Posted on August 19, 2025 By Admin No Comments on Block bad bots with .htaccess (copy/paste rules)

Block Bad Bots with .htaccess (Copy/Paste Rules)

If your WordPress site is suffering from unwanted traffic caused by bad bots—scraping content, spamming forms, or consuming server resources—you can block them efficiently using .htaccess rules. This tutorial provides quick, copy-paste-ready .htaccess snippets to block common bad bots and protect your site.

Quick Fix: Block Bad Bots in .htaccess

  1. Access your WordPress site’s root directory via FTP or hosting file manager.
  2. Open or create the .htaccess file in the root folder.
  3. Copy and paste the provided bad bot blocking rules into the .htaccess file.
  4. Save the file and test your site to ensure it works correctly.

Why This Happens

Bad bots are automated scripts that crawl websites for malicious purposes such as scraping content, spamming, brute forcing login pages, or overloading servers. Unlike good bots (like Googlebot), bad bots ignore robots.txt rules and can cause:

  • High server load and slow site performance
  • Security vulnerabilities through brute force or injection attacks
  • Content theft and SEO penalties

Blocking these bots at the server level using .htaccess is an effective way to reduce unwanted traffic before it reaches WordPress.

Requirements

  • Apache web server with mod_rewrite enabled
  • Access to your site’s root .htaccess file
  • Basic knowledge of FTP or hosting file manager
  • Backup of your current .htaccess file before editing

Step-by-step: Block Bad Bots with .htaccess

  1. Backup your current .htaccess file. Always keep a copy before making changes.
  2. Access your site’s root directory. Use FTP or your hosting control panel’s file manager.
  3. Open the .htaccess file. If it doesn’t exist, create a new plain text file named .htaccess.
  4. Paste the following code at the top of the file, before WordPress rules:
# BEGIN Block Bad Bots
<IfModule mod_rewrite.c>
RewriteEngine On

# Block bad bots by User-Agent
RewriteCond %{HTTP_USER_AGENT} ^.*(AhrefsBot|SemrushBot|MJ12bot|BLEXBot|DotBot|Screaming Frog|YandexBot|Exabot|Ezooms|Sogou).* [NC]
RewriteRule .* - [F,L]

# Block bad bots by Referer (optional)
# RewriteCond %{HTTP_REFERER} ^.*(spamdomain1.com|spamdomain2.net).* [NC]
# RewriteRule .* - [F,L]

</IfModule>
# END Block Bad Bots
  1. Save the file. Upload it back if using FTP.
  2. Test your site. Visit your site to ensure it loads correctly.
  3. Verify blocking. Use online tools or curl commands to simulate bad bot User-Agents and confirm they receive a 403 Forbidden response.

Common Pitfalls

  • Placing rules after WordPress block: Always add bad bot rules before the WordPress section in .htaccess to ensure they take effect.
  • Typos in User-Agent names: User-Agent strings are case-insensitive but must be spelled correctly to match.
  • Blocking legitimate bots: Avoid blocking well-known good bots like Googlebot or Bingbot to prevent SEO issues.
  • Server without mod_rewrite: These rules require Apache’s mod_rewrite module; they won’t work on Nginx or if mod_rewrite is disabled.
  • Overblocking: Be cautious with broad patterns to avoid blocking real users or services.

Works on

Server Compatibility
Apache Fully compatible with mod_rewrite enabled
Nginx Not compatible (requires different config)
LiteSpeed Compatible (supports Apache .htaccess rules)
cPanel / Plesk Compatible (access .htaccess via file manager or FTP)

FAQ

Q1: How do I find out which bots are bad?
A: You can check your server logs or use analytics tools to identify suspicious User-Agent strings. Common bad bots include AhrefsBot, SemrushBot, MJ12bot, and others known for scraping or spamming.
Q2: Can I block bad bots using plugins instead of .htaccess?
A: Yes, there are WordPress plugins that block bad bots, but .htaccess blocking is faster and reduces server load by stopping requests early.
Q3: What if I accidentally block a good bot?
A: Remove or adjust the User-Agent pattern from the .htaccess file and test again. Always verify User-Agent strings before blocking.
Q4: Will blocking bad bots improve my SEO?
A: Indirectly yes. Blocking bad bots reduces server load and prevents content scraping, which can protect your SEO rankings.
Q5: Can I block bots by IP instead of User-Agent?
A: Yes, but IP addresses can change frequently. Blocking by User-Agent is more flexible for bad bots that identify themselves.
Speed & Security Tags:.htaccess, Bots, Rate Limit, Security

Post navigation

Previous Post: WooCommerce: Hide out‑of‑stock products from the catalog
Next Post: Create and manage reusable blocks (patterns)

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Top WordPress Themes for Blogs in 2025
  • WordPress Admin Panel Trick: Adding ID Field to the Posts Listing
  • Solution previous_posts_link and next_posts_link Not Working
  • Show Top Commentators in WordPress Without a Plugin
  • How to Style Admin Comments in WordPress

Recent Comments

    Archives

    • August 2025

    Categories

    • Admin & Blocks
    • Admin & UI
    • Automation
    • Automation & Plugins
    • Comments
    • Comparisons
    • Database & Revisions
    • Developer Snippets
    • Fixes & Errors
    • Media & Thumbnails
    • Queries & Pagination
    • Security
    • Speed & Security
    • Tips & Tricks
    • WooCommerce How‑tos
    • WordPress Snippets
    • WordPress Themes
    • Terms & Conditions
    • Affiliate Disclosure

    Copyright © 2025 wpcanyon.com.

    Powered by PressBook WordPress theme

    Also by the maker of MySurveyReviews.com