Go to main content Go to main menu

Stop Paying for Bots: Smarter Traffic Filtering for Optimizely DXP

By Wezz Balk Reading time: 1 minute

As a senior solution architect working with Optimizely DXP, I've noticed a growing trend: bot traffic is steadily increasing across many client sites. These automated requests often target irrelevant or non-existent paths - like WordPress folders or .php files - and while they may seem harmless, they can trigger full 404 pages, consume unnecessary server resources, and in some cases, even count as page views.

The Hidden Cost of 404s

Most Optimizely DXP sites are well-structured and optimized for performance. But bots don't care about that. They'll crawl your site looking for vulnerabilities, outdated plugins, or admin panels that don’t exist - especially if your domain has been around for a while.
These bots often request things like:

  • /wp-admin/
  • /wp-login.php
  • /xmlrpc.php
  • Any number of .php files or WordPress-related folders

Since these paths don’t exist in a typical Optimizely setup, the site responds with a friendly 404 page. That’s good UX for real users - but for bots, it's unnecessary overhead. And here’s the kicker: those 404s could still count as page views, depending on how your analytics and billing are set up.

Even worse, rendering a full 404 page takes more server resources than simply rejecting the request. In one case, we measured a 404 response taking around 28.0 milliseconds, while a properly handled 400 response took just 215.4 microseconds. That’s a huge difference in CPU time.

A Smarter Way to Handle Irrelevant Traffic

To address this, we’ve developed a proprietary request guard package. It’s designed to sit in front of your application and filter out irrelevant or malicious traffic before it ever reaches your CMS.

Here's how it works:

  • It uses a blacklist of known bad patterns (like .php files or WordPress folders).
  • It applies base rules out of the box, but can be customized per implementation.
  • It classifies traffic as either valid or invalid and responds accordingly - often with a 400 Bad Request instead of a 404.

While the package was originally built with Optimizely DXP in mind, it's fully compatible with any .NET-based website.

Real-World Results

One of our clients was seeing a significant amount of bot traffic - enough to skew their analytics and potentially inflate their page view billing. After implementing our request guard, they saw a drastic shift:

Over 50% of what used to be 404 traffic is now correctly classified as 400.
In the last 30 days, more than 65% of failed requests are now 400s, meaning they’re no longer counted as page views.

While we can't directly quantify the cost savings (those numbers are usually calculated annually and kept internal), the performance gains and cleaner traffic data speak for themselves.

Why This Matters to You

If you’re running a site on Optimizely DXP - or any .NET platform - you should be asking yourself:

  • How much of my traffic is actually relevant?
  • Am I paying for bot traffic that could be filtered out?
  • Are my analytics skewed by automated requests?

This isn’t just a technical issue - it’s a business one. Cleaner traffic means better insights, faster performance, and potentially lower costs.

We would like to hear what you think about the blog post

Wezz Balk

Wezz Balk

Developer | Tech Lead

Read all blog posts by Wezz Balk