Skip to content

As of 2024, more website visits now come from automated software ‘bots’ than from real people. For businesses, that shift matters for two reasons: firstly, it raises your security risk, and secondly, it can waste energy if you don’t manage it.

In this article, we look at the types of bot business websites might encounter, what you can do to protect yourself from the nasty ones, and why sustainable web design can also help reduce the environmental impact of bot traffic.

First, what is a bot?

A bot is a piece of software that visits websites automatically.

Not all bots are created equal. Some can be helpful – for example search engine spiders that index your website, accessibility tools and uptime checkers that make sure your site is live – while others are unhelpful or even actively harmful (such as tools that try lots of passwords to attempt to break into your site, or bots that copy your content).

There are also a growing number of bots that crawl pages to train AI systems like Large Language Models as well as AI Agents, software that accesses and uses a website on behalf of a human (for example to make a purchase or a booking).

How does bot traffic impact your website?

Bots bring issues that human visitors (mostly) don’t. These issues fit broadly into three categories:

  • Cost and energy. Every unnecessary visit uses bandwidth and server power. That’s extra energy being consumed by your site and, at scale, extra carbon being released into the environment.
  • Security. More bot traffic means more attempts to log in, copy content, or poke at weak spots in your website’s defences (see below).
  • User experience. When bots hog server resources, real customers get a slower site.

Bad bots and cybersecurity

Research by cybersecurity company Imperva says bots now account for 51% of all internet traffic, with 37% being malicious. There’s also been a sharp rise in API-targeted attacks, fuelled largely by easy-to-use AI.

Many of these ‘bad bots’ are aimed at business websites, and can cause all sorts of obvious, as well as unseen, issues.

The types of bad bot business websites see most often include:

  • Login bots. These try thousands of email and password combinations to break into the admin area of a website. The ultimate goal is to take over the account and either steal data or trash the website.
  • Form and comment spammers. These auto-submit your contact form or blog comments with junk links, which can lead to reputational damage, lost time for teams sifting through the junk, and inbox flooding, with the consequent impact on digital sustainability.
  • Content scrapers/copycats. These bots automatically lift copy and images from your website to distribute elsewhere. This can lead to your intellectual property appearing elsewhere (impacting your search engine ranking) and competitors undermining your USP.
  • Fake shoppers. These bots will add items to basket or check out items in order to mess with your stock levels. This can lead to distorted analytics, unhappy real customers, and lost sales.
  • Coupon and gift-card testers. This type of bot can automatically try thousands of discount codes to find ones that work, leading to lost revenue and payment processing headaches.
  • Click and ad fraud bots. This type of bot imitates visits or ad clicks to decimate budgets and inflate traffic, leading to wasted ad spend and misleading reports.
  • Vulnerability scanners. These nasty little bots crawl your website looking for weak spots and out-of-date plugins. Their main goal is usually either to leave malware or steal data.
  • Over-eager AI crawlers. These bots use large parts of your website to train or power AI tools, often without much respect for your bandwidth. This can lead to higher hosting load and costs, as well as slower pages for real visitors.

Learn more about our experience passing our IASME Cyber Essentials certification.

Protecting your website from bad bots

Since not all bots are bad, the goal is to allow the helpful ones through while blocking or mitigating the impact of harmful ones. It’s a tricky balance to strike, but an important one nonetheless.

A few things you can do to tip the balance in favour of the goodies include:

  • Use bot detection tools like Cloudflare, hCAPTCHA or Akismet, which use behavioural analysis to detect and filter out bad bots. They can help reduce the risk of fraud, spam and unauthorised access, among other things.
  • Apply rate limiting to login forms, which restricts the number of requests a single user (or bad bot) can make within a certain time. This prevents attacks like password-guessing login attempts, credential stuffing and excessive scraping.
  • Multi-factor authentification (MFA) will further protect your login forms. Have each admin using a unique account to help stop password-guessing bots gaining access.
  • Implement a Web Application Firewall (WAF) or a bot management tool to block known malicious bots. These not only distinguish between good and bad bots and block the nefarious ones, they adapt to evolving bot tactics to keep protections up-to-date.
  • Create custom security rules to block traffic from places you don’t expect to get visitors from. You can also create rules to block suspicious IP addresses and visits from outdated browsers (all suggestive of malicious activity).
  • Set a honeypot trap, such as an invisible form field or login page. Only bots will visit these pages, and they can be used to blacklist bot traffic in the background without impacting user experience for the rest of your site.
  • Protect shops and bookings – don’t reserve stock or appointments until payment is confirmed. You can also add limits to ‘add to basket’ and price-check requests. This will reduce the business impact of fake shopper bots and scalpers.

Need help protecting your website from bad bots? Get in touch to see how we can help.

The environmental impact of bots

Beyond the security implications, every visit to your website – whether human or bot – uses energy. When wasteful bots keep hitting your site, your hosting has to work harder for no extra benefit. That’s higher electricity use and, at scale, extra carbon in the environment.

Employing low-carbon web design techniques can help cut the waste (without hurting SEO):

  • Make pages lighter. Compress and lazy-load images, use modern formats (such as WebP or AVIF), and remove scripts you don’t really need.
  • Cache more. Ask your host to cache pages at the edge so your server isn’t rebuilding them for every request.
  • Click-to-load embeds. If you have to use maps, videos or social feeds, don’t have them load in automatically – let people click to fetch them.
  • Guide the good bots. Keep your XML sitemap accurate so useful bots like search engines know where to go, and use robots.txt to prioritise only the higher value pages
  • Trim old content. Fewer but better pages means fewer pointless bot visits.

Learn more about making your website more energy efficient in our Website Optimisation Guides.

Can you block bots entirely? And should you?

The short answer is: no, you can’t. And you wouldn’t want to either. That’s because the benefits of good bots – like search engines, accessibility tools and uptime monitors – outweigh (for the most part) the impact of the bad ones. The aim is to balance the two and:

  • Let the helpful bots in. Search engines and essential services should run smoothly.
  • Discourage the wasteful ones. Steer this annoying majority with robots rules and lean pages so they don’t guzzle up bandwidth.
  • Block the harmful few. Use your firewall and rate limits to stop obvious attacks in their tracks.

In short, don’t try and shut the web out. Welcome what helps your customers find you, parry away what adds no value, and block the handful that are there to cause harm. This approach will help keep your website findable, faster for real people, and lighter on energy use.

PrivacyJournal provides some guidance on how to block ChatGPT from scraping your website content.

conclusion

As bots continue to overtake humans online, protecting your business doesn’t mean locking your website down – it simply means being thoughtful about how it’s built, managed and maintained.

By combining good cybersecurity practice with sustainable web design, you can protect your business, safeguard your customers and keep your site running efficiently for the visitors who matter the most: the real people.

Further reading

beautiful websites,
rooted in good ethics

Back to top Get in touch