Skip to content

Hello and welcome to issue 9 of Beneath the Surface where we dig into the world of sustainable web design and come together to untangle the green web.

Rethinking website traffic in the age of automation

For the first time ever, automated software – or bots – account for more than half of all web traffic. Let that sink in for a sec. More than half.

Of course, not all bots are equal. Some – like search engines, accessibility tools and uptime monitors – are genuinely useful. Others are anything but.

From login-guessing scripts and comment spammers to AI crawlers vacuuming up your painstakingly produced content, these automated website visitors can create all manner of headaches for businesses.

And it’s not just about security (although they do increase the risk of cyberattacks). They’re a waste of energy too. Guzzling server bandwidth and generating emissions every time they hit your website – often without any human benefit at all. For businesses, that can mean higher hosting costs, more wasted electricity, and yet one more hurdle in the quest for digital sustainability.

Keeping the bad bots out

The good news is that a few simple measures can make a big difference. At Root, we prioritise keeping websites lightweight and efficient – which helps from a carbon-generation perspective, but also gives bots less to attack. Other things you can do include:

  • Using multi-factor authentication (MFA) for all admin users.
  • Turning on firewalls and anti-spam protection for forms.
  • Minimising third-party scripts to reduce both risk and page weight.
  • Keeping any third-party extensions updated.

When it comes to bots, the aim isn’t to block everything. It’s to welcome the helpful, discourage the wasteful, and block the harmful few.

Every leaner page, every verified login, and every small performance tweak makes your website a little safer, and a little lighter on the planet.

For more on bots, head over to the Root blog: How do bots affect the security and environmental cost of your website?


Searching sustainably: SEO for Ecosia (and Qwant)

Ecosia – the tree-planting search engine – has a new partner: the French privacy-focussed search engine Qwant. Together, they’ve launched a European-based search index designed to make web search greener, fairer, and less reliant on Big Tech.

It’s enough to give you the happy tingles. But what does that mean for your business?

In our updated guide on how to show up on Ecosia, we explore how the change affects SEO, what Staan (the new shared index) is all about, and why optimising for Ecosia still means optimising for Bing – for now.

Some top tips include:

  • Using Bing Webmaster Tools to update sitemaps and view analytics
  • Installing the IndexNow plugin
  • Optimising for location-based searches and registering a pin on Bing maps
  • Using specific key phrases
  • Getting inbound links from respected sources

Read the updated blog for more: How to do SEO for Ecosia


🍄 Unearthed

~ Digital sustainability news, insights and tips from around the web.

#EcoWeb Report 2025

The latest #EcoWeb report takes a deep dive into how the climate sector is performing on digital sustainability. It benchmarks hundreds of organisations, highlighting both the progress made and the persistent gaps when it comes to low-carbon design and responsible data use. It found a huge discrepancy in emissions across the top 10% of best-performing websites, ranging from 0.01g per visit to a massive 21g.

Making WordPress websites grid aware

In issue 8, we got very excited about grid-aware websites, which adapt based on the real-time intensity of the user’s local power grid and the provenance of the power. Since then, Nora Ferreirós and Nahuai Badiola have developed the Grid Aware WP plugin, which makes it easy for designers and users to see and reduce energy demand when the grid is running on fossil fuels.

Just in time for Christmas, here’s Sprout*

A new no-code, low-carbon website builder called Sprout* has launched early access availability, claiming to help businesses build a website within minutes that emits up to 99% less CO2 than a standard site. Developed by a small team of purpose-driven developers and designers, Sprout* has the vision and potential to offer a genuinely sustainable alternative to mainstream platforms like Wix and Squarespace.

Chitch Software Development Kit

Chitch is a new development framework built specifically for low-carbon websites. It aims to combine the efficiency of static-site generation with the usability of a dynamic CMS – lowering both energy use and developer maintenance effort. And, as if that weren’t enough to love, it’s got a super-cute capybara mascot.

The climate cost of AI

Big Tech’s appetite for AI continues to grow – and so does its carbon footprint. This investigative article from the MIT Technology Review explores the staggering energy demands of large-scale AI systems, and asks whether ‘efficiency gains’ really offset the environmental cost. With AI increasingly seeping into the apps many of us use on a daily basis, it’s a question we all need to be asking ourselves.


☀️ Studio News

  • We passed our Cyber Essentials certification over the summer. We’ve already begun passing on what we learnt to our clients, and have made security improvements across loads of sites.
  • We’ve added a talented UX designer to our roster of trusted collaborators – May Viratikul has been helping us with wireframing, design and development.
  • Chris recently attended a Sustainability & AI talk at the Stockport Climate and Nature business forum and shared some of the key takeaways here.
  • We now offer Website Care Plans for new clients, even if we didn’t build your website originally. These give you access to ongoing maintenance and support to help you grow sustainably.

💚 Thank you for reading

This issue of Beneath the Surface was written by Paul Jardine and Becky Thorn. We’ll see you for issue 10 in the new year! ✌️

beautiful websites,
rooted in good ethics

Back to top Get in touch