Not being seen? are the robots beating you?
A well thought-out website can be a highly efficient sales funnel for your business, but only if the right pages from your site appear in relevant searches. Something minor, such as an incorrectly installed plug-in on your domain, can mean pages aren’t being indexed, and you’re effectively invisible to potential clients.
Visualise are a graphic design studio headed up by Simon Hutchings. After launching their new revamped website last year at www.wearevisualise.co.uk they came to us for help because they were having issues with their web search results. Only legacy pages from their old site and staging server were appearing in Google Search Results and no fresh content was being seen in Google. Everything looked good on the surface so Simon granted us authority to access under the hood to see how Google were viewing the website.
When Google starts to index or crawl a website it always looks for a file called robots.txt. This tells search engines what they’re allowed to index or not. However in this case the robots.txt file seemed to be corrupt – it was there, but it was an empty zero-byte file. To Google this was like a No-Entry sign. There was a corrupt plugin affecting the file. We reset the permissions and within a few minutes Google was able to see a freshly generated robots.txt file allowing access to all the pages that had previously been blocked. It was like opening the floodgates.
The results were staggering. Within 90 days the Visualise website saw a 900% increase in organic reach compared to the previous year, and the relevance of the pages coming up was radically improved too, with a 50% bounce-rate reduction.
“Absolutely bowled over by the massive difference the Super Web Guys have made to our site’s visibility – it was like clicking an ‘on’ switch. And because of the much lower bounce rate, we also know the relevance of the pages coming up has really improved.”
Simon Hutchings, Visualise