Poorly Designed Websites Impact Rankings: Here’s Why

Website Design Impacts Rankings

There are numerous things that can work against you in your Google search rankings. You might encounter content scrapers, bad incoming links, negative SEO or social media backlash. Through it all, however, nothing harms you more than a poorly designed website.

A bad website just demonstrates to your users and Google that you don’t care enough to put together something of value. It’s an unforgivable sin particularly because of how simple it is to use a drag-and-drop website builder to create something nice.

Related topic: UX & UI in Digital Marketing

How, specifically, can you go wrong creating a bad website?

No Attention to On-Site SEO

A poorly designed site ignores on-page SEO. This means meta tags are often missing or automatically created from a template, which results in duplicate content penalties. It means using default settings in website creators which make your site look the same as dozens of others, with very little to make it stand out. It also means a loss of potential in terms of external and internal linking, content strategies and keyword usage.

All of this combines to leave you exposed and vulnerable to attacks from your competitors, intended to drown you out and limit your search visibility.

Poor Configuration Causing Duplicate Content

Duplicate content can arise in a number of ways. One is just by setting your title tags on every page to be your brand name.

It may seem like a good idea to create an overarching message on every page, but your title tags aren’t the place to do it. Another issue is the standard settings in many older eCommerce platforms, which create a handful of pages for every product, all with the same details. eCommerce configuration in particular – the lack of combined pages and canonicalization – are especially damaging.

Paginated Posts Earning Thin Content Penalties

One common strategy in the past, which has persisted on some large networked websites in recent years, is the idea of dividing up pages into a lot of smaller pages. Write a top 10 list with 35-word entries for each? Split it into 10 pages, an intro, one page for each item, and a conclusion with links to other articles! It maximized pageviews, but it makes your site suffer in load times, usability and thin content. Google dislikes these sorts of thin schemes to inflate pageviews, the pages-per-visit metric and affiliate impressions.

Poor Scripting and Coding Slowing Load Times

Scripts and plugins are great to expand the features and shape the user experience of a website. On the other hand, poorly executed scripts cause a lot of problems. For one thing, old, broken scripts can be an easy entry point for cybercriminals. For another thing, a broken script can have many negative effects. It can slow down page loads. It can disrupt site rendering to make a page not load properly. It can throw errors in browsers that make users bounce.

In general, scripts – particularly broken scripts – increase load times, affect Core Web Vitals, damage the UX, and can disrupt Google’s web crawlers.

Poor Planning Blocks the Googlebot

Often times, when you’re developing a site, you want to host it live but prevent it from indexing until you’re really ready to view it. One common way to do this is to set the noindex attribute in your robots directives. Some site owners do this through each individual page, while others use the robots.txt file in a more efficient way. Either method will stop the Googlebot from indexing the site. Then the site launches, and the Googlebot still stays away. Failing to remove those robots directives will greatly impede site indexing, or block it altogether.

Inconsistent User Experience Hides Value

A user visits your homepage.

  • How many clicks do they need to make to find something they want to see?
  • Do each of your navigation buttons require a click to open a drop-down just to see if what they want might be inside?
  • Do they need to click to a category and browse through pages of unsorted content to find something of interest?
  • To search your site, would they need to back out and visit Google?

There are tons of issues in the typical user experience that hide content or make your site otherwise difficult to navigate.

Color Choices Hide Text and Drive Users Away

That bright neon text on a black background looks cool in an early 2000s MySpace profile kind of way. The bright yellow bold comic sans text fits right in; it’s neon, it stands out and it catches the eye. Your sidebar navigation is nicely tucked away in a frame, so it’s always available. All of it combines to make your site incredibly hard to read or spend any amount of time browsing. Combine that with the fact that a background color too similar to your text color – gray on gray, gray on black, yellow on white – is a sign of hidden text and triggers Google’s black hat flags. You’ve created a recipe for a site penalty.

Ad and Image Placement Make Value Hard to Find

We’ve all been to a site at some point where the layout just doesn’t seem to work. The images, the intertext banner ads, the sidebar ads, all flashing and drawing attention to themselves, intrude on the text. One moment you’re reading a right-side column, and the next you’re wrapping mid-word to the left column, squeezed to the side where six or seven characters can stack at most. In part, this is typically an issue with image scaling or design on a smaller screen. In any case, it’s distracting and it draws away from the text, which is where the primary site value is found.

Poor Coding and Old Technology Open Security Holes

Bad code, no matter where it is or who does it, is a massive vulnerability.

On your end, you have the potential for hackers to compromise your site. They might completely replace your page with a propaganda page of their own. They might insert a few subtle links and a backdoor admin account. They might just copy your sensitive data and leave, with you none the wiser. Your customers may find their personal information stolen, their account information phished or their traffic redirected to a third party site full of viruses.

Out of date plugins, poorly implemented scripts, even a stock website platform all can contribute to a risk of hacking.

Branislav Nikolic

Leave a Comment

Scroll to Top