
High Google rankings can turbocharge the profits of any online business, and optimizing for them is a more competitive game than ever.
Getting to the top takes a difficult combination of textbook SEO, creative inspiration, and more than a little luck. But all your efforts could be in vain if you make these underlying mistakes that prevent even the best sites from ranking.
1) Duplicate Content
Google wants to return unique sites that its users find useful. If your site contains the same content that’s been duplicated elsewhere, why would they choose to rank you above the alternatives?
Duplicate content comes in two main forms. First, simply cutting and pasting content from another source is a surefire way to destroy your rankings.
The second duplication issue is a little more subtle. Even if all the content on your site is custom written and 100 percent unique, if it shows up on more than one page it’s considered duplicated.
This can happen accidentally when the same content is reachable through several different URLs, a situation that can crop up when pages are generated from a database or when a URL rewriting system goes awry.
No matter which way duplicate content might occur, aim to keep each of your pages as unique as possible, in terms of both your own site and the wider web.
2) Broken Images and Missing Alt Attributes
Broken images send strong signals of carelessness and neglect, while missing alt attributes spurn an opportunity to add extra relevance to the page. Both may be minor issues in the scheme of things, but even these small errors could kill off your chances if you’re jostling for a top spot in a competitive search.
What’s more, accurate and descriptive alt attributes are a vital part of accessibility for people with vision problems. Adding them isn’t only good for SEO, it’s also the right thing to do for your visitors.
3) Broken Internal and External Links
Similarly, links that lead nowhere don’t send strong signals of a trustworthy site or a diligent webmaster. They’re also confusing and frustrating for users.
Using a broken-link checker like Dr. Link Check keeps you ahead of potential problems. It helps you find and fix broken links before they have a chance to hurt user experience or search performance. There’s only carelessness to blame if you don’t use one frequently.
4) Title Tag Issues
Among all the on-page SEO factors, the title tag remains one of the most powerful. Each page needs a unique and descriptive title, and a few relevant keywords won’t hurt so long as they’re worked in naturally.
Hopefully, no serious website today should have completely absent title tags. However, tags which are largely the same on every page are nearly as bad. This easily happens when a large portion of each title is used for branding rather than describing the page itself.
Lastly, don’t forget that the title tag will often show up in search results, and so needs to catch the reader’s interest. A page that’s rarely clicked on won’t hold on to rankings for long.
5) H1 Tag Issues
Next up in the on-page importance stakes is the H1 tag. This should be acutely relevant to your page content, but ideally not identical to the title tag. It should also draw the reader in and convince them to stay on the page. Combining these elements isn’t always easy, but it’s essential to put the effort into getting it right.
6) Meta Descriptions
An unoptimized site’s meta descriptions often suffer from the same problems as title tags. If they’re there at all, they’re often generic and unspecific rather than closely describing the page content. This muddies the waters for the algorithm trying to determine a page’s relevance and value.
Descriptions might not be the on-page force they once were, but they’re not difficult to add. And when doing them badly can have a positively harmful effect, they’re worth the small amount of effort they require.
7) Low Text-to-HTML Ratio
Pages which are packed full of behind-the-scenes coding yet have little visible text can be difficult for an algorithm to decipher. At best, they lack enough content to base a relevance judgment on. At worst, a code-heavy page can represent too much of an effort for a spider to plow through, and the page is deindexed or heavily demoted.
Unfortunately, this situation can arise in innocent circumstances, for example with an image gallery or an e-commerce product page. But innocent or not, if your page has more code than text, you need to do something about it.
8) Redirect Chains
Lastly, pages that lie at the end of a long chain of redirects are known to suffer dampened rankings. While temporary redirects are useful for sending visitors to the right place when a page has moved, if they get out of hand they add more work for the spiders for little reward. Google has stated in the past that, under normal circumstances, they won’t follow a redirect chain with more than three steps.
If you use redirects, carry out regular audits to stop them piling up as the years go by. And if for some technical reason you can’t avoid these chains, then block the spiders from following them, and offer a more direct route for page discovery instead.
Bringing It All Together
Some of these issues can tank a site on their own if they cause severe enough problems. Others might represent missed opportunities, or only slightly suppress rankings. But nonetheless, if you’re making these mistakes, the rest of your SEO and promotion efforts won’t bear the fruit they should.
Digital Web Services (DWS) is a leading IT company specializing in Software Development, Web Application Development, Website Designing, and Digital Marketing. Here are providing all kinds of services and solutions for the digital transformation of any business and website.



