Have you invested in a lovely new website that looks wonderful?
The design’s a delight?
It’s easy to use?
The copy and content are cracking?
And it’s the perfect shop window for your world-beating services and products?
Well done you!
And despite all this, is it languishing in the depths of the search engine results and harder for customers to find than hen’s teeth?
Did you nod your head to the last question as well as at least some of the others?
You’re in the right place, because there’s a good chance that your website is suffering from one or more common problems which our crack team of SEO experts come across every day of the week.
We don’t have feathered earrings, gold chains, big cigars, film star looks, quirky personalities, or flying phobias.
We don’t have a black van with a cool red stripe and we’re not on the run in the LA underground.
But apart from all that stuff, we’re the A-Team of SEO.
Why?
Because our ability to transform underperforming websites into big-ranking beasts is comparable to the way BA Baracus, Hannibal et al could make a rocket launcher from a spent toilet roll, rubber band, bits of old turnip and a box of matches.
Ready for your SEO plan to come together?
Read up on our resolutions to five common SEO problems!
1. Not optimising meta-tagsOptimised meta-tags,meta-descriptions, H tags and alt tags make it easy for search engines like Google to recognise that the subject/s on your page match the keywords and phrases people are actually using.
So if meta-tags aren’t optimised, it’s bad news for your rankings.
For instance, although meta-titles and meta-descriptions aren’t seen by users onsite, they’re important to Google bot when it crawls your site, and are also shown in search engines results. Yet they’re often missing entirely, missing partially or present in a haphazard manner. Fix this with a well-written, unique title and description containing the most relevant keywords (satisfying Google) and summarising the page content in a clickable manner (satisfying human users) and you’re on the way to having a hit on your hands.
We also commonly come across missing or duplicate H1 tags and missing alt tags (which describe the content of an image), all of which are problematic.
If your tags leave a lot to be desired, a good SEO team like ours can have them fixed quickly so that everything’s ship-shape!
2. Duplicate contentNobody likes duplicate content. Nobody likes duplicate content. Not at all, at all. Not Google, not humans nor anyone else. Nobody.
But you would be surprised how many people cut and paste their website content from various competitors, creating a mangled mish-mash that’s far less than the sum of its parts, totally half-assed and pays customers little regard.
Few companies knowingly pay for this type of content (unless they’re got money to burn and a business to sink), but what happens is that unscrupulous digital marketers happily take their money in exchange for devilish copycat tactics.
It’s common to come across sites with duplicate title tags and meta-descriptions, duplicate body copy and duplicate H1 and meta-title tags. The good news is that they’re all easily rewritten by a pro and once they’re reuploaded, job’s a good ‘un!
3. Not optimising internal and external linksNot having the right frequency and quality of internal and external links really impairs your UX (user experience) and in turn, damages your SEO because Google doesn’t like anything that turns off human users.
Common problems our SEO eggheads come across when they conduct audits are broken internal links, links to old HTTP pages on an HTTPS site (considered unsafe), broken external links, pages that have only one internal link and pages where the crawl depth is more than three clicks – this means that potentially a customer has to click multiple times to find the information they need.
If your site’s suffering from any or all of these problems, all is not lost – links can be repaired or replaced smoothly by a decent SEO pro and afterwards every customer journey will be swift, simple and put a smile on their face. Who wouldn’t want a bit of that?
4. Crawl issuesWhen Googlebot, or any type of search spider, crawls your website, it wants the experience to be as logical, smooth, straightforward and sensible as possible.
Because if it finds crawling your website simple, a human should find it easy to navigate too.
So when there are elements on your site that stall a crawl or make it uncomfortable for the spider, it’s not a good SEO health signal and will ultimately impair your rankings.
Some common problems our SEO super-sleuths come across are broken internal images, URLs containing underscores, temporary redirects, outgoing external links with nofollow attributes, sitemap .xml not specified in robots.txt (which makes your site structure tough to understand) and sitemap.xml not found at all, which makes crawling and indexing your site a total pain in the posterior for web spiders.
These issues can be fiddly, but irredeemable they are not. Find yourself a technical SEO manager (where might you find one of those?…) and once they’re resolved you’ll be in Google’s good books again.
5. Site/page speed issuesDo you like waiting interminably while a website page loads or do you give up pretty quickly and look for a competitor whose page load speed isn’t so sluggish?
Yup – no one likes slow websites and Google is no different. Slow speed is detrimental to your rankings and ruins your UX and conversion rates.
As well as slow page load speed, we also come across unminified JavaScript and CSS files – these can be resolved by removing white space, comments and surplus lines.
Site and page speed are major ranking signals for SERPS and if they’re substandard, they’re a total turnoff for humans.
Again, a decent SEO can have your website moving as swift and slick as necessary, making it shine for Google and humans alike.
So there you have ‘em – 5 common website SEO issues solved
Nobody likes duplicate content. Nobody likes duplicate content. Not at all, at all. Not Google, not humans nor anyone else. Nobody.
But you would be surprised how many people cut and paste their website content from various competitors, creating a mangled mish-mash that’s far less than the sum of its parts, totally half-assed and pays customers little regard.
Few companies knowingly pay for this type of content (unless they’re got money to burn and a business to sink), but what happens is that unscrupulous digital marketers happily take their money in exchange for devilish copycat tactics.
It’s common to come across sites with duplicate title tags and meta-descriptions, duplicate body copy and duplicate H1 and meta-title tags. The good news is that they’re all easily rewritten by a pro and once they’re reuploaded, job’s a good ‘un!
3. Not optimising internal and external linksNot having the right frequency and quality of internal and external links really impairs your UX (user experience) and in turn, damages your SEO because Google doesn’t like anything that turns off human users.
Common problems our SEO eggheads come across when they conduct audits are broken internal links, links to old HTTP pages on an HTTPS site (considered unsafe), broken external links, pages that have only one internal link and pages where the crawl depth is more than three clicks – this means that potentially a customer has to click multiple times to find the information they need.
If your site’s suffering from any or all of these problems, all is not lost – links can be repaired or replaced smoothly by a decent SEO pro and afterwards every customer journey will be swift, simple and put a smile on their face. Who wouldn’t want a bit of that?
4. Crawl issuesWhen Googlebot, or any type of search spider, crawls your website, it wants the experience to be as logical, smooth, straightforward and sensible as possible.
Because if it finds crawling your website simple, a human should find it easy to navigate too.
So when there are elements on your site that stall a crawl or make it uncomfortable for the spider, it’s not a good SEO health signal and will ultimately impair your rankings.
Some common problems our SEO super-sleuths come across are broken internal images, URLs containing underscores, temporary redirects, outgoing external links with nofollow attributes, sitemap .xml not specified in robots.txt (which makes your site structure tough to understand) and sitemap.xml not found at all, which makes crawling and indexing your site a total pain in the posterior for web spiders.
These issues can be fiddly, but irredeemable they are not. Find yourself a technical SEO manager (where might you find one of those?…) and once they’re resolved you’ll be in Google’s good books again.
5. Site/page speed issuesDo you like waiting interminably while a website page loads or do you give up pretty quickly and look for a competitor whose page load speed isn’t so sluggish?
Yup – no one likes slow websites and Google is no different. Slow speed is detrimental to your rankings and ruins your UX and conversion rates.
As well as slow page load speed, we also come across unminified JavaScript and CSS files – these can be resolved by removing white space, comments and surplus lines.
Site and page speed are major ranking signals for SERPS and if they’re substandard, they’re a total turnoff for humans.
Again, a decent SEO can have your website moving as swift and slick as necessary, making it shine for Google and humans alike.
So there you have ‘em – 5 common website SEO issues solved
When Googlebot, or any type of search spider, crawls your website, it wants the experience to be as logical, smooth, straightforward and sensible as possible.
Because if it finds crawling your website simple, a human should find it easy to navigate too.
So when there are elements on your site that stall a crawl or make it uncomfortable for the spider, it’s not a good SEO health signal and will ultimately impair your rankings.
Some common problems our SEO super-sleuths come across are broken internal images, URLs containing underscores, temporary redirects, outgoing external links with nofollow attributes, sitemap .xml not specified in robots.txt (which makes your site structure tough to understand) and sitemap.xml not found at all, which makes crawling and indexing your site a total pain in the posterior for web spiders.
These issues can be fiddly, but irredeemable they are not. Find yourself a technical SEO manager (where might you find one of those?…) and once they’re resolved you’ll be in Google’s good books again.