What do we mean by “on-page SEO”?
I’ve noticed that many people do not understand what the phrase “on-page SEO” means. Simply put, on-page SEO refers to the creation, maintenance and optimisation of one or more web pages’ content for performance in search engine results. This is done in order to improve visibility in search results and drive more traffic. There are multiple elements that factor into on-page SEO and whether or not the page(s) in question will perform well in search.
Content plays a major role, since search engines now inspect the quality and quantity of content rather heavily. Code and structure – the HTML, PHP and/or other code used to build the page – is another important consideration. Link building (both on-page and off-site) as well as keywords also remain relevant in on-page SEO. The evolution of on-page SEO has been dramatic over the years and many elements have changed, but these four elements are at the heart of what has always constituted on-page SEO.
Why is it important?
For me, on-page SEO is important because of the dominance that search engines still enjoy in the process of obtaining information. While social media continues to grow in scope and clairvoyance, people in pursuit of specific information rely overwhelmingly on search engines. On-page SEO ensures that user experience is held to the highest standards, since it is largely modelled off of that. Without a variety of SEO elements in place, search engines will not be able to make sense of the content on your pages.
Because of this, most serious websites and blogs spend considerable amounts of time optimising technical elements and content. As link building has become more difficult to do properly, many are turning to on-page SEO directly as a way of making up the difference.
What are the top things website owners get wrong?
Perhaps the biggest mistakes I’ve seen those familiar with SEO make in regards to their websites is focusing too much on technical considerations. Elements of SEO absolutely matter, but content quality and regularity are crucial to overall performance.
However, there are technical concerns. Those using WordPress often fail to optimize the speed of their websites, which is often caused by poor code or too many activated plug-ins. A lack of meta data, tags and duplicated elements (titles, for instance) cause red flags.
Low quality content is another sure-fire way in which to get spotted and flagged by search engines. Content is king for a great user experience, and search engines can tell when you’re skimping.
How can you know if you need to improve your website?
There are many different elements of a website that require constant examination and improvement, but it can be fairly easy to determine whether or not certain elements of a site need revisions. One great way I’ve used to determine where improvements can be made is through the use of analytics. There are several utilities like SEMRush, Searchmetrics, DeepCrawl and OnPage.org that give you a ton of information.
Any set of tools or utilities that constantly monitors elements such as traffic, conversions and clicks can help determine whether these elements are improving in performance. This can provide you with insight on whether new steps need to be taken to boost performance.
Visitor feedback is another excellent way to learn more about your site’s performance. Whether it be through blog commentary, social media replies, email feedback or surveys, you can use a variety of methods to find out more about how visitors view your offerings. There is also video capture platform usertesting.com
What diagnostic tools are there and do any of them work (particularly free ones)?
Google and other search engines inspect every element of your website, rank its quality, and ultimately decide where and whether to display it in various searches. Being able to find any issues with a website has been critical to my long-term success.
Google Webmaster Tools is a great utility and will show you essentially how Google sees your website. It’s a great resource for troubleshooting unknown problems with traffic and indexing, and is free to use.
There are many different free website audit utilities available – like quickseoaudit.comand seoptimer.com – on the web that can be utilised in one way or another to find problematic code, un-optimised elements on various pages and broken links, among other things.
There are also several premium diagnostic tools that will delve into greater detail and provide more comprehensive information on which elements are harming SEO efforts and how to fix them. SEMRush.com, Searchmetrics.com, DeepCrawl and OnPage.orgare a few great examples.
Can a poorly built website reduce site traffic even if it is popular?
This is an interesting and rare concept. In my experience, websites have a difficult time becoming popular in the first place if they are built poorly. If the design is not optimised for mobile, the code is filled with errors and general SEO concepts are not incorporated into the build, then gaining a large amount of traffic will be difficult.
However, there may be instances in which a site has become popular because of a viral concept or a massive marketing campaign. In this case, then yes: a poorly-designed website will still under-perform even with massive popularity. The reason for this is that its visibility in search will be nerfed to some degree – even if Google and other search engines can tell others are finding it useful. User experience matters so much these days to search engines and a poorly-designed website usually affects that experience, so Google and others take that into account.
In what ways does Google penalise bad websites?
Google has a sliding scale of sorts when it comes to penalties and how websites can be affected by them. Penalties are not done intentionally or with malice; they’re usually passive in nature. Google Webmaster Tools has been useful for me in determining whether or not a site is being penalised.
Link spam is probably the biggest causes for penalties, lack of optimisation not necessarily. High bounce rates, long load times and duplicated content all add up and website may not rank but its far away from what we understand as penalty; especially that penalty is frequently a manual action set up by Google.