On-Page Optimization Tips to Improve Your Organic Rankings

There was a time when on-page techniques meant keyword stuffing. With Google’s Panda update, the websites that relied on this tactic were thrown out from the first page! 

In SEO, strategies that work today may not work after a few years. This is why I always recommend budding SEO marketers to keep a tab on what’s happening in the industry.

In the post-Panda era, the Search Engine Results Page (SERP) listing is replaced by pages that provide value to the users.

If you’re looking to rank a page on Google, your priority must be to add value to the already existing content that is ranking on the #1 position. 

As you already know, the two critical factors that Google considers while ranking websites are on-page and off-page factors (link building).

In this blog, you’ll get a glimpse of both the basic and advanced on-page optimization techniques that you can use for your page. 

What you’re about to read is not another guide that boasts about never-tried-before techniques.

I’ve used the same on-page optimization techniques that you’re about to learn to rank my blog in the #0 position for the most competitive SEO keyword “Google algorithm update.” 


So to begin with, let’s delve into the basics of on-page SEO activities. If you’re looking for advanced strategies, please use the table of contents section to navigate and start reading. 

What is On-page Optimization?

On-Page SEO is a website optimization technique to rank pages within the site on search engines.

Proper on-page optimization of a page will ensure that it ranks higher on search engines for relevant keywords and drive targeted traffic. 

On-page SEO techniques include optimization of web content, URL, metadata, images, and page speed.

As the name itself suggests, on-page SEO is about optimizing the internal elements within your website to achieve search engine friendliness. 

Many times, websites that fail to do on-page optimization don’t show up on the first page of search results.

The reason for this is that the Google Algorithm considers a handful of on-page factors while ranking pages on the first page. 

One of the most critical factors that determine the position of a page on SERP is relevance. If the content on your page is not relevant to the query entered, you won’t rank on Google 99% of the time. 

During each stage of on-page optimization, you have to ensure that the page stays relevant to the target audience.

If your page is not relevant to the users, it will end up with a high bounce rate and eventually, your page will lose its position on the Google results.

Is On-Page SEO a Ranking Factor?

The success of search engines relies on the relevance of the results provided to the users.

Since Google provides the best results to the users, thanks to its advanced algorithms, it’s currently omnipotent in the search landscape. 

On-page optimization becomes all the more important due to this reason.

The whole purpose of doing on-page SEO is to ensure that the users and the search engine understands what is offered on a page. 

This also creates an opportunity as proper on-page SEO activities pave the way for you to find the target audience and send across the message effectively. 

All the factors that you’re about to learn through this post are critical as they give important signals to the search engine about your website.

Skipping any on-page SEO step may end with your competitor taking up your position on the SERP, which is exactly what you don’t want. 

On-page SEO Techniques


On-page SEO activities, as you may be already aware, are one of the biggest SEO ranking factors and it has the ability to make or break your website.

Here are a few basic On-page SEO factors that can impact the search engine rankings of a page on search engines.

Optimize Content

You may have come across the SEO proverb, Content is King. As I told you in the beginning, the SEO industry is so volatile that things change over time.

Content is still a deciding factor, but defining the quality of the content depends on varying factors. 

You may have come across scenarios wherein your high-quality content fails to make it to the first page of the Google Search list.

The major reason for this is that the content has been written without much audience research. 

If you create long-form content on a topic that is not in the interest of the target audience, chances are you won’t get organic traffic despite the best optimization efforts.

What you as an SEO must do is identify the most searched queries of the targets and try to give them the answers through your content. 

Proper keyword research will help you find the most searched query string, and you can use them within the content to rank high on search engines.

Check out our in-depth article on how to do keyword research to learn the various ways of finding keywords that get you to the No.1 position on Google Search. 

In addition to this, the best strategy would be to identify the kind of content that is working for your competitors and emulate it on your website.

However, emulating doesn’t mean copying or scrapping the content. 

Google hates websites that do this and things can go awry if you indulge in such practices. The best way to do this is to identify the content from the competitors and put it down in your perspective by adding more input, data, and research. 

Optimize URL Structure

URL is something that SEOs give the least importance. However, considering that URL is the basic building block of a website, you cannot ignore it.

In addition to that, all the important discussions about internal links, site architecture, and link juice, which we’ll talk about in a later part of this blog, have URL as its core. 

You may have come across URL structures like this:

By the sheer look of it, you will understand that it’s a real mess.

The problem with these kinds of URLs is that neither the users nor the search engines can understand what is within the link.

A URL is supposed to provide a gist of what the page is about to the users and search engines.

This is one reason why I have always recommended to make the URLs as short as possible while using the target keyword. 

An ideal search engine and user-friendly URL would be: 

Do a simple Google search for “iPhone 11 Pro.”

You can see that the top 10 results showcase clean URLs.

If you’re using WordPress, Joomla, or any other CMS for that reason, they automatically create SEO-friendly URLs using the title tag. 

However, if the title is lengthy, the URLs tend to be longer. The best practice would be to use the target keyword at the beginning of the URL and remove the rest.

If you find the target keyword already used for another page, take the secondary or LSI keywords to generate the URL. 

Optimize Meta Tags

Optimizing the meta title and description of a page is critical for improving the search engine rankings and the click-through rate of the website.

How Important is a Good Meta Description?

Google has confirmed that meta description is not an SEO ranking factor, but ignoring it can lose you valuable click-through rates.

The best way to optimize the meta description of a website is by providing the users with reasons why they should visit the page. 

The text that goes into the meta description is most likely what Google will display on SERP (unless it decides not to and picks up some random description from the content).

Since you’re competing with at least 10 other competitors, it’s important to make the description as click-worthy as possible.

Making the description click-worthy doesn’t imply that you have to stuff it with keywords. Use the keywords and the LSI as naturally as possible if you think it adds more value.

However, this may not help in improving the SERP position on Google.

If you want to learn further on how to optimize the meta description so that it doesn’t truncate, read our blog on Meta Description best practices. 

How to Optimize Meta Title Text?

Even after all these years, the meta title still remains an important on-page ranking factor.

The best meta title optimization trick is to ensure that your target keyword is placed at the beginning of the title. 

Example: Here is how I optimized the meta title of my top ranking post-Google Algorithm Update.


My Target Keyword is “Google Algorithm Update” and I have strategically placed it in the very beginning of the title followed by a semicolon. 

As and when a new update happens, I update the title, description, and the content. However, the first part of the meta title remains the same.

This strategy really works for long-form content and also for content that you plan to keep evergreen. 

One of the most common mistakes that SEOs make while optimizing the meta title is to place the target keyword towards the end of the title.

This strategy can backfire as Google might truncate the keyword when displaying it on SERP. It reduces the possibility of the page appearing and being clicked. 

If you want to learn more about optimizing the meta title without getting it truncated, read our in-depth blog on meta title optimization. 

Optimize Heading Tags

The heading tags within a page give both search engines and its users a fair idea of the topic that they are reading.

When it comes to crawlers, especially Google’s, the H1 tag comes off as an important ranking element. 

Placing the target keyword within the H1, which usually is the page title, carries the same ranking weight as optimizing the Meta Title.

In the majority of cases, Google will consider the H1 tag of a page if there is no predefined Meta title.  

There is a lot of misconception about using multiple H1 tags. However, Google’s John Mueller has categorically stated that multiple H1’s won’t affect the search engine rankings of a page.

He also added that Google’s algorithms are fine with multiple H1s if the users are happy with the way the content has been structured. 

That said, using the different variations of the heading tag will definitely provide Google with ample information about the main topics and the following subtopics.

This can add more value since featured snippets are picked up based on the sub-topics listed under each article. 

Optimize Images for SEO

Nowadays, web users are extremely distracted while reading the content that you have written and published. 

The attention span of people has reduced considerably, and monotonous text is one of the factors why they tend to shift their attention. 

Images are one of the most powerful tools in digital marketing and communication as it’s consumed on-the-go, unlike other resources such as video or audio.

In addition to this, there is a high chance that you can garner more organic traffic through Google Image Search. 

However, for SEO professionals, optimizing images for search engines is one of the core organic SEO techniques that will help boost the organic presence of websites.

Importance of Optimizing Images for Search Engines

One of the biggest mistakes done by webmasters not optimizing images for search engines.

While most SEOs optimize the other aspects of on-page, including title, description, etc., optimizing images is still considered technical on-page SEO and they leave it for the designer or the developer to implement. 

Images should not be ignored if you’re aiming to top the Google SERP. Just think of a website that has high-quality content and good quality backlinks. 

The pages within the site have all reasons to rank, but if it has an image that doesn’t follow the image optimization guidelines, it will negatively affect the overall ranking of the website, resulting in wasted optimization efforts.

If you’re someone who used to think that fixing the Alt text of the images alone can help in ranking on the Google SERP, then this guide is going to be a revelation for you. 

After reading through the tips that we are about to provide, you will be amazed to know the different ways of optimizing images and finding solutions to some of the pressing issues that are pulling your website down in rankings.

About Tag

tags are self closing tags within HTML. This tag, unlike the other HTML tags such as tag, doesn’t have to be closed separately as it closes itself. 

Example for Open & Close Tag:

Example for Self-closing Tag: 

How to Select Images?

Depending on the content that you’re providing on the website, try to use images that are closely related to the theme. For example, real people, objects, products, etc. are some of the most commonly used images. 

What is the Recommended Image Size?

Image resolution size is mostly based on the platform/CMS that you’re using for your website. For example, if you’re using WordPress, 1024×580 is the most suggested dimension for an image. If you’re following BoostStrap rules, the default width is 1024 px. However, the height may vary depending on the various other requirements. 

Coming to the image file size, it’s strongly recommended to have images that are less than 50 KB in size. As per the Google LightHouse tool suggestion, 30KB is the ideal size of an image. One of the reasons for recommending JPG images is because of the smaller file size. Other image types, such as ClipArt (PNG), Vector (SVG), and GIF, usually have bigger file sizes. 

Usually, a website has predefined image sizes. Nonetheless, the users may add higher resolution images, which eats up the speed again. Ideally, the images have to be created after consulting the developer regarding the actual size that’s required. 

For example: If the website requires only a 90×90 pixel image, give the exact file size rather than giving 90×120 or any other size for that matter. This will force the developer to use additional CSS to adjust the image, which can, in turn, slows the webpage. 

How to Optimize Your Alt Text?

The image alt text is an important on-page SEO factor that website owners leave unattended. The alt text is usually a description of an image added to a website.

This comes handy for web browsers when the image fails to load on a page. In such cases, the text used within the alt description will fill the image space and provide users with context. Apart from this, a written copy that appears in place of an image on a webpage if the image fails to load on a user’s screen.

Apart from this, Alt text plays an important role in On-page optimization as search engines value it for helping the visually impaired understand the content. However, SEOs nowadays try to sprinkle keywords within the alt text, which doesn’t fulfil the purpose. 

The best practice here would be to provide a descriptive alt text that features the LSI keywords so that the user gets an idea of what the image conveys. 


How to Optimize Image File Name for SEO?

Christening an image is an important aspect of optimization. The image name should be in relation to the overall theme conveyed on the page.

This is one of the easiest ways to give Google an indication that the image is highly relevant to the content.

Additionally, this can also be used as an opportunity to feature your secondary or LSI keywords. 

While optimizing the image file name, ensure that the space between each word is replaced with a hyphen.

This will avoid WordPress or the CMS platforms, creating file names with the %20 instead of the space. 


Wrong: https://www.example.com/img/on%20page%20optimization.jpg

Correct: https://www.example.com/img/on-page-optimization.jpg

Some webmasters fail to change the file name after taking a screenshot or downloading the image from sources such as ShutterStock or Pixabay.

This is not the best practice as the file name doesn’t match the content within the page. 

It’s also advised not to use special characters and symbols within the file name as it becomes hard for Google to decipher the idea and rank your image. 

How to Optimize Image Title for SEO?

Optimizing title tags of images is something that SEOs and website owners skip. However, the title tags play an important part in helping the visitors identify the image name.

Usually, the title tags are displayed when the users hover above an image. 

Most webmasters skip giving title attributes to images as the tooltip hover effect may affect usability. 

Title tags are not supposed to be stuffed with keywords. They are the representation of the image and have to be in a format that Search Engine Bots understand. 

How to Optimize the Image Caption for SEO?

The Caption attribute is the third and most ignored image attribute. However, the caption pays an important role in helping users get a brief about the image that it represents.

News websites use captions most appropriately.  

If you have an image that represents an action, event, or emotion, captions help in giving further information to the users.

Usually, website owners use the caption also to feature the copyright information or to attribute the image to the original source. 

Adding a descriptive caption is a highly-recommended on-page SEO strategy as the search engine bots read captions as content within the page. 

Proper Internal Linking

As a website owner, you should assign a hierarchy for each section of your website. This will provide users and search engines with the option to navigate and fetch relevant information easily. 

Internal links are hyperlinks from one page to another page within the website. The link can be placed using resources such as text, images, videos, or documents.

A proper internal linking structure will determine the importance of pages within a website. 

Understanding the importance of each page is vital as Google passes the link juice. The concept of link juice is also valid for internal links as only a properly interlinked website can pass link juice generated for one page to the other. 

Here are some of the advanced internal linking factors to consider:

1. Crawl Depth

Crawl Depth is an important internal linking factor to consider when setting up a website.

Crawl depth refers to the internal linking architecture of a website that determines how easily a search engine can find and index pages.

Generally, a crawl depth of three is the maximum as any deeper pages may fail to get the initial crawler attention.

Important money page (service, product pages) must be strategically placed within the crawl depth range of 0-2 for better crawlability.  

2. Page Hierarchy

Internal linking is one way to establish the hierarchy of pages within a website. The more internal link value you give to a page, the more important Google considers it. 

3. Link Relevancy

Though the links are within your website, it doesn’t mean any page can be linked with each other.

Ensure that only relevant pages are interlinked because Google dislikes websites that try to trick its algorithm.

Try to provide internal links to contextually relevant pages using highly relevant anchor text. 

4. Contextual Links

Adding too many links within a page is considered as a bad SEO practice.

Providing 100 internal links from a 1000-word content will make the page look spammy, and Google may never show it on the first page.

Even though there is no definite number for internal links, ensuring that it remains natural is important. 

5. Anchor Text

Anchor Texts are important for hyperlinking within one page to another.

It’s the anchor text that gives contextual signals to Google Crawlers about the relationship between the pages.

It’s recommended to use long-tailed anchors for internal linking as it provides more context to the users and Google crawlers. 

If you want to learn more about internal links and how to use it effectively, read our in-depth article on everything that you need to know about internal linking. 

Remove Intrusive Interstitial Properties

Think of a website that opens up to a full-screen video ad, and when closed, redirects you to another page with multiple pop-ups.

I would close the whole tab instead of closing the pop-up and go to some other less complicated page to fulfill my search intent. 

This is a common mistake that websites make, which is pulling down their organic reach.

Giving users a seamless web experience is critical to ensure you maintain top positions on the Google search. 

Google has been cracking down on websites that indulge in placing too many interstitial properties within the page.

In 2016, Google announced that any website that tries to force intrusive interstitial ads will be penalized. 

Check for Keyword Density

Keyword density is considered as one of the most basic On-page SEO factors. However, those days of stuffing keywords will not bring you to the first position today.

Google’s Algorithms are now trained to find websites that stuff keywords and penalize them. 

In these changed circumstances, keyword density has evolved, and it’s more to do with advanced on-page SEO techniques such as LSI and TF/IDF.

Repeating the same keywords multiple times will only hurt your SEO strategy.

The future lies in making Google understand that the words used within your content are contextual and relevant to the overall topic. 

In 2020, if you use the target keyword only three to four times in the content, your content can still rank, provided you have used LSI and TF/IDF technique. 

What are LSI keywords? How to use LSI keywords?

Using LSI Keywords, AKA Latent Semantic Indexing Keywords is a technique used by Google to understand the relation of words used within a page to the topic discussed.  

LSI Keywords are contextually relevant words that appear within a topic. Google uses its algorithm to find and understand the common terms that appear on different websites for the same topic to analyze the quality of the content.

With LSI keywords in place, Google is now able to determine the quality of the content despite the fewer number of times the keyword appears.

The best way to find LSI keywords is by checking the “Related Search” section on Google search and also by using certain free LSI tools.

TF-IDF: Can It Really Help Your SEO?

TFIDF is the short form for Term Frequency–Inverse Document Frequency.

Google’s John Mueller was the first to confirm that the search engine giant uses the TFIDF technique to retrieve information from the web. 

TFIDF is an information retrieval method that tries to understand the relevance of a combination of words that appear on a page in relation to the overall index of all the content on the web.

Google has many other techniques for information retrieval and TFIDF is just one of the metrics it uses. 

In addition to this, it’s difficult to optimize a web page based on the TFIDF metric because it’s based on an aggregate of all the content currently indexed by Google.

However, you can use certain tools such as SEMRush Writing Assistant or TFIDF Tool to check whether your content qualifies under the basic TFIDF metric. 

Schema Markup/Structured Data

Google SERP Features are now turning out to be important Click Through Rate drivers. Most of these SERP features are the direct result of websites implementing Structured Data or Schema Markup. 

Structured Data is additional information that websites provide to the search engine to better understand the content.

Structured Data ensures that search engines provide valuable information/clues even before a user enters a web page. 

The best example of Structured Data helping users is the reviews that you see on the search results for Movies, Events, and Products. 

Google has categorically stated that Structured Data is not a SEO ranking factor. However, missing out on Structured Data may cause you to lose out on the click-through rates.

Since the additional information is missing, your target audience may choose your competitors instead.

Going back to the history of Structured Data, it was an initiative started by the giants of the search engine market – Google, Bing, and Yahoo in 2011 to make the process of understanding the intent of each page easier. 

The worldwide web is loaded with information that has not been categorized or organized.

The search giants wanted to streamline the web content, so they introduced a coding standard to help their algorithms pick up information easily and in an organized manner. 

Enabling the structured data on a website or on individual pages ensures that the search engines crawl websites and display them with rich information or rich snippets. 

What are the Structured Data Formats?


This is the Google recommended Structured Data format that uses the JavaScript notations or markups within a page to help search engines understand the page type. 

Example: Local Address JSON-LD

{ “@context”: “http://schema.org”,  “@type”: “LocalBusiness”,  “address”:  { “@type”: “PostalAddress”,  “addressLocality”: “Midtown Manhattan”,  “addressRegion”: “NY”,  “streetAddress”: “721 Fifth Avenue”  },  “description”: “Trump Tower is a 58-floor, 664-foot-tall (202 m) mixed-use skyscraper at 721–725 Fifth Avenue, between 56th and 57th Streets.”, “name”: “Trump Tower”, “telephone”: “000-000-0000”  }  


Microdata is another format used to specify the structured data. Even though this is a Google approved Structured Data Format, it’s prone to messing up the code as it’s directly injected into the HTML.

It’s a time-consuming process as it styles the actual elements within the page using inline codes, resulting in reduced page speed.

Example: Local Address Microdata

Trump Tower
721 Fifth Avenue
Midtown Manhattan, NY 20500
United States


RDFa is another Structured Data Format used by websites. Even though it’s Google approved, the number of websites using this format is less compared to the other two.

RDFa (or Resource Description Framework in Attributes) adds a set of attribute-level extensions to HTML for embedding structured data.

Example: Local Address RDFa

Trump Tower

Trump Tower is a 58-floor, 664-foot-tall (202 m) mixed-use skyscraper at 721–725 Fifth Avenue, between 56th and 57th Streets.

721 Fifth Avenue Midtown Manhattan, NY

Phone: 000-000-0000

Sitemap.xml File

Search Engine crawlers are busy indexing millions of pages out there on the web. The time they spent on each website depends on many factors, such as the number of pages, site load speed, and the HTTP status codes.

All that said, you can help them crawl the pages within your site faster by enabling a sitemap. A sitemap is an XML file placed within site to help search engines navigate through different pages 

Sitemaps are easy to generate, and there are a handful of tools like the free Sitemap Generator Tool that can help you with the process.

However, if you’re managing a CMS-based website, the process becomes much easier as most of the time, sitemaps come as an innate feature. 

To ensure Google indexes pages within the sitemap, you have to add it within the Google Search Console. 

One of the key advantages of a sitemap is that it provides Google crawlers with a clue about the importance of each page.

Since a sitemap is generated based on page hierarchy, the crawler gets to know which page is more important.

A sitemap also provides information regarding the freshness of the content, which also helps the crawler to reindex pages.

Robots.txt File

Robot.txt is as important as a sitemap. Usually, all websites have a predefined robots.txt file to ensure that a few pages are not indexed by the search engines.

Not having a robot.txt file will not hurt SEO efforts. However, they may take away the crawl budget assigned to your site as search engine bots may take time to crawl and index pages that are not relevant to your users.  

To check if you have implemented Robot.txt on your website, search for https://www.yoursite.com/robots.txt on your site. 

Optimize Page Speed

From the time Google announced that the load speed of a website is one of the determinants for organic ranking, there has been a lot of discussion about the page of a website. Google Page Speed Insights is a handy tool for SEOs to check the load speed of a page. 

There are a lot of factors that determine the site speed on the Page Speed Insights tool, and this begins with the web hosting provider that you have chosen. 

Modern-day users are less patient and with the plethora of options provided to them. They prefer to browse through sites that open in a blink.

So, what is the most preferred page load time? We have covered this in a comprehensive blog post titled Google Recommended Page Load Time. 

The best way to reduce the page load time is by reducing the size of a few on-page elements, such as JavaScript, CSS, and image.

A chunk of the page load time is eaten up by these elements and proper optimization of these can result in bringing down the page load speed of your website. 

In addition to this, adapting to modern frameworks such as Angular JS and React can boost the speed of your website.

Google always boosts sites that provide users with a seamless user experience. Ensuring that your website is free of technical on-page SEO issues will help in improving the organic rankings.

Increase E-A-T (Expertise, Authoritativeness, and Trustworthiness)

Google has been quite vocal about pulling down the rankings of websites that lack Expertise, Authoritativeness, and Trustworthiness; E-A-T. The term E-A-T has its origin in the Google Quality Rater Guidelines. 

Even though the initial versions of the quality rater guidelines talked very little about E-A-T, the newer ones have a whole section dedicated to the concept.

There are a lot of factors that Google collates before deciding the fate of a page on its results page, and E-A-T is now one of the critical factors, especially if you’re dealing with a YMYL Site. 

YMYL sites are the ones that can impact the life and security of the end-users. Google has been careful lately about how website content can affect the livelihood of its users.

Thus, it introduced E-A-T, which determines whether the information present on the site is authentic and valid. 

This is especially true when it comes to Health, Banking, Finance, and Wellness websites, which all come under the ambit of YMYL. 

There are a lot of factors that determine the E-A-T score of a website, and this includes factors such as the Niche (alternate medicine sites have a hard time ranking on Google), Author Bio, About Us, Security Features, Policies, etc. 

I’ve written an in-depth guide explaining how to optimize each of the above mentioned E.A.T factors for your website to rank higher than your competitors. 

What are Some On-Page SEO Mistakes?

Are you writing great content, but it’s not ranking? In this post, eight onsite SEO mistakes are given that you should avoid.

If you want to rank your web pages in the top results of the Google search engine, it’s essential for you to have basic knowledge about the on-page SEO techniques

Many people don’t have enough knowledge of SEO, and because of this, they often struggle a lot with technical on-page SEO.

The landscape of Digital Marketing has significantly evolved over the past two decades. Google changes its algorithm regularly that makes due to which websites can lose SEO practices and strategies of content marketing as well.

Being Digital Marketers, you all put in your efforts into deciding the strategies and practices that’ll work best for your business.

However, sometimes, you know there could be mistakes that are unknowingly making your website rank fall. Let’s learn what these mistakes are, below.

1. Duplicate Content

If you have duplicate content on your pages, it’s one of the common onsite SEO mistakes you’re doing. Since there are a number of similar businesses like yours, each one of them is trying to produce unique content that will help them develop authority in a particular niche.

If you have to stand out from the crowd, you too need to create unique and high-quality content regularly.

One common duplicate content mistake happens when applying filters for a category or product listings. So, you can avoid this mistake by using the “canonical tag.” This eliminates the chances of Google detecting  duplicate pages and prevents one of the biggest on-page SEO mistakes

2. Forgetting the Importance of Keyword Research in On-Page SEO

Without doing proper keyword research, you can’t learn about your target audience. Keywords work as the bridge between the intent of the user, and the content. So, make sure that your content should be optimized with the keywords.

If your content a broad topic but failed to optimize for the keywords that your target audience search on the internet, your content will never rank on search engines.

This is because proper optimization of the pages with highly relevant keywords helps in getting your web pages to rank higher on the search engines’. In addition to this, it would enable your blog to generate leads.

When it comes to blogs, you should focus mainly on the long-tail keywords, or the keyword phrases providing relevant information, usually known as informational keywords.

Some bloggers do not optimize the headers, meta tags, and the content with the keyword phrases users are searching, while on the other hand, few bloggers are over-optimize their content. Both are not good on-page SEO activities.

Over-optimization of content by doing the keyword stuffing could make your content look spam to Google. For this reason, make sure that you add targeted keywords naturally within your content to make it rank higher on Google SERPs.

3. No Sitemap

What exactly is a sitemap? A sitemap is an XML file. XML stands for eXtensible Markup Language which is designed to store the data, as well as to transport it.

Sitemap feeds the important data to the search engines about the most important pages of a website including the date when the webpage was last updated.

Karl Kangur, founder of Business Media adds: “The emphasis here is on most important pages. If you need a sitemap just to get Google to crawl your site fully, you’ve got major structural issues.

You’ll want to ensure that your sitemap (and the Google index) only contains pages that are adding value to your site. Do a “site:mydomain.com” search for your website in Google, go through all of the results, and ask yourself – is this something someone would want Google to land on? If the answer is no, you should be no indexing these pages and removing them from your sitemap. Leave the authority for your best pages.”

This allows the spider to crawl through the site intelligently. It’s true that creating a sitemap can’t ensure the success of a search engine, but it makes crawling easier for the bots. A higher crawl rate can help in better rankings indirectly.

4. No Header Tags

Header tags usually give a structure to the content. These tags also help the search engines’ to understand which parts of the page is more important.

Header tags are used to prioritize the content of the page. However, when you misuse it, it might become confusing.

To avoid this onsite SEO mistake, you’ve to make sure that the main header tag is unique and inserted all the relevant keywords on the page. You can also use those keywords in suitable subheadings.

5. No Image Description and Alt Tags

There’s the fact that search engines’ can’t understand the images. For this reason, it’s mandatory to attach the relevant Alt text as the description of the image.

Such a text would make it easy for search engines to understand the image. Adding the vague description in the images is also one of the onsite SEO mistakes you might be making. Try to avoid it to get better results.

6. Poor Meta Tags

Meta Descriptions displays the gist of what is there within your website even before they visit. However, a good meta description is important to improve the organic Click Through Rate.

It’s true that the Meta description doesn’t work as an on-page SEO ranking factor.

It’s been recorded from some past research that around 30 percent of the websites are adding the duplicate meta description, while approximately 25 percent of the websites are not even adding the meta description.

Ensure that you are adding unique meta descriptions to the pages within your site.

7. Broken Links

If your site has broken links, it could be one of the more significant onsite SEO mistakes!

With the growth in your site, you need to update the resources. Having one or two broken links isn’t a big issue. You can quickly solve it by setting up the 404 pages in a proper way or by redirecting the uses to a relevant page within your website using 301 redirects.

However, if there are a number of broken links, it could be dangerous. There are several possible reasons for this, such as the visitor sees the 404 pages instead of the required information. It leads to a massive drop in organic traffic. Also, your site would be considered to be of low quality.

Now, you must be wondering how can you identify the broken links? Well, to identify the broken links, you can use various Site Audit tools such as SEMrush or you can add plugins to check the links in your content.

8. Slow Load Times

Google included the loading times of a website in its algorithm announced in 2018. If your websites take much time to load in desktops, as well as mobile phones, it’s can drop the ranking of your website.

Using the PageSpeed Insight tool from Google, you can easily analyze the loading speed of your website. Not only this, but it also gives you the reasons for slow load times, and also how you can resolve it.

Well, some of the standard solutions include the elimination of the render-blocking JavaScript, enabling the compression and minifying CSS, HTML. In this, the additional spaces get removed and automatically increase the loading speed.

Final Thoughts

I hope I have provided enough arsenal to take down those who propagate the idea that on-page SEO is just about adding a few more keywords on pages.

This post will be updated as and when I figure out new on-page SEO techniques to rank websites better on search engines. If you think I missed a critical factor in this on-page seo checklist, please feel free to let me know in the comments section. 

Here’s the Source of this Content