What is On Page Optimization:-
On-page optimization is also known as On Site Optimization. In On-page optimization, you optimize your website according to search engines guild line. On-page optimization has an effect on your website listing in natural results. According to latest algorithm of Google Panda 4.2, On Page Optimization is play 40% role in White Hat SEO. And SMO play impotantent role in on page optimization
main work in on page optimaization is
1. Page Title2. Keywords3. Content4. Title Tags5. Meta-Tags.
What is Keywords?
When user type some words on Search Engine to find out something, these words are known as Keywords.Keywords is most important factor for both On Page and Off Page SEO Techniques. Keywords must use in Title, Meta Keyword, Meta Description, Heading Tag, Bold Tag, and Content. In SEO, you must analyze the keywords for a particular website. Through this video WsCube Tech tell you about how to analysis or select best keywords for your business.
What is HTML Tags?
Tags are crawled by search engines but are not displayed to user as a part of web page content. If you want to Indexed your web page then consider using HTML Tags. HTML stands for Hypertext Markup Language which is use to create a website structure. In SEO we optimize HTML page but does not develops a page.
Tag Limit:-
1.Title : 70 Characters2.Meta Keywords : 140 Characters
3.Meta Description : 140 Characters
4.Heading :
H1 (1 time)
H2 (2 time)
H3
H4
H5
H6
5.BOld : At least twice
6.Hyperlink : 70 Characters
7.Alt : Each and every image should have filled ALT attribute.
What is sitemap?
A sitemap is a list of pages of a web sites accessible to crawlers or users. It can be either document in any form used as a planning tool for web design, or a web page that lists the pages on a web site, typically organized in hierarchical fashion.Sitemap should be in XML format, any search engine tools does not read sitemap in .txt .html or other format for bigger sites, it is recommended to use online sitemap generator automatic site-map generator..
Robot.txt?
We should use robots.txt file to restrict any search engine for crawling. In this video WsCube Tech explain you how to create robots.txt.The Robot Exclusion Standard = Robots Exclusion Protocol = robots.txt Protocol
1. You need a robots.txt file only if your site includes content that you don't want search engines to index like Your site Admin Panel.
2. User-agent: *Disallow: /Admin Panel/
3. less then bracket meta name="robots" content=" noindex,nofollow“ greater then bracket
4. less then bracket meta name="googlebot“ content="noindex" greater then bracket
5. X-Robots-Tag: noindex
Link – www.domain-name.com/robots.txt
0 comments: