Hello Friends, Have you ever wondered about why your website not ranking in search engine (Google)? If you don’t know why then you should read this article to optimize your website onsite optimization. Today I will teach you all about Onsite SEO Optimization.
On Page Optimization
On page optimization means improving the designing of the website according to the search engine requirements so that it can be easily promoted and this is done through source page of the site where coding of the site is written.
Meta tags are of many types.
Title: 65 characters max.
Description, keywords are the most important. Through Meta tags we can improve the designing of the site and this helps in improving our ranking in search engines. Whenever search engine comes for crawling (checking of site) it look for meta tags and if good meta tags are there the search engine improves the ranking of the site. An seo person has to create the meta tags and added in the site by the web designer.
Creating keywords, writing description, creating title of the every page etc. And after that he gives it to designer to be added in the coding part of the site through ftp.
Every search engine has crawling(checking) time like after 20 days or month at that time it looks to the coding of the site and meta tags and if coding is according to the SE than it rises our ranking or drops also.
Description should be of 150 characters and keywords must be used in writing the description. Don’t use only keywords separated by commas in description.
Title, Description of every page is different according to the page, and later on added through ftp (file transfer protocol).
There should be around 15-20 keywords on a single page. Use keyword tool for getting keywords according to your site. Keywords of every page will be different.
Robots Txt File
Creating Robots file (written in notepad with name robots.txt and added through ftp filezilla.)
Robots file is added in our site for blocking (hiding) some of our pages of the site. Then search engines will not able to crawl those pages. Robots file is used for hiding some of your bad pages of the site from search engine crawlings, because it may drop our rankings, so we hide those pages from search engines.
User-agent: * (all search engines)
User-agent: googlebot (for blocking google)
User agent: yahooseeker (for blocking yahoo)
User agent: msnbot (for blocking bing)
Read More Here : http://en.wikipedia.org/wiki/Robots_exclusion_standard
Header and footer
Use header and footer for adding your important keywords, use h1 tag for bolding keywords and hyperlink them also.
http://w3schools.com for learning html
Type in google:
google site map generator
add your site you will get sitemap to add in your site through ftp.
Site map is added in the site to make users comfortable and search engines also crawls these sitemaps and site maps also helps in improving the ranking of the site.
It will generate 4 files:
Download these files first:
you can these files in site for creating sitemaps through ftp. Site maps are very important in sites because google crawls these site maps also and user also feels comfortable on the site as he can find all the pages in the sitemap.
Optimizing SEO content
Optimizing seo content means making hyperlinks of your important keywords in the content and linking them with your other pages of the site.
Read this for more: http://www.beanstalk-inc.com/articles/seo/optimization.htm
Image tag optimization
It Is also called alt tag. We must add image tag to any image for explaining to users regarding the image. Some times images don’t get opened but image tag will explain about that image and user will not get confused.
Read this for more: http://www.spunkyjones.com/image-alt-tag/
Canonical / 404 Implementation
Canonical error means 404 page not found error. We must redirect (called 301 permanent redirection) those pages which contains canonical error to some other page of the site or to the home page.
Keyword Density Analysis
Keyword density means how many keywords should be there in web page content.
There should be around 15-20 keywords in one web page content.
Formula to calculate keyword density:
total no. of keywords/Total words*100= percentage of keyword density
Keyword proximity: closeness between two keywords is good.
Keyword frequency: after how much time keyword is repeated should not be more than 4-5 times.
Keyword prominence: giving importance to your keywords and placing them at important place in your site. Use h1 tag for this.
3-10% density is good.
Density checker: http://www.webconfs.com/keyword-density-checker.php
Anchor text means hyper linking your keywords, we must hyper link our keywords to make them anchor text.
Read More Here : http://en.wikipedia.org/wiki/Anchor_text
It means rewriting those urls which contains #, $, %, @ kind of foreign characters. Google don’t crawl these kind of url and these pages don’t get indexed.
Type in google w3c validator
Google webmaster tools
Google webmaster tools gives us complete report of our site. For this we need to add verification code to our site homepage before body tag.
Type in google google webmaster tools
Login with gmail id
Add your site by clicking on add your site
add you site and you will get verification code to add on your site home page, add that code through ftp file zilla.
Now you can see your sites complete report with in few hours.
Webmaster tools video: http://www.youtube.com/watch?v=FxyinmAe0i8 or ttp://www.youtube.com/watch?v=hf0sMDKoWwE&feature=related