Basic SEO Tips

Search Engine Optimization or SEO is a very complicated and delicate science. While not many people really understand how search engine like Google ranks websites, there are some basic things that one should concentrate on to improve the quality and quantity of traffic of a website.

1) Valid XHTML/CSS
Make sure your site source code is valid and you use proper semantics. Valid source code helps spiders better index your site and your site get a slight increase in rankings even without taking content into consideration.

You can validate your site to see if it follows W3C standards at http://validator.w3.org

2)Site Title tags
Make sure your site pages have site title tags and the title tags are unique in each and every page. While many people assume this and forget to put title tags, site title tags are weighed heavily by search engines.

3)Fix Broken Links
You should check out for any broken link and fix if you find any. Broken links increase download time of a site and it’s a really turn off to a user and crawlers.

4)Site Navigation
Your site navigation should be simple and you should interlink your pages. Do not use frames, flash or image links especially on your site home page. It make it difficult for crawlers to crawl your site.

5)Avoid duplicate content
Content is said to duplicate when same content is accessible from two different urls. Concentrate on unique and fresh content, crawlers love fresh content!

6)Header tags
You should make use of header tags i.e H1,H2,H3 which are important as Site Title Tags.

7)Search Engine Submission
While many people invest in search engine submission softwares or services, it’s advisable you submit your site to search engine yourself. If you order thousands auto submitted directory submissions, google or any other search engine might penalize you. And you should submit to relevant search engine and directories.

8)Sitemap
Build a sitemap for your site and submit it to all major search engines.A site map help spiders crawl pages more quickly.

9)Use and check robots.txt
Make use of robots.txt to bans certain pages from being indexed but also double check that relevant pages are not banned from being accessed by crawlers.

Always keep these simple tips in mind whenever launching a new site, blog or optimizing old blog or site for search engine traffic.

Leave a Comment

× Chat