Covers insights on Search Engine Optimization (SEO), Google Search Algorithm Updates, Paid Search, Social Media to name a few . Jag SEO Blog help articles, tips and tricks are for free to educate newbie’s, web masters, online business owners and seasoned professionals!

Thursday, June 23, 2011

How to mention multiple XML sitemap in robots.txt?

A robots.txt file for a website helps to inform search engine bots whether to allow or disallow a file or a directory for search index

Typical robots.txt file will look like,

User-agent: *
Disallow: /includes/
Disallow: /scripts/
User-agent: * - Means this section applies to all robots
Disallow: - Means that the robot should not visit any pages in the mentioned folder.

If you have a single or multiple XML sitemap then do not miss the opportunity to add to the robots.txt file. Below is an example of how multiple xml sitemap can be added to robots.txt

User-agent: *
Sitemap: http://www.yoursite.com/sitemap1.xml
Sitemap: http://www.yoursite.com/sitemap2.xml
Sitemap: http://www.yoursite.com/sitemap3.xml
Disallow: /includes/
Disallow: /scripts/

Related Post: Everything about Meta Robots and robots.txt

Sunday, May 29, 2011

How SEO's mistakes turns to black hat SEO?

Black hat SEO is not only about keywords hiding or getting links from link farms and link distribution software’s. This also involves the SEO methods that don’t follow Google's, Bing's or any top search engine guidelines.

Still many SEO's think keyword stuffing will work or they think they are doing the right way without checking the site keyword density and it can bring those organic results. This is again one form of black hat SEO. Here is the Google guideline on "Keyword stuffing".

Inlinks bring popularity to your websites and link juice gets your listing top. Natural links are good and you can get those links by attracting your readers with quality contents. But many do go for paid links and it results in violation of guidelines. Here the guidelines on "Paid links".

SEO is a process that should be taken care by following guidelines. If you don’t do so then you have high chance to get penalized. As well have your content unique and quality. Do not have copy content, modified / rephrased content or content scraper that can lead your site to get penalized.