Category : Search Engine Optimization

Google News Inbound Marketing Internet Marketing Search Engine Optimization SEO News

Robots.Txt: A Beginners Guide

Robots.txt is:

A simple file that contains components used to specify the pages on a website that must not be crawled (or in some cases must be crawled) by search engine bots. This file should be placed in the root directory of your site. The standard for this file was developed in 1994 and is known as the Robots Exclusion Standard or Robots Exclusion Protocol.

Some common misconceptions about robots.txt:

  • It stops content from being indexed and shown in search results.

If you list a certain page or file under a robots.txt file but the URL to the page is found in external resources, search engine bots may still crawl and index this external URL and show the page in search results. Also, not all robots follow the instructions given in robots.txt files, so some bots may crawl and index pages mentioned under a robots.txt file anyway.  If you want an extra indexing block, a robots Meta tag with a ‘noindex’ value in the content attribute will serve as such when used on these specific web pages, as shown below:

<meta name=“robots” content=“noindex”>

Read more about this here.

  • It protects private content.

If you have private or confidential content on a site that you would like to block from the bots, please do not only depend on robots.txt. It is advisable to use password protection for such files, or not to publish them online at all.

  • It guarantees no duplicate content indexing.

As robots.txt does not guarantee that a page will not be indexed, it is unsafe to use it to block duplicate content on your site. If you do use robots.txt to block duplicate content make sure you also adopt other foolproof methods, such as a rel=canonical tag.

  • It guarantees the blocking of all robots.

Unlike Google bots, not all bots are legitimate and thus may not follow the robots.txt file instructions to block a particular file from being indexed. The only way to block these unwanted or malicious bots is by blocking their access to your web server through server configuration or with a network firewall, assuming the bot operates from a single IP address.

robots-txt-beginners-guide

Uses for Robots.txt:

In some cases the use of robots.txt may seem ineffective, as pointed out in the above section. This file is there for a reason, however, and that is its importance for on-page SEO.

The following are some of the practical ways to use robots.txt:

  • To discourage crawlers from visiting private folders.
  • To keep the robots from crawling less noteworthy content on a website. This gives them more time to crawl the important content that is intended to be shown in search results.
  • To allow only specific bots access to crawl your site. This saves bandwidth.
  • Search bots request robots.txt files by default. If they do not find one they will report a 404 error, which you will find in the log files. To avoid this you must at least use a default robots.txt, i.e. a blank robots.txt file.
  • To provide bots with the location of your Sitemap.  To do this, enter a directive in your robots.txt that includes the location of your Sitemap:
      Sitemap: http://yoursite.com/sitemap-location.xml

You can add this anywhere in the robots.txt file because the directive is independent of the user-agent line.  All you have to do is specify the location of your Sitemap in the sitemap-location.xml part of the URL. If you have multiple Sitemaps you can also specify the location of your Sitemap index file.  Learn more about sitemaps in our blog on XML Sitemaps.

Examples of Robots.txt Files:

There are two major elements in a robots.txt file: User-agent and Disallow.

User-agent: The user-agent is most often represented with a wildcard (*) which is an asterisk sign that signifies that the blocking instructions are for all bots. If you want certain bots to be blocked or allowed on certain pages, you can specify the bot name under the user-agent directive.

Disallow: When disallow has nothing specified it means that the bots can crawl all the pages on a site. To block a certain page you must use only one URL prefix per disallow. You cannot include multiple folders or URL prefixes under the disallow element in robots.txt.

The following are some common uses of robots.txt files.

To allow all bots to access the whole site (the default robots.txt) the following is used:

User-agent:*
Disallow:

To block the entire server from the bots, this robots.txt is used:

User-agent:*
Disallow: /

To allow a single robot and disallow other robots:

User-agent: Googlebot
Disallow:

User-agent: *

 Disallow: /

To block the site from a single robot:

User-agent: XYZbot
 Disallow: /

To block some parts of the site:

User-agent: *
 Disallow: /tmp/
 Disallow: /junk/

Use this robots.txt to block all content of a specific file type. In this example we are excluding all files that are Powerpoint files. (NOTE: The dollar ($) sign indicates the end of the line):

User-agent: *
 Disallow: *.ppt$

To block bots from a specific file:

User-agent: *
 Disallow: /directory/file.html

To crawl certain HTML documents in a directory that is blocked from bots you can use an Allow directive. Some major crawlers support the Allow directive in robots.txt. An example is shown below:

User-agent: *
 Disallow: /folder/
 Allow: /folder1/myfile.html

To block URLs containing specific query strings that may result in duplicate content, the robots.txt below is used. In this case, any URL containing a question mark (?) is blocked:

User-agent: *
 Disallow: /*?

For the page not to be indexed:Sometimes a page will get indexed even if you include in the robots.txt file due to reasons such as being linked externally. In order to completely block that page from being shown in search results, you can include robots noindex Meta tags on those pages individually. You can also include a nofollow tag and instruct the bots not to follow the outbound links by inserting the following codes:

<meta name=“robots” content=“noindex”>

For the page not to be indexed and links not to be followed:

<meta name=“robots” content=“noindex,nofollow”>

NOTE: If you add these pages to the robots.txt and also add the above Meta tag to the page, it will not be crawled but the pages may appear in the URL-only listings of search results, as the bots were blocked specifically from reading the Meta tags within the page.

Another important thing to note is that you must not include any URL that is blocked in your robots.txt file in your XML sitemap. This can happen, especially when you use separate tools to generate the robots.txt file and XML sitemap. In such cases, you might have to manually check to see if these blocked URLs are included in the sitemap. You can test this in your Google Webmaster Tools account if you have your site submitted and verified on the tool and have submitted your sitemap.

Go to Webmaster Tools > Optimization > Sitemaps and if the tool shows any crawl error on the sitemap(s) submitted, you can double check to see whether it is a page included in robots.txt.

Read More
Internet Marketing Search Engine Optimization

What Small Business Owners Must Do For Successful SEO

SEO has changed considerably in the recent years and with these huge algorithmic changes from Google, hitting the market regularly, can make the life difficult for people interested in SEO.

People are even getting tired of SEO, especially the small business owners. They are going for marketing strategies which are simpler to understand and implement, rather than the risky SEO field. There are other people who will not believe that time has changed and the old tricks don’t work any more. As they continue to work using the old methods, the outcome is often catastrophic.

SEO alone can’t work

Although SEO services are still in demand, adding your marketing strategies with it might lead you to a better result. Eighty-three percent of companies who are successful in SEO have integrated SEO and social media together. This is easier said than done and it takes a lot more than simple link building, which was the trick for the last few years.

SEO strategy of 2014 means your concentration is on brand and audience. This shouldn’t be interpreted as discarding the ranking factor. But it tells you how the new SEO works in 2014.

Create Your Small Business Brand Using SEO

It was possible in the past to do SEO without ever mentioning about the brand. Using tricks like blog comments, widgets, infographics, article posting and on-page tactics, one can do those easily. People might disagree that it used work there are lots of people who will certify that it does. Many low budgeted SEO companies found them on first page using these tricks. However, these tricks don’t work any more.

Nowadays, its user experience optimisation which floats the SEO boat. Thus, the social shares, careful backlinking from selected places while caring for what the users are looking for and of course the use of citation. Content marketing is how you can keep your audience interested in you and that is why ninety-three percent marketers are planning to use content marketing in 2014.

When these small business wants to put their names in these contents which they produce, there is a funny thing to observe. Google can easily sniff these branding efforts, even if they are not hyperlinked, and considers it as a basis of legitimacy. This is why people still believe that press releases can really help ranking while there are some proofs against it.

Response to SEO Provider

Although the importance of social sharing and brand building can’t be stressed more half of the companies going for SEO still prefer to outsource the SEO work. However, the SEO agency and client interaction is going to play an important role in 2014. To be successful in SEO, the SEO companies will need much more knowledge about your target customer, their demography and their needs.

As the SEO company prepares the press release, not only the client company needs to give them a news which is worthy of a news but also needs to approve the final copy in a timely manner so that these press releases can flow out at regular interval. Mentioning the brand without hyperlinking is still good enough a job. While this might sound unbelievable, some client companies don’t even attend the phone calls from the SEO companies.

Conclusions

The small business owners might miss those simple days of link building and directory submission, the trend of SEO suggests that the days are gone. SEO is still very alive a profession but press release, branding and social integration are becoming impossible to deny.

Read More
Search Engine Optimization

5 Trade Secrets of a SEO Professional

It’s no secret that Google, Bing and Yahoo! determine the winners and losers online. Rank well, and you’re likely to edge out your competitors for the Web-searching prospective customer. As such, if your business depends upon Internet traffic to retain market share let alone grow, it’s highly advisable to partner up with a quality SEO professional.

Disclaimer: Not all SEO professionals are created equal. Choose wisely!

Just because you are a small business doesn’t mean you can’t afford a quality SEO professional. BUT, certainly be on the lookout for spammy services that promise big for a low monthly service fee. A good SEO professional will sit down with you to discuss your goals, budget, products & services and target customer, then can assemble a comprehensive plan for attack.

Without further ado, the list:

1. Fix the errors

Seems simple enough, yet so frequently overlooked. Site stability and speed is calculated into Google’s page rank algorithm. Sometimes the site will appear functional, users can navigate from page to page, and despite a bit of lag, everything seems to function reasonably. Running the site through a series of tools and tests tells a different story. Here’s a list of the more common issues and how to fix them.

Crawl errors.

If there are crawl errors, search engines might not even be able to index your site, leaving you dead in the water. These can be quick and easy to fix, given a little know-how. Sometimes it’s an issue with the site map links not matching up with link destinations, caused mostly by temporary redirects. If it’s short-term, keep it temporary (302), if it’s permanently moved, give it a 301 Permanent redirect and be sure to update your site map to reflect the new destination.

Duplicate content.

More often than not, pages with light “original” content but a boilerplate group of paragraphs in the main content region on many pages confuses the crawler as to the intent and topic for the page. Move this content into a lower priority position on the page, and if at all possible, cut it down to less than a paragraph (2-3 sentences) and link off to a page dedicated to that topic.

Blank or duplicate meta description.

This one’s more for the users that might see your page come up in a search listing. That collection of sentences below the link? That’s the description. If you’re using the same general description across all pages the relevance to the user’s search drops significantly, and with a lower click-through rate, your page rank will more than likely take a hit.

2. Watch the keywords

Let’s say your a pet shop and you have a page dedicated to chew toys for dogs. This page isn’t an opportunity to discuss your business’ history, cat adoption days, food choices, oh, and chew toys for dogs. Try to use the exact keyword phrase you’re interested in targeting at least 2-3 times in contextual, common language in the first 150 words so both the reader and search bot know what this page is about. Getting the exact keyword phrase in the URL and page title is major bonus points.

3. Content is king

One sentence on a page won’t get you ranked (unless your HuffPost or Wikipedia), so don’t expect any results until you have original, meaningful content on the page which includes the targeted keyword phrase(s) as mentioned above. Strive for north of 300 words, on up to 2,000 words for long article/blog entries. I tend to try to stay in the 600-1,000 word range, as I find it gets you the most bang for your buck when hiring contract writers.

An oft-forgotten type of content is the images you’re already posting on your page to make them more visually stimulating. Search engines pick up these images for their image search catalogs, including some metadata:

file name
title tag
page description where it was found
link to the page hosting the file

Make sure the file name describes the image, and if at all possible, ties back to the page content as well.

4. Google+ is a plus

Raise your hand if the first thing you do when you wake up in the morning is roll over, grab your phone, and see what you missed on Google+. Didn’t think so. But, does that mean putting links, content, and managing a page on Google+ is pointless? In fact, the opposite is true. Google treats their own social network as a key place to pick up new pages to index. The bots like to see the pages stay active and fresh. Of course, this is most relevant with blog content, not a company’s “about” or “services” page.

5. Submit the site

Has your site and it’s respective site maps been submitted to Google, Bing, and Yahoo!? Once you’re sure you’re free of crawl errors get it listed, and sign up for Webmaster Tools on each platform. You’ll receive notices when opportunities and issues arise, plus there are great tips on improving your site’s presence, and free tools to test your work.

Conclusion

I hope you find these tips helpful. While a SEO professional will certainly kick it up a notch, these are some good “first pass” a tech-savvy site owner can do on their own, and hopefully demystifies some of the basics of how a SEO professional operates.

Read More
Search Engine Optimization

SEO Basics To Challenge Google's New Future Algorithms in 2015

There are right now millions and millions of Personal Computers operating online and the numbers are growing hugely by the minute. But there are already many more mobile devices in use with Internet searching capability than there are PCs operating online. And Google knows this fact and therefore will take steps to give users a much better online searching experience.

One way the best search engine in the planet can do this is by creating icons to aid users in knowing what Google lists next to the searchers’ results. This could be an unfriendly icon, for instance, when the search engines generate mobile errors for user’s searches online. If they go this way, webmasters would be wise to make sure sites can be seen correctly and fast in mobiles, without any problems.

Read More
Search Engine Optimization

2015 Starts With These 5 SEO Tips

Search Engine Optimization results can take the type of a couple of straightforward steps that ought to be emulated to enhance the position of a Webpage in Google and Bing rankings. It is paramount to recall that SEO is a standout amongst the most vital parts of advertising for any business or individual looking to drive business through their Website and see however much movement see their administrations and items as could reasonably be expected all the time.

Read More
1 6 7 8

ARE YOU READY? GET IT NOW!

Stay current on Digital marketing trends

Your Information will never be shared with any third party.