In this article I want to cover a couple of best practices that anyone developing a website should take into account when they want to optimize their web pages for search engines. The act of optimizing one’s webpage is to ensure that there is a very high probability that it ranks on the first page when certain targeted keywords are entered into the search bar.
So, let’s now look at some key elements of web page design that should be taken into account:
Many of us make the mistake of assuming that humans and search engines can interpret web pages the same way. In reality, that is not the case. There are things that humans can read and interpret that search cannot. Search engines are currently designed to read HTML text formats, so if you have a website that has loads on images with text embedded in those pictures, you can rest assured that the search engine spiders will not be able to read them. So, with that in mind you should do the following to your visually stunning web pages to ensure that they are also readable by the search engine bots:
(a.) When you use images in your web pages you should assign “alt text” to each of your images which contains a description of what the images are so that the crawlers can read them and know what the images are.
(b.) Java plug-ins should also have text on the page that describes what they are.
(c.) Flash plug-ins should also have text assigned to them as descriptors stating what they are.
(d.) If you plan on using embedded audio and video media in your webpages you should also append text descriptors in the form of a transcript that gives a written record of what is being said in the audios as well as what is happening in the videos.
There are also lots of tools on the Internet (which are free) that you can use to confirm what search engines can read. It is a good best practice to use these tools after you have built your webpages and optimized them for SEO. Some great examples of such tools are Google’s Cache and WebConfs.com to name a few. These tools are great for webpages that have lots of images and links because it will be able to highlight whether the search engines know what the images, audio and video are portraying as well as whether the links that you have also included in your webpages are readable as well.
Another question that you have to ask yourself, especially if you own a website with many pages of content, is whether the search engine spiders and crawlers can find all of your pages so that they can be indexed. A lot of us make the mistake of creating compelling content, but fail to link to those webpages from our home page or from another webpage within our website. When that happens we end up having clusters of orphan webpages that have great content but cannot be tracked and indexed by our search engines. The end result is that your target audience never gets to see your content! To fix this issue ensure that you build crawlable link structures within your website by ensuring that there are links to all of you webpages within your website. On good practice in a blog roll for example is to have a link from your home page to your first blog post and then have links between every new blog post that you create so that when a crawler finds your home page it can simply traverse all of your blog posts and index them within the huge online database.