What is Search Engine Optimization?
Why does referral traffic to your website from Google, Yahoo and Bing increase the likelihood that your website will achieve web success? How do you get more visitors to your website, which can in-turn, increase profits?
If you’re just starting out in the world of e-commerce, welcome to the shark tank. The world wide web (W3) is the fastest growing, most competitive and most exciting marketplace in the history of buying and selling.
The world wide web is also the place entrepreneurs have the opportunity to actually build a successful business for just a few bucks. The W3 has changed everything, which is why you’re thinking of diving in.
Optimization is probably NOT at the top of your “to-do” list. But it sure should be. SEO often makes the difference between web success and a “down in flames” scenario you don’t even want to think about.
Introducing the Search Engine
The first search engine was introduced by Yahoo in 1994 and as soon as Yahoo released this now-essential tool, search engine optimization began. Search engines, like Google (the 800-pound gorilla of search engines) provide the “address” of the web site that contains the information or products you’re looking for. Prior to 1994, we were all wandering around the web without a road map or an address book.
Today, simply type in a search query, using keywords, and Google, Yahoo, Bing and the other 4,000 search engines crawling the web, and you’ll see dozens of Search Engine Results Pages, or SERPs, offering suggestions for web sites that would interest you based on the keywords you entered into the search box.
The War Between Google and SEOs
Search engine optimizers (SEOs) somewhat understand how search engines evaluate a web site and their job is to manipulate the machine – to get search engines to rank their clients’ web sites higher in the SERPs.
The chances of a search engine user clicking on a SERPs link are much higher if that link is on page 1 of Google’s SERPs rather than on page 152. Think about your own use of search engines. When you make a search query, you look at the results found on the first few pages. When was the last time you drilled down to page 152 of Google SERPs? Never!
The reason is simple. The prime objective of a search engine is to deliver SERPs links that are relevant to the keywords entered into the search box. The more relevant the web site based on the query, the higher the web site ranks in SERPs.
Search engines employ top-secret algorithms – complex mathematical formulas – to assess what a web site is all about, to classify a web site within the search engine index and to deliver relevant SERPs to users.
If you type in “snow tires” into the Google search box and a bunch of SERPs listing knitting stores, sports memorabilia and health food stores appeared on Google’s SERPs, pretty soon you’d stop using Google because it was a waste of time.
On the other hand, if you type in “snow tires,” and find local retail tire stores, you can do some comparison shopping on prices, look for special sales and even print out a map to get to the tire store. When it comes to search engines, it’s all about relevance – how relevant are the search results based on what the search engine user types into the search box.
Positive and Negative Ranking Factors
Search engines use small snippets of code, called bots or spiders, that “crawl” a wide site to determine the topicality of the web site and how good that web site is in terms of relevance to the keywords entered by the search engine user.
Bots move through a web site looking for letter strings – the repetition of the same sequence of letters. So, if a bot finds numerous references to “snow tires,” the bot assumes the web site is about snow tires.
Now, it could be a tire store, research on snow tire performance, a national manufacturer snow tire web site, a PhD thesis on snow tires, but search engine spiders look for repeating letter strings to assess into which category a particular web site should be indexed or classified.
The natural inclination, therefore, is to use the words “snow tires” a lot. The more the better, right? Well, search engine algorithms are created to recognize what’s called keyword stuffing – the overuse of certain keywords and terms designed to attract the attention of search engine bots.
If you sell bananas on line, and every other word of your web site text is “banana,” search bots are programmed to recognize keyword stuffing and when it does, the web site ranks lower in relevance than the web site that provides helpful information on the purchase of bananas.
Linkage – the number of web sites that link to your web site – is another ranking factor. The more web sites linking in to your web site, the higher your web site is ranked. The thinking here? If other site owners link to a web site, they’re recommending it to their own site visitors, an indication of a really good web site.
There are dozens and dozens of ranking factors and endless debate on which factors matter most. Even SEOs – men and women who earn their salaries optimizing web sites – don’t agree on which factors have a positive effect on SERPs rank and which don’t, creating an endless debate.
Search Engine Bots Never Bought Anything
Okay, so now you’re thinking about search engine optimization and how to improve your site’s rank on Google’s or Yahoo’s SERPs, right? Well you should.
The better optimized your site is, the higher it will appear on SERPs. And SERPs drive a lot of traffic to a web site. In the case of many web sites, Google accounts for 60% of all visitors who visit that particular web site so ranking highly in Google SERPs sends visitors to your web site.
However, it’s important to remember that search engine bots never bought a thing. These computer code snippets crawl your web site, gobble up letter strings and index the pages of your web site.
They don’t buy anything, so even if your web site does rank highly on Yahoo’s search engine results pages, it doesn’t necessarily indicate that web success is within your grasp. On the contrary, you may rank highly for obscure, little-used keywords. If you own a hearing aid repair web site, you’ll see organic traffic only when search engine users type in “hearing aid repair” – and that’s not going to happen a million times a day, that’s for sure.
The Web Screen
Your web site consists of two key components – the HTML or XML code that actually creates the web site and the presentation layer or web site skin – the web site that humans actually see.
Think of a web site as a screen. Behind the screen are lines of code that search engine spiders see. Googlebots never see the other side of the screen – the presentation layer. That’s for humans only. However, humans never see the underpinnings of the web site – the source code – the lines of HTML code that create the look, feel and the structure of your web site.
This actually simplifies the process of optimizing your web site for search engine spiders. The code, crawled by spiders, should be optimized for spiders so these search engine tools accurately identify all of the content your web site contains.
On the other side of the screen – the presentation layer – optimize for humans. The color scheme, the type font, the navigation architecture and navigation bar – everything on the presentation layer should be deigned to appeal to humans. Bots never even see your pretty choice of colors or the cool graphics you had created. In fact, bots can’t read any web site element that has a graphic extension like jpg, gif, bmp or other non-text extension.
Balance Creates Web Success
The key to long-term web success is balance. That’s what search engines want so give it to them.
A good SEO copy writer can effectively work in the words “snow tires” up to 5% of the total site text – about the maximum you want before search engines slam you for keyword stuffing.
Also, play by the rules. The rules are laid out clearly by search engine developers and are available for any new web site owner to review. For example, some site designers might employ invisible text. How?
Place white text against a white background and it’s invisible to the human eye, though bots can still read it without a hitch. So, you might be tempted to place the search term “snow tires” in invisible type all over your web site.
Don’t. Bots, and the mathematical formulas that direct them, have grown more and more sophisticated over the years and if you get “caught” using any of these illegal tactics, you lose page rank, and your site may even be banned. You’re out of business.
Optimize your HTML code for spiders. Optimize the presentation layer for humans with credit cars and telephones looking for the closest place to get a new pair of “snows.”
Then, wait for the telephone to start ringing. It will – if you do it right.