Chapter 9: What’s the Truth about How Search Engines operate?

It’s been many years now since the first search engines emerged onto the internet. And to this day a surprising number of incorrect notions about how search engines work continue to be spread. For the person who is just beginning their journey into the labyrinth of SEO, there is much confusion about what is needed to get the most effective performance from the different search engines.

Submitting to Search Engines

When SEO first emerged as a field in the late 90s things were much simpler. A person who wanted to be included on a search engine would have to submit forms to rank on the engine. A webmaster would tag pages and insert keyword information and then a bot crawled their page and Voila! They were ranked on the system.

The problem with this system was that it could not scale. It was easy for a webmaster to spam keywords and so eventually the process of submitting applications to search engines was done away with at the start of the new millennium. All the engines now note in public that the best way for a site to rank highly is to have a good number of links from other sites. In that way, the bots that crawl for the search engines will find the content organically.

While there are some submission pages still to be found, it is no longer necessary in today’s SEO to use them. If you hear from anyone who offers SEO services that they will list submissions to all the major search engines you can be pretty sure that you are being scammed out of your money. In modern SEO, it’s all about the links and no amount of submissions will raise your ranking in the engines.

The Function of Meta Tags

In the past, meta tags were crucial to the SEO process. You would think of long lists of keywords that you wanted to be found for in the rankings and then when those were typed into the engines by the end users, your page would pop up and you would get traffic. This, of course, was quickly gamed and keyword spamming became the order of the day.

As a result, all the major search engines discontinued the use of these meta tags as a ranking signal and made changes to their algorithms. Certain tags such as the title tag and meta description tag are still important tools in the arsenal of the SEO expert. And while certain functions of meta tags are still used, they are not the all-important part of SEO that they once were in the not-so-distant past.

Want Some Stuffing with That?

Why do some so-called SEO experts still insist on stuffing keywords into their pages?

There is a myth that will not go away that keyword density is one of the major components of relevancy in the calculations used to rank websites by the engines.

This has been proven to be a fallacy over the years and even if you hear advice to increase the density of your keywords, you would be well advised to ignore it.

Your best strategy is to find and use keywords in an intelligent manner that enhances the utility of your site and to concentrate on getting those all-important editorial links that the search engines actually use to rank with.

Does Paying for Search Yield Greater Organic Results?

One of the most prominent debates that are currently banded about by the SEO community is whether spending on advertising for search engines actually boosts your organic rankings.

According to our experience, there is absolutely zero evidence that paying a search engine for advertising affects the search results in any meaningful way. All of the major engines such as Google and Yahoo have created barriers that address the crossover between paid and organic searches to effectively block the two from affecting each other.

Search Engine Spam

Search engines will always be fighting spam. In fact, since the 1990’s, search engine spam has dramatically increased. Search engine spam is basically a web page designed to deceive the search engine algorithms and users.

It has, however, started to become very difficult to deceive the search engines and many people are becoming less interested in it. Search engines have become smarter, relying on statistical and historical data to help rank web pages.

Many people agree that over the last ten years Google’s best advantage over its competitors has been the ability to rid the search engine of spam. Google introduced the Panda update – intelligent machine learning algorithms that fight against pages that don’t hold any significant value for its users.

With as great of a job that search engines are doing at identifying quality content and valuable web pages, it makes no sense to invest time and energy into a page that will only be removed or not highly ranked.

Sometimes spam pages get through but they work only for so long. Investing time and energy in web pages with great content and valuable information has much greater payoffs in the long term. In fact, creating spam web pages really doesn’t have any long-term benefits.

It is important for you to understand how the search engine identifies spam. You can view Google’s Webmaster Guidelines and Bing’s Webmaster Guidelines online. It will benefit you to know the reasons a search engine may penalise a web page for. It is far better to use methods that don’t violate the search engine guidelines.

The main thing to remember is that manipulative approaches are a deception to the user in some way. These methods will not give your web page any points with the search engine and they will most certainly impose penalties on the web page for using these tactics. Create valuable and sincere content and the search engines will eventually take notice. Remember these engines rely on historical data, so it can take some time to rank well.

Page-Level Spam Analysis

Search engines now excel in how they analyse entire websites as well as pages at the individual level for spam. Let us examine how they grade different spamming practices.

Stuff it with Keywords!

The first and most glaring tool in the spammer’s toolbox is keyword stuffing. The technique is obvious by its name. Since search engines look for certain words to determine how relevant they are to a query, a spammer will litter a page with repetitive reiterations of the same word or sets of words to attract the attention of the crawling bot and to rank higher in a search.

Since the current algorithms of the search engines are all wise to this simplistic approach, it is no longer effective to any degree to stuff a page with keywords. The search engines have no issue anymore in determining if a page is chock full of spam.

What about Links then?

Another form of spamming comes in the form of manipulative link acquisition. In this case, the spammer looks to outsmart the search engine by using links to raise their rating. This method is very difficult for the search engines to detect because of the variability of the forms that it could take. These could include:

  • Exchange programs – Where sites trade links that reciprocate and try to fool the algorithm that the site is more relevant than it is.
  • Link networks – Where multiple bogus web pages are created to merely serve as sources to link to boost rankings.
  • Links for pay – A webmaster could pay for links from pages from other webmasters who in exchange for money will place a link. This has been difficult for search engines to stop as they provide value to all parties involved and it’s hard to prove collusion between the sites.
  • Cheap directory links – Many SEO experts will use web directories that they pay to get higher rankings. These directories have lately run afoul of Google who has acted against them by deleting or reducing their PageRank score. However, many of them continue to thrive across the internet.

The number of ways that links can be manipulated are many and search engines have identified most of them. Algorithms find connections to uncover these tactics to mitigate the impact of them on search rankings. But, like in any arms race, as soon as one method is stamped out, a new system emerges, and the engineers must apply a whole slew of tactics including algorithms, human analysis, and spam reports to battle those who would use nefarious methods to outsmart the engines.

Cloaking your Content

One of the most basic rules for webmasters is to show the same thing on your pages to the search engine’s crawlers that would be seen by the visitor to the page. This implies that there is no hidden text in the code of the page that the human who visits and reads cannot see.

Those who violate this rule are now caught in the act by most engines and receive a penalty in their ranking as a result. “Cloaking” is considered generally ineffective today but still has some positive uses depending on how they are used. Because of this, the engines may let some of the instances of cloaking pass because of the positive user experience that they create.

Pages of Low Value

While pages in this category are not technically spammy, all the major engines have set tweaks in place to determine if a page is of unique content and actually provides some sort of value to the person who queries. There are many pages of this type such as affiliate content, content which is duplicated, and pages are generated dynamically and provide non-original text and are merely copies of other’s content on the web.

Domain Level Spam Analysis

Domain level spam analysis is very relevant to your particular site. You want a search engine tool to scan your page and identify different parts of your page or domain that might be flagged as spam. You don’t want your site flagged as spam because that will only hurt your traffic numbers.

Search engines are also able to monitor the links associated with a website and whether or not said links are of high quality or not. You don’t want a site associated with your site to be flagged for the things mentioned above as this could hurt your search traffic or even get your site banned from the index.

You want to focus on the trustworthiness of your site but there can be problems with this. SEOs have shown that there is a double standard when it comes to ranking the trustworthiness of a site. Older sites are more likely to get higher trust because search engines look at the links your domain has earned. So, for example, if you were to publish a lot of duplicate nonsense to your site and then buy some spammy directories then your site would suffer in its ranking but if the same content was posted to a well-known site such as Wikipedia in the exact same manner they would most likely still rank really well. This is how the power of trust and domain authority works. Keep in mind that you can establish trust with your inbound links. Duplicate content and a few suspicious links will most likely be overlooked if you have earned a lot of links from high-quality sources.

Content value is another major focus. Looking at everything we have learned, your page will be rated on value by looking at its uniqueness and the experience of your visitors. Even if a particular site performs regular SEO they might not be able to rank if their site contains duplicate, non-unique content. Search engines don’t want a bunch of pages that are the same so they are constantly evaluating sites to make sure this doesn’t happen. Ranking is great when you can get a high ranking but keep in mind that you will have to prove it over and over and over again to maintain a high ranking.

Why Is My Site Underperforming on Search Engines?

There are four things you should look at in order to determine if your site is not doing well.

  • First, you will want to look at your site for errors as these could be inhibiting crawling on your site. A good (and free) place to start looking for whether or not your site contains errors would be Google’s Search Console. Particularly pay attention to any changes to your site. This may have inadvertently changed the way search engines view your particular content.
  • Pay close attention to similar backlink profiles. You will want to focus on whether or not your site shares a similar backlink profile and see how their ranking is.
  • When rank algorithms are updated it can affect link valuation and importance and in turn move your rank.
  • Remember to keep an eye on duplicate content. Many modern sites have a lot of issues with duplicate content especially as they get larger.

How Do I Get Un-Penalised?

It is a pain to get your site re-included or reconsidered with search engines and success is not guaranteed. Frustratingly, you may not be able to find out what the issue is, as little feedback is offered, so it is critical to know how to handle a search engine ban it if it happens to you.

Start by registering with Webmaster Tools for Google or Bing. This process establishes a robust link between the search site and your website. Then use the available facilities to check your site for any broken links or pages that don’t function correctly as well as any alerts on your site, such as spam warnings (which are frequently recorded incorrectly if the site is not sufficiently accessible).

Repair any issues, especially any bad links or any things that might appear to be designed to artificially enhance your SEO ranking, such as overuse of keywords or excessive links within the site itself.

Once this is all sorted, use the internal message system (rather than an open forum) to request that your site is reviewed for re-inclusion. The internal system is more likely to get you a response.

To maximise your chances of getting relisted, be completely honest, and declare all details of anything prohibited that you have done, such as spamming. Provide details of any links you have obtained unfairly and how, as the engines want to use this intelligence to improve their monitoring systems. If you aren’t straight with them, you are unlikely to be viewed as a legitimate site for re-inclusion and you may not even hear back from them.

Then sit tight and wait for a response, which could take some time as there is a huge waiting list.

For major web brands, there is an alternative option which may speed this up. This is by approaching an engineer or rep at a search industry event or conference and explaining the issue. The cost of the ticket is far outweighed by getting the search engine issue resolved quickly.

Remember, they don’t have to include you in their searches, it is their choice. Make sure you set up your website carefully and only apply legitimate SEO methods and in that way you will reduce your risk of being banned in the first place.

Need Some Advice?