SEO (Search Engine Optimization) means just one thing – no matter the definitions offered with the view of confusion. It means getting the search engines (Google, Yahoo, Bing etc.) to show the content you care about when keywords relating to the site/entry in question are entered by someone searching for that particular piece of information online.
- 1 What is the purpose of SEO?
- 2 Why you should care about SEO?
- 3 What is the most used search engine?
- 4 How Google works?
- 5 Old SEO vs New SEO
- 6 What Matters To Google?
- 7 How SEO Works?
- 8 What Are the Most Important SEO Ranking Factors?
- 8.1 1-Crawlability, indexability and Security
What is the purpose of SEO?
When people search for information online in the search engines, they go about it by typing in what they’re looking for into the search box and if you want to attract them to your site, you need search engine optimization.
In order for your site to become available when people search for information, the search engines need to determine that the site is valuable so that when people search online they can find your site.
Why you should care about SEO?
- Organic Search is most often the primary source of website traffic
- SEO is a primary source of leads
- SEO results to higher conversion rate
- SEO promotes better cost management
- SEO builds brand credibility
- SEO helps establish brand awareness
- SEO can be a long-term marketing strategy
- SEO helps you gain market share
- SEO creates a synergy of all marketing activities online
- SEO increases your followers on social media
- SEO takes you ahead of the competition
What is the most used search engine?
With over 70% of the search market share, Google is undoubtedly the most popular search engine. Additionally, Google captures almost 85% of mobile traffic.
Bing, Google’s biggest contender, rakes in 33% of U.S. searches and also powers Yahoo, the U.S.’s third biggest search engine.
Baidu is China’s largest search engine, capturing over 75% of China’s search market.
Admittedly not the sleekest search engine interface, Yahoo still manages to capture fourth place in our list, with a little over 3% of the worldwide market share.
If you’re aiming to capture Russian traffic, Yandex is your best bet, with 65% of total Russian search traffic. Yandex is also popular in Ukraine, Kazakhstan, Turkey, and Belarus.
With .35 of search traffic, Ask.com is certainly a more modest option compared to the likes of Bing and Yahoo!.
If you’re uneasy about targeted ads or don’t want your search data stored, you might want to try DuckDuckGo, which touts itself as “The search engine that doesn’t track you”.
How Google works?
You can feel like a dog chasing its own tail trying to figure out how Google works. There are thousands of bloggers and journalists spreading volumes of information that simply isn’t true. The majority of blog posts about SEO are rarely written by experts or professionals with the day-to-day responsibility of growing site traffic and achieving top rankings in search engines.
Old SEO vs New SEO
Old School Methods
In the early days of Google—over 15 years ago— Google started a smarter search engine and a better experience for navigating the World Wide Web. Google delivered on this promise by delivering relevant search engine results.
Internet users discovered they could simply type what they were looking for into Google—and BINGO—users would find what they needed in the top results, instead of having to dig through hundreds of pages.
Google’s user base grew fast. It didn’t take long for smart and entrepreneurially minded webmasters to catch on to sneaky little hacks for ranking high in Google. Webmasters discovered by cramming many keywords into the page, they could get their site ranking high for almost any word or phrase. It quickly spiraled into a competition of who could jam the most keywords into the page.
The page with the most repeated keywords won, and rose swiftly to the top of the search results. Naturally, more and more spammers caught on and Google’s promise as the “most relevant search engine” was challenged.
Webmasters and spammers became more sophisticated and found tricky ways of repeating hundreds of keywords on the page and completely hiding them from human eyes.
All of a sudden, the unsuspecting Internet user looking for “holidays in Florida” would find themselves suddenly arriving at a website about Viagra Viagra Viagra!
How could Google keep its status as the most relevant search engine if people kept on spamming the results with gazillions of spammy pages, burying the relevant results at the bottom?
Enter the first Google update. Google released a widespread update in November 2003 codenamed Florida, effectively stopping spammers in their tracks. This update leveled the playing field by rendering keyword stuffing completely useless and restored balance to the force. And so began the long history of Google updates—making it hard for spammers to game the system and making ranking in Google a little more complicated for everyone.
Old-school methods no longer work. Today stuffing a keyword into your content too many times can actually knock the stuffing out of your search rankings.
Fast forward 15 years and ranking in Google has become extremely competitive and considerably more complex. Simply put, everybody wants to be on Google.
Google is fighting to keep its search engine relevant and must constantly evolve to continue delivering relevant results to users. This hasn’t been without its challenges.
Just like keyword stuffing, webmasters eventually clued onto another way of gaming the system by having the most anchor text pointing to the page.
If you are not familiar with this term, anchor text is the text contained in external links pointing to a page. This created another loophole exploited by spammers. In many cases, well-meaning marketers and business owners used this tactic to achieve high rankings in the search results.
Along came a new Google update in 2012, this time called Penguin. Google’s Penguin update punished sites with suspicious amounts of links with the same anchor text pointing to a page, by completely delisting sites from the search results.
Many businesses that relied on search engine traffic lost all of their sales literally overnight, just because Google believed sites with hundreds of links containing just one phrase didn’t acquire those links naturally.
Google believed this was a solid indicator the site owner could be gaming the system. If you find these changes alarming, don’t. How to recover from these changes, or to prevent being penalized by new updates, is covered in later chapters.
In the short history of Google’s major updates, we can discover two powerful lessons for achieving top rankings in Google.
1. If you want to stay at the top of Google, never rely on one tactic.
2. Always ensure your search engine strategies rely on SEO best practices.
What Matters To Google?
Authority, trust, relevance & user behavior. Four powerful SEO strategies. Google has evolved considerably from its humble origins in 1993. Eric Schmidt, former CEO of Google, once reported that Google considered over 200 factors to determine which sites rank higher in the results.
Today, Google has well over 200 factors. Google assesses:
- how users are behaving on your site
- how many links are pointing to your site
- how trustworthy these linking sites are
- how many social mentions your brand has
- how relevant your page is
- how old your site is
- how fast your site loads… the list goes on.
Does this mean it’s impossible or difficult to get top rankings in Google? Nope. In fact, you can have the advantage. Google’s algorithm is complex, but you don’t have to be a rocket scientist to understand how it works.
In fact, it can be ridiculously simple if you remember just four principles. The four areas of focus are: Trust, Authority, Relevance and User Behavior.
Trust is at the very core of Google’s major changes and updates the past several years. Google wants to keep poor-quality, untrustworthy sites out of the search results, and keep high-quality, legit sites at the top. If your site has high-quality content and backlinks from reputable sources, your site is more likely to be considered a trustworthy source, and more likely to rank higher in the search results.
Previously the most popular SEO strategy, authority is still powerful, but now best used in tandem with the other two principles. Authority is your site’s overall strength in your market. Authority is almost a numbers game, for example: if your site has one thousand social media followers and backlinks, and your competitors only have fifty social media followers and backlinks, you’re probably going to rank higher.
Google looks at the contextual relevance of a site and rewards relevant sites with higher rankings. This levels the playing field a bit, and might explain why a niche site or local business can often rank higher than a Wikipedia article.
You can use this to your advantage by bulking out the content of your site with relevant content, and use the on-page SEO techniques.
4. User Behavior
Are users sticking to your content like glue? Or are they visiting and leaving your site faster than Usain Bolt?… How users behave on your site are currently the strongest factors in Google’s algorithm. You can take advantage of this by improving user experience.
How SEO Works?
SEO works by optimizing your site for the search engine that you want to rank for, whether it’s Google, Bing or Yahoo!.
Your job is to make sure that a search engine sees your site as the overall best result for a user’s search.
What Are the Most Important SEO Ranking Factors?
1-Crawlability, indexability and Security
In other words, Google has to be able to visit the URL and look at the page content to start to understand what that page is about.
Google has 3 core components:
Your web site must be discovered and followed by search engine spiders. Spiders or bots are programs that search engines send out to find and re-visit content (web pages, images, video, pdf files, etc).
If a search engine spider cannot follow a link, then the destination page will either not be included at all, or exist in the search engine’s database but not be included in the universe of web pages available to search results.
How to make sure your website content is indexable?
Check your robots.txt file and see if important directories are excluded from crawling.
Check whether your sitemap contains all URLs to be indexed, and check the status codes of the URLs.
Check your URLs for the no-index tag. Unless this tag is completely necessary, change it to “index, follow”.
Check this file for incorrect redirects or syntax mistakes.
5-DNS or Connectivity issues
If Google’s spiders simply can’t reach your server, then you may encounter a crawler error. This could be for a variety of reasons such as your host is down for maintenance or had a glitch of their own.
6-Internal nofollow Links
Search for nofollow-tags on your website, and remove them. Alternatives are canonical- or noindex-tags.
Check that these tags correctly refer to the canonical URL.
How to make sure your website is secure?
HTTPS isn’t a factor in deciding whether or not to index a page. HTTPS, or “secure http”, was developed to allow authorization and secured transactions. Exchanging confidential information needs to be secured in order to prevent unauthorized access, and https makes this happen. John Mueller has tweeted that it’s a “light-weight ranking factor” and that “having HTTPS is great for users.