SEO for beginners. The layman’s guide to learning SEO in 2019
I was asked over dinner what SEO is and if I could explain the basic concepts behind it. This post is just my interpretation of Search Engines (Google) and my attempt at explaining SEO to the layperson.
Google is essentially a giant searchable database of sites on the internet. Imagine you had a library of every word ever written or published all in one place, having all that data means very little if you do not know where to look.
I believe that Google wants to make sense of large amounts of data and figure out which content best addresses a specific query. They’re continually asking what content would give the greatest utility and value for each specific query. Their core focus hasn’t changed, but their end product has morphed through the years as the raw materials (the content they index) has evolved through the years.
First you have to get the site indexed, and getting links from social profiles can be a pretty easy way to do that. Since social platforms are crawled by the search engine spiders (bots), any sites multiple networks link to are an indicator to Google that, ‘hey this is a new site with some buzz, maybe I should add it to my index.”
I think there are many smaller algorithms at work.
There’s likely one to analyze the on-site factors (content, layout, ease of use). Another for social signals (are people talking about the brand and site? are they linking and sharing posts). Another for user experience (does the user have to click through a whole bunch of popups? or do they have to wait 30 seconds for a particular page element to load). And another for utility, are people getting the answers they require from the query? Do they go on to purchase something or fill out a form to get more information? Or do they abandon the site and look for answers elsewhere. And then there are off-site links. How many authorative sites are linking back to you.
In each case, if any other site with similar content performs better, they’d get the leg up in rankings.
For on-page factors, ceteris paribus, if site A has content in a structure, Google understands and prefers, such as structured schema. It would likely rank higher than a site which didn’t have any schema entries.
For social-signals, ceteris paribus, if site A had more people talking and sharing links to an article than a similar piece on site B. Google would most likely recommend users check out site A over site B
For user-experience. If site A presented everything in a mobile-friendly format, and Site B required mobile users to pinch to zoom in and out to get content. People would likely get frustrated and not bother giving Google an indication that site A would better serve people.
Then there are off-page links if established high traffic sites are linking to you. E.G. You get featured on reddit.com, and it starts trending and 10% of people reading an article click on and find out more about the site. Or you get a feature on Forbes or Entrepreneur magazine. That’s a pretty reliable indicator that your site has relevance and it should be showing up in the search engines. It’s akin to a Bourdain effect for food, having an authority lend credibility and rave about you can and usually will slingshot you into the limelight.
I believe that all these algorithms churn out different values, and they work independently, then a master algorithm factors in these scores and ranks the sites accordingly. They could then change the weight given to any particular value and see how that impacts overall utility by website visitors. If they upped the value of the User Experience, does that lead to great time spent on the site or greater frustration?