SEO terms in simple language

You keep hearing SEO jargon from other bloggers, experts and also SEO experts. These can sometimes look profound and confusing.

In this glossary, let me simplify the SEO jargon for you.

404 not found: When a linked webpage does not exist, the visitor gets '404 not found' status on his browser. If a website has many webpages that have gone dead or their URL has changed, that is a bad SEO signal because visitors do not get what they look for. 

algorithm: The set of formulas that search engines use for calculating how important a webpage is, from search point of view. The algorithm used by Google is so popular, we call it Google algorithm. Search companies keep their algorithms secret so that SEO experts and hacks do not use it artificially to jack up search ranking of their webpages. (Algorithm is a general computer term used for a set of formulas for any problem-solving.)

alt attribute: This is a way to give a description to images, and is written in HTML. This description or alt attribute is not visible directly but is read by readers for visually handicapped people. It is also used by search engines to know the details about the image. (An attribute is a quality of an HTML tag, so alt attribute is part of img tag. To call it alt tag is wrong.)

anchor text: The text which is hyperlinked to a web entity. In this sentence, what is search engine is the anchor text:You can visit this article to know in simple terms, what is search engine. When someone clicks on this anchor text, he is taken to a webpage on ITB website.

backlink: A link received from another webpage. In the above example, the webpage on search engines has received a backlink from the webpage you are reading. 

If A is a webpage of a very authoritative website and it sends a backlink to webpage B, search engines will get a sense that B must also be authoritative. Thus, backlinks are a sort of endorsement, and so are important for search engine optimization. Also called inbound link.

BERT: It is a way of 'natural language processing' or letting the machines learn what humans say. Invented as a machine learning tool by Google, BERT tries to understand the context of a word by looking at its left and right (so, the B in BERT is for bi-directional). It is being used by search engines to understand the real intent behind search queries.

blackhat: When people use wrong ways of search engine optimization, such SEO is called blackhat SEO. Blackhat techniques try to fool search engines to get a webpage to the top of search results. Such techniques include hiding keywords, buying backlinks, link farming, etc. 

bounce rate: Used for telling whether people leave a website after landing on a webpage, i.e. without visiting other webpages of that website. In number terms, bounce rate is the percentage of all sessions on a website where users viewed only one webpage. A high bounce rate tells search engines that the website does not have many webpage relevant to the search query.  

broken links: When a link does not lead to the intended webpage. This happens when the linked webpage/ image is removed or its URL changes.

canonicalization: When you have more than one URLs pointing to the same website (e.g. with or without www written at the beginning of it), search engines take them as different websites, and that hurts SEO. A 'canonical' tag is put in the HTML of the website to tell search engines which one is the primary URL out of many. This is called canonicalization.

click through rate: This is the number of time a webpage is clicked out of the number of times it appears on search results. It is calculated in percentage. A webpage having CTR of 2% means the webpage was clicked 2 out of 100 times it was shown up in search. A higher CTR gives a signal to search engines that the webpage is relevant for the search keywords for which it was searched. (CTR is also used for estimating how many times an advertisement was clicked out of total number of times it was seen.)

core web vitals: These are qualities that tell how much a webpage or search result is visually stable. These include how fast the page loads on the browser, how its interactive elements behave and whether the content loads first or other elements such as advertisements. Poor performance in terms of core web vitals can lead to downgrading of a webpage in search ranking.

crawling: Search engines have programmes called crawlers (also called search bots/ spiders) that visit websites and their webpages, using links, to collect data relating to search. This data helps search engines to put webpages in their index, remove webpages that have gone bad, find keywords in the content, etc.

crawl budget: The number of webpages of your website that Google might crawl in a given session. In large websites, there may be thousands of webpages and search engines may crawl only a few hundred, and this may result in many webpages not being included in search index. So, webmasters use robot.txt (see below) to tell search engines which they may skip.

domain authority: This is a ranking score developed by Moz, which tells how good a website is from search point of view. Related is the page authority- the score for individual webpages.

duplicate content: When the same content is found on more than one webpages, it is called duplicate content. It can hurt SEO because the search engines are likely to mix up as to which one is original. When it results from undesirable copy-pasting of content, search engines may even penalise websites for this.

external link: Link to another website. Opposit of internal link, which is a link relating to another webpage on the same website.

Google analytics: A Google website that would give you a lot of search-related data regarding your website if you put Google analytics code on the website. This data is very useful in making SEO strategy for the website. (The link given above might take you to Google login page before it opens the analytics website.)

Googlebot: The main crawler or search software of Google. It keeps visiting billions of webpages to get data that is used for ranking websites for search purposes.

Google Keyword Planner: A free tool provided by Google for keyword research.

Google penalties: Google and other search engines can remove a webpage or a complete website from their index if they find blackhat SEO practices being carried on it. There can also be other reasons for such penalties, e.g. carrying copy-pasted content or inappropriate content.

heading tags: HTML tags that specify headings and sub-headings within a webpage. There are 6 levels of H tags. These are important for SEO because they tell search engines that the words given in the H tags are of special importance.

Hummingbird: A 2013 Google algorithm update that tries to find the real intent of search queries. Semantics or finding the true meaning of a word in the context it has been used is the base for this algo update.

internal link: Link to another webpage within the same website.

keyword: One or more words in the search query that tell what the query is. When used for webpages, the keyword is one or more words that are likely to be used by searchers and that should exist in that webpage so that it comes up in search results. 

keyword difficulty score: Some SEO tools give a number that tells how difficult it is to optimize for a keyword. It is generally in the range of 1-100. Long tail keywords are usually easy for optimization and have low keyword difficulty score - and therefore bloggers should focus on them for SEO. 

keyword stuffing: Putting too many keywords or using the same keyword many times on a webpage so that search engines believe that the webpage is a great resource for those keywords. It is not a good SEO practice to stuff keywords on webpages. 

Landing page: A webpage to which visitors are directed so that they take the desired action. Such actions on the part of visitors could be to register, give feedback, sign up for receiving emails or newsletter, download a document, make a purchase order, apply for affiliation or something else.

link farms: When people join to artificially promote each other's websites by linking to one another, these are called link exchanges and link farms.

link juice: This is the authority that is passed on to another webpage when it is backlinked form an authoritative webpage. So, when a high authority website/ webpage gives a backlink to your blog, you get a lot of link juice and the chances of your blog coming high on search pages go up. 

local SEO: Search engine optimization done in a way that a webpage comes high on search results when someone searches for local items. Websites of local businesses need more local SEO than websites giving pure information.

long-tail keyword: Keyword that refers to a narrow topic. For example, "geography" is a very wide subject while "geographic formation of Himalayan ridges in Bhutan" is a very narrow subject - this is a long-tail keyword within "geography".

meta tags: HTML expressions that tell about the website or webpage. Website's name, its description, and description of an individual page are important meta tags from SEO point of view, because they help search engines know the subject of the website/ webpage.

nofollow: If you make a link on your blog as nofollow, you are telling search engines that your blog does not intend to pass on authority or link juice to the linked webpage. When you link to a bad webpage to tell people not to use it, you should make the link nofollow. For this, a nofollow attribute has to be added to the link's HTML. 

off-page optimization: SEO actions taken on the webpage itself, e.g. putting alt attribute on images, putting heading tags.

on-page optimization: SEO actions that are not taken directly on the webpage, e.g. getting backlinks from authoritative websites.

organic traffic: Traffic that comes to a website/ webpage without payment. On the other hand, when you pay (or issue advertisements) to get traffic, that is called paid traffic.

page experience: Google says, it now gives high importance to the experience that searchers have on the webpage. Thus, if a webpage shows an intrusive popup, it is a bad page experience. Webpages with poor page experience are likely to suffer in terms of SEO.

PageRank: A complex set of formulas used by Google to give a number ranking to webpages. The score is arrived at by giving values for hundreds of factors such as the number of backlinks. The name of this ranking score comes from the name of its developer, Larry Page. It is not made public but is part of the overall Google algorithm.

Panda: The name given by Google to a major update to its search algorithm. Released in 2011, its main aim was to reduce the ranking of webpages with poor quality content.

PBN: Private Blog Network. It is a network of websites used for giving backlinks to particular webpages to improve their search ranking. Google says, it disapproves of such undesirable practices of artificially boosting a website/ webpage's ranking.

Penguin: Name given by Google to a major algorithm update in 2012. It is specifically targeted against link manipulation for SEO. 

RankBrain: It is a part of Google's search algorithm since 2015, and refers to the use of machine learning for finding the relevance of webpages for search queries.

ranking factors: The factors that help a webpage to rank high on search engines. These include the age of the website, pages linking to the webpage, bounce rate, length of the webpage, load speed and so on.

relevant traffic: Traffic that is relevant to the subject or advertisements on a webpage. 

Irrelevant traffic is useless, even hurtful. People who visit a webpage as they were wrongly re-directed to the webpage, or by seeing an advt or using a wrong link will not read the content, not click on advertisements and may even flag the webpage as bad. 

rich cards: Visually enhanced and engaging search results. These are a form of rich snippets, and are created with structured data.

rich snippets: Snippets or brief description about a webpage that comes on search results, with enhancements (e.g. images of products, preview of related webpages)


rich snippets and cards in search results
Rich snippets and cards. (Courtesy: Google)

robot.txt: This is a file (part of a website) to tell search engines which webpages to crawl and which they should not crawl for indexing. Webmasters usually do not want webpages with little value or duplicate content to be crawled by search engines.

search console: A Google website that gives a lot of information (including search-related problems) and advice on your website if you link the website with it. You need to have a Google account to make use of its tools.

search engine: A program that searches databases based on user queries. For searching information on the web, there are big web search engines such as Google and Bing. 

search intent: The intent (=purpose) of a search. When you make a search using Google search box, your intent could be to get information about something, buy or sell a product, locate a place, compare products, look for a person, or something else. 

semantic search: Search techniques that try to know the real meaning of a search query out of many possible interpretations. As a simple example, 'where' in the search query would mean that the searcher wants to know a location without asking for it. Search engines try to understand the human language using machine learning and other modern tools. 

SEO: Search engine optimization. All techniques that help a webpage show up high on search pages of Google and other search engines. 'SEO' is used used by some for an SEO expert.

SERP: Search Engine Result Page. The page (with 10 or more number) with search results, which comes on screen of the device when makes a web search. 

schema.org: A set of information (called schema) provided on the website that helps search engines to give visually rich search results. Rich cards (see above) are created using schema.org

sitemap: The list of webpages, arranged in a structured way, in a website. Though Google and other major search engines crawl all websites on regular intervals to find their content, submitting a sitemap to search engines helps them know what webpages should to be crawled.

snippet: The brief description that comes below the search result. (See rich snippets and rich cards above for visually rich types of snippets.)

structured data: Any data that is properly organized, such as Excel tables. However, in terms of search, it refers to data about a website given in a format recognized by search engines, so that they understand the content better and then display it in a better way in search results. Schema.org is the way structured data is formatted on a website.

UX. Short form of user experience. An often-used web jargon.

webspam: Techniques that try to confuse search engines to believe that a webpage is of high value though it is not. These are blackhat SEO tricks.

whitehat: The type of SEO techniques that are desirable. (Its opposite is blackhat SEO.)

Comments

Post a Comment

We deeply appreciate comments but do not allow comments with links or spam. If your comment is valuable, we publish it and you naturally get a backlink through your profile. Pl do not comment on the same post more than once.

Popular posts

Detailed observations on Indian blogging in English

Top literary blogs list: India's best literature blogs 2023 [also other great book resources]

Indian top blog directory 2023 to be released on June 1