Last year Google introduced a new and exciting feature that shows you relevant content, even when you are not searching for it.
The feature is gaining popularity day by day and over 800 million people use the feed each month to enjoy the latest and relevant content based on their interests.
Google updated a new feature called “Discover” with a fresh look and a brand new set of features.
With Discover, we can surf and explore more appealing and fresh visual content.
Despite the fact that Google web Stories appears almost the same as other stories features on various platforms, Google Web Stories has some variation.
The research concluded by Valentin Pletzer tells us that 61% of the stories lasted between 1 hour to 24 hours, whereas 6% lasted only for an hour.
The data was collected from 6,000 URLs of more than 1,300 different publishers and out of which 66% of those URLs were Accelerated Mobile Pages(AMP).
Going further, 17% of those stories lasted for one to two days and on the other hand, 7% for two to three days.
Nevertheless, 9% of those stories last over three days on Google Discover.
The minimum time of a story lasting on Google Discover was 70 seconds, and the maximum one for 710 days – almost for two years.
Take a look at these two pictures to understand the data provided more clearly –
You must know the major falls and ups your website experiences when there is an update from Google.
Google updates are the most annoying as well as an exciting thing for your SEO team, based on the performance of your website.
“Google Dance” is a period when Google is rebuilding its ranking and fluctuates heavily for a short period; ranging from 3 to 5 days.
In other words, it also refers to the high amount of uncertainty that a new asset like a website and webpage faces on Search Engine Result Pages (SERPs). By doing so, Google tries to determine what position to offer to the website in the index rankings.
In the video uploaded by Google Search Central on 18th of December 2020, John Mueller was asked the question that “During the core update rollout, is it like the quality of the website is calculated from the overall site signals, and then this site quality score is propagated to every page gradually, page by page? Is it possible that some pages drop and some pages surge, and the overall traffic to the domain remains the same?”
In answer to that question, John Muller says you can see parts or portions of your website go up or down after an algorithm update. The reason for that is that some of Google’s algorithms not only try to look at the “bigger picture of the website” but also “look at smaller parts of a website.”
Going further, he says that it depends of course. It depends on the particular algorithm, and what scores the algorithm can generate at a granular level and what it has to assume or guess for the other pages it does not have enough information to generate at that granular level.
Explaining further, John says that “And it’s also that there are always a lot of different things that come out with regards to search, and some are a little bit more focused on the domain or on a bigger picture of the website. Some are focused more on smaller parts of a website.”
Each time there is a Google update, it tests and verifies the quality and performance of a webpage and a website, and then assigns the ranking to that particular website accordingly.
There are some factors, and based on that Google Dance happens. These factors are: –
Age – The newer the website, the higher the chances of fluctuation.
Competitiveness – If there is more competition on the keyword you are targeting or the industry or locality you are in, the more will be the fluctuation.
Link Building – The numbers of link building also affect the fluctuation of the website.
Strength of links – The more powerful links, the more fluctuation.
Modification – If there are any major changes like new launching of a webpage or website or redesigning of the website, there are more chances of fluctuation.
The Google dance depends on the same factor as “why” s. The higher the dominance of any of the factors, the more chances of bouncing rank. Imagine after a lot of fluctuation your website still doesn’t end up where you want it to be, and you start building more links or other changes to push it to the first page, the chances of fluctuation will be there again.
Even after you rank on the first page and on the first position, there is no guarantee that your site will rank on first forever. You need to work constantly on ranking factors to keep competing and winning the race.
There is no way that you can prevent your site from not getting affected by the Google algorithm dance. It is totally okay for the website to fluctuate every now and then; you don’t need to panic and just keep practicing the best SEO practice to minimize the effect of Google algorithm updates.
2020 has really been a dramatic year. Not just in the real world but also “online.”
Google has announced the “Core Update” of 2020 on December 3rd at 1 pm ET. Generally, Google updates the system every three months, but this time it was scratched as the last update was in May 2020.
Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the December 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G
— Google SearchLiaison (@searchliaison) December 3, 2020
Google further announced it might take some time for the update to fully roll out, if not earlier up to a couple of weeks.
If we look at the data, it is really big, even bigger than the May 2020 core update of this year.
By looking at the data of those companies who track Google’s search results, we can clearly see that this is a really big and substantial update that felt changes across multiple verticals.
Same as all other core updates of Google, this was global and not targeting any particular continent, country, region, language group, and category of websites. Such a classic “board core update” occurs at the period frame of every three months.
According to Rank Rangers, “December update caused even more changes compared to the May 2020 update and especially in the top three results.”
Here is how the December update compared to the May update:
While if we look at the top twenty positions, the fluctuations are almost doubled compared to the May update.
While if we look at the data of various niches, even fluctuations can be seen with less degree of fluctuation in the top 3 and top 5 results in the travel niche.
If we look at the data shared by SEMRush, desktop search changes were most felt in the health, real estate, travel, finance, law, and government. While in mobile searches, health, law, and government, jobs, and education, pets & animals, real estate were affected.
While Searchmetrics has shown results of some of the most popular websites of the world.
Have a look at them –
Moreover, if we look at the data provided by Sistrix, dictionaries and encyclopedias saw large losses here because of the new quality raters guidelines specifically.
According to various data, the update seems to be slowing down in terms of fluctuations, and its effects started rolling out on Friday, December 4th.
There are groups of SEOs who are debating whether this time was appropriate to declare a new update from Google. As December is the festive season and there will be lots of sales and offers will be going on, and this update could result badly for many of the site owners.
As per Google’s announcement, the update was rolled out after the Thanksgiving season, Black Friday, and Cyber Monday but prior to the Christmas holidays. For many businesses, this update can be devastating.
If we look at Barry Schwartz, 41% of people said that it was not a good time to release an update, while 31% was in favor of this core update, and the remaining 26% were not sure about this update.
Did Google release the December 2020 Core update at a good or bad time? After Black Friday/Cyber Monday but before the still holidays.
— Barry Schwartz (@rustybrick) December 6, 2020
Google has published a list of questions that you should consider if your site has been impacted negatively by a core update. Nonetheless, Google said that one could expect a bit of recovery between two updates, but the major change will be seen after another core update.
Additionally, if your website has been impacted negatively, it does not necessarily mean that your website has something wrong. You can refer to the questions published by Google’s public search Liaison to know the reason behind the fall of your website. It helps us in understanding whether the content of a page is still relevant or not, and other guidelines on any deficiencies to be resolved and factors to be improved.
There are no particular actions that website owners can take in order to recover from the damage that has been done with the core update, and it is totally ok to expect your website to perform really well or worse after such core updates from Google. Google makes algorithm changes frequently to understand what works better and to provide the best possible experience to the audience.
However, you can always check your search intent and content quality to make sure that your website is least affected by such core updates. In addition to that, you can also examine the general shift in the search interest regarding the field you are serving to or validate the relevance of your content.
Understanding and accepting the fact that the shift in the rankings is huge because of the algorithm change, it becomes easier to adopt new patterns and standards of the latest version and optimize your website accordingly.
When it comes to recovery time, it depends on the case to case. There is no specific answer to this. You may need to wait till the next update, and it could also happen in between periods of these updates. It all depends on the factor that led to the decline – If a site has some major quality issues, it may need to solve that all before the next core update, and a website with smaller issues may start receiving positive results as soon as they solve the problem.
On 24th November 2020, Google announced that it had launched new and improved crawl stats within Google Search Console.
This update is quite useful for all the web developers, SEOs, and business owners to monitor their site’s graph.
According to Google, this new update will enable you to get a chart showing your crawl over time, displaying the total crawl requests over a period or by day, the total amount of downloaded data, and average response time.
Moreover, in this new feature of Google Console, the latest version of crawl stats will include the total number of requests grouped by response code, crawled file type, crawl purpose, and Googlebot type along with detailed information on host status. Not only that, but it also shows specific URL examples to display where a website’s requests happened and an overview of properties with various hosts and support for domain properties.
And sometimes it really becomes tough to understand why your website is not performing well but now with the latest update of Google crawl will make it easier for web developers, SEOs, and businesses to understand the true reasons behind the performance of their website- Be it good or bad. Once they understand the reasons, it becomes smooth to work on it and improve the website.
By collecting the data from the Googlebot, website holders can gain insightful information followed by actionable plans.
On the 28th of May 2020, Google first announced that it would include page experience as an additional ranking factor in the coming days. Furthermore, it also declared that Google would provide enough time and give prior notification before implementing new updates.
After that, on the 10th of November 2020, Google announced that they would finally consider page experience as an additional ranking factor in May 2021. To say it in simple words, we can say that if your page is user-oriented and appeals to users in an engaging way, no one can stop you from ranking higher on Google rankings.
As businesses make more user-centric and easy to navigate web pages, they will be on the top of SERPs, resulting from a delightful customer experience. Moreover, people will engage and interact more with websites that they find the most attractive. It is essential to have great visual appeal along with amazing usability to make users stick to your website for a longer period of time.
In order to understand the usability and friendliness of a website, Google has set some standards that measure the website’s capacity to influence users’ experience while scrolling through a page. Some of the factors that decide the page experience are inclusive mobile-friendliness, HTTPS connection, intrusive interstitials, and page loading time. According to Google, these factors are core web vital components.
In other words, core web vital decides the ranking of your page in the real-world based on user-centric metrics. Some of the core web vital metrics are LCP, FID, and CLS.
LCP, also known as the Largest Contentful Paint, measures loading performance. To provide a good user experience, your website should have an LCP occurrence within 2.5 seconds, counting from the time of clicking on the website.
Going further, FID, also known as First Input Delay, measures the interactivity of the website. Ideal FID is of less than 100 milliseconds.
While Cumulative Layout Shift (CLS) judges the visual stability of the website. In order to offer a good user experience, pages should keep a CLS of less than 0.1.
Google will use image previews from your page into a snippet to give an idea to the user about the content that you will have on your webpage. Likewise, visual indicators are highly useful in providing information related to the quality of a web page’s experience. Such visual indicators help users to decide which page they want to visit. Google is trying to identify the pages that have included and followed all the page experience criteria set by them. Once the testing is done successfully by the Google team, they will launch it in May 2021 while sharing the details on the progress in the coming months.
We're announcing that page experience ranking signals for Google Search will launch in May 2021. This will combine Core Web Vitals and previous UX-related signals.
Learn more: https://t.co/OrrR8LDl1a
— Google Search Central (@googlesearchc) November 10, 2020
When it comes to lasting user experience indicators, it depends on the worthiness of displaying content in the search results based on the data.
There is also much confusion regarding the AMP pages. It was compulsory until now to be a Google News publisher in order to be eligible for Google’s Top Stories carousel, but now the latest page experience guidelines will replace the AMP pages, resulting in a significant impact in the overall calculus. Google will continue to support AMP versions of your content.
These new metrics will undoubtedly impact the search rankings results on a significant level.
There have been many updates since Google has been introduced to the world of the internet, and there will be a lot more others coming in the future but what remains crucial is the content. If you have good content to offer, no one can beat you from ranking higher. Nonetheless, you should never underestimate the power of additional ranking factors that keeps you consistently high on Google rankings.
On the 4th of November, 2020, a team of Google workers published a podcast talking about the dupe detection and canonicalization process at Google. John Mueller, Martin Splitt, Gray Illyes, and Lizzi Harvey were hosts of this podcast. They talked about some really amazing stuff that we all should be knowing about how they process a lot of information and content available online and how they maintain a higher quality of search engine relevance to the audience by providing them the top quality and original content.
The podcast started with some really refreshing environment, and then Gary Illyes went really well with explaining a significant difference between dupe detection and canonicalization.
What is dupe detection?
To begin with the process, Google creates a Checksum for each page, meaning a unique fingerprint based on the words of a particular page. By using checksums of multiple pages, Google can identify the pages that have similar content. To do so, Google collects small-sized data derived from a set of digital data with a purpose to identify flaws that may have occurred during the time of transmission or storage. Additionally, checksums verify the integrity of data available, but it may sometimes fail to examine its authenticity.
Going further, Gray mentioned that dupe detection and canonicalization are two different things. Dupe detection is the primary step, followed by canonicalization. In the dupe detection process, Google clusters similar-looking content together and then chooses one out of them as a final one or a “leader,”; known as canonicalization. Another thing that we must consider is that duplication includes cluster building and canonicalization. Dupe detection mainly relies on the hash or checksums made by reducing content, followed by a thorough comparison. Converting content into hash or checksums makes it easier to do dupe detection. Gray explains further that scanning texts take more resources, but it will show almost similar results that Google gets from checksums.
In the process of dupe detection, checksums detect “exact” and similar kinds of content. Google has many algorithms that find and exclude the boilerplate from the pages. To describe this in other words, we can say that Google eliminates navigation and footer content for checksum calculation and examines only the centric piece of pages.
After collecting and detecting dupe, how does Google process canonicalization? Canonicalization factors are inclusive of content, page rank, HTTPS, sitemap file, server redirect signal, and real canonical. Machine learning algorithms decide the weightage of all parameters, which generally puts higher weightage on redirect and canonical tag. Gary further explains that although ML puts more emphasis on some factors, it doesn’t have any consequences on rankings. The page that Google chooses as canonical will rank, but it is not based on these factors.
On 20th October 2020, Google announced that it would soon identify passages from a webpage, and it will improve 7% of search queries across all languages once practised globally. With the latest technologies like Artificial Intelligence (AI) and Machine Learning (ML), Google now will be able to classify the relevancy of each paragraph of a website. But there is a lot of confusion around this latest update of Google.
Many people thought it would be “passage indexing” instead of “page indexing,” but we all need to know that Google will still index whole website pages and not individual parts of a web page. The latest technologies like AI and ML will support Google to scan every paragraph of a website and present more relevant information when someone searches for a particular and tricky question.
In current practice, Google skims the whole web page to find an appropriate and relevant answer for a given query, but after this announcement, we can say that Google will be able to find solutions to questions buried deep down on a page.
Sometimes, it is easy to get answers to your questions easily, but when someone asks a particular question, you may need to go through multiple pages to get the answer you are looking for.
After the latest update from Google, now they will evaluate pages in a more detailed manner. In other words, we can say that passages will rank independently of the rest of the page’s content. Many people confuse it with featured snippets, but the significant difference lies in how Google will interpret it. With the latest update, Google will look at the relevance of a paragraph to a particular query instead of the whole page’s relevancy.
For example, if a webpage is based on the topic “X,” but it also has some part of the topic “Y” and topic “Z,” we will be able to get information about the topic “Z” from a page based on topic “X.” So, a web page can have a vast pool of knowledge under a single topic with multiple subheadings; without compromising on ranking factor. To be more specific, if a particular page mainly talks about ways to improve skin health and have some parts of hair health and teeth health, that page will still rank on the query of teeth health and hair health.
With our new technology, we’ll be able to better identify and understand key passages on a web page. This will help us surface content that might otherwise not be seen as relevant when considering a page only as a whole….
— Google SearchLiaison (@searchliaison) October 20, 2020
There’s nothing special creators need to do here. Continue to focus on great content, with all the advice we offer: https://t.co/gcM0rvYaFb
It just means in some cases, we may now do a better job of surfacing content, no work required on the part of creators.
— Google SearchLiaison (@searchliaison) October 20, 2020
Moreover, we need to remember that Google will not “index” passages, it will still index web pages, but now passages are considered an additional ranking factor.
To conclude, we can say that modern technology like AI and ML; Google will go through each paragraph of website pages to determine the relevance of individual passage, resulting in enhanced user experience by showcasing relevant information related to complex and specific queries. Additionally, content creators will enjoy the freedom of writing multiple subjects with a single title.
Last year Google introduced a new and exciting feature that shows you relevant content, even when you are not searching for it. The feature is gaining popularity day by day and over 800 million p
There have been really few people who haven’t ever seen a paid ad on Google or any other social media platform. In the digital era, every business literally runs with the power of online market
You must know the major falls and ups your website experiences when there is an update from Google. Google updates are the most annoying as well as an exciting thing for your SEO team, based on t
Writing an ad copy is the trickiest part, especially when you are spending thousands of dollars on your online advertising campaign like PPC. Though it takes only a line or two to convert your po
We are living in the modern era of digital marketing, and with the boon of various technologies, we are now able to optimize the marketing and advertising strategies to level-up our business in the co
If you have been reading about digital marketing or familiar with the various terms of online marketing, you must be knowing the term “SEO”. Well, if you don’t know, SEO stands for Search E
Do you remember searching for a query on Google, and you see a sponsored ad there on the top of the Search Engine Result Page (SERPs). And if you have noticed that carefully, you will see that it has
Ever since the ban of Tiktok in some countries, many other social media platforms like Instagram and Facebook have tried to launch the same kind of feature in their apps. It is clearly visible th
2020 has really been a dramatic year. Not just in the real world but also “online.” Google has announced the “Core Update” of 2020 on December 3rd at 1 pm ET. Generally, Google updates the
We are living in a world where we can not neglect the power of the internet. If you just look around yourself, more than half of the people will be spending their time on screen. As people are now mor