Skip to content

How Google’s Roadmap for MUM Will Impact SEO in 2022 and Beyond

Kickstart Your Online Business With These 300+ Video Tutorials

One of the biggest SEO headlines in 2021 was Google’s announcement of its new Multitask Unified Model (MUM) technology. In fact, our SEO experts recently said that keeping tabs on MUM should be a priority for SEOs in 2022 as the technology matures.

Back in July, Google’s Pandy Nayak spoke to Search Engine Land to discuss the roadmap for MUM and what it could mean for the future of search. So now seems like a good time to review Nayak’s comments as we prepare for the year ahead.

What does Google’s roadmap for MUM look like?

Based on Nayak’s comments, Google’s roadmap for MUM involves three stages with short-term, mid-term and long-term plans. The short-term phase has already begun and we’ll continue to see this develop throughout 2022.

We may also start to see some of Google’s mid-term plans come to fruition this year, too, but it’s clear from Nayak’s interview that even Google isn’t 100% sure what the full impact of MUM will be in the long-term future.

Here’s a quick summary of the key points from his interview:

  1. Short-term development: Primarily knowledge transfer across languages.
  2. Mid-term development: Multimodal search incorporating text, images and video.
  3. Long-term development: Understanding complex questions and delivering a greater depth of relevant information.
  4. Content engagement: Nayak insists MUM won’t turn Google into a question-answering system.
  5. The ethics of MUM: Google’s plans for limiting bias and addressing the ecological impact of its energy-intensive technology.

Before we explore these in more detail, let’s quickly review the ins and outs of the Multitask Unified Model (MUM) technology.

What is Multitask Unified Model (MUM)?

Google announced MUM in May 2021 at the Google I/O virtual event. This was followed by a blog post published on The Keyword by Vice President of Search, Pandu Nayak.

Here’s a quick summary of the key points from both announcements:

  • 1,000x more powerful than BERT
  • Understand and create content across 75+ languages
  • Source and translate information from other languages
  • Understand information in text, images and video
  • Capable of analysing and answering complex queries
  • Identify items in images (products, patterns, people, etc.)

Before MUM, Google sessions were limited to the original language of the user. Someone who types a query in English will receive answers and content in English. The problem is, the best information available isn’t always in the same language as the query.

Imagine someone planning for a trip to Japan and they’re looking for information about temples in Kyoto. Well, the best information available may well be published in Japanese and MUM allows Google to understand the original query in English, identify the best information and translate it back into English.

The technology also enables something called multimodal search, which allows Google to understand information in different media types. For example, users will eventually be able to use images in their searches and Google will identify items within the frame.

So users could take a picture of their hiking boots and ask Google whether they’re suitable for hiking Mt. Fuji. Or they might take a picture of a bike part and find out where to buy a replacement, relying on Google to identify the part in the image and match it with suitable replacements.

For a more in-depth look at the capabilities of MUM, take a look at our complete summary of the new technology.

Google’s roadmap for developing MUM

When Google first announced MUM, it was clear that many of the features being teased wouldn’t make their way into search for quite some time. Like BERT before it, MUM isn’t going to transform search overnight but it will develop over time into one of the most powerful aspects of Google’s algorithm.

Pandy Nayak reiterates this sentiment in his interview with Search Engine Land by laying out Google’s short, mid and long-term goals with MUM.

Short-term: Knowledge transfer across languages

For now, Google’s priority is developing MUM’s capabilities for transferring information across languages. Earlier, we touched on one example of how Google could source information published in Japanese and translate it into English for a user planning a trip to Japan.

This is the priority for Google in the short-term phase of MUM and we’ve already seen examples of this in the wild. The first public application of the technology was used to identify 800 variations of vaccine names across 50 languages as the world started searching for information about Covid-19 vaccinations.

“With MUM, we were able to identify over 800 variations of vaccine names in more than 50 languages in a matter of seconds. After validating MUM’s findings, we applied them to Google Search so that people could find timely, high-quality information about COVID-19 vaccines worldwide.”

Google says MUM was able to perform a task that should have taken weeks in a matter of seconds by transferring information across multiple languages. This was quite a test for the technology given the limited amount of verified information available at the time compared to the surge in search queries across dozens of major languages.

Testing Covid searches using Googles MUM technology

Crucially, this transfer of information means Google doesn’t need to learn from zero in each language. It can transfer skills and information to perform tasks in 75+ languages and scale improvements globally without having to relearn and retrain in each language.

As a result, Google requires significantly less input data than before and it can now deliver information in languages where no input data even exists. The covid-19 pandemic produced a real-world test far more challenging than anything Google could have manufactured and it passed with flying colours.

This suggests Google’s plans for the short-term phase of MUM development is progressing nicely.

Mid-term: Multimodal search with different media types

Google’s mid-term plans centre around the implementation of multimodal search. This will incorporate features that allow users to search with different media types, including text, images and video.

Earlier, we looked at one potential example where users might take a picture of their hiking boots and ask Google whether they’re suitable for scaling Mt. Fuji.

To make this work, Google first needs to match the image with the exact same model of hiking boots. Then, it needs to match information from product pages, specifications, product reviews, Q&As, forums and other locations to compile its answer and list of relevant results.

Another hypothetical example Google has offered up is someone needing a replacement part for their bike. Without knowing the name of the specific part they’re looking for, users can’t type the relevant keywords into search.

Google hopes MUM can change this by allowing users to include a picture of such parts in their query.

Taking a photo of a bike part to upload to Google to find out what it is / where to buy it using MUM technology

By uploading this image and the query “how to fix,” Google expects MUM will be able to identify the part users are looking for and find content that will help them fix or replace it.

Don’t expect to see these changes start rolling out in 2022, though.

While Google’s short-term development plans are already taking shape, the mid-term phase remains more conceptual. Nayak says Google has tested several multimodal features using MUM and insists the results have been positive but clarifies that the exact implementation, features and timelines for the mid-term stage of MUM development remains uncertain.

Long-term: Advanced language understanding & query handling

Pandy Nayak says Google’s long-term goal with MUM is to maximise its ability to understand complex queries and provide more sophisticated answers. This echoes some of the comments made in his original blog post and the example offered at the time of someone planning their next trip to Japan – more specifically, someone who wants to know how hiking Mt. Fuji compares to scaling Mt. Admans.

“Today, Google could help you with this, but it would take many thoughtfully considered searches — you’d have to search for the elevation of each mountain, the average temperature in the fall, difficulty of the hiking trails, the right gear to use, and more. After a number of searches, you’d eventually be able to get the answer you need.”

As Nayak pointed out, though, if you were to ask a hiking expert, you could get the answer you’re looking for by asking a single question. Not only that, but you’d get “a thoughtful answer that takes into account the nuances of your task at hand and guides you through the many things to consider”.

This is the kind of answer Google wants to provide for complex queries so users can get the depth of information they need from a single query.

Example of how Googles MUM technology would work when researching climbing a mountain

Nayak admits that search engines still struggle to pick out the key pieces of information and deliver relevant results for all of the criteria included in more complex queries. Google hopes MUM will help the search giant to understand these queries, pick out all of the key points and find content that answers every aspect of the question – as you can see from the illustration above.

Nayak addresses concerns about question-answer sessions

With all this talk about MUM helping Google to answer more complex queries, some SEOs have raised the concern that the technology could turn the search engine into a question-answer platform.

With two-thirds of searches ending without a click in 2020, according to data from SparkToro, SEOs and businesses are understandably cautious about Google directing less traffic to websites.

However, Pandu Nayak insists that MUM isn’t designed to turn Google into a question-answer platform. If you go back to the Search Engine Land interview, he reinforces this point by specifying that the reason Google has no plans to create question-answer experiences is that it simply wouldn’t be useful for search users.

He argues that, while it makes sense for Google to provide direct answers for simple queries (as is the case with existing zero-click searches), users require a greater depth of information for complex queries and, in many cases, the opportunity to explore information in more detail.

He uses the example of a query asking “what is the speed of light?” where a direct answer is the best experience for the user. Except, MUM is designed to deal with more complex queries and Nayak points out that these can’t be satisfied by single answers.

The ethics of MUM & AI algorithms

The final point Nayak raises in his interview with SEL is the ethical implications of MUM and running AI models, in general. He raises three points that Google is specifically working on:

  1. Limiting potential bias in the training data to minimise the risk of bias in the training and output data. Nayak says Google only uses high-quality data to filter out the majority of bias but he also acknowledges that high-quality data can contain biases, too, and adds that Google takes measures to remove biases.
  2. Internal evaluations to identify any concerning patterns that could develop through training.
  3. Addressing the environmental cost of running large AI models, which consume vast amounts of energy. Google says its choice of model technology is reducing its carbon footprint by up to 1,000x and Nayak reminds that Google has been carbon neutral since 2007.

While MUM is a new technology for Google, the search giant has worked on systems to mitigate these potential issues for many years, as Nayak revealed in his original blog post announcement:

“Just as we’ve carefully tested the many applications of BERT launched since 2019, MUM will undergo the same process as we apply these models in Search. Specifically, we’ll look for patterns that may indicate bias in machine learning to avoid introducing bias into our systems. We’ll also apply learnings from our latest research on how to reduce the carbon footprint of training systems like MUM, to make sure Search keeps running as efficiently as possible.”

Keep tabs on MUM and its impact in 2022

As our own Gabe Kegan says in our list of top SEO advice for 2022, search marketers need to keep a keen eye on MUM and its potential impact as Google continues to develop the technology.

We’ve already seen some impact in the shape of passage ranking and Google has teased plenty of other changes that could come into effect this year, some of which could significantly change the way users interact with search.

SEOs will have to keep an even closer eye on their search data this year because these developments will take place in the background. We aren’t going to get announcements and Google isn’t going to provide much (if any) clarification about the impact they have.


Source link

Back To Top

This site is protected by wp-copyrightpro.com