Whereas the BERT algorithm was designed to answer simple questions by processing many small samples of natural language text content, the Multitask Unified Model (MUM)—which Google claims is 1000 times more powerful—was designed to answer complex questions through a technique known as transfer learning. This means that MUM works at a higher level, relying on language models that have already been built and refined through BERT and other similar algorithms that take different approaches to language processing. With access to so many different language models for both text and images, Google hopes that MUM will reduce the number of searches required to compare and contrast two similar subjects.
For example, let’s say you have a pet budgie (a parakeet), and one day you get a call from your great-aunt, asking if you could adopt her pet cockatoo (a large parrot) when she moves into an assisted living community next month. Before you can give her an answer, you need to know whether your parakeet knowledge and experience will be useful when applied to a bigger bird. Will one of your current cages work? Do you need a special perch? Can the cockatoo eat the same food as the budgie? Will they fight with each other? The one question that underlies everything is: considering my current expertise and resources, what extra things do I need to consider?
Unfortunately that’s not a very good search query. You might get lucky and quickly find a blog post, article, or video that covers this exact topic in the exact level of detail you need. More than likely, though, the answers you’re looking for will have to be derived by extracting and combining details from several imperfect sources that you found with several different queries. You’ll waste a lot of time skimming over redundant or irrelevant content, and despite all that effort you still might not get the kind of perspective you need.
Ideally—and realistically—you’d instead use Google to find a credible expert who has dealt with both of those birds and can offer specific guidance on this scenario. In the near future, Google hopes to use MUM to provide this kind of specific information in a lower number of easier searches. With a natural language query like: I already have a budgie, what do I need to know before adopting a cockatoo? Google would understand which birds you are referring to, and that they are or will be pets in the same house (and where that house is in the world), and then it would build a comprehensive context around all of those subjects for you. You might not even need to think of more queries—a MUM-enhanced Google SERP would be able to anticipate your queries in order of importance, starting with several ways to compare each bird in terms of pet care, and advice from people who have both birds. Then you’d see OneBox results for cockatoo-specific questions that you haven’t asked (or even thought of) yet: behavior toward other pets (especially budgies), cage and perch sizes, preferred toys, engagement activities, annual avian veterinary costs, average lifespan in captivity, toilet training techniques, how to deal with jealousy and tantrums, and highly-rated wing clipping and nail trimming services near where you live.
That’s only the beginning. The next step for MUM is to cross-reference a query between textual and graphical language models. You’d be able to take a photo of your budgie and ask Google: will this bird get along with a cockatoo? Or you could take a picture of a cage and ask: is this a good cage for a cockatoo? You might even be able to put all of your bird food on a table, take a photo of it, and ask: will a cockatoo eat this?
Even that advanced scenario still does not represent a proper replacement for a qualified human expert, but compared to a pre-MUM Google search effort, it provides a much more accurate impression of how much extra work and expense it will be to adopt your great-aunt’s parrot. This level of information is probably not enough for you to agree to adopt the cockatoo—you’d still want to talk to an expert first—but it may be enough to convince you that you and your parakeet would be much happier without it.
It isn’t clear what the timeline for MUM implementation is. Google announced that it was working on MUM in mid-2021, but hasn’t yet given examples of how it affects search results.
I find this sort of thing fascinating, which is why I wrote a whole book about Google’s search technology and how to use it: Google Power Search. If you’ve read this far into this blog post, I can practically guarantee that Google Power Search will be the best technical book you’ve read in a long time.