Although Google has been known for years as a leader in generative AI and language models, at the end of last year Microsoft successfully attacked them with its chatbot based on OpenAI’s GPT model, ChatGPT integrated into the Bing search engine. It’s part of the story that Google has also been working on chat AI for some time, and LaMDA was, regarding a strange (somewhat foreboding) interlude arose exactly a year ago, when a company employee claimed to have become self-aware.
The artificial intelligence race has intensified over the past few months. ChatGPT’s direct competitor, Google Bard, has been introduced by LaMDA so far – it was announced at Google’s 2023 developer conference that a new language model, PaLM-2, is coming.
The name PaLM is a file Pathways language model Covers words, which mainly refer to the structure of the pathways that are defined under intelligence. One of its most important features is that it can be used for multiple purposes, so it not only writes, but can later develop into general artificial intelligence.
Pathways enable multi-faceted modeling that includes visual, auditory and language comprehension. If the model processes the word leopard, the way someone says the word leopard, or a video of a tiger running, the reaction is internally connected to one thing: the concept of tiger. This results in a model that is more insightful, less biased, and less error-prone
says the company’s blog post.
magical animals
A 92-page technical report on PaLM-2 was also published, which revealed that LaMDA handled 137 billion parameters, the first version of PaLM handled 540 billion parameters, PaLM-2 ranges from 14.7 billion to a more modest 100 billion parameters. This is strange for several reasons: on the one hand, we see smaller numbers, and on the other hand, we see more numbers. That’s because the PaLM-2 comes in different sizes, called geckos, otters, bison, and rhinos.
The architecture also works with fewer parameters, but with great efficiency. Although exact data about GPT-4 has not been announced, it is estimated that it manages a thousand billion parameters.
In performance testing, PaLM-2 proved competitive with the larger PaLM and GPT-4 models, reportedly outperforming GPT-4 in the GSM8K mathematical inference test. The latter is just a cherry on the cake, because the new model is also the embodiment of Google’s strategy for artificial intelligence. And here we see the models trying to gain an edge in terms of computational and power appetite. Among the aforementioned size variants, there would be the PaLM-2, which works offline on gecko mobile devices, for example.
In addition, the model can not only be modified to fit the device, but also to be adapted to the application environment: not only does the PaLM-2 know one hundred languages, but also the Med-PaLM-2 that specializes in the healthcare environment, and another second-PaLM that specializes in Cyber security environment. Even if all is not going well, it is clear that Google, having organized the entire Alert and its various subsections full of scientists into one group of stormtroopers, is simultaneously trying to protect the Internet search that is the center of the business model and has tried to launch an attack in areas Like mobile devices.
(Geeky ToolsAnd information)