By the time Open AI The GPT-3 model made its debut in May 2020, its performance has been widely regarded as the literal state of the art. Capable of generating text indistinguishable from human artisanal prose, GPT-3 has set a new standard in deep learning. But oh what a difference it makes a year. Researchers of the Beijing Academy of Artificial Intelligence announced Tuesday the release of its own generative deep learning model, Wu Dao, a mammoth AI apparently capable of doing all that GPT-3 can do, and even more.
First of all, Wu Dao is huge plan. It was formed on 1.75 trillion parameters (essentially, the self-selected coefficients of the model) which is a full ten times larger than the 175 billion GPT-3 has been formed on the 150 billion largest parameters of Google Change Transformers.
To form a model on how many parameters and do it so quickly – Wu Dao 2.0 came only three months later Version 1.0 will be released in March – BAAI researchers have first developed an open-source learning system similar to Google Mix of Experts, dubbed FastMoE. This system, which is operable above PyTorch, allowed the model to be trained in both supercomputer clusters and conventional GPUs. This has given FastMoE more flexibility than Google’s system since FastMoE doesn’t need proprietary hardware like Google’s TPUs and can therefore work with off-the-shelf hardware – despite supercomputing clusters.
With all that computing power comes a lot of capacity. Unlike most deep learning models that perform a single task – write copy, generate deep forgeries, recognize faces, win in Go – Wu Dao is multi-modal, similar in theory to AI anti-hatespeech of Facebook or Google recently published it MUM. BAAI researchers demonstrated Wu Dao’s ability to perform natural language processing, text generation, image recognition, and image generation activities during the lab’s annual conference on Tuesday. The model can not only write essays, poems and poems in traditional Chinese, it can generate both alt text based on a static image and generate quasi-photorealistic images based on natural language descriptions. Wu Dao also demonstrated its ability to feed virtual idols (with some help from Microsoft’s spin-off XiaoIce) and predicted 3D structures of proteins such as AlphaFold.
“The way for general artificial intelligence is big models and big computers,” said Drs. Zhang Hongjiang, president of BAAI, during the conference Tuesday. “What we’re building is a power plant for the future of AI, with mega data, mega computing power, and mega models, we can transform the data to power the AI applications of the future.”
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, you can earn an affiliate commission.