Whats the transformer machine learning model? And why should you care? – The Next Web
Posted: May 5, 2022 at 1:44 am
This article is part of Demystifying AI, a series of posts that (try to) disambiguate the jargon and myths surrounding AI. (In partnership with Paperspace)
In recent years, the transformer model has become one of the main highlights of advances in deep learning and deep neural networks. It is mainly used for advanced applications in natural language processing. Google is using it to enhance its search engine results. OpenAI has used transformers to create its famous GPT-2 and GPT-3 models.
Since its debut in 2017, the transformer architecture has evolved and branched out into many different variants, expanding beyond language tasks into other areas. They have been used for time series forecasting. They are the key innovation behind AlphaFold, DeepMinds protein structure prediction model. Codex, OpenAIs source codegeneration model, is based on transformers. More recently, transformers have found their way into computer vision, where they are slowly replacing convolutional neural networks (CNN) in many complicated tasks.
Researchers are still exploring ways to improve transformers and use them in new applications. Here is a brief explainer about what makes transformers exciting and how they work.
The classic feed-forward neural network is not designed to keep track of sequential data and maps each input into an output. This works for tasks such as classifying images but fails on sequential data such as text. A machine learning model that processes text must not only compute every word but also take into consideration how words come in sequences and relate to each other. The meaning of words can change depending on other words that come before and after them in the sentence.
Before transformers, recurrent neural networks (RNN) were the go-to solution for natural language processing. When provided with a sequence of words, an RNN processes the first word and feeds back the result into the layer that processes the next word. This enables it to keep track of the entire sentence instead of processing each word separately.
Recurrent neural nets had disadvantages that limited their usefulness. First, they were very slow. Since they had to process data sequentially, they could not take advantage of parallel computing hardware and graphics processing units (GPU) in training and inference. Second, they could not handle long sequences of text. As the RNN got deeper into a text excerpt, the effects of the first words of the sentence gradually faded. This problem, known as vanishing gradients, was problematic when two linked words were very far apart in the text. And third, they only captured the relations between a word and the words that came before it. In reality, the meaning of words depends on the words that come both before and after them.
Long short-term memory (LSTM) networks, the successor to RNNs, were able to solve the vanishing gradients problem to some degree and were able to handle larger sequences of text. But LSTMs were even slower to train than RNNs and still couldnt take full advantage of parallel computing. They still relied on the serial processing of text sequences.
Transformers, introduced in the 2017 paper Attention Is All You Need, made two key contributions. First, they made it possible to process entire sequences in parallel, making it possible to scale the speed and capacity of sequential deep learning models to unprecedented rates. And second, they introduced attention mechanisms that made it possible to track the relations between words across very long text sequences in both forward and reverse directions.
Before we discuss how the transformer model works, it is worth discussing the types of problems that sequential neural networks solve.
A vector to sequence model takes a single input, such as an image, and produces a sequence of data, such as a description.
A sequence to vector model takes a sequence as input, such as a product review or a social media post, and outputs a single value, such as a sentiment score.
A sequence to sequence model takes a sequence as input, such as an English sentence, and outputs another sequence, such as the French translation of the sentence.
Despite their differences, all these types of models have one thing in common. They learn representations. The job of a neural network is to transform one type of data into another. During training, the hidden layers of the neural network (the layers that stand between the input and output) tune their parameters in a way that best represents the features of the input data type and maps it to the output.
The original transformer was designed as a sequence-to-sequence (seq2seq) model for machine translation (of course, seq2seq models are not limited to translation tasks). It is composed of an encoder module that compresses an input string from the source language into a vector that represents the words and their relations to each other. The decoder module transforms the encoded vector into a string of text in the destination language.
The input text must be processed and transformed into a unified format before being fed to the transformer. First, the text goes through a tokenizer, which breaks it down into chunks of characters that can be processed separately. The tokenization algorithm can depend on the application. In most cases, every word and punctuation mark roughly counts as one token. Some suffixes and prefixes count as separate tokens (e.g., ize, ly, and pre). The tokenizer produces a list of numbers that represent the token IDs of the input text.
The tokens are then converted into word embeddings. A word embedding is a vector that tries to capture the value of words in a multi-dimensional space. For example, the words cat and dog can have similar values across some dimensions because they are both used in sentences that are about animals and house pets. However, cat is closer to lion than wolf across some other dimension that separates felines from canids. Similarly, Paris and London might be close to each other because they are both cities. However, London is closer to England and Paris to France on a dimension that separates countries. Word embeddings usually have hundreds of dimensions.
Word embeddings are created by embedding models, which are trained separately from the transformer. There are several pre-trained embedding models that are used for language tasks.
Once the sentence is transformed into a list of word embeddings, it is fed into the transformers encoder module. Unlike RNN and LSTM models, the transformer does not receive one input at a time. It can receive an entire sentences worth of embedding values and process them in parallel. This makes transformers more compute-efficient than their predecessors and also enables them to examine the context of the text in both forward and backward sequences.
To preserve the sequential nature of the words in the sentence, the transformer applies positional encoding, which basically means that it modifies the values of each embedding vector to represent its location in the text.
Next, the input is passed to the first encoder block, which processes it through an attention layer. The attention layer tries to capture the relations between the words in the sentence. For example, consider the sentence The big black cat crossed the road after it dropped a bottle on its side. Here, the model must associate it with cat and its with bottle. Accordingly, it should establish other associations such as big and cat or crossed and cat. Otherwise put, the attention layer receives a list of word embeddings that represent the values of individual words and produces a list of vectors that represent both individual words and their relations to each other. The attention layer contains multiple attention heads, each of which can capture different kinds of relations between words.
The output of the attention layer is fed to a feed-forward neural network that transforms it into a vector representation and sends it to the next attention layer. Transformers contain several blocks of attention and feed-forward layers to gradually capture more complicated relationships.
The task of the decoder module is to translate the encoders attention vector into the output data (e.g., the translated version of the input text). During the training phase, the decoder has access both to the attention vector produced by the encoder and the expected outcome (e.g., the translated string).
The decoder uses the same tokenization, word embedding, and attention mechanism to process the expected outcome and create attention vectors. It then passes this attention vector and the attention layer in the encoder module, which establishes relations between the input and output values. In the translation application, this is the part where the words from the source and destination languages are mapped to each other. Like the encoder module, the decoder attention vector is passed through a feed-forward layer. Its result is then mapped to a very large vector which is the size of the target data (in the case of language translation, this can span across tens of thousands of words).
During training, the transformer is provided with a very large corpus of paired examples (e.g., English sentences and their corresponding French translations). The encoder module receives and processes the full input string. The decoder, however, receives a masked version of the output string, one word at a time, and tries to establish the mappings between the encoded attention vector and the expected outcome. The encoder tries to predict the next word and makes corrections based on the difference between its output and the expected outcome. This feedback enables the transformer to modify the parameters of the encoder and decoder and gradually create the right mappings between the input and output languages.
The more training data and parameters the transformer has, the more capacity it gains to maintain coherence and consistency across long sequences of text.
In the machine translation example that we examined above, the encoder module of the transformer learned the relations between English words and sentences, and the decoder learns the mappings between English and French.
But not all transformer applications require both the encoder and decoder module. For example, the GPT family of large language models uses stacks of decoder modules to generate text. BERT, another variation of the transformer model developed by researchers at Google, only uses encoder modules.
The advantage of some of these architectures is that they can be trained through self-supervised learning or unsupervised methods. BERT, for example, does much of its training by taking large corpora of unlabeled text, masking parts of it, and trying to predict the missing parts. It then tunes its parameters based on how much its predictions were close to or far from the actual data. By continuously going through this process, BERT captures the statistical relations between different words in different contexts. After this pretraining phase, BERT can be finetuned for a downstream task such as question answering, text summarization, or sentiment analysis by training it on a small number of labeled examples.
Using unsupervised and self-supervised pretraining reduces the manual effort required to annotate training data.
A lot more can be said about transformers and the new applications they are unlocking, which is out of the scope of this article. Researchers are still finding ways to squeeze more out of transformers.
Transformers have also created discussions about language understanding and artificial general intelligence. What is clear is that transformers, like other neural networks, are statistical models that capture regularities in data in clever and complicated ways. They do not understand language in the way that humans do. But they are exciting and useful nonetheless and have a lot to offer.
This article was originally written by Ben Dickson and published by Ben Dickson onTechTalks, a publication that examines trends in technology, how they affect the way we live and do business, and the problems they solve. But we also discuss the evil side of technology, the darker implications of new tech, and what we need to look out for. You can read the original articlehere.
Read more:
Whats the transformer machine learning model? And why should you care? - The Next Web
- The Top Five AWS Re:Invent 2019 Announcements That Impact Your Enterprise Today - Forbes [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- The Bot Decade: How AI Took Over Our Lives in the 2010s - Popular Mechanics [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Cloudy with a chance of neurons: The tools that make neural networks work - Ars Technica [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Measuring Employee Engagement with A.I. and Machine Learning - Dice Insights [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Amazon Wants to Teach You Machine Learning Through Music? - Dice Insights [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- NFL Looks to Cloud and Machine Learning to Improve Player Safety - Which-50 [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Machine Learning Answers: If Nvidia Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? - Forbes [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- The NFL And Amazon Want To Transform Player Health Through Machine Learning - Forbes [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Managing Big Data in Real-Time with AI and Machine Learning - Database Trends and Applications [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- 10 Machine Learning Techniques and their Definitions - AiThority [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- This AI Agent Uses Reinforcement Learning To Self-Drive In A Video Game - Analytics India Magazine [Last Updated On: December 31st, 2019] [Originally Added On: December 31st, 2019]
- Machine learning to grow innovation as smart personal device market peaks - IT Brief New Zealand [Last Updated On: December 31st, 2019] [Originally Added On: December 31st, 2019]
- Can machine learning take over the role of investors? - TechHQ [Last Updated On: December 31st, 2019] [Originally Added On: December 31st, 2019]
- The impact of ML and AI in security testing - JAXenter [Last Updated On: December 31st, 2019] [Originally Added On: December 31st, 2019]
- Are We Overly Infatuated With Deep Learning? - Forbes [Last Updated On: December 31st, 2019] [Originally Added On: December 31st, 2019]
- Will Artificial Intelligence Be Humankinds Messiah or Overlord, Is It Truly Needed in Our Civilization - Science Times [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Get ready for the emergence of AI-as-a-Service - The Next Web [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Clean data, AI advances, and provider/payer collaboration will be key in 2020 - Healthcare IT News [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- An Open Source Alternative to AWS SageMaker - Datanami [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- How Machine Learning Will Lead to Better Maps - Popular Mechanics [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Federated machine learning is coming - here's the questions we should be asking - Diginomica [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Iguazio pulls in $24m from investors, shows off storage-integrated parallelised, real-time AI/machine learning workflows - Blocks and Files [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- New York Institute of Finance and Google Cloud launch a Machine Learning for Trading Specialisation on Coursera - HedgeWeek [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Short- and long-term impacts of machine learning on contact centres - Which-50 [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Iguazio Deployed by Payoneer to Prevent Fraud with Real-time Machine Learning - Yahoo Finance [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Regulators Begin to Accept Machine Learning to Improve AML, But There Are Major Issues - PaymentsJournal [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- What Is Machine Learning? | How It Works, Techniques ... [Last Updated On: January 27th, 2020] [Originally Added On: January 27th, 2020]
- Global Deep Learning Market 2020-2024 | Growing Application of Deep Learning to Boost Market Growth | Technavio - Business Wire [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- The Human-Powered Companies That Make AI Work - Forbes [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- UB receives $800,000 NSF/Amazon grant to improve AI fairness in foster care - UB Now: News and views for UB faculty and staff - University at Buffalo... [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- Euro machine learning startup plans NYC rental platform, the punch list goes digital & other proptech news - The Real Deal [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- New Project at Jefferson Lab Aims to Use Machine Learning to Improve Up-Time of Particle Accelerators - HPCwire [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- This tech firm used AI & machine learning to predict Coronavirus outbreak; warned people about danger zones - Economic Times [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- Reinforcement Learning: An Introduction to the Technology - Yahoo Finance [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- Reinforcement Learning (RL) Market Report & Framework, 2020: An Introduction to the Technology - Yahoo Finance [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- Top Machine Learning Services in the Cloud - Datamation [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- In Coronavirus Response, AI is Becoming a Useful Tool in a Global Outbreak - Machine Learning Times - machine learning & data science news - The... [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- Combating the coronavirus with Twitter, data mining, and machine learning - TechRepublic [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- Speechmatics and Soho2 apply machine learning to analyse voice data - Finextra [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- REPLY: European Central Bank Explores the Possibilities of Machine Learning With a Coding Marathon Organised by Reply - Business Wire [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- What is Machine Learning? A definition - Expert System [Last Updated On: February 4th, 2020] [Originally Added On: February 4th, 2020]
- How to Train Your AI Soldier Robots (and the Humans Who Command Them) - War on the Rocks [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Google Teaches AI To Play The Game Of Chip Design - The Next Platform [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Would you tell your innermost secrets to Alexa? How AI therapists could save you time and money on mental health care - MarketWatch [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Cisco Enhances IoT Platform with 5G Readiness and Machine Learning - The Fast Mode [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Buzzwords ahoy as Microsoft tears the wraps off machine-learning enhancements, new application for Dynamics 365 - The Register [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Inspur Re-Elected as Member of SPEC OSSC and Chair of SPEC Machine Learning - HPCwire [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- How to Pick a Winning March Madness Bracket - Machine Learning Times - machine learning & data science news - The Predictive Analytics Times [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Syniverse and RealNetworks Collaboration Brings Kontxt-Based Machine Learning Analytics to Block Spam and Phishing Text Messages - MarTech Series [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Grok combines Machine Learning and the Human Brain to build smarter AIOps - Diginomica [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Machine Learning: Real-life applications and it's significance in Data Science - Techstory [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- Why 2020 will be the Year of Automated Machine Learning - Gigabit Magazine - Technology News, Magazine and Website [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- What is machine learning? Everything you need to know | ZDNet [Last Updated On: February 22nd, 2020] [Originally Added On: February 22nd, 2020]
- AI Is Top Game-Changing Technology In Healthcare Industry - Forbes [Last Updated On: February 23rd, 2020] [Originally Added On: February 23rd, 2020]
- Removing the robot factor from AI - Gigabit Magazine - Technology News, Magazine and Website [Last Updated On: February 23rd, 2020] [Originally Added On: February 23rd, 2020]
- This AI Researcher Thinks We Have It All Wrong - Forbes [Last Updated On: February 23rd, 2020] [Originally Added On: February 23rd, 2020]
- TMR Projects Strong Growth for Property Management Software Market, AI and Machine Learning to Boost Valuation to ~US$ 2 Bn by 2027 - PRNewswire [Last Updated On: February 29th, 2020] [Originally Added On: February 29th, 2020]
- Global Machine Learning as a Service Market, Trends, Analysis, Opportunities, Share and Forecast 2019-2027 - NJ MMA News [Last Updated On: February 29th, 2020] [Originally Added On: February 29th, 2020]
- Forget Chessthe Real Challenge Is Teaching AI to Play D&D - WIRED [Last Updated On: February 29th, 2020] [Originally Added On: February 29th, 2020]
- Workday, Machine Learning, and the Future of Enterprise Applications - Cloud Wars [Last Updated On: February 29th, 2020] [Originally Added On: February 29th, 2020]
- The Global Deep Learning Chipset Market size is expected to reach $24.5 billion by 2025, rising at a market growth of 37% CAGR during the forecast... [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- The Power of AI in 'Next Best Actions' - CMSWire [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- Proof in the power of data - PES Media [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- FYI: You can trick image-recog AI into, say, mixing up cats and dogs by abusing scaling code to poison training data - The Register [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- Keeping Machine Learning Algorithms Humble and Honest in the Ethics-First Era - Datamation [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- Emerging Trend of Machine Learning in Retail Market 2019 by Company, Regions, Type and Application, Forecast to 2024 - Bandera County Courier [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- With launch of COVID-19 data hub, the White House issues a call to action for AI researchers - TechCrunch [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- Are machine-learning-based automation tools good enough for storage management and other areas of IT? Let us know - The Register [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- Why AI might be the most effective weapon we have to fight COVID-19 - The Next Web [Last Updated On: March 22nd, 2020] [Originally Added On: March 22nd, 2020]
- AI Is Changing Work and Leaders Need to Adapt - Harvard Business Review [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- Deep Learning to Be Key Driver for Expansion and Adoption of AI in Asia-Pacific, Says GlobalData - MarTech Series [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- With Launch of COVID-19 Data Hub, The White House Issues A 'Call To Action' For AI Researchers - Machine Learning Times - machine learning & data... [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- What are the top AI platforms? - Gigabit Magazine - Technology News, Magazine and Website [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- Data to the Rescue! Predicting and Preventing Accidents at Sea - JAXenter [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- Deep Learning: What You Need To Know - Forbes [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- Neural networks facilitate optimization in the search for new materials - MIT News [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- PSD2: How machine learning reduces friction and satisfies SCA - The Paypers [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- Google is using AI to design chips that will accelerate AI - MIT Technology Review [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- What Researches says on Machine learning with COVID-19 - Techiexpert.com - TechiExpert.com [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]
- Self-driving truck boss: 'Supervised machine learning doesnt live up to the hype. It isnt C-3PO, its sophisticated pattern matching' - The Register [Last Updated On: March 29th, 2020] [Originally Added On: March 29th, 2020]