00:12 — This episode is brought to you by the Cloud Wars Expo. This in-person event will be held June 28th to 30th at the Moscone Center in San Francisco, California.
00:37 — When people hear the word ‘transformer,’ they tend to think about a part of the electricity grid infrastructure or the Transformers cartoon show which was later turned into hit movies.
01:05 — There’s a new transformer, which Aaron previously discussed with Aleksandra Przegalinska in a podcast episode on collaborative AI.
01:24 — These AI transformers are made up of a network of nodes that can learn how to do tasks by training on existing data.
01:40 — The first appearance of the term was in a 2017 paper from Cornell University, Attention is All You Need.
01:52 — AI transformers function like building blocks, aware of nearby transformers and building blocks. Additionally, these transformers can assess language models and can pick up the language models of nearby transformers.
02:34 — Researchers are referring to this method as ‘self-attention.’ When testing the AI transformers, researchers discovered accuracy exceeding 90% beyond their expectations.
02:51 — AI transformers require higher upfront computational power. Although this is the next evolution of AI, we certainly have not reached the full potential and final result just yet.
Looking for real-world insights into AI? Subscribe to the Enterprise AI Impact channel: