Skip to content
AI

Four important capabilities of today’s AI

Four capabilities that enable today’s AI to perform: foundation models, transformer architecture, emergent capabilities, and generative AI.

AI can do a lot of things: suggesting the next movie to watch, analyzing data, writing intelligent text from prompts, and much more. Whatever the AI, there are four important capabilities that enable today’s AI to perform.

Four capabilities of today's AI: foundation models, transformer architecture, emergent capabilities, and generative AI.

Foundation models

A foundation model is any model on which other models are built. Because foundation models have often learned from extremely large amounts of data available on the internet, they acquired encyclopedic knowledge with great breadth. Since foundation models encode knowledge of the world, they can provide a valuable starting point for new models and give today’s solutions a leg up. There is no reason to recreate the wheel, and besides, you might not have the computing power to do so anyway. 

For many problems, foundational models’ knowledge of the world is often good enough to deliver the solution. Yet, for other complex problems, you need to build on top of this base model, which is called “fine-tuning.” In fact, most models today are built on a foundation model—it might be a text, audio, or image foundations model, depending on what you are doing. For example, a model designed to identify cars can be fine-tuned to identify trucks; or a language model can be fine-tuned to address particular regulations.

Transformer architecture

From 2012 onwards, neural networks became very popular. Neural networks are inspired by the human brain as a way to process data and can consist of millions of simple processing nodes that are densely interconnected and work together. Organizations built proprietary neural networks from scratch to solve their problems. But in 2017, a new type of neural network architecture was built by Google to solve translation problems. This network was called transformer, and it literally transformed the world of AI and became the common architecture for not only natural language, but also for image, audio, and video. The transformer architecture is how the AI models learn its parameters or weights. Most of today’s AI is built on transformer architecture.

Emergent capabilities

When the models learn a large number of weights, they are called large language models (LLMs). When the models become very, very large, they can exhibit certain emergent capabilities that were not present in smaller models, such as in-context learning, reasoning, self-reflection, etc. These emergent capabilities are enabling more complex problems to be solved and are expanding the art of the possible.

Generative AI

As the name suggests, this type of AI generates something new based on a prompt. It enables computers to create new content, images, etc. based on the intent of the user. For example, LLMs can generate articles that most would think a human wrote. LLMs have the potential to change the field of AI in a very robust way. Generative AI can augment human processes, helping make us more efficient and productive.

These are the capabilities of today that make AI so impressive and performant; but as we all know, the AI ecosystem is evolving at a rapid pace. As these techniques are used, there will be limitations that will inspire the creation of the next technologies.

1091243.1.0

Sara Walker

Content Marketing Associate
Sara has a background in numerous word-related fields, including nonprofit communications, literary blogging, community media, English tutoring, and now content marketing. She holds a BA in English from Arizona State University.

Check out our latest blogs

AI’s double act: where quick wit meets deep thought

AI’s double act: where quick wit meets deep thought

Explore the dual approach to AI that combines rapid intuition with deep analytical thinking to revolutionize complex problem-solving.

Harnessing the power of Gen AI for evaluating AI systems

Harnessing the power of Gen AI for evaluating AI systems

LLMs, a type of Gen AI, can be used to help evaluate AI solutions, from stress testing to test set generation.

Three takeaways from Compliance Week’s AI and Compliance Summit

Three takeaways from Compliance Week’s AI and Compliance Summit

Key topics in the FinServ compliance space include opportunities and challenges for professionals, generative AI supervision, and the US re...