Want to learn more?
Read: The 2024 Hype Cycle for Artificial Intelligence
Foundation models are large-parameter models that are trained on a broad gamut of datasets in a self-supervised manner. They are mostly based on transformer or diffusion deep neural network architectures and will potentially be multimodal in the near future. They are called foundation models because of their critical importance and applicability to a wide variety of downstream use cases. This broad applicability is due to the pretraining and versatility of the models.
Read: The 2024 Hype Cycle for Artificial Intelligence
Attend a Conference
Experience Data and Analytics conferences
With exclusive insight from Gartner experts on the latest trends, sessions curated for your role and unmatched peer networking, Gartner conferences help you accelerate your priorities.
Gartner Data & Analytics Summit
Orlando, FL