Blog
Can AI Learn Physics from Sensor Data?
We are excited to share a milestone in our journey toward developing a physical AI foundation model. In a recent paper by the Archetype AI team, "A Phenomenological AI Foundation Model for Physical Signals," we demonstrate how an AI foundation model can effectively encode and predict physical behaviors and processes it has never encountered before, without being explicitly taught underlying physical principles. Read on to explore our key findings.
Last week, we made waves at TEDAI San Francisco — on Day 1, we premiered our feature video, "A Phenomenological AI Foundation Model for Physical Signals” and on Day 2, our co-founder, CEO, and CTO, Ivan Poupyrev, took the stage for a panel discussion on the future of embodied AI.
Currently, digital companions are the dominant metaphor for understanding AI systems. However, as the field of generative AI continues to evolve, it's crucial to examine how we frame and comprehend these technologies—it will influence how we develop, interact with, and regulate AI. In this blog post, we'll explore different metaphors used in AI products and discuss how they shape our mental models of AI systems.
On October 17, 2024, Infineon and Archetype AI introduced the first-ever foundation model capable of understanding real time sensor data. They presented a demo at OktoberTech™ Silicon Valley, Infineon's annual technology forum. Read on to learn more about the event and our approach to building a large behavior model that can reason about events in the physical world.
Imagine a world where your car responds not just to what you see but to what every vehicle, traffic light, and smartphone detects. Soarchain is making this a reality by combining their decentralized platform with Archetype AI’s developer API, Newton. Together, we will give drivers and autonomous vehicles a rich real time understanding of what’s happening on the roads around them.
Implementing AI in industrial settings comes with significant challenges like ensuring employee safety, estimating productivity, and monitoring hazards—all requiring real-time processing. However, sending sensor data to the cloud for analysis introduces latency and security concerns, driving up costs. The solution? Eliminate the cloud. With Archetype AI’s Newton foundation model, AI can run on local machines using a single off-the-shelf GPU, delivering low latency, high security, and reduced costs in environments like manufacturing, logistics, transportation, and construction.
We’re building the first AI foundation model that learns about the physical world directly from sensor data, with the goal of helping humanity understand the complex behavior patterns of the world around us all.
At Archetype, we want to use AI to solve real world problems by empowering organizations to build for their own use cases. We aren’t building verticalized solutions –instead, we want to give engineers, developers, and companies the AI tools and platform they need to create their own solutions in the physical world.
The renowned tech journalist Steven Levy featured Archetype AI in an exclusive article on WIRED. In his piece, Levy explores how our advanced AI models serve as a crucial translation layer between humans and complex sensors, enabling seamless interactions with houses, cars, factories, and more.
In a world where devices constantly collect and transmit data, many organizations struggle to harness its potential. At Archetype AI, we believe AI is the key to transforming this raw data into actionable insights. Join us as we explore how the convergence of sensors and AI can help us better understand the world around us.
Imagine a world where technology could help us make sense of the world's hidden patterns, understand the root causes of problems, and identify solutions.
At Archetype AI, we believe such a world is possible. We’re unveiling a new form of artificial intelligence that takes us a step closer to this reality.