Europe's AI Ambition: A Czech-Led Effort to Challenge US Dominance
The EU is investing in the OpenEuroLLM project, led by the Czech Republic, to create its own open-source AI models, aiming for technological independence and cultural relevance.
Omi is an AI wearable that listens, summarizes, and attempts to understand your thoughts. It uses a brain-computer interface and has an app store for expanded functionality. Is this the future of AI companions?
In an era where AI assistants are becoming increasingly prevalent, a new wearable device called Omi is attempting to push the boundaries of how we interact with technology. Developed by Nik Shevchenko, Omi is not just another AI assistant; it's designed to listen, summarize, and even attempt to understand your thoughts through a brain-computer interface. This $89 device, which can be worn on a lanyard or attached to the temple, is creating a buzz with its capabilities and ambitions.
Omi's primary function is to be an always-listening device. Unlike traditional assistants that require a wake word, Omi is constantly active, ready to capture and process conversations. This allows it to summarize meetings, provide action items, and offer real-time information. For example, during a conversation, Shevchenko casually wondered about the price of Bitcoin and received a notification from the Omi companion app with the answer moments later. The device also has a feature called “Personas,” which lets users create AI bots based on social media personas. Shevchenko has been using an AI Elon Musk to help him with his daily tasks and provide feedback. According to Shevchenko, “It helps me to understand what I should be working on tomorrow. Or when I’m talking to someone and I don’t know an answer to the question, it will give me a small nudge — it sometimes tells me I’m wrong!”
What sets Omi apart is its attempt to incorporate a brain-computer interface. The device uses a single electrode to detect when the user is addressing it directly. As Shevchenko demonstrated, by focusing his attention while wearing the device on his temple, Omi was able to answer his question about the reputation of The Verge as a news source.
While this technology is still in its early stages and somewhat fragile, it suggests a future where AI devices can understand our intentions without explicit verbal commands. Shevchenko envisions a future where Omi can save and understand our thoughts, a concept he acknowledges is still science fiction but believes might be possible in a couple of years.
Omi’s technology is primarily based on a microphone and powered by AI models from OpenAI and Meta. The device is being positioned as a broad platform, featuring an app store with 250 apps. This open-source approach allows developers to integrate Omi's audio input into various services like Zapier and Google Drive, expanding its functionality. Shevchenko stated, “All of Omi’s code is open source, and there are already 250 apps in the store. Omi’s plan is to be a big, broad platform, rather than a specific device or app.”
Omi is not without its competitors. It shares similarities with devices like the Limitless Pendant and Friend. In fact, Omi was originally called Friend, but Shevchenko changed the name due to a dispute with Friend's CEO, Avi Schiffmann. Despite the early-stage issues, Shevchenko is confident that Omi can improve upon these other devices, aiming to provide a more comprehensive and versatile AI companion experience.
To try out Omi, you can visit the official Omi website at omi.me. While the device is not yet widely available, you can explore the platform and potentially join the waitlist for early access. The Omi app is available on both iOS and Android, allowing you to interact with your AI companion and explore its capabilities. The company is also actively engaging with its community through social media channels, where you can stay updated on the latest developments and share your experience.
As AI companions become more integrated into our lives, Omi represents a step toward a more intuitive and personalized interaction with technology. Whether it’s through an always-on microphone or the potential of brain-computer interfaces, Omi is exploring how AI can truly understand and assist us. The device's ability to learn, adapt, and potentially even anticipate our needs could significantly change our daily routines. As the technology develops, it will be interesting to see how Omi and similar devices shape the future of human-AI interaction.
One electrode on the temple? Really? It's like trying to understand a symphony by listening to a single violin string. I appreciate the ambition, but let's not pretend we're deciphering the secrets of the universe just yet. On the other hand, this approach, while rudimentary, could be the start of something truly transformative. Imagine a world where we don't have to translate our thoughts into words for our devices to understand.
Omi, or devices like it, could completely change how we learn. Imagine a student struggling with a complex math problem. Instead of trying to articulate their confusion, the device could sense the specific area of difficulty in their brain waves and provide targeted assistance in real-time. It could also be used to enhance communication for people with disabilities, allowing them to express thoughts and ideas without the need for verbal or physical input. This could also extend to mental health. Omi could potentially pick up subtle changes in brain activity that indicate stress or anxiety, providing timely interventions and support. It's like having a personal mental health assistant, always on standby. And let's not forget about the workplace. Imagine a surgeon being guided by an AI that understands their intentions and anticipates their needs during a complex operation, or an engineer designing a new structure with the AI providing real-time feedback based on their thought process. The possibilities are pretty wild.
Sources: