After eagerly awaiting their release, Meta AI features are finally accessible on Ray-Ban Meta smart glasses for individuals residing within the United States and Canada. However, this accessibility is presently confined to these regions. Furthermore, while the Meta AI tools are no longer restricted to an exclusive beta program, Meta cautions in its official blog post that they remain in beta testing. This implies that users may encounter various issues concerning reliability and accuracy. Despite not being as polished as anticipated, this update represents a significant advancement for Meta’s smart glasses, enabling them to fulfill the remarkable AI-driven capabilities initially envisioned by Meta CEO Mark Zuckerberg during their unveiling at Meta Connect 2023 last September. The ‘Look and Ask’ feature is a key Meta AI function that users should explore. To initiate it, simply begin a phrase with ‘Hey Meta, look and …’ and proceed to ask the glasses a question related to something within your field of vision. For instance, you could inquire, ‘… tell me about this animal,’ ‘… tell me about this building,’ or even ‘… tell me what I can make for dinner with these ingredients.’ The glasses will then utilize your command in conjunction with an image captured by the built-in camera to search for a response within its database. This database encompasses data that Meta AI has been trained on as well as information gathered from Google and Bing. As with any AI-generated responses, it’s advisable to approach the information provided by Meta AI with a degree of skepticism. AI assistants are susceptible to hallucinations which, in the context of AI, essentially equates to providing inaccurate information and this Meta model is not exempt from this tendency. While it can provide accurate information at times, it’s crucial to refrain from treating its suggestions as infallible.
Meta AI Features Unleash on Ray-Ban Smart Glasses
dev