Tech

Meta Introduces AI-Powered App With Voice Assistant And Personalised Feed

Meta Launches Standalone AI App With Voice Chat, Personalisation, and Discovery Feed

Meta has taken a major step forward in its AI journey with the launch of a standalone Meta AI app, offering users a highly personalised and conversational digital assistant experience. Built on Meta’s latest Llama 4 model, the app introduces advanced voice chat capabilities, now available in the US, Canada, Australia, and New Zealand, allowing real-time voice interactions and deep integration across Meta platforms.

While the app has been made available in India and other countries, the full suite of advanced features remains exclusive to the four mentioned markets for now.

A Personalised AI Experience

At the core of the Meta AI app is personalisation. It tailors responses based on individual user behaviour and preferences, adapting over time. Whether you’re a frequent traveler or have specific interests, the app can remember your context and refine its suggestions accordingly. Integration with the Meta Accounts Center also allows it to draw insights from your activity across Facebook and Instagram for more relevant responses.

Voice Chat That Feels Natural

A standout feature of the app is its full-duplex voice interaction, enabling fluid, real-time conversations that feel more human. Users can simply tap the microphone icon to speak with Meta AI. This conversational ability is powered by improvements in the Llama 4 model, making responses more contextual, natural, and relevant.

The app also supports image generation and editing, which can be done via text or voice inputs. While the full-duplex speech demo is still being tested, it offers a preview of Meta’s future direction in natural AI interaction.

Explore with the Discover Feed

Another unique feature is the Discover feed, where users can see how others are interacting with Meta AI. Prompts can be shared, remixed, and explored—though nothing is made public without explicit user permission.

AI on the Go: Ray-Ban Smart Glasses Integration

The Meta AI app is also the companion platform for Ray-Ban Meta smart glasses. Users can start conversations via the glasses and continue them later on the app or desktop, promoting a seamless experience across devices.

Web Enhancements and Productivity Tools

The web version of Meta AI has also received upgrades, including voice controls, improved image generation, and new tools for document creation and analysis. Users can now create, import, and export files, although some features are still in development.


From helping with daily tasks to enabling natural conversations, Meta’s new AI app is shaping up to be a comprehensive digital assistant. As Meta puts it, “Voice is the most intuitive way to interact with Meta AI,” and the new app marks a big leap toward that vision.

Also Read: PM Modi to Inaugurate Infrastructure Projects in Kerala on May 2

AAJ TIME

Recent Posts

What Is Vibe Coding and Why Is Everyone Talking About It? Here’s Everything You Need to Know

In today's fast-paced digital world, a unique word has captured the social media and coding…

1 hour ago

5G vs 6G: What’s Next in Mobile Internet Technology

The 5G rollout brought revolution in the way we connect, stream and communicate. But even…

2 hours ago

Badbox 2.0 Malware Infects Over 1 Million Android Devices: What You Need to Know

In a related development for Android users globally, the Federal Investigation Bureau of Investigation (FBI)…

21 hours ago

#ArrestKohli Trends After RCB’s IPL 2025 Triumph Turns Tragic: Who’s to Blame?

Finally, the joy of raising his first IPL title turned into heartbreak for Royal Challengers…

21 hours ago

How ChatGPT Increased Business Users By 50% in Six Months?

Openai has reached a remarkable milestone this week, in which Chatgpt commercial products are now…

22 hours ago

RBI Slashes Interest Rates Sharply: How It Will Impact Your EMIs And Loans

In an adventurous and unexpected step, the Reserve Bank of India (RBI) has cut its…

24 hours ago

This website uses cookies.