Apple Intelligence premieres new features that revolutionize the user experience
New Apple Intelligence features are available for both iPhones and the rest of the Apple ecosystem
Apple has just released an update that brings a series of innovative features to the table under the Apple Intelligence umbrella, available today on devices with iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26 and visionOS 26.
These new features are designed to improve how users interact with their devices, facilitating everything from multilingual communication to From custom image creation to smart integration that expands automation and health capabilities. The best part: many of these features run directly on the device, ensuring privacy and speed.
Communication without barriers: Live Translation built into Apple devices
One of the stars of this launch is Live Translation, which allows users to overcome language barriers in real time. Natively integrated into Messages, FaceTime, and Phone Calls, this system uses Apple Intelligence to instantly translate conversations, whether the user is typing, on a call, or using FaceTime.
In addition, with AirPods Pro 3 and AirPods 4 equipped with active noise cancellation, translation can be activated by simply pressing both earbuds or using Siri with a command as simple as “Siri, start Live Translation.” This allows you to hear the translation in your preferred language and reduces the speaker's volume to make it easier to concentrate.
This feature initially supports several languages, including English, French, German, Brazilian Portuguese, Spanish, Italian, and Japanese, with plans to expand to more languages, including Mandarin Chinese, by the end of the year. It's important to note that all conversation processing happens on the device, ensuring that privacy is maintained, a sensitive issue when we talk about artificial intelligence and real-time translation.
Vision and creativity powered by Apple Intelligence
Another striking feature is Visual Intelligence,which allows users to interact more intelligently with what appears on the screen. For example, by pressing the buttons to take a screenshot, you can select an object and search for similar images on the web or in apps like eBay, Poshmark, or Etsy.
This makes it easier to shop and find information without switching apps. Plus, with ChatGPT integration, users can naturally ask questions about what they see on screen and get quick answers or translate text with a simple tap.
Creativity also has its place with new features in Genmoji and Image Playground. You can now combine emoji and instructions to create cool new Genmojis, personalizing them with updated facial expressions or hairstyles to better reflect family and friends. In Image Playground, using ChatGPT helps generate images in a variety of styles like watercolor or oil painting, and an “Any Style” option gives complete artistic freedom based on a user’s description or photograph.
Health and Productivity: Apple Intelligence Powers Motivational Fitness and Smart Shortcuts
Apple Intelligence is also coming to support exercise with Workout Buddy, a feature that offers personalized, spoken motivation during workouts. It analyzes data like heart rate, distance, and achievements in real time to generate motivational advice with a voice generated by Fitness+ coaches. It’s available for a variety of exercise types, from outdoor running to strength training, and works with Apple Watch, AirPods, and upcoming iPhones, ensuring an immersive and private experience.
Finally, automations in the Shortcuts app just got smarter thanks to integration with Apple Intelligence. Users can create workflows that include text summaries, image generation, or even comparing audio transcripts with written notes, all while maintaining privacy with processing locally or in Apple's private cloud.
With this openness, developers are already incorporating AI models to enrich their apps with intelligent offline features, such as automatic categorization in to-do apps or script generation for video editors.

