Stepping Up the Intelligence to Rethink the Way You Interact with Your Device

Apple® has officially announced the launch of new Apple Intelligence™ features, each one designed to scale up the user experience across iPhone®, iPad®, Mac®, Apple Watch®, and Apple Vision Pro™.

According to certain reports, this new version of Apple Intelligence arrives on the scene bearing newer pathways for users through which they can interact with features like Live Translation. More on that would reveal how they can express themselves better with enhancements to Image Playground™ and Genmoji™.

On top of that, they can also leverage Shortcuts to tap into Apple Intelligence directly. Complementing this would be a facility for developers to access the on-device large language model at the core of Apple Intelligence.

Talk about all the given innovations on a deeper level, we begin from the promise of live translation, which is designed to empower users in regards to communicating across languages when messaging or speaking. The stated experience, on its part, is integrated into Messages, FaceTime®, and Phone, enabled by Apple-built models that run entirely on device to keep users’ personal conversations secure.

Particularly in Messages, Live Translation can automatically translate messages. To give you can example, if a user is making plans with new friends while traveling abroad, their message can be translated as they type, delivered in the recipient’s preferred language.

Turning our attention towards FaceTime calls, they now empower a user to follow along with translated live captions, all while still hearing the speaker’s voice. Not just that, in the event of a phone call, the translation is spoken aloud throughout the conversation.

Next up, we must dig into the enhanced brand of visual intelligence, which is now available right on a user’s iPhone screen, helping them search and take action on anything they’re viewing across their apps.

You see, visual intelligence, in its current shape and form, is already helping users learn about objects and places around them using just their iPhone camera. For instance, you can ask ChatGPT questions regarding what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products. If there’s an object a user is especially interested in, they can highlight it to search for that specific item or similar objects online.

Apple’s visual intelligence can be accessed by simply pressing the same buttons used to take a screenshot.

The expansion under focus here also marks Apple Intelligence’s foray into fitness. Named as Workout Buddy, this aspect takes into consideration user’s workout data and fitness history to generate personalized, motivational insights during their session.

In essence, Workout Buddy can analyze data from a user’s current workout along with their fitness history, based on data like heart rate, pace, distance, Activity rings, personal fitness milestones, and more. To make the proposition even better, Apple is also bringing a new text-to-speech model, which eventually translates insights into a dynamic generative voice.

At launch, Workout Buddy will be available on Apple Watch with Bluetooth headphones, requiring an Apple Intelligence-supported iPhone nearby. It will be also be compatible with some of the most popular workout types, including Outdoor and Indoor Run, Outdoor and Indoor Walk, Outdoor Cycle, HIIT, as well as Functional and Traditional Strength Training.

Among other things, we ought to mention how, from here onwards, Apple’s Genmoji and Image Playground will provide users with even more ways to express themselves. This means, apart from turning a text description into a Genmoji, users can now also mix emojis with descriptions to create something entirely new.

“Last year, we took the first steps on a journey to bring users intelligence that’s helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we’re integrating features in even more places across each of our operating systems,” said Craig Federighi, Apple’s senior vice president, Software Engineering. “We’re also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy.”

Share

Related

Kickstarting the Logistics Revolution

To live a rather meaningful life, we are very...

Unpacking the Psychology Around QR Codes to Understand the Best Ways of Leveraging Their Potential

Uniqode, the highest-ranked QR code platform on G2, has...

Will Quantum computing really affect a company’s security posture?

Since the time of Covid, no one goes to...

Role of automation in building a robust and continuous security compliance program

When it comes to compliance, the best way to...

A Billion-Dollar Plan to Make Tech Revolution Bigger than Ever Before

The human arsenal might be expansive beyond all limits,...

The Role of Insurance in Helping COVID-19 from Disrupting Tech Industries

Unlike any others the technology industry has been disrupted...

Reimagining the All-Important Delivery Experience

A human arsenal is known to hold some really...

What technologies should insurers bet on in 2021?

This is not a usual time - and, as...

Latest

No posts to display

No posts to display