Apps on iOS 16 may perform real-world actions without having to use your hands.

News Sand DC
Source: Getty Images

Apps will be able to activate real-world activities hands-free thanks to the new capability in iOS 16. Users may, for example, start playing music simply by entering a room or switch on an e-bike for a workout simply by getting on it. These hands-free activities might be performed even if the iOS user isn't actively using the app at the moment, Apple informed developers today at the company's Worldwide Developer Conference (WWDC).

If developers and accessory makers chose to use Apple's Nearby Interaction framework, the upgrade might lead to some fascinating use cases where the iPhone becomes a method to interact with items in the real world.

Apple demonstrated how applications may now connect to and share data with Bluetooth LE gadgets even while they are operating in the background during the session. However, with iOS 16, applications will be able to initiate a Nearby Interaction session in the background with a Bluetooth LE item that also supports Ultra Wideband.

Apple modified the standard for accessory makers to accommodate these new background sessions as a result of this.

This sets the path for a future in which the barrier between applications and the actual world blurs, but whether the third-party app and device developers opt to exploit the capabilities remains to be seen.

The new functionality is part of a larger upgrade to Apple's Nearby Interaction framework, which was the topic of discussion during the developer session.

This framework, which was introduced at WWDC 2020 with iOS 14, allows third-party app developers to access the U1 or Ultra-Wideband (UWB) processor on iPhone 11 and subsequent devices, as well as Apple Watch and other third-party accessories. It's what powers Apple's AirTag's Precision Finding capabilities, which allow iPhone users to open the "Find My" app and be guided to their AirTag's precise location using on-screen directional arrows and other guidance that tells you how far away you are from the AirTag or if it's on a different floor.

Third-party developers will be able to create applications that accomplish a lot of the same things with iOS 16 owing to a new feature that allows them to combine ARKit — Apple's augmented reality development toolkit — with the Nearby Interaction framework.

Developers will be able to leverage the device's trajectory, as computed by ARKit, to intelligently direct a user to a missing item or another object with which the user may wish to interact, based on the app's functionality. Developers can get more constant distance and orientation information with ARKit than they could with Nearby Interaction alone.

However, the functionality does not have to be limited to AirTag-like accessories made by third parties. Another scenario Apple demonstrated was a museum using Ultra-Wideband equipment to lead visitors around its displays.

This functionality may also be used to overlay directional arrows or other AR elements on top of the camera's view of the actual world to assist users to find the Ultra-Wideband object or accessory. Continuing the demonstration, Apple demonstrated how red AR bubbles may emerge on the app's screen above the camera view to indicate which way to travel.

Longer-term, this feature paves the way for Apple's anticipated mixed reality smart glasses, which would likely have AR-powered applications at their heart.

The new capability is being rolled out to iOS 16 beta testers ahead of its release to the general public later this year.

#buttons=(Accept !) #days=(20)

Our website uses cookies to enhance your experience. Learn More
Accept !