Apple released several new augmented reality tools and technologies for software makers during its annual WWDC conference this week as its CEO Tim Cook has said AR is the “next big thing.”
Apple has never confirmed plans to release augmented reality hardware, but could possibly announce a headset as soon as this year. Facebook, Microsoft, and Snap are the companies working on devices that can understand the world around them and display information in front of the user’s eyes.
Apple announced several updates during the conference’s more technical parts shows that it remains an important long-term initiative for Apple. Getting developers on board to build augmented reality software now increases the chance of one or more “killer apps” being available at launch.
During the week-long conference, Apple briefed its developers on its rapidly improving tools that can make 3D models, use a device’s camera to understand hand gestures and body language, add quick AR experiences on the web, a heavily Apple-backed standard for 3D content, and an intriguing new sound technology that’s like surround sound for music or other audio.
Apple has introduced application programming interfaces or software tools, that will enable apps to create 3D models. 3D models are essential for AR because they’re what the software places in the real world. If an app doesn’t have an accurately detailed file for a shoe, then it can’t use Apple’s machine vision software to place it on a table.
Names as Object Capture, it isn’t an app. Instead, it’s a technology that allows a camera, like the iPhone’s camera, to take several photographs of an object, then stitch them together into a 3D model that can be used inside the software in minutes. Previously, precise and pricey camera setups were required for detailed object scanning.
Furthermore, using AI for understanding faces, hands, and other movements, Apple’s technology represents abilities that will be important for a computer interface that works in 3D spaces. Apple’s Vision framework software can be called by apps to detect people, faces, and poses through the iPhone’s camera.
Apple’s computer vision software can now identify objects inside images, including text on signs, as well as the ability to search for things inside photos — like a pet or a friend.