Apple WWDC Keynote Hints at Future of Computing

Apple Inc.’s annual developer lovefest, in the form of its annual developers conference, shook off some of the awkwardness from recent events to tout its latest software updates and additions across its iPhones, Apple Watches, iPads, Macs and more on Monday.

Undaunted by its court battle over App Store policies and employee backlash over Apple’s proposed part-time return to the office, chief executive officer Tim Cook kicked off the virtual keynote address in front of a pandemic-friendly animated memoji audience.

“Last year’s WWDC was our most inclusive and most-watched developer conference ever, with nearly 25 million viewers,” Cook said. “It was exciting to have so many people join us and to see the impact it had on new Apple developers as we broaden our audience, welcoming more people from more places around the world.”

Those developers, according to Apple, grew App Store billings and sales 24 percent last year.

View Gallery

Related Gallery

Chloe Resort 2022

Other notable changes include a flashy new feature that lets iPads, Macbooks and iMacs share a mouse and keyboard, as well as files by just moving them across the spate of screens and new tricks for Photos, including allowing the Apple Watch to share pics and set personal portraits as custom watch faces. After making transit cards available in Wallet, Apple is also adding support for ID cards, which will be accepted by the Transportation Security Administration. Keys are also coming to Wallet, allowing users to unlock BMWs and other vehicles or hotel rooms at participating hospitality providers, like Hyatt hotels.

Apple Watches will soon get to set custom watch faces with portraits of loved ones. Courtesy image

But that’s just the tip of the iceberg. One feature introduced during the earlier Maps section hints at one of Apple’s priorities: augmented reality. In this case, it comes via a new navigation feature for drivers.

“When driving on highways, Maps now renders overlapping complex interchanges in three-dimensional space, making it much easier to see upcoming traffic conditions, or to know which lane you need to be in,” said Meg Frost, director of product design for Apple Maps.

The effort shows how handy 3D visuals can be for real-world uses. Later in the presentation, Susan Prescott, vice president worldwide developer relations, doubled down on the theme with the announcement of Object Capture, an intriguing tool for the upcoming Mac OS Monterrey that makes it “easy for developers to create holistic 3D objects” using 2D images.

According to Prescott, the tech relies on photogrammetry, or the science of drawing measurements from still images.

“Rather than manually creating 3D models, which can take weeks, Object Capture uses photogrammetry to turn a series of 2D images into photorealistic 3D objects in just minutes,” she explained.

Thanks to Object Capture, developers can take still photos of objects like sneakers and create realistic 3D models. Courtesy image

Game developers are already using the tool to create new immersive experiences.

Developers simply take photos of an object using the iPhone or iPad — the latest models of which have Apple’s more accurate, laser-based LIDAR sensor built in — and import the images to software program Cinema 4D to create the realistic 3D model. Creators, game developers, brands and retailers can also generate AR previews on location, to make sure they’re capturing what they need during shoots.

Wayfair is using Object Capture “to develop tools for their manufacturers, so they can use an iPhone and Mac to easily create a virtual representation of the merchandise,” Prescott continued. “This will allow Wayfair users to preview more products in their homes with AR, to make sure they choose the right product for their space.

“This is a massive step forward for 3D content creation, what used to be the most difficult and expensive part of building AR experiences and 3D scenes is now available to all developers in Mac OS Monterrey Object Capture,” she added.

Another subject that loomed large on the keynote is privacy. App-tracking requests will be logged in a new dedicated dashboard. In many ways, it’s a continuation of Apple’s recent App Tracking Transparency update, as the iOS 15 update corrals information on which apps have been granted permission to access location, microphone, contacts, images or other personal data.

And in an era when suspicions run high about voice assistants sharing information with marketers, Apple also revealed that Siri will process speech “on device,” instead of transmitting it to Apple servers.

What marketers will want to note, however, is the company’s bid to banish the pixel — the tiny bit of code typically included in messaging like promotional emails. A pixel can track if and when the email was opened, and it can also collect information like an IP or web address of the computer that opens it, which is often used to figure out the recipient’s location.

Craig Federighi, senior vice president of software engineering, kicks off the section on privacy — a key focus for Apple. Courtesy photo

Google has already waged war on website cookies, and now Apple is doing the same for Mail messages. The choice could send the advertising and marketing community into a lather. But regardless, Apple is pressing forward to banish these ad trackers from email.

Apple’s consumer-facing changes will bring some new experiences to the public, but it’s nothing compared to what’s going on under the hood.

It’s clear now that the company’s decision to take control of its own chips, in an effort to boost the processing power of its devices, and new hardware components like LIDAR have been foundational elements for a broader Apple vision, one that has only just begun to unfold.

Source: Read Full Article