SAN JOSE â€” In 2011, Apple became the first company to place artificial intelligence in the pockets of millions of consumers when it launched the voice assistant Siri on the iPhone.
Six years later, the technology giant is struggling to find its voice in AI.
Analysts say the question of whether Apple can succeed in building great artificial-intelligence products is as fundamental to the companyâ€™s next decade as the iPhone was to its previous one. But the tech giant faces a formidable dilemma because the nature of artificial intelligence pushes Apple far out of its comfort zone of sleekly designed hardware and services.
AI programming demands a level of data collection and mining that is at odds with Appleâ€™s rigorous approach to privacy, as well as its positioning as a company that doesnâ€™t profile consumers. Moreover, Appleâ€™s long-standing penchant for secrecy has made the company less desirable in the eyes of potential star recruits, who hail from the countryâ€™s top computer science departments and are attracted to companies that publish research.
â€œArtificial intelligence is not in Appleâ€™s DNA,â€ said venture capitalist and Apple analyst Gene Munster. â€œThey understand that in the future, every company is going to become an AI company, and they are in a particularly tough spot.â€
At Appleâ€™s annual developers conference Monday â€” the same event where Siri was introduced â€” the company’s efforts to become an AI powerhouse were on display as executives launched a new stand-alone smart speaker and touted features meant to boost Siriâ€™s chops and to power AI applications on Apple products.
â€œMachine learningâ€ â€” an AI buzzword that describes a form of ultra-fast, complex computer dataÂ analysis and statistical modeling â€”Â was repeated throughout the 2Â½-hour presentation, delivered to an audience of roughly 6,000 developers here. Siri will now use machine learning to predict the times ofÂ a morning commute, or scan the travel news as you are reading it on the companyâ€™s Safari browser and then suggest related activities, such as booking a reservation.
She will use machine learning to talk with you and help you sort through music through a new $349 home automation device, the HomePod. She will automatically organize your photos into albums, such as â€œ2nd Anniversary,â€ without you giving her any context about the pictures. There was even a new software tool kit, Core ML, that will allow for faster processing of large amounts of data collected during machine learning applications. (Itâ€™s six times as fast as Googleâ€™s rival AI processor, an executive quipped.)
But Monday’s announcements come as other technology companies have released similar innovations and have already spentÂ billions on the burgeoning AI arms race. Many are placing their bets on artificial intelligence â€” software that one day may be smart enough to chat back and forth like a human, or computer vision that identifies real-world objects so well it can power the first fully functioning self-driving car.
That hasÂ putÂ Apple in the disadvantaged position of trying to lead in an area where it has fallen behind â€” and where the effort cuts against core aspects of the companyâ€™s secretive culture.
â€œThis is a substantial shift for Apple,â€ said Daniel Gross, a former Apple executive who focused on artificial intelligence. â€œThe internal focus is on building great products, not publishing papers.â€
Ten years after launching the iPhone, Apple is on the hunt for another blockbuster product that can take its place. Sales of the iPhone propelled Apple to become the most valuable company in the world and still account for more than half of the companyâ€™s revenue, which was $215.6 billion in 2016. But last year, purchases of the companyâ€™s smartphone dropped for the first time, suggesting that the market for high-end smartphones may finally be saturated.
â€œThe difference between last yearâ€™s iPhone and this yearâ€™s iPhone is going to be much smaller than the difference between the first and second iterations of the iPhone. . . . So that S-curve is starting to flatten out,â€ said Benedict Evans, a mobile expert and partner at the Silicon Valley venture capital firm Andreessen Horowitz, using an industry term for exponential increases in innovation. â€œThen you have the next transformative technology that isnâ€™t here yet.â€
Apple did not respond to repeated requests for interviews and comment.
Appleâ€™s launch of he HomePod, which will go on sale later this year, comes several years after Amazon.com and Google released their own automation devices, the Echo and the Google Home, which send consumersâ€™ spoken queries back to their servers in California for analysis. Google, Tesla, Uber, and others have been testing self-driving vehicles on public roadways for several years; Apple received its first permit to begin testing just two months ago. Before Mondayâ€™s updates, the Siri assistant was still mostly a glorified Web search that could tell the occasional preprogrammed joke.
Now Apple is racing to catch up. Last October, the company hired Russ Salakhutdinov, a Carnegie Mellon professor whose expertise is in an area of artificial intelligence known as â€œdeepâ€ or â€œunsupervisedâ€ learning, a complex branch of machine learning in which computers are trained to replicate the way the brainâ€™s neurons fire when they recognize objects or speech. Salakhutdinov is a protege of Geoffrey Hinton, perhaps the worldâ€™s top researcher in this area. Salakhutdinov divides his time between Carnegie Mellon and Apple. Hinton divides his time between Google and the University of Toronto.
Building ties to academic superstars not only helps to improve products but also becomes a key recruiting tool, said Richard Zemel, director of the Vector Institute for Artificial Intelligence and a professor specializing in machine learning at the University of Toronto.
â€œYou used to not see people with Apple name badges at conferences, and now you do,â€ Zemel said.
In December, Apple presented and published its first academic paper on artificial intelligence at an industry conference. Another paper has been accepted to a computer vision conference and will be published in July, Salakhutdinov said in an interview. Salakhutdinov said he was not authorized to discuss his work at Apple in any detail.
Zemel, who told Bloomberg two years ago that Apple was â€œoff the scaleâ€ in terms of secrecy and that such secrecy was keeping it out of the loop on major developments in the field, now said that the Cupertino, Calif., giant was â€œmaking some changes.â€
â€œBut itâ€™s going to take some work,â€ he added.
Researchers at elite universities said in interviews that Apple was still not the top choice for their computer science graduates â€” Google, Facebook and Amazon were by far the top picks â€” but that the company was moving up in the rankings. (Amazon chief executive Jeffrey P. Bezos owns The Washington Post.)
Appleâ€™s forays into AI have also been slower than its peers’ because itâ€™s been reluctant to embrace the data-mining practices of rivals Google and Facebook, experts said. The company has spent considerable resources building additional layers of privacy. Unlike Google and Facebook, which are primarily advertising companies that collect massive amounts of intimate data to profile their users, Apple believes in limiting the amount of user data it collects. At a previous developerâ€™s conference, executives bragged that the company did not build user profiles. Chief executive Tim Cook has positioned the company as the anti-Google.
But that stance against data collection becomes a problem if you are building artificial intelligence, researchers say. A home device must collect and analyze peopleâ€™s speech to improve the way the device can speak to humans, for example. For Siri to be smart, she needs to collect and interpret data from other applications, such as your calendar, your restaurant reservations and, now, your browsing.
Last year, as Apple began to embrace artificial intelligence on the iPhone, the company undertook a large privacy protection project. The project took an academic concept called differential privacy and applied it to AI applications on the iPhone. Differential privacy works by inserting noise â€” or bad information â€” into good data to confuse outsiders who might try to hone in on an individual’s records. For example, in order for Apple software to group the photos of your dog into a single album, it needs to collect many photos of your dogs.
Apple collects those images, but not before encrypting the data in them and then scrambling that data with other data, so that if anyone tried to recover the original data set they wouldnâ€™t know what was tied to a single user, the company claims. This technique is considered a stronger privacy protection than other methods, such as using mathematical formulas to render user profiles anonymous.
Appleâ€™s focus on privacy may have slowed the company down in terms of building some products, Gross said, but the trade-off would be consumer trust. â€œApple is dousing itself with an extra piece of really hard science and doing so to try and preserve your privacy,â€ he said. â€œI think Google and Facebook will have to answer to a world where a similar product that is offered is more privacy-preserving.â€
Munster pointed out that no tech company has a huge lead on artificial intelligence yet. â€œThe bad news is that Apple is behind,â€ he said. â€œThe good news is that if we look at how AI is going to impact the world, itâ€™s still early days â€” there is plenty of time to catch up.â€
Correction: An earlier version of this article incorrectly said that Apple co-founder Steve Jobs introduced Siri eight years ago. It also misstated the first name of venture capitalist and Apple analyst Gene Munster. The article has been updated.