Apple CEO Tim Cook has been talking up augmented reality for the past year, but don’t take that to mean that Apple will launch a dedicated AR product anytime soon. In an interview with The Independent, Cook said that currently “the technology itself doesn’t exist” to make augmented reality glasses “in a quality way.” And Apple, he said, won’t ship an AR product unless it can deliver “a great experience.”
Cook identified two problems with current AR devices. Their field of view and the quality of their displays, he said, aren’t there yet. “Anything you would see on the market any time soon would not be something any of us would be satisfied with,” Cook told The Independent. “Nor do I think the vast majority of people would be satisfied.”
He’s not wrong. Current augmented reality headsets all leave something to be desired. Microsoft’s HoloLens works, but it has a limited field of view and requires a large headset. Meta’s is less expensive but similarly huge. And Google Glass (which doesn’t even totally count as augmented reality) flopped badly immediately upon release.
But even if Apple doesn’t plan on diving into dedicated AR hardware, it already made an enormous play for the augmented reality market this year — perhaps doing more than any company to date. With the release of iOS 11 last month, recent iPhones were granted the ability to perform all kinds of AR tricks using something Apple calls ARKit. It lets developers make augmented reality games and makes it easy for camera apps to implement augmented reality stickers.
That means Apple is in an early position to be at the center of a possible boom in augmented reality experiences. Cook seems to believe as much. He compared the introduction of AR features to the introduction of the App Store. “Now you couldn’t imagine your life without apps,” he said. “AR is like that. It will be that dramatic.”
Even if it won’t happen right away, there are already signs that Apple is exploring dedicated AR hardware. The company has a patent application that envisions augmented reality glasses, and Apple reportedly has a team of over 1,000 people working on AR. In typical Apple fashion, Cook told The Independent that Apple has no interest in rushing into the market just to get a head start. “We don’t give a rat’s about being first,” he said. “We want to be the best.”
Snapchat is partnering with artist Jeff Koons to bring some of his more iconic sculptures to its app.
From today, Snapchat users will be able to explore his work in augmented reality at select locations around the world. His balloon dog sculptures will be digitally placed in Hyde Park in London and Central Park in New York, and others will be placed at other popular public spaces around the world.
When users are near one of the sculptures, the locations of which they can check on Snapchat’s site, the app will point them towards its exact location. The sculpture will appear on the user’s phone as they approach, allowing them to explore it up close, almost as if they were actually standing next to a real sculpture.
Snapchat has a form on its website inviting other artists to bring their work to the messaging platform, but it’s unclear whether it’s currently working with any other artists yet. Right now, it’s essentially like Pokémon Go, but for three-story sculptures of inflated dog balloons.
Snap appears to be betting big on augmented reality as something that will keep users coming back to its app over its much larger competitorslike Instagram. Today’s partnership comes less than a week after Snap launched augmented-reality world lenses—interactive models similar to the Koons sculptures that users can share in their snaps—for advertisers. Right now, users can add a model of the flying car from the forthcoming Blade Runner sequel, or add a man selling Bud Light beer to their snaps. And a few weeks earlier, Snapchat introduced three-dimensional versions of Bitmoji, the user-created cartoon emojis, which users can add to their snaps.
To experience AUGMENTED REALITY, you look through a device screen or put on a headset and a virtual image is laid over the room you’re in. You can see what’s around you, but part of it is blocked out by whatever video projection is playing on your headset.
The Basic Setup
▶ A camera and screen equipped with computer vision, a technology that identifies objects and surfaces. Adding depth and motion sensors lets a device map the room around you and track your motion through it. Your app can then overlay anything from a first-person-shooter zombie attack to the steps to replace a fan belt.
▶ For now it’s pretty simple: catching Pokémon (Pokémon GO), mapping constellations (Sky Map), inking a tattoo (InkHunter), turning you into a half-dog (Snapchat).
AR can’t scan a room and identify every object. But you can teach its computer vision to identify individual objects, like a motorcycle, when prompted, says Mike Campbell, executive vice president of the ThingWorx AR platform. “There’s not enough computing power to analyze everything it sees.”
▶ Hands-on skill training, interior design, wearable computing.
AR can lead a factory worker on a tutorial, but right now the technology won’t change your life unless you own a factory, says Amber Case, a fellow at Harvard’s Berkman Klein Center for Internet & Society. A Microsoft HoloLens can overlay hidden parts such as a tucked-away air filter and demonstrate its removal. Similar programs are in development for phones and tablets and could soon offer life-changing relief for tasks like Ikea furniture assembly.
▶ Motion sickness sets in when your perceived motion—what you see—doesn’t match what your inner ear feels. That’s not the case with augmented reality, says Robert Scoble, coauthor of The Fourth Transformation: How Augmented Reality & Artificial Intelligence Will Change Everything. You’re still looking out on the real world and the same horizon.
▶ AR on mobile devices really is mobile. Unlike high-end VR, which can’t leave a room, AR can enhance a city tour or museum. Last winter, the Detroit Institute of Arts lent visitors Android phones to view the skeleton inside a 2,000-year-old sarcophagus and to see the original colors on a now-biege Assyrian sculpture.
However, AR is difficult to wear on your face. Everybody thinks we’ll be walking around with the next Google Glass but social constraints prevent that, says Case, adding, “Sunshine makes headset AR difficult to see, voice and hand controls are still unreliable.”
How Apple Will Own It
▶ AR will explode in the next year. Today, relatively few devices offer a rich AR experience, leading to a lack of demand for new AR apps—phones with Google’s Tango AR number less than a million. Expect that to change after Apple’s June release of iOS 11 ARKit for developers, says Scoble. ARKit is a bundled suite of AR tools that can reach a quarter billion Apple devices. Additionally, this fall’s new iPhone adds 3D sensors and room mapping that can play hologram-like counter-op games or virtually measure and then furnish a room without draining battery.
From 50 ways to leave your lover, as the song goes, to 750 types of shampoos, we live in an endless sea of choices. And although I haven’t been in the market for hair products in a while, I understand the appeal of picking a product that’s just right for you, even if the decision-making is often agonizing. This quandary (the “Goldilocks Syndrome”, of finding the option that is “just right”) has now made its way to the travel industry, as the race is on to deliver highly personalized and contextual offers for your next flight, hotel room or car rental.
Technology, of course, is both a key driver and enabler of this brave new world of merchandising in the travel business. But this is not your garden variety relational-databases-and-object-oriented-systems tech. What is allowing airlines, hotels and other travel companies to behave more like modern-day retailers is the clever use of self-learning systems, heuristics trained by massive data sets and haptic-enabled video hardware. Machine learning (ML), artificial intelligence (AI), augmented reality (AR) and virtual reality (VR) are starting to dramatically shape the way we will seek and select our travel experiences.
AI is already starting to change how we search for and book travel. Recent innovation and investment has poured into front-end technologies that leverage machine learning to fine tune search results based on your explicit and implicit preferences. These range from algorithms that are constantly refining how options are ranked on your favorite travel website, to apps on your mobile phone that consider past trips, expressed sentiment (think thumbs up, likes/dislikes, reviews) and volunteered information like frequent traveler numbers.
Business travel, as well, is positioned for the application of AI techniques, even if not all advances are visible to the naked eye. You can take photos of a stack of receipts on your smartphone; optical character recognition software codifies expense amounts and currencies, while machine learning algorithms pick out nuances like categories and spending patterns.
AI is also improving efficiencies in many operational systems that form the backbone of travel. Machine learning is already starting to replace a lot of rule-based probabilistic models in airport systems to optimize flight landing paths to meet noise abatement guidelines, or change gate/ramp sequencing patterns to maximize fuel efficiency.
VR and AR are still changing and evolving rapidly, with many consumer technology giants publicly announcing products this year we can expect to see rapid early adoption and mainstreaming of these technologies. Just as music, photos, videos and messaging became ubiquitous thanks to embedded capabilities in our phones, future AR and VR applications are likely to become commonplace.
VR offers a rich, immersive experience for travel inspiration, and it is easy to imagine destination content being developed for a VR environment. But VR can also be applied to travel search and shopping. My company, Amadeus, recently demonstrated a seamless flight booking experience that includes seat selection and payment. Virtually “walking” onto an airplane and looking a specific seat you are about to purchase makes it easier for consumers to make informed decisions, while allowing airlines to clearly differentiate their premium offerings.
AR will probably have a more immediate impact than VR, however, in part due to the presence of advanced camera, location and sensor technology already available today on higher-end smartphones. Airports are experimenting with beacon technology where an AR overlay would be able to easily and quickly guide you to your tight connection for an onward flight, or a tailored shopping or dining experience if you have a longer layover.
“Any sufficiently advanced technology is indistinguishable from magic,” goes Arthur C. Clarke’s famously quoted third law. But as we expect more authentic experiences: precise search results, an informed booking or an immersive travel adventure, we can count on increasingly magical technology from systems that learn to deliver us our “perfect bowl of porridge.”
Merge is announcing that the Merge Cube is debuting exclusively this week at Walmart stores across the U.S.
The Merge Cube is a holographic toy that allows users to physically hold and interact with 3D objects using augmented reality (AR) technology. The Merge Cube costs only $15, and it is compatible with iOS and Android devices. It features dozens of games and experiences built for it.
The launch of the Merge Cube in Walmart stores follows the earlier launch of the company’s Merge VR/AR Goggles, which are $60 devices that are available in 5,000 stores worldwide. While the goggles are aimed at those ages 10 and up, the Merge Cube is targeted at kids. The Merge Cube will expand into other major retailers soon.
“We’re excited to bring the Merge Cube to Walmart stores and physically put this technology into people’s hands. With this first-of-its-kind product, people can experience the wonder and amazement of interacting with holographic, 3D content in a natural and intuitive way,” said Merge founder Franklin Lyons, in a statement. “Our Merge Cube and Goggles allow users to interact with more than just a screen — now, they can build worlds, explore the human brain, visit foreign lands, and more through the power of VR/AR.”
Also launching today is Merge Miniverse, a portal for virtual and augmented worlds. Merge makes both physical products and apps, and it also curates a library of family-friendly experiences like 360-degree videos, virtual and augmented reality apps, and games.
The Merge Miniverse allows AR and VR explorers to choose from hundreds of apps and experiences to use with the Merge VR/AR Goggles and Merge Cube, as well as with other AR/VR devices.
Some of the Merge Cube apps currently available on the Miniverse (with more coming soon) include:
Merge is inviting developers from around the world to join them in shaping the future of play. In June, the company announced its Merge AR/VR Developer Fund, a $1 million fund to support the developer community building apps for Merge products.
While companies like Magic Leap and Oculus Rift are spending millions of dollars to develop their new Virtual Reality products and VR startups are raising enormous amounts of money from venture capitalists and angel investors, many people may wonder, “How can I get in the game? Are there any small-business VR ideas with no or very little startup cost?”
Yes, there are a few ways you can start a new business in virtual reality with very little investment. Here are a few ideas for starting a virtual reality business now.
Virtual reality headsets are pretty pricey right now. Oculus Rift units cost $600, and the HTC Vive goes as high as $800, plus the powerful computer you’ll need to use either one. That means that virtual reality headsets are more of a luxury item than an everyday device.
Virtual reality theaters and pop-ups are opening worldwide right now, attracting both attention and customers. This option will require a bit more of investment — you will need to find a venue as well as buy equipment. But once you find a space, you can create a pop-up movie theater where people sit with VR headsets and enjoy 360 degree videos, virtual reality photo galleries with 360 degree pictures from great photographers, or try-out studios where they can come to experience Oculus Rift or HTC vive. With new VR products coming online all the time, charging for them as entertainment can be profitable.
Google cardboard is one of the most popular VR viewers because it’s very cheap compared to other, more advanced headsets. It has a lot of liabilities, however. It’s not very convenient to wear, requires an additional strap and if you use it often enough, at some point you will need to buy a new one. After all, it’s cardboard. A lot of companies, especially in China started producing their own version of popular viewers. With a little startup capital, you can buy those headsets wholesale and sell them retail via Amazon or at a retail location.
Create and host local meet-up events, conferences, lectures, fairs and other social events related to VR. Once you have a brand and a decent audience, you can sell access to advertisers and companies who make VR products — there are already a ton of them, and there will be more.
In VR, people are craving interesting content. As a startup, you can create a professional YouTube channel or blog reviewing the latest available technology, observing conferences, exhibitions, games and movies. Winning a sizeable audience can allow you to monetize it — and your influence — later.
Apple has often been accused of acting like it invented things that others have been doing for years. That complaint is not without merit, however Apple can lay claim to transforming existing things into mainstream successes, which takes no small amount of invention in its own right. Fingerprint authentication and contactless payments are just two recent examples, having both existed in Japan and on niche devices for over a decade before Apple raised them to global prominence with the iPhone.
Next up on Apple’s agenda is augmented reality, the act of superimposing digital data and visuals atop a live video feed of your surroundings — something that Google, Microsoft, and many others have been experimenting with for a long time. Apple is far from being able to claim it invented AR, but its new ARKit in iOS 11 is already showing signs to suggest that Apple will help bring AR into the mainstream faster and better than anyone else.
The chronic problem with augmented reality has always been one of practicality. You could have the most basic forms of AR on your regular phone, as provided by apps like Layar, which has been around since 2009, but those have never been particularly compelling. Or you could have more sophisticated and appealing augmentations, as presented by Google’s Tango project, but you’d need a big fat phablet to lug around to make them happen. Apple’s difference is to combine the convenience of your daily phone with the appeal of advanced AR.
Measure distances with your iPhone. Just because you can. Clever little #ARKit app by @BalestraPatrick ♂️ https://t.co/b2mXe2FS84 pic.twitter.com/pyoHp99Yts
— Made With ARKit (@madewithARKit) June 25, 2017
Looking at this distance-measuring app, it seems so simple and obvious. Of course your super-powered, multi-core phone should be smart enough to measure out basic distances, and there have indeed been many wonky apps trying to do that in the past. But measuring with AR, as already shown off by Google Tango phones, allows you a much more intuitive method for doing it. Having the phone actually aware of the three-dimensional space in its view allows for precise measurements, which can be represented with a neat hologram of a measuring tape. Apple’s advantage in the contest for doing this best is simple: while Google Tango demands special hardware, ARKit requires only that you have a recent iOS device. At WWDC earlier this month, Craig Federighi described ARKit as “the largest AR platform in the world,” and he was right.
Apple’s AR will immediately reach millions of people who already have the requisite hardware. And while it looks to be functionally as flexible and capable as Google’s Tango (check out some early examples of fanciful experiments with ARKit), its broader audience makes it much more enticing for serious developers to invest their time and money into. Google’s Tango is about the future whereas Apple’s ARKit is about the present.
BOOM And just like that we have #ARKit measurement app number 2 https://t.co/cjfQMpHmx0 → by @laanlabs pic.twitter.com/U8QKFjiMXs
— Made With ARKit (@madewithARKit) June 25, 2017
Considering how little time it took to develop two convincingly accurate AR measuring apps with the iOS 11 beta, and reading the comments from their makers, Apple also appears to have an advantage in the ease of development with ARKit. It’s exciting to think that there are still three months before the release of the next iPhone and the accompanying finalization of iOS 11, by which time Apple’s big-budget app developer partners are likely to have a deluge of AR-enabled apps for people to play with. That’s how stuff goes mainstream: as a big wave of change that touches everyone from casual Pokémon Go players to serious home DIY geeks figuring out how to arrange their living room furniture.
For the people who don’t care about incremental changes in phone specs or design, the differentiator between devices has always been in the unique things that each one can do — or, failing that, the convenience and ease of use of common features. Apple’s iPhone is more convenient than Google’s Project Tango devices and with iOS 11 it’ll have much better AR capabilities than its nearest premium Android rivals. So if we’re looking for the AR innovator that will take the technology into the mainstream, Apple once again looks like the likeliest suspect.
Apple may leverage augmented reality on the iPhone to help pave the way for a future smart glasses product, UBS said in a note to investors Tuesday.
Apple recently launched its ARKit developer tools, which will allow its partners to build new augmented reality applications for millions of iPhones already in the hands of consumers. It will give Apple an overnight leg up on companies like Google that are participating in the space on a much smaller scale.
Apple hasn’t participated in the smart glasses space yet, but the idea is that a user will be able to wear a special pair of glasses that overlays computer images over the real world. You might learn more about a restaurant, perhaps view its menu, by standing in front of it, for example.
Right now, companies like Apple and Google would be forced to create bulky glasses that wouldn’t be feasible or comfortable to wear. UBS believes Apple could use AR-ready iPhones to power the experience.
“Advanced sensors and camera capabilities will enhance the iPhone; eventually there could be independent hardware offerings, perhaps iGlass,” UBS analyst Steven Milunovich said. “We can imagine a pair of glasses with quintessential Apple design (iGlass), which enable a Hololens-type experience,” the company said, referring to Microsoft’s bulky alternative.
“However, the amount of compute power and sensors required likely pose a serious design challenge. If Apple could find a way to send massive amounts of data from the eyeglasses to the iPhone where the bulk of the compute would occur, the eyewear could have a more attractive design. The issue then becomes how to transfer massive amounts of complex data between devices quickly.”
Milunovich laid out 10 AR use cases ranging from games and entertainment to home improvement and health care/medical diagnostics. It said AR will help Apple retain iPhone users.
For the multiple times Apple executives mentioned “machine learning” at the company’s developers conference Monday, they also emphasized an older theme that could be more important: Unified software.
Apple’s mobile operating system, or iOS, works the same and runs the same on every iPhone. That’s why app developers often prefer building for Apple before they build for Google’s Android, an operating system that’s splintered across different kinds of hardware. While the latest versions of Android consistently run on higher-end phones like the Google Pixel, cheaper and older phones often only run previous versions.
Apple’s control over both its operating system and the hardware on which that software runs came up throughout the keynote. That highlights what Google really needs to worry about when competing with Apple: The fragmentation of Android. This matters even when Google has a technical advantage.
Apple’s latest announcements in augmented reality are where the company’s advantage with a unified platform showed the most.
The ability to quickly deploy software to devices immediately differentiates Apple’s AR from Google AR computing platform Tango, which only runs on select newer-model phones.
“When you bring the software together with these devices, we actually have hundreds of millions of iPhones and iPads that are going to be capable of AR,” Apple’s head of software, Craig Federighi, said. “That’s going to make overnight the AR kit the largest AR platform in the world.”
In general, only a portion of Android phones tend to run the latest version of the mobile software at any given time. While Google builds its own hardware now — such as the Pixel phone — it still relies largely on third-party manufacturers and carriers to deliver Android updates to hardware that runs it.
Update: Google said it has made efforts to lessen the fragmentation of software across Android devices, including rolling out a program to make it easier for device makers to upgrade to newer versions of Android, and ensuring software in Google Play Services is updated frequently across devices. Google reports 93 percent of users ahave the latest version of Google Play Services.
Federighi brought up hardware and platform advantages again when he announced a new set of machine learning software tools for developers.
Developers that used Apple’s new machine learning tools would be able to execute their use with “tremendous performance on-device,” he said, and have access to “all the data privacy benefits and all of the carefully tuned compatibility with all of our platforms.”
What he didn’t seem to emphasize was what made the tools themselves interesting. That’s probably because Apple is known to lag behind Google, Microsoft and others in machine learning.
The iPhone didn’t publish its first artificial intelligence research paper until December. By comparison, Google had 44 papers accepted this year to the International Conference on Machine Learning while Microsoft had 33, according to a Medium post by a researcher at AI think tank OpenAI.
And Apple’s new machine learning offerings are also not all drawn from Apple technology. “Some of the pre-trained machine learning models that Apple offers are open-sourced Google code, primarily for image recognition,” noted Quartz reporter Dave Gershgorn. Google confirmed this.
The Washington Post is launching an augmented-reality series today, the start of a push into AR-enhanced storytelling this year.
The first series uses AR to let people explore innovative buildings around the world, starting with the Elbphilharmonie concert hall in Hamburg, Germany, whose structure lets visitors hear and see the same thing no matter where they sit. Readers can access the story on the Post’s app on iOS devices, then point their smartphone’s camera at the ceiling of any room they’re in and tap play. The real ceiling is transformed into the concert hall ceiling while an audio narration by Post art and architecture critic Philip Kennicott plays. Users can also tap a prompt to read an accompanying article by Kennicott.
With AR’s obvious application to visual stories, Kennicott said there’s a question of whether AR will replace the need for critics like him. To him, the answer is that AR can enhance, rather than replace the experience, and hence make criticism more interesting and relevant to readers. “It’s a great way to get people a lot more than what they’re getting from a photographer or video,” Kennicott said.
The series will continue with at least two more installments through the end of the summer. The Post hopes to do around six AR series total this year and plans to expand the AR stories to Android and its Rainbow app.
The Post deliberately started small, with the first video in the series only running about 10 seconds, said Joey Marburger, the Post’s head of product. “With that quick experience, you get more out of the story,” he said. “But we didn’t want it to be the only way you can experience the story. We didn’t want to overdo it.”
Audi is sponsoring the series. Its first ad will appear as a visual, and future ads will take the form of AR branded stories in upcoming installments.
AR is still a new experience for most people and requires prompts to get people to try it. It also doesn’t make sense for every story. But the Post made it a priority this year because unlike virtual reality, it’s less expensive, doesn’t require a headset and advertiser demand is there, Marburger said. The series took six people in editorial and engineering to produce, which is comparable to the size of teams it puts on other projects.
Facebook’s Snapchat-style augmented reality face filters are coming to Instagram. Eight different filters will be available starting today, including a few different crowns, ones that make a person look like a koala or a rabbit, and another that sends math equations spinning around your head.
Instagram’s face filters will work whether you’re using the front or the back camera on your phone. You can find them by opening up the camera interface in the app and tapping the new icon in the bottom right corner. The filters can be used in any of Instagram’s shooting modes — photo, video, or even Boomerang. You can access them by downloading the new 10.21 update on the App Store or Google Play Store.
The idea of using augmented reality technology to map and apply animations to a user’s face was popularized by Snapchat, which bought Looksery — a company that pioneered the tech — back in 2015. Facebook responded by snatching up Belarusian startup MSQRD in early 2016, and the tech made its way into Facebook Stories earlier this year.
This is far from the first idea Facebook has lifted from Snap — adding Snapchat’s 24-hour Stories feature to Instagram is the real molten core of this entire drama — but augmented reality face filters were one of the last blockbuster Snapchat features that Instagram was missing. They are also just one small part of the much larger vision Facebook has for augmented reality, which the company laid out in detail at last month’s F8 conference. (Snap, of course, shares a similar vision.)
Instagram is also adding a few other features to the app today. Users will now be able to add hashtag “stickers” to a photo or video when posting it to their Story. Viewers will be able to tap these stickers to explore other media that’s been shared with the same hashtag, the same way you can already tag other users or apply geostickers. A new “rewind” video feature (also “inspired” by Snapchat) and an eraser have been added to the app as well.
Self-driving vehicles have yet to hit the road in a major way, but Amazon already is exploring the technology’s potential to change how your packages are delivered.
Amazon is the nation’s largest online retailer, and its decisions not only turn heads but influence the entire retail and shipping industries, analysts say. That means any foray into the self-driving arena – whether as a developer or customer – could have a significant effect on the technology’s adoption.
Amazon has assigned a dozen employees to determine how it can use the technology as part of its business, the Wall Street Journal reported Monday. It’s unclear what shape Amazon’s efforts will take or how far along they might be, although the company has no plans to create its own vehicles, according to the report.
Nevertheless, the Amazon group offers an early indication that big companies are preparing for the technology’s impact.
Transportation experts anticipate that self-driving cars will fundamentally alter the way people get around and the way companies ship goods, changes that stand to disrupt entire industries and leave millions of professional drivers without jobs. The forthcoming shift has attracted the money and attention of the biggest names in the technology and automotive industries, including Apple, Uber, Google, Ford, General Motors and Tesla, among others.
In particular, the technology could make long-haul shipping cheaper and faster because, unlike human drivers, machines do not command a salary or require down time. That would be important to Amazon, whose shipping costs continue to climb as the company sells more products and ships them faster, according to its annual report. Amazon even invested in its own fleet of trucks in December 2015 to give the company greater control over distribution.
If Amazon adopts self-driving technology, it may push others to do the same.
“When Amazon sneezes, everyone wakes up,” said Satish Jindel, president of SJ Consulting Group, a transportation and logistics advisory firm.
The company said it shipped more than 1 billion items during the 2016 holiday season.
An Amazon spokeswoman declined a request for an interview, citing a “long-standing practice of not commenting on rumors and speculation.” The company’s chief executive, Jeffrey P. Bezos, owns The Washington Post.
Amazon has become something of a pioneer in home delivery, in part by setting the standard for how quickly purchases arrive on your doorstep. The company has begun using aerial drones in an effort to deliver goods more quickly, completing its first successful flight to a customer in the United Kingdom in December. Like self-driving vehicles, drones will need to overcome regulatory hurdles before they’re widely deployed.
In its warehouses, Amazon has used thousands of robots that pull items from shelves and pack them. Last summer, Deutsche Bank analysts found the robots reduced the time to fulfill an order from more than an hour to 15 minutes, according to business news site Quartz. They also saved Amazon about $22 million per warehouse. Amazon acquired Kiva, the company that makes the robots, in 2012 for $775 million.
At this week’s Facebook F8 conference in San Jose, Mark Zuckerberg doubled down on his crazy ambitious 10-year plan for the company, first revealed in April 2016.
Basically, Zuckerberg’s uses this roadmap to demonstrate Facebook’s three-stage game plan in action: First, you take the time to develop a neat cutting-edge technology. Then you build a product based on it. Then you turn it into an ecosystem where developers and outside companies can use that technology to build their own businesses.
When Zuckerberg first announced this plan last year, it was big on vision, but short on specifics.
On Facebook’s planet of 2026, the entire world has internet access — with many people likely getting it through Internet.org, Facebook’s connectivity arm. Zuckerberg reiterated this week that the company is working on smart glasses that look like your normal everyday Warby Parkers. And underpinning all of this, Facebook is promising artificial intelligence good enough that we can talk to computers as easily as chatting with humans.
For science-fiction lovers, the world Facebook is starting to build is very cool and insanely ambitious. Instead of smartphones, tablets, TVs, or anything else with a screen, all our computing is projected straight into our eyes as we type with our brains.
A mixed-reality world is exciting for society and for Facebook shareholders. But it also opens the door to some crazy future scenarios, where Facebook, or some other tech company, intermediates everything you see, hear, and, maybe even, think. And as we ponder the implications of that kind of future, consider how fast we’ve already progressed on Zuckerberg’s timeline.
We’re now one year closer to Facebook’s vision for 2026. And things are slowly, but surely, starting to come together, as the social network’s plans for virtual and augmented reality, universal internet connectivity, and artificial intelligence start to slowly move from fantasy into reality.
In fact, Michael Abrash, the chief scientist of Facebook-owned Oculus Research, said this week that we could be just 5 years away from a point where augmented reality glasses become good enough to go mainstream. And Facebook is now developing technology that lets you “type” with your brain, meaning you’d type, point, and click by literally thinking at your smart glasses. Facebook is giving us a glimpse of this with the Camera Effects platform, making your phone into an AR device.
The potential here is tremendous. Remember that Facebook’s mission is all about sharing, and this kind of virtual, ubiquitous ” teleportation ” and interaction is an immensely powerful means to that end.
This week, Oculus unveiled “Facebook Spaces,” a “social VR” app that lets denizens of virtual reality hang out with each other, even if some people are in the real world and some people have a headset strapped on. It’s slightly creepy, but it’s a sign of the way that Facebook sees you and your friends spending time together in the future.
And if you’re wearing those glasses, there’s no guarantee that the person who’s taking your McDonald’s order is a human, after all. Imagine a virtual avatar sitting at the cash register, projected straight into your eyeballs, and taking your order. With Facebook announcing its plans to revamp its Messenger platform with AI features that also make it more business-friendly, the virtual fast-food cashier is not such a far-fetched scenario.
Sure, Facebook Messenger chatbots have struggled to gain widespread acceptance since they were introduced a year ago. But as demonstrated with Microsoft’s Xiaoice and even the Tay disaster, we’re inching towards more human-like systems that you can just talk to. And if Facebook’s crazy plan to let you “hear” with your skin plays out, they can talk to you while you’re wearing those glasses. And again, you’ll be able to reply with just a thought.
Apple’s iPhone 8 will reportedly include an iPad Pro-like smart connector that may be the link up for augmented reality and virtual reality headsets. The report is tenuous, but the idea that Apple is ready to introduce its augmented reality platform this fall is interesting.
Word of Apple’s plan comes courtesy of the Israeli website The Verifier saying the smart connector will also be used for charging, sort of like MagSafe for the iPhone. Assuming they’re right, Apple will use the iPhone’s smart connector right away for more than it’s done with the iPad Pro. Currently, the only accessory taking advantage of the iPad’s smart connector is the Smart Keyboard cover.
It’s no secret Apple is exploring augmented reality, which overlays data, graphics, and other content onto whatever users are looking at. Google’s first public attempt at grabbing the augmented reality market was Google Glass—high tech eye glasses that projected information only the wearer could see.
Google Glass never amounted to more than a public exploration of what’s possible with augmented reality technology in part because convincing people to wear glasses who don’t need them is a hard sell. Apple will likely use the iPhone as its augmented reality platform, just as Facebook just announced it’s doing.
Using smartphones with augmented reality makes sense because users won’t have to buy more equipment to carry around, and the built-in cameras can handle the image and video capture necessary while the phone processors handle the real time overlay of data, all of which displays on the built-in screen.
Relying on smartphones means users need their phone in hand to experience augmented reality, which they currently do when playing Pokémon GO—a popular augmented reality game that sends players on hunts in the real world to capture virtual characters.
If Apple wants to make augmented reality feel more immersive, adding in some sort of glasses is the most logical path to take. Glasses as an accessory instead of a requirement means more iPhone owners can try augmented reality without spending extra money, and those who want a deeper experience can buy Apple’s special glasses.
Connecting the glasses to a smart port, however, seems clunky and awkward since there’ll be an extra cable running from the glasses to your iPhone. Instead, Apple could use the smart connector to charge its glasses and go with Bluetooth when they’re in use.
That said, there isn’t much right now to back up the idea of augmented reality glasses for the iPhone 8 yet. The Verifier doesn’t have a history with insider sources, and there aren’t any independent reports echoing what they’re saying.
Bloomberg’s Mark Gurman, for example, has a well documented track record with Apple product leaks, and his report from earlier this week has no mention of the smart connector or augmented reality glasses. Until more sources back up this report we’re remaining skeptical.
LONDON — What will the store of the future look like? Will we be served by fleets of gleaming robots, using built-in facial recognition technology to adjust each sales pitch to a person’s current mood or past spending preferences? Will there be voice-activated personal assistants, downloading the availability, color and fit of any and every garment to your smartphone? Three-D printing stations? No checkout counters when you leave? Could there even be floating, holographic product displays on the shop floor that change when a customer walks by?
Perhaps shoppers will make all their purchases from their own home, using virtual fitting rooms via virtual reality headsets. Drones will then drop deliveries in the backyard or on the front steps.
As fanciful as these innovations may sound, none are hypothetical. All exist, are being tested and could be rolled out in as little as a decade. But is this the sort of shopping experience that customers really want?
Scores of leading retailers and fashion brands increasingly say no. And in an ever-more-volatile and unpredictable shopping environment, where long-term survival is dictated by anticipating and catering to consumers’ desires (often before they themselves even know what they want), the race to find out how and where people will do their spending has started to heat up.
On Wednesday, for example, Farfetch — the global online marketplace for independent luxury boutiques — held a daylong event at the Design Museum in London. There, in front of 200 fashion industry insiders and partners, José Neves, the founder of Farfetch, unveiled “The Store of the Future,” a suite of new technologies developed by his company to help brands and boutiques bridge the worlds of online and offline.
Nevertheless, in a telephone call last week, Mr. Neves said: “I am a huge believer in physical stores. They are not going to vanish and will stay at the center of the seismic retail revolution that is only just getting started.”
A corresponding report released by Bain & Company this week suggests that he might be right; although 70 percent of high-end purchases are influenced by online interactions, the consultancy maintains that stores will continue to play a critical role, with 75 percent of sales still occurring in a physical location by 2025.
What may change, however, is a store’s primary purpose. Forget e-commerce, or bricks and mortar, or even omnichannel sales; according to Mr. Neves, the new retail era is one anchored in “augmented retail,” a blend of the digital and physical allowing a shopper to shift seamlessly between the two realms.
“Customers don’t wake up and think, I will be online this morning or offline later; we are rarely purely one or the other anymore and tend to jump constantly between two worlds without noticing,” Mr. Neves said. “Harnessing this behavior is a major challenge for retailers and brands and why we are doing this event. It is in our interests to give our partners firsthand access to information about changing behaviors and new technology, so everyone is ‘future-proofed’ as to what might come next.”
Holition is an augmented-reality consultancy and software provider based in London that has worked with some well-known retail brands. Last fall it worked with the British cosmetics company Charlotte Tilbury on a “magic mirror” concept, a virtual makeup selling tool that allows users to try on different looks that are digitally superimposed onto their faces in 40 seconds. They can then send the selection of photos to their email address, ready to be referred to later or shared socially. And they then can buy products, available from glamorous makeup artists milling around nearby.
“Technology is still often a barrier in the retail place, with smartphones, iPads and screens getting in the way of what the consumer wants to see, touch and feel 80 percent of the time,” said Jonathan Chippindale, Holition’s chief executive.
“The holy grail now for retailers is creating digital empathy. No one can really guess what the future will look like. But those who are using technology and data to create bespoke shopping experiences that recognize every person is different, and with different needs, are more likely to come out on top.”
Tom Chapman, a founder of MatchesFashion.com, agreed. It was originally a bricks and mortar boutique; now 95 percent of the British fashion retailer’s sales — which hit 204 million pounds (about $253 million) in 2016 — are online. But Mr. Chapman said boutiques and physical events remained vital “marketing opportunities,” with a more specialized inventory selection and the opportunity for customers to do more than buy merchandise; for example, the MatchesFashion.com “In Residence” series offers talks, film screenings and designer meet-and-greets, along with social media lessons, exercise classes and floristry sessions.
“You need to be accessible to your customer wherever she wants to find you,” Mr. Chapman said, “and we have seen that a sizable proportion want human interaction and access that goes far beyond a credit card transaction.”
At the moment augmented reality is still a clunky, phone-in-front-of-your-face experience, but today Facebook promised to transform that experience when it announced a new AR platform. And you can be sure that quite a few folks at Apple were watching with great interest the live stream of the F8 developers conference in San Jose, California, where the new platform was introduced with plenty of striking visual aids.
Onstage at the conference, Facebook founder and CEO Mark Zuckerberg and some of his top aides energetically kicked off what is sure to be a new platform war—one that will soon be joined by Apple and plenty of other competitors.
“This space will get crowded though, and I expect both Apple and Google to also bring these functionalities to their camera apps and offer developers the chance to build AR experiences on their platforms as well,” Creative Strategies analyst Ben Bajarin told Fast Company in an email.
Facebook made a strategic decision to be the first tech giant to launch an AR platform when the hardware and technology is still in its nascent stage. In Facebook’s version of AR, a user will hold their phone in front of their face and watch as all kinds of moving imagery and information is superimposed over the picture of the real world seen by the camera. Yep, kinda like Pokémon Go.
Facebook said it will provide developers with precise location, object recognition, and 3D effects tools they need to start building their own custom AR experiences. That could mean anything from AR games to a retailer placing a data card over a product in front of the camera.
“Facebook is smart to give developers the tools to build AR experiences and give those tools to a wider developer community,” Bajarin says. “They are approaching this as a true platform play, which is smart.”
It seems likely that Apple will have to weigh in with its own AR platform sooner or later—I’m guessing sooner. Two media outlets have already reported that Apple is working on prototypes of some kind of AR glasses. This is unsurprising; the company has likely been working on glasses for a couple of years. The press reports say the company already has “hundreds” of engineers working on its own AR effort.
Apple isn’t known for being first in emerging technologies, preferring to hang back and drop in when it knows it can deliver a product or service better than everybody else. Entering the market with a cool-looking and high-functioning set of AR glasses would be one way to outshine the competing platforms. But Apple may not be able to wait until the core technologies needed for such glasses become available at mass market levels.
In the near term, Apple might begin adding new camera technology to the iPhone that can support AR apps, both its own and those from third-party developers. Fast Company previously reported that Apple will source 3D camera technology from Lumentum for its 10th anniversary iPhone 8, which will likely be announced this fall.
Facebook’s AR announcement today may end up serving as the ideal opening act for Apple, says Technalysis Research president Bob O’Donnell. Facebook’s experience in its current form is built to work with the camera on the user’s phone, he says. “So Facebook announces a 2D Snapchat-like experience with photo filters and calls it AR.” Facebook will have introduced the AR concept to a mass audience with an experience that Apple can best by leveraging a true 3D camera in the next iPhone (or, possibly, with the dual cameras in the iPhone 7 Plus), says O’Donnell.
Zuckerberg may also have lowered expectations around AR—which, after all the overly hyped Magic Leap coverage, has led many people to anticipate a headset-wearable experience. “So when Apple announces an AR experience that works on the phone instead of an AR headset, it won’t be as much of a disappointment,” O’Donnell says.
Creative Strategies’ Bajarin believes that Facebook and Google are more capable than Apple at delivering cloud-based augmented reality content from the cloud, which could be a big advantage. He also believes Facebook and Google have the edge in the machine learning needed to help identify objects and people in the real world, a key function in AR apps. And let’s not forget Microsoft, which arguably has more experience in the space than anyone, after having developed the first major AR headset with the HoloLens.
Apple’s advantage lies in its ability to control far more of the hardware and software stack underpinning the AR experience—the apps, the phone, and the OS. Apple may begin building AR features deep in iOS, so that it could begin to “augment” lots of different aspects of the phone experience, like Siri does. By building AR into the OS, Apple might enable a richer experience that’s optimized for the iPhone.
It’s even possible that Apple could begin talking about its AR strategy as soon as its upcoming developer conference, WWDC, in June. Since Apple likely wants to speak to developers en masse and in person about what could become a major new platform, it may choose to at least start the conversation, as opposed to waiting until next year’s conference.
If it does, it will likely lay out a wade-into-the-water approach that will look something like what Facebook announced today. That is, the AR experience will be seen through the display of an iPhone, not through a face wearable. But that might be just the thing to get developers and consumers used to the concept, without getting too exotic with the hardware.
Whatever the shape of the product, Apple will try to make a grand entrance into the AR market. It won’t be first in, so it will have to infuse the technology with simplicity, style, and a wow factor that turns consumers on and makes the competing platforms seem inferior.
Facebook today announced a platform for developers to build new experiences into its in-app cameras, saying it would bring augmented reality into the mainstream and position Facebook to reap the majority of the benefits. Speaking on stage at Facebook’s F8 developer conference, CEO Mark Zuckerberg said that AR would be the next major platform for computing. A closed beta that opens today will let developers begin experimenting with photo and video filters, games, art projects, and more.
In a demonstration, Zuckerberg showed a variety of dazzling camera effects. Swiping to the stories camera that Facebook introduced last month, users will soon find thousands of augmented reality effects, he said. These go beyond the art frames and face filters of today to include three-dimensional text and images. In one demo, giant puffy words reading “It’s feeding time” rose out of a breakfast table, where a series of sharks swam around a cereal bowl.
In another demo, Facebook’s camera turned a two-dimensional photo into a 3D A mundane picture of an office with chairs transformed in several ways: appearing to fill up with water, or bouncy balls, and even Skittles. “Because the future is delicious,” Zuckerberg said. (Hello advertisers!) The camera platform will launch with just six platforms, Zuckerberg told Recode.
Facebook’s camera will use object recognition to suggest effects based on the object. Tap on a coffee cup, for example, and you’ll be able to add steam. Or tap a wine bottle and add a card showing the vintage, and presumably, a link to buy it yourself. “Some of these effects are going to be fun,” Zuckerberg said. “Others are going to be useful.”
The addition of face filters and 3D effects into camera apps was pioneered by Snapchat, which Facebook has spent the past year disassembling and integrating into its suite of apps. Just today, Snapchat introduced world lenses, which project similar 3D images into the world around you.
Zuckerberg acknowledged Facebook had been somewhat late to the AR party. But he said the company’s object recognition and machine learning technology would give it an advantage over its rivals. “Even though we were a little slow to add cameras to all our apps, I’m confident that now we’re going to push this augmented reality camera forward,” he said.
Virtual and augmented reality is about much more than just gaming. It’s a trend that’s expected to grow rapidly and consume almost every part of our physical lives, including fashion.
From notebooks to calendars, alarm clocks and cameras, objects that were once integral have all but disappeared into the digital landscape. Could our wardrobes be about to follow suit?
It has even been suggested that technology could become so advanced and bizarre that we end up renting virtual clothes and jewelry for our augmented reality worlds.
“All those things we thought essential materially, disappeared into the virtual environment,” Jody Medich, director of design at Singularity University told WIRED Retail.
“They have all gone into the screen – but in the future, we are going to be looking through that screen.”
Medich believes that augmented reality will be ubiquitous in just five years through our phones, headsets, or AR contact lenses and while that might sound a little hasty to some of you, it’s more realistic than you might think.
As early as 2014, VR was making huge waves in the fashion world when London Fashion Week offered users a front-row view of the catwalk with a 360 degree stream. More recently, Balenciaga’s Autumn Winter 2016 show was broadcast in virtual reality, while Hussein Chalayan released a panoramic video of his show; heck even Dior has its own VR headset.
“This technology will be ubiquitous – it won’t affect little bits of our lives, it’s going to affect every aspect of our lives. All those activities we do on our second screens are going to change in a radical way,” says Medich.
“Instead of looking into a screen, we will be looking through the screen. When we do that, magical things will happen.”
People love the convenience of online shopping but it’s difficult to know exactly what you’re getting or how well it’s going to fit. Could AR be the solution?
Medich certainly seems to think so, referencing furniture retailer Wayfair, which used Google’s Tango technology to build an app that can measure your environment.
“Tango knows the dimensions and can see the decor of my space, then suggest what types of end tables I will want. Then I can just put it there and see what it looks like.”
Imagine this but in a virtual retail world where you can be matched with products specific to your shape and personal style or where your bedroom becomes a fitting room and builds outfits based on exactly what’s in your wardrobe.
Why stop there? What about personal AI stylists or the ability to rent high-end clothing and jewellery that can only be viewed through AR lenses? If Merdich is right, the accessibility of luxury fashion could soon be available to every consumer.
Of course, how and to what extent the medium will impact the fashion industry is yet to be determined but nonetheless it comes with endless possibilities.
Sephora continues to be one of the most aggressive companies among beauty brands (and retailers in general) in its readiness to add new features to its mobile app. The retailer has added to its list of Virtual Artist offerings multiple times within the last year, and has been busily extending these capabilities to its chatbot on Facebook Messenger as well. It seems like the company is intent on enabling in its mobile app just about everything you might do at the beauty counter in a Sephora store.
Augmented reality technology is what make much of that possible. It allows Sephora customers to “try on” make up and cosmetics they same way they might try them on on a store by matching it with their facial type or features, or even overlaying it on images of their own faces. It’s a mobile app tool that is not just useful for beauty brands either — Gap said at CES 2017 in January that it’s testing a DressingRoom app that would use the same sort of augmented reality technology to allow customers to virtually try on clothes.
Retailers that thrive on try-on traffic — people visiting their brick-and-mortar stores because they want to try something on in person before buying — have some interesting decisions to make about how far they want to go with enabling these realistic try-on capabilities via mobile and social media. Could the richly-featured apps keep people away from stores and addition selling opportunities for the retailers, or could the apps drive them to buy more from retailers who they feel are serving them well?
Sephora seems to be pretty invested in the latter outcome, and doing everything it can to make it happen. It’s in a very competitive space, as fellow beauty retailer Ulta Beauty is succeeding on the e-commerce front while also continuing to open new stores. Macy’s also appears to be working harder to leverage Bluemercury, the beauty chain it acquired in 2015. With beauty being such a battleground, Sephora will need to keep its Virtual Artist well-armed.
Apple is ramping up its plans to pursue augmented reality, according to a new Bloomberg report, which provides some detail on how the company may incorporate the technology into the iPhone.
The Cupertino, Calif.-based company is said to be exploring features that would allow iPhone owners to change the depth of photographs taken with the phone’s camera after they’ve been captured. Another potential implementation would let users isolate a specific object in an image, such as a person’s head, and tilt it 180 degrees, the report says. Apple is also developing a feature that would make it possible to overlay virtual effects and objects onto a photo, similar to Snapchat.
The report cautions that Apple may not roll out these features, but that the company has iPhone camera engineers working on them. Apple’s iPhone 7 Plus has two cameras, one specifically tailored for better zooming, which allows the phone to shoot images that sharpen a subject against a slightly blurred background.
Apple has reportedly assembled a team of experts specializing in augmented reality technology, camera sensors, 3D video production, and wearable tech hardware. This group reportedly includes Mike Rockwell, who previously ran the hardware and new technologies team at Dolby, as well as Cody White, who was formerly a lead engineer for Amazon’s Lumberyard VR platform. Last year, The Financial Times also reported that Apple was building a secret team dedicated to developing virtual and augmented reality products.
The iPhone maker is also said to be working on a pair of augmented reality glasses, as Bloomberg previously reported. This new headset combined with the upcoming iPhone camera features could put Apple in direct competition with Snap Inc., which released its first pair of glasses in September.
Apple CEO Tim Cook has previously said that augmented reality is an area of interest for the company, although he hasn’t discussed plans for future products. “I think AR is extremely interesting and sort of a core technology,” Cook told The Washington Post in August. “So, yes, it’s something we’re doing a lot of things on behind that curtain we talked about.”