Tag Archives: augmented reality

Apple’s AR is closer to reality than Google’s

Apple has often been accused of acting like it invented things that others have been doing for years. That complaint is not without merit, however Apple can lay claim to transforming existing things into mainstream successes, which takes no small amount of invention in its own right. Fingerprint authentication and contactless payments are just two recent examples, having both existed in Japan and on niche devices for over a decade before Apple raised them to global prominence with the iPhone.

 

Next up on Apple’s agenda is augmented reality, the act of superimposing digital data and visuals atop a live video feed of your surroundings — something that Google, Microsoft, and many others have been experimenting with for a long time. Apple is far from being able to claim it invented AR, but its new ARKit in iOS 11 is already showing signs to suggest that Apple will help bring AR into the mainstream faster and better than anyone else.

The chronic problem with augmented reality has always been one of practicality. You could have the most basic forms of AR on your regular phone, as provided by apps like Layar, which has been around since 2009, but those have never been particularly compelling. Or you could have more sophisticated and appealing augmentations, as presented by Google’s Tango project, but you’d need a big fat phablet to lug around to make them happen. Apple’s difference is to combine the convenience of your daily phone with the appeal of advanced AR.

Looking at this distance-measuring app, it seems so simple and obvious. Of course your super-powered, multi-core phone should be smart enough to measure out basic distances, and there have indeed been many wonky apps trying to do that in the past. But measuring with AR, as already shown off by Google Tango phones, allows you a much more intuitive method for doing it. Having the phone actually aware of the three-dimensional space in its view allows for precise measurements, which can be represented with a neat hologram of a measuring tape. Apple’s advantage in the contest for doing this best is simple: while Google Tango demands special hardware, ARKit requires only that you have a recent iOS device. At WWDC earlier this month, Craig Federighi described ARKit as “the largest AR platform in the world,” and he was right.

arkit-hero

 

Apple’s AR will immediately reach millions of people who already have the requisite hardware. And while it looks to be functionally as flexible and capable as Google’s Tango (check out some early examples of fanciful experiments with ARKit), its broader audience makes it much more enticing for serious developers to invest their time and money into. Google’s Tango is about the future whereas Apple’s ARKit is about the present.

Considering how little time it took to develop two convincingly accurate AR measuring apps with the iOS 11 beta, and reading the comments from their makers, Apple also appears to have an advantage in the ease of development with ARKit. It’s exciting to think that there are still three months before the release of the next iPhone and the accompanying finalization of iOS 11, by which time Apple’s big-budget app developer partners are likely to have a deluge of AR-enabled apps for people to play with. That’s how stuff goes mainstream: as a big wave of change that touches everyone from casual Pokémon Go players to serious home DIY geeks figuring out how to arrange their living room furniture.

For the people who don’t care about incremental changes in phone specs or design, the differentiator between devices has always been in the unique things that each one can do — or, failing that, the convenience and ease of use of common features. Apple’s iPhone is more convenient than Google’s Project Tango devices and with iOS 11 it’ll have much better AR capabilities than its nearest premium Android rivals. So if we’re looking for the AR innovator that will take the technology into the mainstream, Apple once again looks like the likeliest suspect.

Source:

https://www.theverge.com/2017/6/26/15872332/apple-arkit-ios-11-augmented-reality-developer-excitement

Advertisements

Apple may eventually launch ‘iGlass’ smart glasses for augmented reality

Apple may leverage augmented reality on the iPhone to help pave the way for a future smart glasses product, UBS said in a note to investors Tuesday.

Apple recently launched its ARKit developer tools, which will allow its partners to build new augmented reality applications for millions of iPhones already in the hands of consumers. It will give Apple an overnight leg up on companies like Google that are participating in the space on a much smaller scale.

Apple hasn’t participated in the smart glasses space yet, but the idea is that a user will be able to wear a special pair of glasses that overlays computer images over the real world. You might learn more about a restaurant, perhaps view its menu, by standing in front of it, for example.

 

Right now, companies like Apple and Google would be forced to create bulky glasses that wouldn’t be feasible or comfortable to wear. UBS believes Apple could use AR-ready iPhones to power the experience.

“Advanced sensors and camera capabilities will enhance the iPhone; eventually there could be independent hardware offerings, perhaps iGlass,” UBS analyst Steven Milunovich said. “We can imagine a pair of glasses with quintessential Apple design (iGlass), which enable a Hololens-type experience,” the company said, referring to Microsoft’s bulky alternative.

feature_AR_glasses

“However, the amount of compute power and sensors required likely pose a serious design challenge. If Apple could find a way to send massive amounts of data from the eyeglasses to the iPhone where the bulk of the compute would occur, the eyewear could have a more attractive design. The issue then becomes how to transfer massive amounts of complex data between devices quickly.”

Milunovich laid out 10 AR use cases ranging from games and entertainment to home improvement and health care/medical diagnostics. It said AR will help Apple retain iPhone users.

d841d8124323d96953012673b21e493a-1200-80

Source:

http://www.cnbc.com/2017/06/20/apple-smart-glasses-for-augmented-reality-could-leverage-iphone-power-ubs-says.html

Apple’s advances in augmented reality highlight its real advantage over Google

For the multiple times Apple executives mentioned “machine learning” at the company’s developers conference Monday, they also emphasized an older theme that could be more important: Unified software.

 

Apple’s mobile operating system, or iOS, works the same and runs the same on every iPhone. That’s why app developers often prefer building for Apple before they build for Google’s Android, an operating system that’s splintered across different kinds of hardware. While the latest versions of Android consistently run on higher-end phones like the Google Pixel, cheaper and older phones often only run previous versions.

Apple’s control over both its operating system and the hardware on which that software runs came up throughout the keynote. That highlights what Google really needs to worry about when competing with Apple: The fragmentation of Android. This matters even when Google has a technical advantage.

Apple’s latest announcements in augmented reality are where the company’s advantage with a unified platform showed the most.

 

The ability to quickly deploy software to devices immediately differentiates Apple’s AR from Google AR computing platform Tango, which only runs on select newer-model phones.

 Keynote Address Opens Apple Worldwide Developers Conference

“When you bring the software together with these devices, we actually have hundreds of millions of iPhones and iPads that are going to be capable of AR,” Apple’s head of software, Craig Federighi, said. “That’s going to make overnight the AR kit the largest AR platform in the world.”

In general, only a portion of Android phones tend to run the latest version of the mobile software at any given time. While Google builds its own hardware now — such as the Pixel phone — it still relies largely on third-party manufacturers and carriers to deliver Android updates to hardware that runs it.

 

Update: Google said it has made efforts to lessen the fragmentation of software across Android devices, including rolling out a program to make it easier for device makers to upgrade to newer versions of Android, and ensuring software in Google Play Services is updated frequently across devices. Google reports 93 percent of users ahave the latest version of Google Play Services.

Federighi brought up hardware and platform advantages again when he announced a new set of machine learning software tools for developers.

Developers that used Apple’s new machine learning tools would be able to execute their use with “tremendous performance on-device,” he said, and have access to “all the data privacy benefits and all of the carefully tuned compatibility with all of our platforms.”

 

What he didn’t seem to emphasize was what made the tools themselves interesting. That’s probably because Apple is known to lag behind Google, Microsoft and others in machine learning.

The iPhone didn’t publish its first artificial intelligence research paper until December. By comparison, Google had 44 papers accepted this year to the International Conference on Machine Learning while Microsoft had 33, according to a Medium post by a researcher at AI think tank OpenAI.

And Apple’s new machine learning offerings are also not all drawn from Apple technology. “Some of the pre-trained machine learning models that Apple offers are open-sourced Google code, primarily for image recognition,” noted Quartz reporter Dave Gershgorn. Google confirmed this.

Source:

https://www.recode.net/2017/6/7/15749860/apple-ar-toolkit-highlights-ios-advantages-unified-software

The Washington Post is diving into augmented reality

The Washington Post is launching an augmented-reality series today, the start of a push into AR-enhanced storytelling this year.

The first series uses AR to let people explore innovative buildings around the world, starting with the Elbphilharmonie concert hall in Hamburg, Germany, whose structure lets visitors hear and see the same thing no matter where they sit. Readers can access the story on the Post’s app on iOS devices, then point their smartphone’s camera at the ceiling of any room they’re in and tap play. The real ceiling is transformed into the concert hall ceiling while an audio narration by Post art and architecture critic Philip Kennicott plays. Users can also tap a prompt to read an accompanying article by Kennicott.

With AR’s obvious application to visual stories, Kennicott said there’s a question of whether AR will replace the need for critics like him. To him, the answer is that AR can enhance, rather than replace the experience, and hence make criticism more interesting and relevant to readers. “It’s a great way to get people a lot more than what they’re getting from a photographer or video,” Kennicott said.

The series will continue with at least two more installments through the end of the summer. The Post hopes to do around six AR series total this year and plans to expand the AR stories to Android and its Rainbow app.

The Post deliberately started small, with the first video in the series only running about 10 seconds, said Joey Marburger, the Post’s head of product. “With that quick experience, you get more out of the story,” he said. “But we didn’t want it to be the only way you can experience the story. We didn’t want to overdo it.”

Audi is sponsoring the series. Its first ad will appear as a visual, and future ads will take the form of AR branded stories in upcoming installments.

AR is still a new experience for most people and requires prompts to get people to try it. It also doesn’t make sense for every story. But the Post made it a priority this year because unlike virtual reality, it’s less expensive, doesn’t require a headset and advertiser demand is there, Marburger said. The series took six people in editorial and engineering to produce, which is comparable to the size of teams it puts on other projects.

washington-post-start-using-augmented-reality-tell-better-stories.1280x600

Source:

https://digiday.com/media/washington-post-diving-augmented-reality/

Instagram adds augmented reality face filters

Facebook’s Snapchat-style augmented reality face filters are coming to Instagram. Eight different filters will be available starting today, including a few different crowns, ones that make a person look like a koala or a rabbit, and another that sends math equations spinning around your head.

 

Instagram’s face filters will work whether you’re using the front or the back camera on your phone. You can find them by opening up the camera interface in the app and tapping the new icon in the bottom right corner. The filters can be used in any of Instagram’s shooting modes — photo, video, or even Boomerang. You can access them by downloading the new 10.21 update on the App Store or Google Play Store.

The idea of using augmented reality technology to map and apply animations to a user’s face was popularized by Snapchat, which bought Looksery — a company that pioneered the tech — back in 2015. Facebook responded by snatching up Belarusian startup MSQRD in early 2016, and the tech made its way into Facebook Stories earlier this year.

This is far from the first idea Facebook has lifted from Snap — adding Snapchat’s 24-hour Stories feature to Instagram is the real molten core of this entire drama — but augmented reality face filters were one of the last blockbuster Snapchat features that Instagram was missing. They are also just one small part of the much larger vision Facebook has for augmented reality, which the company laid out in detail at last month’s F8 conference. (Snap, of course, shares a similar vision.)

 

Instagram is also adding a few other features to the app today. Users will now be able to add hashtag “stickers” to a photo or video when posting it to their Story. Viewers will be able to tap these stickers to explore other media that’s been shared with the same hashtag, the same way you can already tag other users or apply geostickers. A new “rewind” video feature (also “inspired” by Snapchat) and an eraser have been added to the app as well.

instagram-face-filter

Source:

https://www.theverge.com/2017/5/16/15643062/instagram-face-filters-snapchat-facebook-features

Why Amazon’s use of self-driving technology would be a game changer

Self-driving vehicles have yet to hit the road in a major way, but Amazon already is exploring the technology’s potential to change how your packages are delivered.

Amazon is the nation’s largest online retailer, and its decisions not only turn heads but influence the entire retail and shipping industries, analysts say. That means any foray into the self-driving arena – whether as a developer or customer – could have a significant effect on the technology’s adoption.

Amazon has assigned a dozen employees to determine how it can use the technology as part of its business, the Wall Street Journal reported Monday. It’s unclear what shape Amazon’s efforts will take or how far along they might be, although the company has no plans to create its own vehicles, according to the report.

Nevertheless, the Amazon group offers an early indication that big companies are preparing for the technology’s impact.

Transportation experts anticipate that self-driving cars will fundamentally alter the way people get around and the way companies ship goods, changes that stand to disrupt entire industries and leave millions of professional drivers without jobs. The forthcoming shift has attracted the money and attention of the biggest names in the technology and automotive industries, including Apple, Uber, Google, Ford, General Motors and Tesla, among others.

amazon-is-now-using-a-whole-lot-more-of-the-robots-from-the-company-it-bought-for-775-million

In particular, the technology could make long-haul shipping cheaper and faster because, unlike human drivers, machines do not command a salary or require down time. That would be important to Amazon, whose shipping costs continue to climb as the company sells more products and ships them faster, according to its annual report. Amazon even invested in its own fleet of trucks in December 2015 to give the company greater control over distribution.

If Amazon adopts self-driving technology, it may push others to do the same.

“When Amazon sneezes, everyone wakes up,” said Satish Jindel, president of SJ Consulting Group, a transportation and logistics advisory firm.

The company said it shipped more than 1 billion items during the 2016 holiday season.

 

An Amazon spokeswoman declined a request for an interview, citing a “long-standing practice of not commenting on rumors and speculation.” The company’s chief executive, Jeffrey P. Bezos, owns The Washington Post.

Amazon has become something of a pioneer in home delivery, in part by setting the standard for how quickly purchases arrive on your doorstep. The company has begun using aerial drones in an effort to deliver goods more quickly, completing its first successful flight to a customer in the United Kingdom in December. Like self-driving vehicles, drones will need to overcome regulatory hurdles before they’re widely deployed.

In its warehouses, Amazon has used thousands of robots that pull items from shelves and pack them. Last summer, Deutsche Bank analysts found the robots reduced the time to fulfill an order from more than an hour to 15 minutes, according to business news site Quartz. They also saved Amazon about $22 million per warehouse. Amazon acquired Kiva, the company that makes the robots, in 2012 for $775 million.

download

Source:

http://www.denverpost.com/2017/04/26/amazon-self-driving-delivery/

The smartphone is eventually going to die — this is Mark Zuckerberg’s crazy vision for what comes next

At this week’s Facebook F8 conference in San Jose, Mark Zuckerberg doubled down on his crazy ambitious 10-year plan for the company, first revealed in April 2016.

Basically, Zuckerberg’s uses this roadmap to demonstrate Facebook’s three-stage game plan in action: First, you take the time to develop a neat cutting-edge technology. Then you build a product based on it. Then you turn it into an ecosystem where developers and outside companies can use that technology to build their own businesses.

When Zuckerberg first announced this plan last year, it was big on vision, but short on specifics.

On Facebook’s planet of 2026, the entire world has internet access — with many people likely getting it through Internet.org, Facebook’s connectivity arm. Zuckerberg reiterated this week that the company is working on smart glasses that look like your normal everyday Warby Parkers. And underpinning all of this, Facebook is promising artificial intelligence good enough that we can talk to computers as easily as chatting with humans.

58fa606c7522ca38008b5661-1392

A world without screens

For science-fiction lovers, the world Facebook is starting to build is very cool and insanely ambitious. Instead of smartphones, tablets, TVs, or anything else with a screen, all our computing is projected straight into our eyes as we type with our brains.

A mixed-reality world is exciting for society and for Facebook shareholders. But it also opens the door to some crazy future scenarios, where Facebook, or some other tech company, intermediates everything you see, hear, and, maybe even, think. And as we ponder the implications of that kind of future, consider how fast we’ve already progressed on Zuckerberg’s timeline.

We’re now one year closer to Facebook’s vision for 2026. And things are slowly, but surely, starting to come together, as the social network’s plans for virtual and augmented reality, universal internet connectivity, and artificial intelligence start to slowly move from fantasy into reality.

In fact, Michael Abrash, the chief scientist of Facebook-owned Oculus Research, said this week that we could be just 5 years away from a point where augmented reality glasses become good enough to go mainstream. And Facebook is now developing technology that lets you “type” with your brain, meaning you’d type, point, and click by literally thinking at your smart glasses. Facebook is giving us a glimpse of this with the Camera Effects platform, making your phone into an AR device.

Fries with that?

The potential here is tremendous. Remember that Facebook’s mission is all about sharing, and this kind of virtual, ubiquitous ” teleportation ” and interaction is an immensely powerful means to that end.

This week, Oculus unveiled “Facebook Spaces,” a “social VR” app that lets denizens of virtual reality hang out with each other, even if some people are in the real world and some people have a headset strapped on. It’s slightly creepy, but it’s a sign of the way that Facebook sees you and your friends spending time together in the future. 

And if you’re wearing those glasses, there’s no guarantee that the person who’s taking your McDonald’s order is a human, after all. Imagine a virtual avatar sitting at the cash register, projected straight into your eyeballs, and taking your order. With Facebook announcing its plans to revamp its Messenger platform with AI features that also make it more business-friendly, the virtual fast-food cashier is not such a far-fetched scenario.

Sure, Facebook Messenger chatbots have struggled to gain widespread acceptance since they were introduced a year ago. But as demonstrated with Microsoft’s Xiaoice and even the Tay disaster, we’re inching towards more human-like systems that you can just talk to. And if Facebook’s crazy plan to let you “hear” with your skin plays out, they can talk to you while you’re wearing those glasses. And again, you’ll be able to reply with just a thought.

screenshot 2017-04-20 172747

Source:

http://www.businessinsider.com/facebook-f8-mark-zuckerberg-augmented-reality-2026-2017-4

Apple’s Augmented Reality Plans May Include iPhone 8 Smart Connector for Special Glasses

Apple’s iPhone 8 will reportedly include an iPad Pro-like smart connector that may be the link up for augmented reality and virtual reality headsets. The report is tenuous, but the idea that Apple is ready to introduce its augmented reality platform this fall is interesting.

Word of Apple’s plan comes courtesy of the Israeli website The Verifier saying the smart connector will also be used for charging, sort of like MagSafe for the iPhone. Assuming they’re right, Apple will use the iPhone’s smart connector right away for more than it’s done with the iPad Pro. Currently, the only accessory taking advantage of the iPad’s smart connector is the Smart Keyboard cover.

It’s no secret Apple is exploring augmented reality, which overlays data, graphics, and other content onto whatever users are looking at. Google’s first public attempt at grabbing the augmented reality market was Google Glass—high tech eye glasses that projected information only the wearer could see.

Google Glass never amounted to more than a public exploration of what’s possible with augmented reality technology in part because convincing people to wear glasses who don’t need them is a hard sell. Apple will likely use the iPhone as its augmented reality platform, just as Facebook just announced it’s doing.

hololens-augmented-reality

Using smartphones with augmented reality makes sense because users won’t have to buy more equipment to carry around, and the built-in cameras can handle the image and video capture necessary while the phone processors handle the real time overlay of data, all of which displays on the built-in screen.

Relying on smartphones means users need their phone in hand to experience augmented reality, which they currently do when playing Pokémon GO—a popular augmented reality game that sends players on hunts in the real world to capture virtual characters.

Smartphones and Augmented Reality

If Apple wants to make augmented reality feel more immersive, adding in some sort of glasses is the most logical path to take. Glasses as an accessory instead of a requirement means more iPhone owners can try augmented reality without spending extra money, and those who want a deeper experience can buy Apple’s special glasses.

Connecting the glasses to a smart port, however, seems clunky and awkward since there’ll be an extra cable running from the glasses to your iPhone. Instead, Apple could use the smart connector to charge its glasses and go with Bluetooth when they’re in use.

That said, there isn’t much right now to back up the idea of augmented reality glasses for the iPhone 8 yet. The Verifier doesn’t have a history with insider sources, and there aren’t any independent reports echoing what they’re saying.

Bloomberg’s Mark Gurman, for example, has a well documented track record with Apple product leaks, and his report from earlier this week has no mention of the smart connector or augmented reality glasses. Until more sources back up this report we’re remaining skeptical.

Source:

https://www.macobserver.com/news/iphone-8-augmented-reality-smart-connector/

Imagining the Retail Store of the Future

LONDON — What will the store of the future look like? Will we be served by fleets of gleaming robots, using built-in facial recognition technology to adjust each sales pitch to a person’s current mood or past spending preferences? Will there be voice-activated personal assistants, downloading the availability, color and fit of any and every garment to your smartphone? Three-D printing stations? No checkout counters when you leave? Could there even be floating, holographic product displays on the shop floor that change when a customer walks by?

Perhaps shoppers will make all their purchases from their own home, using virtual fitting rooms via virtual reality headsets. Drones will then drop deliveries in the backyard or on the front steps.

As fanciful as these innovations may sound, none are hypothetical. All exist, are being tested and could be rolled out in as little as a decade. But is this the sort of shopping experience that customers really want?

Scores of leading retailers and fashion brands increasingly say no. And in an ever-more-volatile and unpredictable shopping environment, where long-term survival is dictated by anticipating and catering to consumers’ desires (often before they themselves even know what they want), the race to find out how and where people will do their spending has started to heat up.

IFLO_cissco.com_600x400

On Wednesday, for example, Farfetch — the global online marketplace for independent luxury boutiques — held a daylong event at the Design Museum in London. There, in front of 200 fashion industry insiders and partners, José Neves, the founder of Farfetch, unveiled “The Store of the Future,” a suite of new technologies developed by his company to help brands and boutiques bridge the worlds of online and offline.

Nevertheless, in a telephone call last week, Mr. Neves said: “I am a huge believer in physical stores. They are not going to vanish and will stay at the center of the seismic retail revolution that is only just getting started.”

A corresponding report released by Bain & Company this week suggests that he might be right; although 70 percent of high-end purchases are influenced by online interactions, the consultancy maintains that stores will continue to play a critical role, with 75 percent of sales still occurring in a physical location by 2025.

What may change, however, is a store’s primary purpose. Forget e-commerce, or bricks and mortar, or even omnichannel sales; according to Mr. Neves, the new retail era is one anchored in “augmented retail,” a blend of the digital and physical allowing a shopper to shift seamlessly between the two realms.

“Customers don’t wake up and think, I will be online this morning or offline later; we are rarely purely one or the other anymore and tend to jump constantly between two worlds without noticing,” Mr. Neves said. “Harnessing this behavior is a major challenge for retailers and brands and why we are doing this event. It is in our interests to give our partners firsthand access to information about changing behaviors and new technology, so everyone is ‘future-proofed’ as to what might come next.”

Holition is an augmented-reality consultancy and software provider based in London that has worked with some well-known retail brands. Last fall it worked with the British cosmetics company Charlotte Tilbury on a “magic mirror” concept, a virtual makeup selling tool that allows users to try on different looks that are digitally superimposed onto their faces in 40 seconds. They can then send the selection of photos to their email address, ready to be referred to later or shared socially. And they then can buy products, available from glamorous makeup artists milling around nearby.

“Technology is still often a barrier in the retail place, with smartphones, iPads and screens getting in the way of what the consumer wants to see, touch and feel 80 percent of the time,” said Jonathan Chippindale, Holition’s chief executive.

“The holy grail now for retailers is creating digital empathy. No one can really guess what the future will look like. But those who are using technology and data to create bespoke shopping experiences that recognize every person is different, and with different needs, are more likely to come out on top.”

Tom Chapman, a founder of MatchesFashion.com, agreed. It was originally a bricks and mortar boutique; now 95 percent of the British fashion retailer’s sales — which hit 204 million pounds (about $253 million) in 2016 — are online. But Mr. Chapman said boutiques and physical events remained vital “marketing opportunities,” with a more specialized inventory selection and the opportunity for customers to do more than buy merchandise; for example, the MatchesFashion.com “In Residence” series offers talks, film screenings and designer meet-and-greets, along with social media lessons, exercise classes and floristry sessions.

“You need to be accessible to your customer wherever she wants to find you,” Mr. Chapman said, “and we have seen that a sizable proportion want human interaction and access that goes far beyond a credit card transaction.”

13StoreFuture-web1-master675

Source:

Apple Will Soon Join Facebook in the AR Online Platform Video War

At the moment augmented reality is still a clunky, phone-in-front-of-your-face experience, but today Facebook promised to transform that experience when it announced a new AR platform. And you can be sure that quite a few folks at Apple were watching with great interest the live stream of the F8 developers conference in San Jose, California, where the new platform was introduced with plenty of striking visual aids.

 

Onstage at the conference, Facebook founder and CEO Mark Zuckerberg and some of his top aides energetically kicked off what is sure to be a new platform war—one that will soon be joined by Apple and plenty of other competitors.

“This space will get crowded though, and I expect both Apple and Google to also bring these functionalities to their camera apps and offer developers the chance to build AR experiences on their platforms as well,” Creative Strategies analyst Ben Bajarin told Fast Company in an email.

Facebook made a strategic decision to be the first tech giant to launch an AR platform when the hardware and technology is still in its nascent stage. In Facebook’s version of AR, a user will hold their phone in front of their face and watch as all kinds of moving imagery and information is superimposed over the picture of the real world seen by the camera. Yep, kinda like Pokémon Go.

Facebook said it will provide developers with precise location, object recognition, and 3D effects tools they need to start building their own custom AR experiences. That could mean anything from AR games to a retailer placing a data card over a product in front of the camera.

“Facebook is smart to give developers the tools to build AR experiences and give those tools to a wider developer community,” Bajarin says. “They are approaching this as a true platform play, which is smart.”

It seems likely that Apple will have to weigh in with its own AR platform sooner or later—I’m guessing sooner. Two media outlets have already reported that Apple is working on prototypes of some kind of AR glasses. This is unsurprising; the company has likely been working on glasses for a couple of years. The press reports say the company already has “hundreds” of engineers working on its own AR effort.

Apple isn’t known for being first in emerging technologies, preferring to hang back and drop in when it knows it can deliver a product or service better than everybody else. Entering the market with a cool-looking and high-functioning set of AR glasses would be one way to outshine the competing platforms. But Apple may not be able to wait until the core technologies needed for such glasses become available at mass market levels.

 

In the near term, Apple might begin adding new camera technology to the iPhone that can support AR apps, both its own and those from third-party developers. Fast Company previously reported that Apple will source 3D camera technology from Lumentum for its 10th anniversary iPhone 8, which will likely be announced this fall.

Facebook’s AR announcement today may end up serving as the ideal opening act for Apple, says Technalysis Research president Bob O’Donnell. Facebook’s experience in its current form is built to work with the camera on the user’s phone, he says. “So Facebook announces a 2D Snapchat-like experience with photo filters and calls it AR.” Facebook will have introduced the AR concept to a mass audience with an experience that Apple can best by leveraging a true 3D camera in the next iPhone (or, possibly, with the dual cameras in the iPhone 7 Plus), says O’Donnell.

Zuckerberg may also have lowered expectations around AR—which, after all the overly hyped Magic Leap coverage, has led many people to anticipate a headset-wearable experience. “So when Apple announces an AR experience that works on the phone instead of an AR headset, it won’t be as much of a disappointment,” O’Donnell says.

Creative Strategies’ Bajarin believes that Facebook and Google are more capable than Apple at delivering cloud-based augmented reality content from the cloud, which could be a big advantage. He also believes Facebook and Google have the edge in the machine learning needed to help identify objects and people in the real world, a key function in AR apps. And let’s not forget Microsoft, which arguably has more experience in the space than anyone, after having developed the first major AR headset with the HoloLens.

Apple’s advantage lies in its ability to control far more of the hardware and software stack underpinning the AR experience—the apps, the phone, and the OS. Apple may begin building AR features deep in iOS, so that it could begin to “augment” lots of different aspects of the phone experience, like Siri does. By building AR into the OS, Apple might enable a richer experience that’s optimized for the iPhone.

It’s even possible that Apple could begin talking about its AR strategy as soon as its upcoming developer conference, WWDC, in June. Since Apple likely wants to speak to developers en masse and in person about what could become a major new platform, it may choose to at least start the conversation, as opposed to waiting until next year’s conference.

If it does, it will likely lay out a wade-into-the-water approach that will look something like what Facebook announced today. That is, the AR experience will be seen through the display of an iPhone, not through a face wearable. But that might be just the thing to get developers and consumers used to the concept, without getting too exotic with the hardware.

 

Whatever the shape of the product, Apple will try to make a grand entrance into the AR market. It won’t be first in, so it will have to infuse the technology with simplicity, style, and a wow factor that turns consumers on and makes the competing platforms seem inferior.

ABC's "Good Morning America" - 2016

Source:

https://www.fastcompany.com/40409497/apple-will-soon-join-the-ar-platform-war-that-facebook-just-started