Augmented Reality’s True Purpose: Serving the Appetite for Big Data

Futurist and VR innovator Mark Pesce warns that we are being lured into a gilded cage of surveillance and influence

Doug Bierend
Vantage

--

Screen capture from a test of Skeltrack, an open source machine vision application — Joaquim Rocha via Flickr

Augmented Reality — overlaying convincing, interactive digital content upon one’s physical surroundings — is becoming commonplace. For consumers, it’s presented as a world of endless experiential possibilities; just gaze through your phone or a set of special eyewear, and anyplace you go is suddenly enhanced by immersive scenes, games, interfaces, and other people.

Snapchat and artist Jeff Koons teamed up for an AR ‘intervention’ in Central Park

The popularity of a relatively primitive AR game like Pokémon Go suggests just how eager people are to frolick about in a digitally enhanced world. Indeed, for the companies pursuing and developing it, augmented reality represents the next big digital platform. Amply funded by the likes of Apple, Google, Facebook and Microsoft, AR is being bet on as the next transformative technology, as ubiquitous and profitable as cell phones proved to be.

Another point of view, however, sees the rise of AR as tantamount to immersive surveillance; a step closer to the day when mechanisms of data collection and social influence merge with the physical world.

“Augmented reality is a technology of surveillance, full stop,” says futurist and VR innovator Mark Pesce. “It has to be, that’s the way it works. Augmented reality has to be intensely aware of the space that you’re in, and the things that are in that space.”

Pesce speaks from experience, having been integral to the development of early VR efforts like Sega’s ahead-of-its-time Virtua VR, and VRML, an early language for web-based virtual reality (another species of augmented reality).
“To be inside of a VR environment,” says Pesce, “Is to be continuously under surveillance by that environment, so that the environment can respond to you. There’s no particular malevolence around that, but we also know that data is being collected and added to your profile, and it is effectively being weaponized against you; that we actually have evidence for.”

In the latest edition of the Australian literary magazine Meanjin, under the headline The Last Days of Reality, Pesce unpacks his concerns about the growing power of data-driven social media companies to not only measure and exploit people’s thoughts and desires, but to influence them. It describes a dystopia — or ‘surveillance utopia’ — that reads more like Huxley than Orwell.

Pesce’s article begins with reference to a story from last year, about Facebook’s experiments in assessing and influencing Australian teenagers’ emotional state by altering the composition of their newsfeeds. Facebook has tried similar experiments here in the States, too, met with round criticism and discomfort by users.

People don’t seem to like the idea of having their emotions manipulated or exploited, but that’s part of the deal when signing up with Facebook. It’s all in the terms and conditions.

Per the New York Times in 2014:

“Researchers found that moods were contagious. The people who saw more positive posts responded by writing more positive posts. Similarly, seeing more negative content prompted the viewers to be more negative in their own posts.”

In short: the manipulations have an effect. These mediums operate not as windows, but mirrors, feeding into our view the content that their algorithms have determined we’re most likely to engage with. And engage we do, often despite our better judgement, precisely because they have been successful in appealing to our attention.

“What it means is that certain topics in civic discourse simply can’t be raised as they won’t propagate,” says Pesce. “This pervasive system of surveillance and profiling presents a face that is entirely conformant to an individual’s desires, and it’s designed to satisfy those desires. Well, how much of those desires are being co-created by that device; by the relationships in that device; by the autonomous forces that are powering that device?”

Each of us uses our chameleonic devices in ways that are particular to us, yet we all feed into the same data collection machine. What we see through our screens is mediated and modulated, not by a principle of accurately reflecting the world as it is, nor to equip us to effectively and productively interact with it. What drives the machine and makes the entire operation profitable isn’t even the quality or veracity of what’s being conveyed. It’s how much and how often people participate in ways that generate monetizable streams of data. Sweet, sweet data.

This speaks to phenomena like fake news, filter bubbles, and the systematic warping of civic discourse by bots or trolls. Few things drive engagement like feeding preconceptions. Many of us are caught in feedback loops which will only grow more difficult to pull out from as these algorithms and the profiles they build around us become more refined, and the interfaces more compelling and immersive.

What happens when a system of engagement über alles finds a medium that can take over our impression of the world around us in the most fundamental, sensorial way? Given the profound effects and depth of intrusion and influence already represented by digital interactions, what does it mean when the screen melts away into the world itself? Its implications go well beyond games and shopping.

IKEA offers an augmented reality app for selecting furniture.

According to Pesce, we may find ourselves in the precious final interval before influential mechanisms of social media and personal technology curate themselves out of view.

“What power does is it makes itself invisible. It removes any gaps in its functioning, it just looks like this smooth surface,” Pesce says. “That’s Discipline and Punish. There’s nothing new in that idea. And so as these systems become more totalizing, if you’re talking about a system that can literally edit reality, it edits its own visibility out.”

Billboards, storefronts and the like have traditionally been the primary means for commerce to intrude upon our surroundings. With immersive AR that knows no boundaries, backed by machine learning and individual data collection, it’s almost impossible to describe precisely what the next stage of data collection and influence will look like. But our current situation offers clues.

Few of us would have imagined how much time we’d spend on our phones, or the impact they’d have on our state of mind and sense for the world. 15 years ago, one might have suggested that we could avoid our engagement addiction by simply putting down the phone. But of course, we’ve only grown more attached to our devices. As the form factor becomes less and less intrusive — Pesce reckons we’ll see affordable, spectacles-style AR devices enter the market in about 5 years — the same trend may begin anew.

In an augmented world, that system of influence is all-surrounding and always present. Being able to take off the spectacles will depend on how much we get out of keeping them on.

“If you’re talking about something that’s responding to — and perhaps, I would say, playing off of your emotions — then establishing a sort of critical distance is easy. If you’re talking about something that is actively working to keep you engaged, to keep you enthralled in a way, then, while possible, the psychic bar you have to cross is higher.”

This all may sound a bit overheated, but the concerns are rooted in the realities of how both AR and data collection works. Today, any shiny whizbang new device or service ultimately exists to hoover up, analyze, and sell the myriad forms of data they gather from users. That’s key to the business model, and AR is not unique in this regard.

What is unique about AR is its reliance on detailed spatial and visual awareness of a user’s environment and their activities within it. Apple’s ARkit, for example, is built on what the company calls World Tracking. Per their document for developers:

To create a correspondence between real and virtual spaces, ARKit uses a technique called visual-inertial odometry. This process combines information from the iOS device’s motion sensing hardware with computer vision analysis of the scene visible to the device’s camera. ARKit recognizes notable features in the scene image, tracks differences in the positions of those features across video frames, and compares that information with motion sensing data. The result is a high-precision model of the device’s position and motion. World tracking also analyzes and understands the contents of a scene.

Constant awareness of the environment and user is key, and other technologies underline this fact.

For example, in parallel with empowering personal devices to track and make sense of their users’ environment, leading technology companies are also pushing hard for the widespread use of facial recognition technology.

Tim Cook demonstrates iPhone X’s facial recognition technology. GIF via Buzzfeed

Increasingly capable of registering emotions, eye movements, and other rich data streams generated by our faces, this capability exists largely because it can feed into the sprawling infrastructure of commercial data collection. Sharing funny videos with goofy digital masks is secondary to the true value of facial tracking. So are virtual reality, augmented reality, and the various games, utilities, and social apps they enable.

“This is going to be the next big battle after the smartphone, and part of it is going to be the battle for the devices, but more of it is going to be about who gets the data feed that’s coming off of all these devices,” Pesce says. “Is that going to be Facebook’s game? Or is it going to be Sony’s game, or Google’s game? Or is it going to be Tencent’s game? Because China is absolutely going to do their own thing … In All the President’s Men the line was ‘follow the money’, and now the line is ‘follow the data’.”

None of this is to suggest that these companies maliciously seek to undermine civic or social life. They’re caught in their own feedback loops after all, of satisfying shareholders and capturing market share. But despite such familiar motives, these devices and platforms are increasingly capable of shaping social sentiments, with effects that are largely unpredictable but already tangible in out discourse and politics. And the step from corporations selling data for profit to governments weaponizing the same data to monitor and manage people is unnervingly short.

‘Weaponizing’ may imply an unduly nefarious motive. But when it comes to AR, like so many modern technologies, weapons are where it began.

Test of a helmet-mounted heads-up display during a refueling maneuver

The earliest AR systems were developed decades ago, to help fighter pilots achieve higher situational awareness via Heads Up Displays; an advanced technology meant to serve the delivery of another class of advanced technology (e.g. bombs and missiles). In a consumer tech context, weaponization means that something we buy or use for one purpose — work, play, communicating with family or friends — is masking the deeper purpose of measuring and monetizing our activities and attitudes. That can take the form of ads, or malicious propaganda. The only thing standing in the way of these uses are yet-to-be written laws and regulations, or the caprice of the CEOs who seek to portray their companies in the most virtuous light. In the end though, Mark Zuckerberg doesn’t serve his investors by successfully ‘connecting the world’ for its own sake. He does it by finding new ways of enhancing their return on investment.

Immersive, networked technologies are inevitably going to grow more powerful, widespread, and impactful. It’s unlikely that anything can be done at the level of one device or another to prevent the collection and sale of data — again, AR in particular requires this to work — and the distorting gyres of engagement that result. But not augmented reality, nor cell phones nor the internet, nor any other medium are really the problem. Indeed, they can be and are the basis of compelling, positive new experiences and capabilities.

At issue is the apparently secondary (but really primary) industry of data collection, and its shaping of behavior and sentiment according to interests that are not our own. That problem is only accelerated by more powerful and immersive delivery systems.

The latest technology therefore raises old concerns. One of the most important things, and perhaps the simplest, is for people to become more aware of the many ways that they are being observed, adjusting spending, use habits and expectations according to that reality. These technologies are serving a desire for convenience and experience; we just have to learn to value our privacy and agency even more.

“So much data is being gathered from so many different quadrants, most of which we are completely unaware of, that in some sense what we need to think about is doing an inventory of that.” says Pesce. “You have to balance your own desire to be able to connect in an interesting and meaningful way through this new medium with other people, with the risks you’re taking with the data you’re generating by your presence there, and how it’s being used.”

Mark Pesce is a futurist, author, entrepreneur and innovator in areas of virtual reality, education, finance, manufacturing, transportation and communication. Read his longform piece in Meanjin, and follow him on Twitter.

--

--

Doug Bierend
Vantage
Editor for

Writer, freelance journalist, author, comms specialist, some guy.