Millions of years of evolution have shaped us to best deal with the challenges we face in nature.
Over the course of those years, the human brain has tripled in size, we have transitioned to standing and moving on two legs, and we have become the dominant species in the animal world. Yet none of this has prepared us to cope with a digital environment that constantly challenges our natural tendencies. In recent years, that environment has been changing at such a rapid pace that it is impossible for the forces of evolution to make the necessary adjustments to our natural tendencies, which have not yet adapted to the digital environment.
Evolutionary psychology aims to explain human behavior in evolutionary terms. Some human characteristics evolved because they increased the likelihood of survival in the environment. Studies in the field draw inspiration from observation of animals — for example, how a giraffe’s neck allows it to reach the tall tree leaves that shorter animals cannot reach, or how the chameleon’s ability to change colors enables it to hide from predators by blending into the surrounding environment.
Charles Darwin came up with the concept of natural selection, which holds that if a particular variant of traits leads to greater adaptation to the environment, these traits will be preserved and passed on to future generations.
For example, about 10,000 years ago, no one past infancy could digest milk sugar, called lactose. Human production of the enzyme responsible for the breakdown of the milk stopped after weaning. Sometime in the past 10,000 years, certain populations began to raise livestock in Northern Europe and the Middle East. These populations also developed certain gene variants that enabled digestion of milk beyond infancy. This ability provided a significant calorific advantage, and therefore the trait spread.
We Are Still Cave Dwellers When it Comes to Loss of Control
Another dominant feature that has developed is an obsessive need to be in control. One of the most crucial things for our survival in the world is our ability to predict what is happening around us. Therefore, our systems respond strongly to a sense of loss of control. This feeling is accompanied by automatic physiological responses, such as rapid pulse rate and accelerated blood flow, designed to prepare our systems for coping. Whether the situation that precipitates the feeling of loss of control is an unexpected breakup, a job interview or a flooded kitchen, the physiological response remains the same.
Common to all these examples is our inability to anticipate the situation. In terms of our systems, this is the worst-case scenario, because our survival in the world depends on our ability to predict what is happening in our environment.
It turns out the digital environment also puts us in many situations that trigger this feeling of loss of control.
I recently helped a client, a global news organization, analyze the behavior of visitors to its website. The organization had tried to promote a video by having it play automatically when visitors landed on the page. After analyzing the behavior of the visitors, we found that in 90 percent of the cases, visitors clicked to stop the video immediately. It wasn’t that there was something wrong with the video — the content was designed to be interesting and relevant — there was something the organization had not taken into account: One of the primary goals of a human being’s system is to control what is happening in the environment. We are highly sensitive to the smallest deviation from our expectations. So if we expect to log on and quietly read an article, and suddenly a video begins to play, we have an uncontrollable urge to restore control in the situation — and clicking the stop button will accomplish that. Although the video poses no threat to our survival, our brains have not yet learned to make this mental transition, and the immediate response is to restore control. We recommended the organization cancel the automatic playback — and when it did, the video-viewing rate increased by 60 percent!
An automatically playing video isn’t the only situation that generates a stress response in the digital world. We have found visitors to sites where the pages are particularly long or have endless scrolling (sites where you scroll and scroll but never reach the bottom of the page) feel a loss of control as well. I have observed situations where people scroll for a while and suddenly lose track of where they are. That type of situation evokes the same fear reactions we feel when we lose our way in unfamiliar physical environments.
In the physical world, we erect road signs and milestones to help people retrace their steps. Similarly, in the digital world, sites that understand user psychology provide navigation bars that allow people to click out of a page at any time. The interesting thing is the very presence of a navigation bar leads to higher scrolling percentages, even if it is not used. What matters isn’t whether users actually have to take steps to control the situation. What matters is the presence of the navigation bar gives them the feeling of having control. In the physical world, close-door buttons in elevators and buttons on pedestrian walk signals at street corners serve a similar purpose: They make us feel in control, even though some of them don’t actually work.
What About Traits We No Longer Need?
Just as evolution works to preserve traits that give us an advantage, it also works to erase characteristics that no longer constitute an advantage in the environment.
For example, in the past (about 63 million years ago), our bodies produced an enzyme that generated vitamin C on its own. At one point, we began to consume vitamin C from citrus fruit, so we no longer needed to produce it ourselves, and now the ability to produce vitamin C is extinct.
Similarly, studies show that today, as a result of relying on GPS navigation applications, regions of our brain responsible for navigation and spatial orientation are ceasing to respond. In addition, since we have begun to store phone numbers on our mobile devices, we are using less long-term memory.
An even more disturbing phenomenon is that digitization has made interpersonal interactions less and less available to us, since our primary communication is through screens. In this way, the region in our brains responsible for interpreting signals from other people becomes less efficient. This is especially true among those who have grown up in a technological environment.
The pace of technological development, and the fact that it occupies a growing place in our lives, both lead to the fact that in a few short years our neural circuits will be rewired with completely different brain functions. It is difficult to assess how the technological changes will shape our minds, but what we can know with certainty is that, in terms of future generations, we will be the object of extensive research, much like the ancient man.
Liraz Margalit, Ph.D., analyzes online consumer behavior, incorporating theory and academic research into a conceptual framework.
Also known by various other terms like “digital detox,” “digital Sabbath,” and “unplugging”, the idea behind a digital fast is to voluntarily and deliberately stop using all connected devices – smartphones, computers, tablets, and so on – that plug you to the internet for a pre-specified amount of time. The abstention could be for as little as a few hours (say, from 7 pm until the next morning). However most digital fasts are at least a day long, and many span an entire weekend or even longer. Just like food fasts, longer digital fasts are thought to be more effective (up to a point) in weaning oneself away from digital connectivity, and in regaining self-control.
Source: distraction by underminingme Flickr Licensed Under CC BY 2.0
Some versions of the digital fast recommend off-line activities you should perform instead (with all the spare time freed up from not texting and Facebooking): going on a walk or a hike, or organizing a pot-luck meal with friends who are also digitally fasting.
In theory, the idea of a digital fast has generated a lot of interest, with organizations such as Adbusters and the Huffington Posteach promoting their respective versions of digital fasts. It is especially popular among Silicon Valley employees of tech companies like Facebook and Google.
But in practice, digital fasting really hasn’t moved beyond the fringes of our culture. One reason is that there is a raging debate about whether a digital fast is desirable. Advocates take the digital stance that digital technology has addictive properties that cause harm in various ways, and we need to regain control over its use. Its opponents counter with the argument that digital technology satisfies our fundamental needs today and is irreplaceable. So, they say, there is no need for anyone to fast or unplug.
In this blog post, I want to provide brief synopses of both positions, so that you can make a better decision for yourself about whether a digital fast is something worth doing.
The Addiction Perspective: I am Addicted to Digital Technology and this is a Growing & Serious Problem
Source: The Hummingbird Doll by Paree Flickr Licensed Under CC BY 2.0
Over the past decade, a growing number of psychologists have started to see addictionto digital technologies as a form of behavioral addiction, similar to pathological gambling, and even to substance dependence addictions. They point to the fact that when using smartphones, or playing online games, or using social media, many people exhibit features that are very similar to those displayed by drug addicts. These featuresinclude the following: excessive use of the digital technology without discretion, experiencing symptoms of withdrawal including feelings of anger, tension or depression without use, and negative repercussions from use such as lowered ability to focus and insomnia. One study from 2014 led by psychologist Julia Hormes concluded:
“The use of online social networking sites is potentially addictive…Disordered online social networking use seems to arise as part of a cluster of symptoms of poor emotion regulation skills and heightened susceptibility to both substance and non-substance addiction.
Under this perspective, unchecked use of digital technology is a potential problem that could have serious consequences. One avenue to gain control over the problem is to disconnect from the source of the addiction in a deliberate and systematic way by means of a digital fast.
The Unalloyed Empowerment Perspective: Digital Technologies Satisfy My Fundamental Needs and Allow Me to do Important Things That I Cannot Otherwise Do
Source: Love at a Distance by Cubmundo Flickr Licensed Under CC BY 2.0
Technology enthusiasts take an entirely different view. They see nothing problematic or dangerous with our incessant digital connectivity. They believe that there is no need for anyone to unplug at all, even momentarily. Rather, they argue that digital technologies now satisfy many of our most basic needs and we no longer have other, old-fashioned ways of meeting these needs. For instance, (at least in the United States) people no longer stroll to the town square every evening, or call each other to talk for hours on a landline phone. Our online and offline lives have now blended together so completely that one is not possible without the other. Thus, fasting digitally will lead to nothing more than depriving ourselves of basic need fulfillment and constitutes a case of unnecessary technological asceticism. This view is expressed nicely by business consultant Alexandra Samuel:
“When we’re online — not just online, but participating in social media — we’re meeting some of our most basic human needs. Needs like creative expression. The need to connect with other people. The need to be part of a community. Most of all, the need to be seen: not in a surface, aren’t-you-cute way, but in a deep, so-that’s-what’s-going-on-inside-your-head way.”
Under this perspective of technology and digital connectivity, it is clear that a digital fast is anathema. Instead, it would see each of us as electric appliances. Unless it is plugged in to a power source, it is useless. In the same way, we are only functioning when we are digitally connected.
Which perspective makes sense – Addiction or Unalloyed Empowerment?
It is worth thinking deeply about which perspective of technology relationship – addiction or empowerment – resonates with you. When I thought about this question, it was clear even though using digital technology and being connected is empowering, I am also addicted to its use. Some experts suggest that people like me should approach digital connectivity like a diet instead of a fast. In other words, use technology in a restrained way rather than not at all. Nevertheless, I choose to unplug from all digital devices from time to time, usually for a few hours, or sometimes for a weekend. (I have never gone on a longer digital fast than this). If nothing else, digital fasting is a way to prove to myself that I have the strength to act willfully in my relationship with technology.
Utpal M. Dholakia, Ph.D., is the George R. Brown Professor of Marketing at Rice University.
It’s done through through broken promises and spikes of dopamine Photo by Joshua Earle on Unsplash Source: How Brands Addict Us Douglas Van Praet There’s a reason why marketers spend billions of dollars on advertising every year. It works! That’s because humans, and by extension, all consumers, are wired for the joys of anticipation more […]
The development of email and texting has enhanced our ability to communicate productively, efficiently, and quickly. But, based on new research into how human communication works, it’s easy to see a downside to our over-reliance on emails and texts. In fact, some of our online habits may be undermining our efforts at communicate successfully.
For example, have you ever made a joke in an email that didn’t go over well because the recipient couldn’t discern your sarcasm (even with the addition of an emoji)? Research by UCLA psychology professor emeritus Albert Mehrabian found that 7 percent of a message was derived from the words, 38 percent from the intonation, and 55 percent from the facial expression or body language. In other words, the vast majority of communication is not carried by our words alone.
Not surprisingly, research shows we communicate most effectively in real-life, real-time conversation. New neurological evidence shows that effective communication physically resounds in the brain of the receiver, echoing the thoughts and sentiments of the communicator by inducing and shaping neurological responses. A remarkable study led by Princeton University’s Greg Stephens determined through fMRI brain scans that in both the communicator and listener, similar regions of the brain fired when engaged in unrehearsed, real-life story telling, leading the team to conclude that our brain cells actually synchronize during successful communication. As the study says:
“The findings shown here indicate that during successful communication, speakers’ and listeners’ brains exhibit joint, temporally coupled response patterns. Such neural coupling substantially diminishes in the absence of communication. Moreover, more extensive speaker-listener neural couplings result in more successful communication.”
The deeper the conversation, the more deeply our minds meld. In some instances, the listener’s brain patterns actually anticipate where the story is going, in deep rapport with the speaker.
These findings support studies that link “mirror” neurons to empathy. The neuroscientist Giacomo Rizzolatti and his team discovered that empathy is mediated by neurons in the brain’s motor system. These “mirror neurons,” as Rizzolatti named them, give humans the capacity for shared experiences by enabling us to project ourselves into the minds, emotions, and actions of others through the direct simulation of feeling, not thinking. This happens best live and in person rather than through the shadowy substitutes of digital communication.
As it happens, online communication may have given rise to completely different standards of trustworthiness. Judy Olson, a professor of information and computer sciences whop has researched the essentials of building trust in digital communication, found that in the absence of traditional trust indicators like voice intonation, emotional expression, and body language in online, text-based messages, research participants default to speed of response as a key marker of trustworthiness.
The mind is a prediction machine and pattern recognizer that hates an open loop or unresolved pattern. On the web, this trigger is often exploited through headlines that beg for closure like: “What happened next will blow your mind.” We are compelled to click on the link to resolve the uncertainty. Similarly, not getting a response to email can cause significant if unintended psychological unrest. But in an email-default communication environment, the non-response has become the norm for messages that appear to lack urgency. In some ways, it may be better to give someone bad news than no news at all.
Given what we’ve learned, here are a few suggestions on how to enhance your own text-based communication:
Play it straight. We don’t process communication on face value: Our minds work mostly through implicit inference, not direct suggestion. We look for the hidden meaning, often times to avoid deception or unmask others’ agendas. As evolutionary biologist Richard Dawkins puts it, “We are evolved to second-guess the behaviors of others by becoming brilliant intuitive psychologists.” The best bet, then, is to be clear rather than clever.
Close the loop. Have you ever reached out to someone to congratulate or compliment them and not heard back? That sense of injustice and anger you felt is not healthy, for either party. It’s a violation of our most deeply ingrained social norm of reciprocal altruism to repay in kind what others have done for or to you. Do you really need people hating on you for such a simple omission? Acknowledge what you’ve received.
Respond quickly. As the digital age obviates the need for live interactions, gaining trust becomes more of a challenge. Person-to-person interactions carry benefits (such as facial expressions and gestures) that facilitate the manner in which humans typically generate trust. Trust is the glue that binds people and the means by which we succeed as social beings who rely on the resources of others. In the absence of these cues, research would indicate that, as a rule of thumb, if you are quick to reply, others will respect you more (even if your message is not what they want to hear).
Move the conversation offline. For an important message, try phone calls, video conference, or in-person talks. Phone has the benefit of real-time conversation and the inclusion of the intonation of one’s voice to convey the real meaning of their words—as does using Skype or Google Hangouts, which can add the further contextual cues of body language and help complete the picture. (In general, video conference services are underrated and underused.) But by far the best way is to sit down in person, a rarity these days as we increasingly hide behind emails—and sometimes pay a price for it.
Deep under the Disneyland Resort Hotel in California, far from the throngs of happy tourists, laundry workers clean thousands of sheets, blankets, towels and comforters every day. Workers feed the heavy linens into hot, automated presses to iron out wrinkles, and load dirty laundry into washers and dryers large enough to sit in. It’s loud, difficult work, but bearable. The workers were protected by union contracts that guaranteed a living wage and affordable healthcare, and many had worked decades at the company. They were mostly happy to work for Disney.
This changed in 2008. The union contracts were up, and Disney wouldn’t renew without adjustments. One of the changes involved how management tracked worker productivity. Before, employees would track how many sheets or towels or comforters the workers washed, dried, or folded on paper notes turned in at the end of the day. But Disney was replacing that system with an electronic tracking system that monitored their progress in real time.
Electronic monitoring wasn’t unusual in the hotel business. But Disney took the highly unusual step of displaying the productivity of their workers on scoreboards all over the laundry facilities, says Austin Lynch, director of organizing for Unite Here Local 11. According to Lynch, every worker’s name was compared with the names of coworkers, each one colour-coded like traffic signals. If you were keeping up with the goals of management, your name was displayed in green. If you slowed down, your name was in yellow. If you were behind, your name was in red. Managers could see the monitors from their office, and change production targets from their computers. Each laundry machine would also monitor the rate of worker input, and flash red and yellow lights at the workers directly if they slowed down.
[Source Photo: klikk/iStock]
‘They had a hard time ignoring it,’ said Beatriz Topete, a union organiser for Unite Here Local 11 at the time. ‘It pushes you mentally to keep working. It doesn’t give you breathing space.’ Topete recalled an incident where she was speaking to workers on the night shift, feeding hand-towels into a laundry machine. Every time the workers slowed down, the machine would flash at them. They told her they felt like they couldn’t stop.
The workers called this ‘the electronic whip’.
While this whip was cracking, the workers sped up. ‘We saw a higher incidence of injuries,’ Topete said. ‘Several people were injured on the job.’ The formerly collegial environment degenerated into a race. The laundry workers competed with each other, and got upset when coworkers couldn’t keep up. People started skipping bathroom breaks. Pregnant workers fell behind. ‘The scoreboard incentivises competition,’ said Topete. ‘Our human competitiveness, whatever makes us like games, whatever keeps us wanting to win, it’s a similar thing that was happening. Even if you didn’t want to.’
The electronic whip is an example of gamification gone awry.
Gamification is the application of game elements into nongame spaces. It is the permeation of ideas and values from the sphere of play and leisure to other social spaces. It’s premised on a seductive idea: if you layer elements of games, such as rules, feedback systems, rewards and videogame-like user interfaces over reality, it will make any activity motivating, fair and (potentially) fun. ‘We are starving and games are feeding us,’ writes Jane McGonigal in Reality Is Broken (2011).‘What if we decided to use everything we know about game design to fix what’s wrong with reality?’
Consequentially, gamification is everywhere. It’s in coupon-dispensing loyalty programmes at supermarkets. Big Y, my local supermarket chain in Boston, employs digital slot machines at the checkout for its members. Winning dispenses ‘coins’ that can be redeemed for deals. Gamification is in the driver interfaces of Lyft and Uber, which give badges for miles driven. Gamification is the premise of fitness games such as Zombies, Run!, where users push themselves to exercise by outrunning digital zombies, and of language-learning apps such as Duolingo, where scoring prompts one to master more. The playgroundoffices of Silicon Valley, complete with slides and ball pits, have been gamified. Your credit score is one big game, too.
But gamification’s trapping of total fun masks that we have very little control over the games we are made to play – and hides the fact that these games are not games at all. Gamified systems are tools, not toys. They can teach complex topics, engage us with otherwise difficult problems. Or they can function as subtle systems of social control.
[Source Photo: klikk/iStock]
Games are probably as old as the human species itself. Archaeologists have unearthed mancala–like boards made of stone in Jordan, dated to 6,000 BC. The application of games to serious matters has probably been with us almost as long. The Egyptian board game senetrepresented the passage of the ka (or vital spark) to the afterlife; its name is commonly translated as ‘the game of passing’. The Roman senatorial class played latrunculi, an abstract game of military strategy to train the mind and pass the time. Dice-based games of chance are thought to have originated with ancient divination practices involving thrown knucklebones. Native American ball games served as proxies of war and were probably crucial to keeping the Iroquois Confederation together. As many as 1,000 players would converge to play what the Mohawk game called baaga’adowe (the little brother of war).
The conflation of game and ritual is likely by design. The Dutch cultural historian Johan Huizinga observed in Homo Ludens (1938) that both invoke a magic circle, a time and place outside of the norms of reality. During play, as during ritual, new rules supersede the old. Players are not tried as thieves for ‘stealing’ a base in baseball. The Eucharist doesn’t literally become flesh during Catholic transubstantiation rituals. Through play and games, Egyptians could metaphorically engage with the afterlife without the inconvenience of dying.
An important aspect of early games was that they were still limited in size and scope. One-thousand-player stickball games between whole villages were a rarity. We don’t see the emergence of anything analogous to modern gamification until the 18th century when Europe underwent a renaissance of games and game design. In 18th-century Paris, Rome, Vienna and London, an international leisure class emerged that communicated across national and linguistic divides through the medium of games. For example, one of the earliest four-person card games in Europe was ombre – from el hombre (the man) – which originated in 16th-century Spain. The game didn’t become known outside Spain until almost the end of the 17th century, with the marriage of Maria Theresa of Spain to Louis XIV of France. Within a few years, the game spread across the continent and was playable in the courts and salons of every capital in Europe.
The spread of ombre coincided with a boom in games and game culture in Europe. Abraham and David Roentgen became a father-and-son pair of rockstars for building foldable game-tables that could be rearranged to suit everything from backgammon to ombre. Play rooms appeared in the homes of the aristocracy and emergent bourgeois. Books of rules such as Pleasant Pastime with Enchanting and Joyful Games to Be Played in Society (1757) were translated into multiple languages. The Catholic Church got in on the act with the liberalisation of lottery laws by popes Clement XII and Pius VI. In the 1750s, the Swiss mathematician and physicist Daniel Bernoulli even declared: ‘The century that we live in could be subsumed in the history books as … the Century of Play.’
In the mid-18th century, Gerhard Tersteegen, an enterprising priest, developed the ‘Pious Lottery’, a deck of 365 cards with various tasks of faith. ‘You’d read a prayer straight from the card,’ explains the historian Mathias Fuchs of Leuphana University in Germany. It is reminiscent of modern mindfulness or religious apps that attempt to algorithmically generate spiritual fulfilment.
Soon, 18th-century musicians were incorporating the logic of game design into their music through randomised card- or dice-based systems for musical composition. Johann Sebastian Bach’s student Johann Philipp Kirnberger, and second son, Carl Philipp Emanuel Bach, both wrote musical composition games – respectively, ‘The Ever-Ready Minuet and Polonaise Composer’ and ‘A Method for Making Six Bars of Double Counterpoint at the Octave Without Knowing the Rules’ (Musikalisches Würfelspiel), which was also attributed to Mozart. These games asked erstwhile composers to roll a pair of dice to randomly select pre-written measures for minuets. According to one estimate, Mozart’s game features 1.3 x 1029 possible combinations. Players would stitch measures of music together in the order rolled to compose a final product, in essence enacting an algorithm. In a way, these resemble modern musical rhythm games such as Guitar Hero that provide the illusion of musical mastery for the sake of entertainment.
It’s not clear what ended the century of play. Perhaps the rococo play culture of the 18th century ended with the wars and nationalistic fervour of the 19th. Fuchs suggests the French Revolution of 1789 as the likely cause. What’s clear is that the centrality of games as a cultural force wouldn’t reach 18th-century levels of saturation until the development of computers.
By the end of the 20th century, video and then computers became more ubiquitous and user-friendly, and digital games rose in scale and scope. To make computers more accessible, human-computer interface designers borrowed elements from early video games. Graphical user interfaces replaced code. Games and gamers became distinct subsets of the computer software and computer hobbyist landscapes. Because the first computer games were experiments in software design, computer and hobby magazines regularly printed and distributed lines of code. Programs, including games, were freely available to remix and experiment on. Importantly, this hobbyist culture, while not a utopia of gender equality, was not strictly male-coded initially.
As software development became more corporate, and the user experience more centralised, the discourse shifted away from the quality of the software to gameplay and user experience. Game development corporations seized on a booming market, cultivating gamers as a distinct category of consumer, and focusing on white, adolescent and teenage boys. Jennifer deWinter, a video-game scholar at Worcester Polytechnic Institute in Massachusetts, refers to this as the construction of technomasculinity. ‘It takes over the ideology of what it takes to be a successful man … the gamer identity was constructed for them to consume as white, male and tech-savvy,’ she explains. The workers of the future would be gamers.
[Source Photo: klikk/iStock]
By 2008, the gamification of work felt absolutely natural to a generation of people raised on ubiquitous digital technology and computer games. Tech startups were faced with the challenge of attracting and retaining users. Game designers and marketers including Jane McGonigal and Ethan Zuckerman promoted the use of immersive game mechanics as a way of ‘hacking happiness’ and building user engagement at summits, speeches and TED talks. By 2010, interest in gamification intensified with the success of the social network game FarmVille, which seemed to have solved the problem of user retention and engagement. Marketers and consultants were quick to seize on gamification as a tool to create customer loyalty and manage human desire. They sought to capitalise on the ‘addictive fun’ of gambling and games by introducing ‘pseudo-goals’ unrelated to the primary goals of either the consumer or the business in question. Game design elements such as badges, points, scoreboards and progress-tracking proliferated across different platforms, apps and workspaces. In doing so, they unknowingly borrowed from the Pious Lottery. Saying a Hail Mary or going to church because of a game isn’t necessarily aligned with the goal of eternal salvation, in much the same way as buying blood oranges for loyalty points isn’t really the goal of grocery shopping.
This brings us back to the electronic whip; Disney was hardly alone. The US retail giant Target implemented the Checkout Game which tracked and scored the speed of minimum-wage checkout clerks. The clerks could see themselves scored in real time on their point-of-sale computers. The US ice-cream parlour chain Cold Stone Creamery marshalled the power of games to teach workers how to be expert ice-cream mixers with the game Stone City, which uses motion controls to teach people how to ‘feel’ out the correct scoops. The game calculates how large the scoops are in relation to the optimal sizes, and then tells the players how much their over-scoops cost the store. Workers were asked to download the game and play it in their off-hours.
Amazon has also bought big into gamifying work. Warehouse workers are subject to scoreboards that display the silhouettes of workers who were caught stealing, what they were caught stealing, and how they were caught. Their productivity is monitored by handheld devices that scan and locate products. If their productivity drops, workers are disciplined with points on a scorecard. As in golf, more points is bad. Accrue enough points, and the worker is fired. White-collar workers too are scored and ranked by digital metrics, and by their peers and bosses. Until 2016, the bottom scorers were fired in what’s called ‘rank and yank’ by the employees.
Through gamified technology, corporations such as Amazon and Disney now have an unprecedented level of control over the individual bodies of their employees. Steve Sims, a vice-president at the gamification firm Badgeville, now CallidusCloud, in California said: ‘We like to think of it as behaviour management.’ In other words, how to get other people to do more stuff, more often.
This kind of micromanagement resembles Taylorism, a system developed by the American engineer Frederick Winslow Taylor during the 1890s to codify the movements and habits of mind that led to productivity. To eliminate inefficiency and waste, Taylor followed around the ‘most productive’ factory workers, recording the timing of all their movements with a stopwatch. He set managers, similarly armed with stopwatches, to micromanage every detail of a job. Taylor was also famous for fudging his numbers in favour of speed-driving workers to exhaustion and, in some cases, to strike.
But the modern gamified workplace enables control beyond Taylor’s wildest dreams. Games are sets of rules prescribing both actions and outcomes. A gamified workplace sets not just goals for workers but precisely how those goals can be achieved. Managers don’t need to follow workers with stopwatches. They can use smartphones or apps. It’s micromanagement with unprecedented granularity. ‘This is Taylorism 2.0,’ according to the media expert Steven Conway of Swinburne University of Technology in Australia. ‘Activities are more rigidly defined and processed than ever.’ The gamified workplace is not a game in the original sense, nor does it cultivate playful ends.
The problem of the gamified workplace goes beyond micromanagement. The business ethicist Tae Wan Kim at Carnegie Mellon University in Pittsburgh warns that gamified systems have the potential to complicate and subvert ethical reasoning. He cites the example of a drowning child. If you save the child, motivated by empathy, sympathy or goodwill – that’s a morally good act. But say you gamify the situation. Say you earn points for saving drowning children. ‘Your gamified act is ethically unworthy,’ he explained to me in an email. Providing extrinsic gamified motivators, even if they work as intended, deprive us of the option to live worthy lives, Kim argues. ‘The workplace is a sacred space where we develop ourselves and help others,’ he notes. ‘Gamified workers have difficulty seeing what contributions they really make.’
The problem isn’t limited to work. Social platforms all employ some form of gamification in their stats, figures, points, likes and badges. Dating apps gamify our romantic life; Facebook gamifies friendship.
Even war has been gamified: drone pilots operate in a highly gamified environment. Foeke Postma, a researcher and programme officer at the Dutch peace organization PAX, says that drone warfare often takes the shape of a game, right down to the joysticks or PlayStation-like controllers that the pilots use. ‘The US Airforce and the Royal Air Force have specifically targeted gamers to recruit as drone operators,’ he explains. The US drone program also employs game-like terminology when discussing targets. High-value assassination targets are called ‘jackpots’. Anyone caught near a jackpot during an airstrike is called ‘bugsplatter’. When drone pilots retire or transfer, they’re given a scorecard of kills. Postma says that this framework risks the total dehumanisation of the targets of drone warfare. In an interview with The Guardian, a drone pilot said: ‘Ever step on ants and never give it another thought?’
The expansion of game-like elements into nongame spaces is a global phenomenon. We are all living in expanding, overlapping magic circles, with some places moving faster than others. China in introducing a national, gamified social credit score through public-private partnerships. Eight credit scoring systems have been granted charters and each has a share of the national credit system. One social credit system ranks you based on how well you repay loans, the scores of your friends, where you shop and what you post to social media. This ranking determines whether you can receive loans or obtain a visa. In the US, the more limited FICO score can determine whether you get an apartment, a car, or a job.
The 20th-century French philosopher Michel Foucault would have said that these are technologies of power. Today, the interface designer and game scholar Sebastian Deterding says that this kind of gamification expresses a modernist view of a world with top-down managerial control. But the concept is flawed. Gamification promises easy, centralised overviews and control. ‘It’s a comforting illusion because de facto reality is not as predictable as a simulation,’ Deterding says. You can make a model of a city in SimCity that bears little resemblance to a real city. Mistaking games for reality is ultimately mistaking map for territory. No matter how well-designed, a simulation cannot account for the unforeseen.
A prime example of gamification gone awry is Go365, a health app introduced in 2017 by the Public Employees Insurance Agency (PEIA) in West Virginia and the Humana health insurance company. The app was presented as a motivating tool and game, not unlike smartphone fitness apps. Go365’s advertisements featured white, upper-middle-class joggers and attractively dishevelled soccer moms buying carrots. The app tracked physical activity, steps and location. It also allowed users to give more sensitive information to Humana, such as blood glucose levels, sleep cycle, diet and the results of doctor’s visits. Users were asked how often they drank and whether they smoked. Family medical histories were probed. The app awarded points, sets milestones and gave rewards for participation in the form of ‘Bucks’ that could be redeemed for gift cards. The agency claimed that the app was voluntary, but failure to accrue enough points (and to increase points annually) meant an extra $500 in premiums and an additional $1,000 on top of existing deductibles. That might not sound like a lot, but most teachers and support staff in West Virginia make less than $40,000 a year. Many have second jobs. Many more are elderly or have chronic illnesses.
The legislature gave no option but to play Go365 – but how teachers were supposed to play was another matter. ‘It was the cherry on top of a shit sundae,’ said Michael Mochaidean, a teacher and organiser in West Virginia. The teachers didn’t want to give up sensitive medical data. They didn’t want their locations tracked. After years of funding cuts to the PEIA, they saw the app as a way to kick teachers off their healthcare altogether.
Enraged, the teachers of West Virginia took to Facebook. They complained, they organised, and in March of 2018 thousands of them descended on the capitol in Charleston in a wildcat strike. After years of low pay and slashed benefits, their dissatisfaction had finally crystallised around the imposition of Go365. They would not participate in the game. By the end of the strike, the teachers had won a pay raise, and forced West Virginia to end its contract with Humana. Go365 was phased out. The teachers had sent a message to their bosses. Neither their work nor their health was a game.
This article was republished under a Creative Commons license fromAeon. Read the original here.