The psychology behind why we value physical objects over digital

Stated simply, it’s easier to develop meaningful feelings of ownership over a physical entity than a digital one. By Christian Jarrett

Source: The psychology behind why we value physical objects over digital

By Christian Jarrett

When technological advances paved the way for digital books, films and music, many commentators predicted the demise of their physical equivalents. It hasn’t happened, so far at least. For instance, while there is a huge market in e-books, print books remain dominant. A large part of the reason comes down to psychology – we value things that we own, or anticipate owning, in large part because we see them as an extension of ourselves. And, stated simply, it’s easier to develop meaningful feelings of ownership over a physical entity than a digital one. A new paper in the Journal of Consumer Research presents a series of studies that demonstrate this difference. “Our findings illustrate how psychological ownership engenders a difference in the perceived value of physical and digital goods, yielding new insights into the relationship between consumers and their possessions,” the researchers said.

In an initial study at a tourist destination, Ozgun Atasoy and Carey Morewedge arranged for 86 visitors to have their photograph taken with an actor dressed as a historical character. Half the visitors were given a digital photo (emailed to them straight away), the others were handed a physical copy. Then they were asked how much they were willing to pay, if anything, for their photo, with the proceeds going to charity. The recipients of a physical photo were willing to pay more, on average, and not because they thought the production costs were higher.

It was a similar story when Atasoy and Morewedge asked hundreds of American volunteers on the Amazon Mechanical Turk survey website to say what they would be willing to pay for either physical or digital versions of the book Harry Potter and the Sorcerer’s Stone and physical or digital versions of the movie Dark Knight. The participants placed higher monetary value on the physical versions, and this seemed to be because they expected to have a stronger sense of ownership for them (for the physical versions, they agreed more strongly with statements like “I will feel like I own it” and “feel like it is mine”). In contrast, participants’ anticipated enjoyment was the same for the different versions and so can’t explain the higher value placed on physical.

In further studies, the researchers showed that participants no longer placed higher value on physical objects over digital when they would be renting rather than buying – presumably because the greater appeal of owning something physical is irrelevant in this case. Likewise, the researchers found that participants who identified strongly with a particular movie (The Empire Strikes Back) placed higher value on owning a physical copy versus digital, but participants who had no personal connection with the film did not. This fits the researchers’ theorising because the greater sense of ownership afforded by a physical product is only an enticing prospect when there’s a motivation to experience a strong sense of connection with it.

If it is a greater psychological sense of ownership that makes physical objects so appealing, then the researchers reasoned that people disposed with more “need for control” will be particularly attracted to them – after all, to own something is to control it. Atasoy and Morewedge found some support for this in their final study. The higher that participants scored on a “need for control scale” (they agreed with items like “I prefer doing my own planning”), the more than they tended to say that physical books would engender a greater sense of ownership, and, in turn, this was associated with their being willing to pay a higher amount for them, compared with digital.

The findings have some intriguing interesting implications for companies seeking to boost the appeal of digital products, the researchers said. Any interventions that might engender a greater psychological sense of ownership over digital entities will likely boost their value – such as allowing for personalisation or being able to interact with them in some way. Similarly, the results may help explain the ubiquity of digital piracy – because people generally place a lower value on digital products (even when they see the production costs as the same physical) it follows that many of us consider the theft of digital products as less serious than physical theft.

Digital Goods are Valued Less than Physical Goods

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest and author of the TED-ED Lesson Why are we so attached to our things?

Advertisements

Facebook’s founding president admitted how it exploits human psychology

 

WRITTEN BY     Hanna Kozlowska
Sean Parker told Axios: “God only knows what it’s doing to our children’s brains.”

Source: Facebook’s founding president admitted how it exploits human psychology

Most people don’t need to be told they’re addicted to technology and social media. If reaching for your cell phone first thing in the morning doesn’t tell you as much, multiple scientific studies and books will. Now the people responsible for this modern-day addiction have admitted that was their plan all along.

Silicon Valley bad boy Sean Parker, Facebook’s first president, told Axios in an interview that the service “literally changes your relationship with society,” and “probably interferes with productivity in weird ways.” And, he added, “God only knows what it’s doing to our children’s brains.”

Facebook’s main goal is to get and keep people’s attention, Parker said. “The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible?’”

Attention, he said, was fueled by “a little dopamine hit every once in a while,” in the form of a like or a comment, which would generate more content, in the forms of more likes and comments.

“It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.”

Parker said that the inventors of social media platforms, including himself, Facebook’s Mark Zuckerberg and Instagram’s Kevin Systrom, “understoood consciously” what they were doing. “And we did it anyway.”

How Information Overload Robs Us of Our Creativity: What the Scientific Research Shows

in    August 5th, 2017

from http://www.openculture.com/2017/08/how-information-overload-robs-us-of-our-creativity.html

Everyone used to read Samuel Johnson. Now it seems hardly anyone does. That’s a shame. Johnson understood the human mind, its sadly amusing frailties and its double-blind alleys. He understood the nature of that mysterious act we casually refer to as “creativity.” It is not the kind of thing one lucks into or masters after a seminar or lecture series. It requires discipline and a mind free of distraction. “My dear friend,” said Johnson in 1783, according to his biographer and secretary Boswell, “clear your mind of cant.”

There’s no missing apostrophe in his advice. Inspiring as it may sound, Johnson did not mean to say “you can do it!” He meant “cant,” an old word for cheap deception, bias, hypocrisy, insincere expression. “It is a mode of talking in Society,” he conceded, “but don’t think foolishly.” Johnson’s injunction resonated through a couple centuries, became garbled into a banal affirmation, and was lost in a graveyard of image macros. Let us endeavor to retrieve it, and ruminate on its wisdom.

We may even do so with our favorite modern brief in hand, the scientific study. There are many we could turn to. For example, notes Derek Beres, in a 2014 book neuroscientist Daniel Levitin brought his research to bear in arguing that “information overload keeps us mired in noise…. This saps us of not only willpower (of which we have a limited store) but creativity as well.” “We sure think we’re accomplishing a lot,” Levitin told Susan Page on The Diane Rehm Show in 2015, “but that’s an illusion… as a neuroscientist, I can tell you one thing the brain is very good at is self-delusion.”

Johnson’s age had its own version of information overload, as did that of another curmudgeonly voice from the past, T.S. Eliot, who wondered, “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?” The question leaves Eliot’s readers asking whether what we take for knowledge or information really are such? Maybe they’re just as often forms of needless busyness, distraction, and overthinking. Stanford researcher Emma Seppälä suggests as much in her work on “the science of happiness.” At Quartz, she writes,

We need to find ways to give our brains a break…. At work, we’re intensely analyzing problems, organizing data, writing—all activities that require focus. During downtime, we immerse ourselves in our phones while standing in line at the store or lose ourselves in Netflix after hours.

Seppälä exhorts us to relax and let go of the constant need for stimulation, to take longs walks without the phone, get out of our comfort zones, make time for fun and games, and generally build in time for leisure. How does this work? Let’s look at some additional research. Bar-Ilan University’s Moshe Bar and Shira Baror undertook a study to measure the effects of distraction, or what they call “mental load,” the “stray thoughts” and “obsessive ruminations” that clutter the mind with information and loose ends. Our “capacity for original and creative thinking,” Bar writes at The New York Times, “is markedly stymied” by a busy mind. “The cluttered mind,” writes Jessica Stillman, “is a creativity killer.”

In a paper published in Psychological Science, Bar and Baror describe how “conditions of high load” foster unoriginal thinking. Participants in their experiment were asked to remember strings of arbitrary numbers, then to play word association games. “Participants with seven digits to recall resorted to the most statistically common responses(e.g., white/black),” writes Bar, “whereas participants with two digits gave less typical, more varied pairings (e.g. white/cloud).” Our brains have limited resources. When constrained and overwhelmed with thoughts, they pursue well-trod paths of least resistance, trying to efficiently bring order to chaos.

“Imagination,” on the other hand, wrote Dr. Johnson elsewhere, “a licentious and vagrant faculty, unsusceptible of limitations and impatient of restraint, has always endeavored to baffle the logician, to perplex the confines of distinction, and burst the enclosures of regularity.” Bar describes the contrast between the imaginative mind and the information processing mind as “a tension in our brains between exploration and exploitation.” Gorging on information makes our brains “’exploit’ what we already know,” or think we know, “leaning on our expectation, trusting the comfort of a predictable environment.” When our minds are “unloaded,” on the other hand, such as can occur during a hike or a long, relaxing shower, we can shed fixed patterns of thinking, and explore creative insights that might otherwise get buried or discarded.

As Drake Baer succinctly puts in at New York Magazine’s Science of Us, “When you have nothing to think about, you can do your best thinking.” Getting to that state in a climate of perpetual, unsleeping distraction, opinion, and alarm, requires another kind of discipline: the discipline to unplug, wander off, and clear your mind.

For another angle on this, you might want to check out Cal Newport’s 2016 book, Deep Work: Rules for Focused Success in a Distracted World.

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Smartphones Hijack Our Minds

Tags

, , , ,

Research suggests that as the brain grows dependent on phone technology, the intellect weakens.

Source: How Smartphones Hijack Our Minds

ILLUSTRATION: SERGE BLOCH

 

So you bought that new iPhone. If you are like the typical owner, you’ll be pulling your phone out and using it some 80 times a day, according to data Apple collects. That means you’ll be consulting the glossy little rectangle nearly 30,000 times over the coming year. Your new phone, like your old one, will become your constant companion and trusty factotum—your teacher, secretary, confessor, guru. The two of you will be inseparable.

The smartphone is unique in the annals of personal technology. We keep the gadget within reach more or less around the clock, and we use it in countless ways, consulting its apps and checking its messages and heeding its alerts scores of times a day. The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.

We love our phones for good reasons. It’s hard to imagine another product that has provided so many useful functions in such a handy form. But while our phones offer convenience and diversion, they also breed anxiety. Their extraordinary usefulness gives them an unprecedented hold on our attention and vast influence over our thinking and behavior. So what happens to our minds when we allow a single tool such dominion over our perception and cognition?

Scientists have begun exploring that question—and what they’re discovering is both fascinating and troubling. Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.

The division of attention impedes reasoning and performance.

Adrian Ward, a cognitive psychologist and marketing professor at the University of Texas at Austin, has been studying the way smartphones and the internet affect our thoughts and judgments for a decade. In his own work, as well as that of others, he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.

A 2015 Journal of Experimental Psychology study, involving 166 subjects, found that when people’s phones beep or buzz while they’re in the middle of a challenging task, their focus wavers, and their work gets sloppier—whether they check the phone or not. Another 2015 study, which involved 41 iPhone users and appeared in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.

ILLUSTRATION: SERGE BLOCH

 

The earlier research didn’t explain whether and how smartphones differ from the many other sources of distraction that crowd our lives. Dr. Ward suspected that our attachment to our phones has grown so intense that their mere presence might diminish our intelligence. Two years ago, he and three colleagues— Kristen Duke and Ayelet Gneezy from the University of California, San Diego, and Disney Research behavioral scientist Maarten Bos —began an ingenious experiment to test his hunch.

The researchers recruited 520 undergraduate students at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available cognitive capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.

As the phone’s proximity increased, brainpower decreased.

The results were striking. In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.

In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.

A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.

In an April article in the Journal of the Association for Consumer Research, Dr. Ward and his colleagues wrote that the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.” Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking. The fact that most of us now habitually keep our phones “nearby and in sight,” the researchers noted, only magnifies the mental toll.

Dr. Ward’s findings are consistent with other recently published research. In a similar but smaller 2014 study (involving 47 subjects) in the journal Social Psychology, psychologists at the University of Southern Maine found that people who had their phones in view, albeit turned off, during two demanding tests of attention and cognition made significantly more errors than did a control group whose phones remained out of sight. (The two groups performed about the same on a set of easier tests.)

In another study, published in Applied Cognitive Psychology in April, researchers examined how smartphones affected learning in a lecture class with 160 students at the University of Arkansas at Monticello. They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly. A study of 91 secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.

It isn’t just our reasoning that takes a hit when phones are around. Social skills and relationships seem to suffer as well. Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.

ILLUSTRATION: SERGE BLOCH

 

In a study conducted at the University of Essex in the U.K., 142 participants were divided into pairs and asked to converse in private for 10 minutes. Half talked with a phone in the room, while half had no phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in 2013 in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.” The downsides were strongest when “a personally meaningful topic” was being discussed. The experiment’s results were validated in a subsequent study by Virginia Tech researchers, published in 2016 in the journal Environment and Behavior.

The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware of.

Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking—that has, in the psychological jargon, “salience.” Media and communications devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.

But even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings—which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.

The irony of the smartphone is that the qualities we find most appealing—its constant connection to the net, its multiplicity of apps, its responsiveness, its portability—are the very ones that give it such sway over our minds. Phone makers like Apple and Samsungand app writers like Facebook and Google design their products to consume as much of our attention as possible during every one of our waking hours, and we thank them by buying millions of the gadgets and downloading billions of the apps every year.

A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it isn’t that simple. The way a media device is designed and used exerts at least as much influence over our minds as does the information that the device unlocks.

People’s knowledge may dwindle as gadgets grant them easier access to online data.

As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores. In a seminal 2011 study published in Science, a team of researchers—led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner —had a group of volunteers read 40 brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be immediately erased.

Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it. The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”

Now that our phones have made it so easy to gather information online, our brains are likely offloading even more of the work of remembering to technology. If the only thing at stake were memories of trivial facts, that might not matter. But, as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.” Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.

We aren’t very good at distinguishing the knowledge we keep in our heads from the information we find on our phones.

This story has a twist. It turns out that we aren’t very good at distinguishing the knowledge we keep in our heads from the information we find on our phones or computers. As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”

That insight sheds light on our society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media by Russian agents and other bad actors. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.

Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains. When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning. Upgrading our gadgets won’t solve the problem. We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.

Mr. Carr is the author of “The Shallows” and “Utopia Is Creepy,” among other books.

Tech Giants, Once Seen as Saviors, Are Now Viewed as Threats

Facebook, Google and others positioned themselves as bettering the world. But their systems and tools have also been used to undermine democracy. Credit Ali Asaei for The New York Times

SAN FRANCISCO — At the start of this decade, the Arab Spring blossomed with the help of social media. That is the sort of story the tech industry loves to tell about itself: It is bringing freedom, enlightenment and a better future for all mankind.

Mark Zuckerberg, the Facebook founder, proclaimed that this was exactly why his social network existed. In a 2012 manifesto for investors, he said Facebook was a tool to create “a more honest and transparent dialogue around government.” The result, he said, would be “better solutions to some of the biggest problems of our time.”

Now tech companies are under fire for creating problems instead of solving them. At the top of the list is Russian interference in last year’s presidential election. Social media might have originally promised liberation, but it proved an even more useful tool for stoking anger. The manipulation was so efficient and so lacking in transparency that the companies themselves barely noticed it was happening.

The election is far from the only area of concern. Tech companies have accrued a tremendous amount of power and influence. Amazon determines how people shop, Google how they acquire knowledge, Facebook how they communicate. All of them are making decisions about who gets a digital megaphone and who should be unplugged from the web.

Their amount of concentrated authority resembles the divine right of kings, and is sparking a backlash that is still gathering force.

“For 10 years, the arguments in tech were about which chief executive was more like Jesus. Which one was going to run for president. Who did the best job convincing the work force to lean in,” said Scott Galloway, a professor at New York University’s Stern School of Business. “Now sentiments are shifting. The worm has turned.”

News is dripping out of Facebook, Twitter and now Google about how their ad and publishing systems were harnessed by the Russians. On Nov. 1, the Senate Intelligence Committee will hold a hearing on the matter. It is unlikely to enhance the companies’ reputations.

Under growing pressure, the companies are mounting a public relations blitz. Sheryl Sandberg, Facebook’s chief operating officer, was in Washington this week, meeting with lawmakers and making public mea culpas about how things happened during the election “that should not have happened.” Sundar Pichai, Google’s chief executive, was in Pittsburgh on Thursday talking about the “large gaps in opportunity across the U.S.” and announcing a $1 billion grant program to promote jobs.

Underlying the meet-and-greets is the reality that the internet long ago became a business, which means the companies’ first imperative is to do right by their stockholders.

Ross Baird, president of the venture capital firm Village Capital, noted that when ProPublica tried last month to buy targeted ads for “Jew haters” on Facebook, the platform did not question whether this was a bad idea — it asked the buyers how they would like to pay.

“For all the lip service that Silicon Valley has given to changing the world, its ultimate focus has been on what it can monetize,” Mr. Baird said.

Criticism of tech is nothing new, of course. In a Newsweek jeremiad in 1995 titled “Why the Web Won’t Be Nirvana,” the astronomer Clifford Stoll pointed out that “every voice can be heard cheaply and instantly” on the Usenet bulletin boards, that era’s Twitter and Facebook.

“The result?” he wrote. “Every voice is heard. The cacophony more closely resembles citizens band radio, complete with handles, harassment and anonymous threats. When most everyone shouts, few listen.”

Such complaints, repeated at regular intervals, did not stop the tech world from seizing the moment. Millions and then billions of people flocked to its services. The chief executives were regarded as sages. Disruption was the highest good.

What is different today are the warnings from the technologists themselves. “The monetization and manipulation of information is swiftly tearing us apart,” Pierre Omidyar, the founder of eBay, wrote this week.

Justin Rosenstein, a former Facebook engineer, was portrayed in a recent Guardian story as an apostate: Noting that sometimes inventors have regrets, he said he had programmed his new phone to not let him use the social network.

Mr. Rosenstein, a co-founder of Asana, an office productivity start-up, said in an email that he had banned not just Facebook but also the Safari and Chrome browsers, Gmail and other applications.

“I realized that I spend a lot of time mindlessly interacting with my phone in ways that aren’t serving me,” he wrote. “Facebook is a very powerful tool that I continue to use every day, just with more mindfulness.”

Justin Rosenstein, a former Facebook engineer, recently said he had programmed his phone to prevent him from using the social network on it. CreditStephen McCarthy/Sportsfile for Web Summit

If social media is on the defensive, Mr. Zuckerberg is particularly on the spot — a rare event in a golden career that has made him, at 33, one of the richest and most influential people on the planet.

“We have a saying: ‘Move fast and break things,’” he wrote in his 2012 manifesto. “The idea is that if you never break anything, you’re probably not moving fast enough.”

Facebook dropped that motto two years later, but critics say too much of the implicit arrogance has lingered. Mr. Galloway, whose new book, “The Four,” analyzes the power of Facebook, Amazon, Google and Apple, said the social media network was still fumbling its response.

“Zuckerberg and Facebook are violating the No. 1 rule of crisis management: Overcorrect for the problem,” he said. “Their attitude is that anything that damages their profits is impossible for them to do.”

Joel Kaplan, Facebook’s vice president of global public policy, said the network was doing its best.

“Facebook is an important part of many people’s lives,” he said. “That’s an enormous responsibility — and one that we take incredibly seriously.”

Some social media entrepreneurs acknowledge that they are confronting issues they never imagined as employees of start-ups struggling to survive.

“There wasn’t time to think through the repercussions of everything we did,” Biz Stone, a Twitter co-founder, said in an interview shortly before he rejoined the service last spring.

He maintained that Twitter was getting an unfair rap: “For every bad thing, there are a thousand good things.” He acknowledged, however, that sometimes “it gets a little messy.”

Despite the swell of criticism, the vast majority of investors, consumers and regulators seem not to have changed their behavior. People still eagerly await the new iPhone. Facebook has more than two billion users. President Trump likes to criticize Amazon on Twitter, but his administration ignored pleas for a rigorous examination of Amazon’s purchase of Whole Foods.

In Europe, however, the ground is already shifting. Google’s share of the search engine market there is 92 percent, according to StatCounter. But that did not stop the European Union from fining it $2.7 billion in June for putting its products above those of its rivals.

A new German law that fines social networks huge sums for not taking down hate speech went into effect this month. On Tuesday, a spokesman for Prime Minister Theresa May of Britain said the government was looking“carefully at the roles, responsibility and legal status” of Google and Facebook, with an eye to regulating them as news publishers rather than platforms.

“This war, like so many wars, is going to start in Europe,” said Mr. Galloway, the New York University professor.

For some tech companies, the new power is a heavy weight. Cloudflare, which provides many sites with essential protection from hacking, made its first editorial decision in August: It lifted its protection from The Daily Stormer, basically expunging the neo-Nazi site from the visible web.

“Increasingly tech companies are going to be put into the position of making these sorts of judgments,” said Matthew Prince, Cloudflare’s chief executive.

The picture is likely to get even more complicated. Mr. Prince foresees several possible dystopian futures. One is where every search engine has a political point of view, and users gravitate toward the one they feel most comfortable with. That would further balkanize the internet.

Another possibility is the opposite extreme: Under the pressure of regulation, all hate speech — and eventually all dissent — is filtered out.

“People are realizing that technology isn’t neutral,” Mr. Prince said. “I used to travel to Europe to hear these fears. Now I just have to go to Sacramento.”

Las Vegas Shooting News Coverage – A Perspective

Tags

, , , , , ,

News Man Pic

Last night I received a text from my mom wondering if we should attend the Bruno Mars concert coming up in November. I bought tickets for her birthday this year and we have been excited about attending. What brought on this sudden second guessing? The news coverage of the mass shooting in Las Vegas of course! What happened in Vegas was truly horrible and many are now second guessing how safe it is to attend concerts and other events. While I scrolled through my news feed and perused Facebook, my friends wondered in their posts how such a horrific event could happen. As expected, proponents for tighter gun laws have been in the news which has started a lively debate in my Facebook feed. This post is not about my political views on gun laws, nor is it intended to downplay what has happened. My heart truly goes out to everyone affected. My aim is to bring to light some food for thought as we all absorb the events and news coverage.

The likeliness of being killed in a homicide by a firearm is relatively low compared to other potential causes of death. In 2014 there were 11,008 homicide deaths from a firearm in the U.S. This translates to 3.5 people out of 100,000 or a 0.0035% chance (CDC, 2017). However, firearm homicides are dwarfed in comparison to the top 10 causes of death in 2016 which are as follows:

  • Heart disease: 633,842
  • Cancer: 595,930
  • Chronic lower respiratory diseases: 155,041
  • Accidents (unintentional injuries): 146,571
  • Stroke (cerebrovascular diseases): 140,323
  • Alzheimer’s disease: 110,561
  • Diabetes: 79,535
  • Influenza and pneumonia: 57,062
  • Nephritis, nephrotic syndrome, and nephrosis: 49,959
  • Intentional self-harm (suicide): 44,193 (CDC, 2017)

Looking at the numbers, we should all be more concerned about lifestyles and choices that directly contribute to heart disease and cancer. So why aren’t stories about the leading causes of death receiving the same amount of media coverage? Because media’s #1 job is to create audiences and anything sensational or out of the ordinary does the best job attracting attention (it is like trying to pass a car crash on the freeway and not look). However, creating audiences is much more hyper targeted than it used to be. News Media companies collect personally identifiable information on our viewing and reading habits through cookies, device IDs and set-top box data to name a few. This data collected is then utilized so they can sell their advertisers the best target audiences across their platforms. For example, Apple’s algorithms know I have recently been following hurricanes since I was in Florida right before Irma. On October 3rd in the “For You” section, there was an article from the Miami Herald about the tropical depression moving towards the Caribbean. Right below that article, an advertisement from Wells Fargo (my bank) was strategically placed. Wells Fargo has my personal information and so does Apple, so they can leverage an intermediary to anonymize and match my data between the companies while remaining privacy compliant. From there my anonymized information is leveraged enabling Wells Fargo to strategically target their advertisement in my Apple news feed. Because the targeting is more precise to the audience, Wells Fargo in theory sees a lift in their ROI and Apple commands higher advertising rates.

While media uses sensational headlines and stories to gain more of our attention, the bad news in the media affects our stress levels. A study on news coverage from the 2007 Virginia Tech shootings increased “acute stress” in students at other universities who followed the happenings in the news media. Furthermore, the more news media on the subject consumed the higher the probability the students would respond with higher degrees of stress symptomology (Fallahi & Lesik, 2009). Constant news negativity can exacerbate our own feelings of sadness and anxiety as well as the severity of how we perceive our own situation (Davey, 2012). A big dose of negative news daily can certainly send me into a spin of constant mobile device checking for updates and an overall pessimistic view that day.

Does this mean we should all turn off the news and not pay attention to what is going on in the world? Of course not, as the news media plays a positive role in society as well. We just all need to remember that News Media’s first priority is to create audiences and react accordingly.

References:

CDC. (2017, March 17). Centers for Disease Control and Prevention Assault or Homicide. Retrieved October 6, 2017, from National Center for Health Statistics: https://www.cdc.gov/nchs/fastats/homicide.htm

CDC. (2017, March 17). National Center for Health Statistics Leading Causes of Death. Retrieved October 2017, 2017, from Centers for Disease Control and Prevention: https://www.cdc.gov/nchs/fastats/leading-causes-of-death.htm

Davey, G. (2012). Psychology Today. Retrieved from https://www.psychologytoday.com/blog/why-we-worry/201206/the-psychological-effects-tv-news

Fallahi, C. R., & Lesik, S. A. (2009). The effects of vicarious exposure to the recent massacre at Virginia Tech. Psychological Trauma: Theory, Research, Practice and Policy, 1(3), 220-230. Retrieved from http://dx.doi.org/10.1037/a0015052

 

 

Advertising vs Data – Media Advertising

large_ocxb50ft3lblnlolqi2keb_vodqs4m_w5ejospp8d2wFor over 30 years now I’ve been hearing what seems to be a never-ending debate between “suits” of the research orientation and creatives about how advertising should be done. Creatives always complain that research has a tendency to kill great advertising. Research people point to case after case of bad creative that would never have […]

via Creativity in Advertising vs. Data — mediainmind

Media Psychology

I like socks. Strange socks. People who don’t know what to get me at Christmas usually land on the weirdest socks they can find at EA Games. So when I found out there was a socks company that donated a pair of socks for every pair purchased, it had my attention. But that’s not even […]

via Ear Hustle —