New brain research by USC scientists shows that reading stories is a universal experience that may result in people feeling greater empathy for each other, regardless of cultural origins and differences.
And in what appears to be a first for neuroscience, USC researchers have found patterns of brain activation when people find meaning in stories, regardless of their language. Using functional MRI, the scientists mapped brain responses to narratives in three different languages — English, Farsi and Mandarin Chinese.
The USC study opens up the possibility that exposure to narrative storytelling can have a widespread effect on triggering better self-awareness and empathy for others, regardless of the language or origin of the person being exposed to it.
“Even given these fundamental differences in language, which can be read in a different direction or contain a completely different alphabet altogether, there is something universal about what occurs in the brain at the point when we are processing narratives,” said Morteza Dehghani, the study’s lead author and a researcher at the Brain and Creativity Institute at USC.
Dehghani is also an assistant professor of psychology at the USC Dornsife College of Letters, Arts and Sciences, and an assistant professor of computer science at the USC Viterbi School of Engineering.
The study was published on Sept. 20 in the journal Human Brain Mapping.
Making sense of 20 million personal anecdotes
The researchers sorted through more than 20 million blog posts of personal stories using software developed at the USC Institute for Creative Technologies. The posts were narrowed down to 40 stories about personal topics such as divorce or telling a lie.
They were then translated into Mandarin Chinese and Farsi, and read by a total of 90 American, Chinese and Iranian participants in their native language while their brains were scanned by MRI. The participants also answered general questions about the stories while being scanned.
Using state-of-the-art machine learning and text-analysis techniques, and an analysis involving over 44 billion classifications, the researchers were able to “reverse engineer” the data from these brain scans to determine the story the reader was processing in each of the three languages. In effect, the neuroscientists were able to read the participants’ minds as they were reading.
The brain is not resting
In the case of each language, reading each story resulted in unique patterns of activations in the “default mode network” of the brain. This network engages interconnected brain regions such as the medial prefrontal cortex, the posterior cingulate cortex, the inferior parietal lobe, the lateral temporal cortex and hippocampal formation.
The default mode network was originally thought to be a sort of autopilot for the brain when it was at rest and shown only to be active when someone is not engaged in externally directed thinking. Continued studies, including this one, suggest that the default mode network actually is working behind the scenes while the mind is ostensibly at rest to continually find meaning in narrative, serving an autobiographical memory retrieval function that influences our cognition related to the past, the future, ourselves and our relationship to others.
“One of the biggest mysteries of neuroscience is how we create meaning out of the world. Stories are deep-rooted in the core of our nature and help us create this meaning,” said Jonas Kaplan, corresponding author at the Brain and Creativity Institute and an assistant professor of psychology at USC Dornsife.
Morteza Dehghani, Reihane Boghrati, Kingson Man, Joe Hoover, Sarah I. Gimbel, Ashish Vaswani, Jason D. Zevin, Mary Helen Immordino-Yang, Andrew S. Gordon, Antonio Damasio, Jonas T. Kaplan. Decoding the neural representation of story meanings across languages. Human Brain Mapping, 2017; DOI: 10.1002/hbm.23814
Phubbing is the practice of snubbing others in favor of our mobile phones. We’ve all been there, as either victim or perpetrator. We may no longer even notice when we’ve been phubbed (or are phubbing), it has become such a normal part of life. However, research studies are revealing the profound impact phubbing can have on our relationships and well-being.
There’s an irony in phubbing. When we’re staring at our phones, we’re often connecting with someone on social media or through texting. Sometimes, we’re flipping through our pictures the way we once turned the pages of photo albums, remembering moments with people we love. Unfortunately, however, this can severely disrupt our actual, present-moment, in-person relationships, which also tend to be our most important ones.
The research shows that phubbing isn’t harmless—but the studies to date also point the way to a healthier relationship with our phones and with each other.
What phubbing does to us
According to their study of 145 adults, phubbing decreases marital satisfaction, in part because it leads to conflict over phone use. The scientists found that phubbing, by lowering marital satisfaction, affected a partner’s depression and satisfaction with life. A follow-up study by Chinese scientists assessed 243 married adults with similar results: Partner phubbing, because it was associated with lower marital satisfaction, contributed to greater feelings of depression. In a study poignantly titled, “My life has become a major distraction from my cell phone,” Meredith David and James Roberts suggest that phubbing can lead to a decline in one of the most important relationships we can have as an adult: the one with our life partner.
Phubbing also shapes our casual friendships. Not surprisingly to anyone who has been phubbed, phone users are generally seen as less polite and attentive. Let’s not forget that we are extremely attuned to people. When someone’s eyes wander, we intuitively know what brain studies also show: The mind is wandering. We feel unheard, disrespected, disregarded.
A set of studies actually showed that just having a phone out and present during a conversation (say, on the table between you) interferes with your sense of connection to the other person, the feelings of closeness experienced, and the quality of the conversation. This phenomenon is especially the case during meaningful conversations—you lose the opportunity for true and authentic connection to another person, the core tenet of any friendship or relationship.
In fact, many of the problems with mobile interaction relate to distraction from the physical presence of other people. According to these studies, conversations with no smartphones present are rated as significantly higher-quality than those with smartphones around, regardless of people’s age, ethnicity, gender, or mood. We feel more empathy when smartphones are put away.
This makes sense. When we are on our phones, we are not looking at other people and not reading their facial expressions (tears in their eyes, frowns, smiles). We don’t hear the nuances in their tone of voice (was it shaky with anxiety?), or notice their body posture (slumped and sad? or excited and enthusiastic?).
No wonder phubbing harms relationships.
The way of the phubbed
What do “phubbed” people tend do?
According to a study published in March of this year, they themselves start to turn to social media. Presumably, they do so to seek inclusion. They may turn to their cell phone to distract themselves from the very painful feelings of being socially neglected. We know from brain-imaging research that being excluded registers as actual physical pain in the brain. Phubbed people in turn become more likely to attach themselves to their phones in unhealthy ways, thereby increasing their own feelings of stress and depression.
A Facebook study shows that how we interact on Facebook affects whether it makes us feel good or bad. When we use social media just to passively view others’ posts, our happiness decreases. Another study showed that social media actually makes us more lonely.
“It is ironic that cell phones, originally designed as a communication tool, may actually hinder rather than foster interpersonal connectedness,” write David and Roberts in their study “Phubbed and Alone.” Their results suggest the creation of a vicious circle: A phubbed individual turns to social media and their compulsive behavior presumably leads them to phub others—perpetuating and normalizing the practice and problem of “phubbing.”
“It is ironic that cell phones, originally designed as a communication tool, may actually hinder rather than foster interpersonal connectedness”
―Meredith David and James Roberts
Why do people get into the phubbing habit in the first place? Not surprisingly, fear of missing out and lack of self-control predict phubbing. However, the most important predictor is addiction—to social media, to the cell phone, and to the Internet. Internet addiction has similar brain correlates to physiological forms like addiction to heroine and other recreational drugs. The impact of this addiction is particularly worrisome for children whose brain and social skills are still under development.
Nicholas Kardaras, former Stony Brook Medicine clinical professor and author of Glow Kids, goes so far as to liken screen time to digital cocaine. Consider this: The urge to check social media is stronger than the urge for sex, according to research by Chicago University’s Wilhelm Hoffman.
These findings come as no surprise—decades of research have shown that our greatest need after food and shelter is for positive social connections with other people. We are profoundly social people for whom connection and a sense of belonging are crucial for health and happiness. (In fact, lack thereof is worse for you than smoking, high blood pressure, and obesity.) So, we err sometimes. We look for connection on social media at the cost of face-to-face opportunities for true intimacy.
The urge to check social media might be stronger than the urge for sex.
How to stop phubbing people
To prevent phubbing, awareness is the only solution. Know that what drives you and others is to connect and to belong. While you may not be able to control the behavior of others, you yourself have opportunities to model something different.
Research by Barbara Fredrickson, beautifully described in her book Love 2.0, suggests that intimacy happens in micro-moments: talking over breakfast, the exchange with the UPS guy, the smile of a child. The key is to be present and mindful. A revealing study showed that we are happiest when we are present, no matter what we are doing. Can we be present with the person in front of us right now, no matter who it is?
Studies by Paula Niedenthal reveal that the most essential and intimate form of connection is eye contact. Yet social media is primarily verbal. Research conducted by scientists like the GGSC’s Dacher Keltner and others have shown that posture and the most minute facial expressions (the tightening of our lips, the crow’s feet of smiling eyes, upturned eyebrows in sympathy or apology) communicate more than our words.
Most importantly, they are at the root of empathy—the ability to sense what another person is feeling—which is so critical to authentic human connection. Research shows that altruism and compassion also make us happier and healthier, and can even lengthen our lives. True connection thrives on presence, openness, observation, compassion, and, as Brené Brown has so beautifully shared in her TED talk and her bestselling book Daring Greatly, vulnerability. It takes courage to connect with another person authentically, yet it is also the key to fulfillment.
What to do if you are phubbed
What if you are phubbed? Patience and compassion are key here. Understand that the phubber is probably not doing it with malicious intent, but rather is following an impulse (sometimes irresistible) to connect. Just like you or I, their goal is not to exclude. To the contrary, they are looking for a feeling of inclusion. After all, a telling sociological study shows that loneliness is rising at an alarming rate in our society.
What’s more, age and gender play a role in people’s reactions to phubbing. According to studies, older participants and women advocate for more restricted phone use in most social situations. Men differ from women in that they viewed phone calls as more appropriate in virtually all environments including—and this is quite shocking—intimate settings. Similarly, in classrooms, male students find phubbing far less disturbing than their female counterparts.
Perhaps even worse than disconnecting from others, however, Internet addiction and phubbing disconnect us from ourselves. Plunged into a virtual world, we hunch over a screen, strain our eyes unnecessarily, and tune out completely from our own needs—for sleep, exercise, even food. A disturbing study indicates that for every minute we spend online for leisure, we’re not just compromising our relationships, we are also losing precious self-care time (e.g., sleep, household activities) and productivity.
So, the next time you’re with another human and you feel tempted to pull out your phone—stop. Put it away. Look them in the eyes, and listen to what they have to say. Do it for them, do it for yourself, do it to make the world a better place.
LAWRENCE —visual logo for their brand, believing that consumers apply a logo’s meaning to its accompanying brand. Source: Marketing scholar investigates psychological effects of logo design Consumer behavior literature supports this assumption by documenting how a logo’s meaning can affect attitudes toward and beliefs about associated brands. New research by a University of Kansas marketing […]
Isn’t it scandalous that BarackObama, whose health-care reform law established death panels, is a Muslim who was born in Kenya? And isn’t it scary that all those scientific studies have shown that childhood vaccines can cause autism?
You might not believe these falsehoods, but if so, you’re a minority. In a 2015 study, political scientist Adam Berinsky of MIT asked thousands of US voters to rate the truth or falsity of seven myths, such as that Obama is a Muslim or that vote fraud in Ohio swung the 2004 presidential election to George W. Bush. On average, people believed about two of them, he found. “Some people believe a lot of crazy things,” Berinsky said, “but mostly it’s a lot of people believing a few crazy things.”
Such credulity is bad enough in terms of personal decision-making, as when it causes parents to opt out of childhood vaccines. The notion that a democracy’s electoral decisions are partly shaped by outright lies and slanted rumors must have George Orwell chortling smugly in his grave. Even worse is that misinformation can be “sticky,” or impervious to correction. But the reasons we believe misinformation and resist efforts to debunk it shed some not-very-flattering light on the workings of the human mind.
Start at the beginning, when we first hear a claim or rumor. People “proceed on the assumption that speakers try to be truthful,” psychologist Stephan Lewandowsky of England’s University of Bristol and colleagues explained in Psychological Science in the Public Interest. “Some research has even suggested that to comprehend a statement, people must at least temporarily accept it as true.”
That’s because compared to assuming the truth of a claim, assessing its plausibility is cognitively more demanding. It requires paying careful attention, marshaling remembered facts, and comparing what we just heard to what we (think we) know and remember. With the exceptions of assertions from a messenger we reflexively mistrust (as in, “I won’t believe anything Fox News says”) or involving something we know like our own name, our cognitive reflex is that what we’re hearing is likely true. The mental deck is stacked in favor of belief, not skepticism.
In addition, people are generally more likely to accept claims that are consistent with what they already believe. In what’s called “motivated reasoning,” we process new information through the filter of our preexisting worldview. Think of the process as akin to filing papers. If a new document arrives and fits the contents of an existing folder, it’s much easier to file—remember—than if it doesn’t. Similarly, if many Americans had not already been primed with the idea that Obama is an outsider and a threat to “people like them,” the birthers and death-panel assertions would not have gained the traction they did.
So now we have widely-believed falsehoods. Let’s debunk them.
MIT’s Berinsky tried. In a 2015 study, he asked nearly 2,000 US voters whether the 2010 Affordable Care Act (“Obamacare”) established death panels that would decide whether treatment should be withdrawn from elderly patients. Among voters who said they follow political news, 57% said the death-panel claim was untrue, Berinsky reported in the British Journal of Political Science.
Fifty-seven percent might seem like cause to despair (“only 57% knew the truth?!”). But wait, it got worse. When Berinsky showed people information from nonpartisan sources such as the American Medical Association correcting the death-panel claim, it made little difference in the ranks of believers. “Rumors acquire their power through familiarity,” he said. “Merely repeating a rumor”—including to debunk it—“increases its strength” because our fallible brains conflate familiarity (“I’ve heard that before”) with veracity (“…so it must be true”). As a result, “confronting citizens with the truth can sometimes backfire and reinforce existing misperceptions.”
His findings reinforced something scientists had seen before: the “fluency effect.” The term refers to the fact that people judge the accuracy of information by how easy it is to recall or process. The more we hear something, the more familiar we are with it, so the more likely we are to accept it as true. That’s why a “myths vs. facts” approach to correcting beliefs about, say, vaccinations often fail. Right after reading such correctives, many people accept that something they believed to be true (that the flu vaccine can cause the flu, to take an example from one recent study) isn’t. But the effect fades.
Just hours later, people believe the myth as strongly as ever, studies find. Repeating false information, even in a context of “this is wrong,” makes it more familiar. Familiarity = fluency, and fluency = veracity. The Internet, of course, has exponentially increased the amount of misinformation available to us all, which means that we are “fluent” in evermore fallacious rumors and claims.
People judge the accuracy of information by how easy it is to recall or process. The more we hear something, the more familiar we are with it, so the more likely we are to accept it as true—even if we’re told it isn’t.
Debunking faces another hurdle: If misinformation fits with our worldview, then obviously the debunking clashes with that view. Earlier studies have shown that when self-described political conservatives were shown information that Iraq did not possess weapons of mass destruction (WMDs) at the time of the 2003 invasion, they were more likely to believe Iraq had those weapons. Challenging a core conservative belief—that the invasion was justified on those grounds, that the George W. Bush administration was correct in claiming those weapons existed—caused them to double down on their beliefs. It is harder to accept that the report of WMDs in Iraq was false if one supported the 2003 invasion and the president who ordered it. WMD debunking worked, correcting erroneous beliefs, only among opponents of the invasion and others whose political beliefs meshed with the retraction, a 2010 study found.
Now, to switch presidents, relinquishing belief in Obamacare’s death panels challenges the mental model of the president as a nefarious schemer who hates People Like Me. If that’s my cognitive model, then removing the fact (sic) of death panels weakens it. Challenging my mental model makes me have to pause and think, wait, which negative rumors about Obama are correct and which are myths? Easier to believe they’re all true.
Misinformation is sticky because evicting it from our belief system requires cognitive effort. Remember the situation: Our mind holds an assertion that likely went down easy, cognitively speaking; we assumed the veracity of the source and fluently easily slotted it into our mental worldview. Now here comes contrary information. It makes us feel cognitively uneasy and requires more mental processing power to absorb. That’s the very definition of non-fluent: the information does not flow easily into our consciousness or memory.
All is not lost, however. In Berinsky’s death-panels study, he followed the AMA debunking with something quite different: quotes from a Republican senator slamming the rumors as a pack of lies. Now 69% agreed it was a fabrication—a significant uptick—with more disbelievers among both Democrats and Republicans. When an “unlikely source” refutes a rumor, Berinsky explained, and the debunker’s debunking runs contrary to its interests (a Republican defending Obamacare?!), “it can increase citizens’ willingness to reject rumors.”
If the most effective way to debunk false rumors is to get a politician to speak against his or her own interests…well, I leave it to you, reader, to decide if, in our hyperpartisan world, this is more likely to happen than pigs flying.
Sharon Begley is a senior science writer with The Boston Globe Media Group, author of Train Your Mind, Change Your Brain, and coauthor with Richard Davidson of The Emotional Life of Your Brain. She writes a regular column for Mindful magazine called Brain Science.
You probably know the Google Effect: the first rigorous finding in the booming research into how digital technology affects cognition. It’s also known as digital amnesia, and it works like this: When we know where to find a piece of information, and when it takes little effort to do so, we are less likely to remember that information. First discovered by psychologist Betsy Sparrow of Columbia University and her colleagues, the Google Effect causes our brains to take a pass on retaining or recalling facts such as “an ostrich’s eye is bigger than its brain” (an example Sparrow used) when we know they are only a few keystrokes away.
“Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally,” Sparrow explained in her 2011 paper. “When we need it, we will look it up.” Storing information requires mental effort—that’s why we study before exams and cram for presentations—so unless we feel the need to encode something into a memory, we don’t try. Result: Our recollection of ostrich anatomy, and much else, dissipates like foam on a cappuccino.
It’s tempting to leap from the Google Effect to dystopian visions of empty-headed dolts who can’t remember even the route home (thanks a lot, GPS), let alone key events of history (cue Santayana’s hypothesis that those who can’t remember history are doomed to repeat it). But while the short-term effects of digital tech on what we remember and how we think are real, the long-term consequences are unknown; the technology is simply too new for scientists to have figured it out.
People spend an average of 3 to 5 minutes at their computer working on the task at hand before switching to Facebook or other enticing websites.
Before we hit the panic button, it’s worth reminding ourselves that we have been this way before. Plato, for instance, bemoaned the spread of writing, warning that it would decimate people’s ability to remember (why make the effort to encode information in your cortex when you can just consult your handy papyrus?). On the other hand, while writing did not trigger a cognitive apocalypse, scientists are finding more and more evidence that smartphones and internet use are affecting cognition already.
The Google Effect? We’ve probably all experienced it. “Sometimes I spend a few minutes trying hard to remember some fact”—like whether a famous person is alive or dead, or what actor was in a particular movie—“and if I can retrieve it from my memory, it’s there when I try to remember it two, five, seven days later,” said psychologist Larry Rosen, professor emeritus at California State University, Dominguez Hills, who researches the cognitive effects of digital technology. “But if I look it up, I forget it very quickly. If youcan ask your device any question, you do ask your device any question” rather than trying to remember the answer or doing the mental gymnastics to, say, convert Celsius into Fahrenheit.
“Doing that is profoundly impactful,” Rosen said. “It affects your memory as well as your strategy for retrieving memories.” That’s because memories’ physical embodiment in the brain is essentially a long daisy chain of neurons, adding up to something like architectI.M. Pei is alive or swirling water is called an eddy. Whenever we mentally march down that chain we strengthen the synapses connecting one neuron to the next. The very act of retrieving a memory therefore makes it easier to recall next time around. If we succumb to the LMGTFY (let me Google that for you) bait, which has become ridiculously easy with smartphones, that doesn’t happen.
To which the digital native might say, so what? I can still Google whatever I need, whenever I need it. Unfortunately, when facts are no longer accessible to our conscious mind, but only look-up-able, creativity suffers. New ideas come from novel combinations of disparate, seemingly unrelated elements. Just as having many kinds of Legos lets you build more imaginative structures, the more elements—facts—knocking around in your brain the more possible combinations there are, and the more chances for a creative idea or invention. Off-loading more and more knowledge to the internet therefore threatens the very foundations of creativity.
Besides letting us outsource memory, smartphones let us avoid activities that many people find difficult, boring, or even painful: daydreaming, introspecting, thinking through problems. Those are all so aversive, it seems, that nearly half of people in a 2014 experiment whose smartphones were briefly taken away preferred receiving electric shocks than being alone with their thoughts. Yet surely our mental lives are the poorer every time we check Facebook or play Candy Crush instead of daydream.
But why shouldn’t we open the app? The appeal is undeniable. We each have downloaded an average of nearly 30 mobile apps, and spend 87 hours per month internet browsing via smartphone, according to digital marketing company Smart Insights. As a result, distractions are just a click away—and we’re really, really bad at resisting distractions. Our brains evolved to love novelty (maybe human ancestors who were attracted to new environments won the “survival of the fittest” battle), so we flit among different apps and websites.
As a result, people spend an average of just three to five minutes at their computer working on the task at hand before switching to Facebook or another enticing website or, with phone beside them, a mobile app. The most pernicious effect of the frenetic, compulsive task switching that smartphones facilitate is to impede the achievement of goals, even small everyday ones. “You can’t reach any complex goal in three minutes,” Rosen said. “There have always been distractions, but while giving in used to require effort, like getting up and making a sandwich, now the distraction is right there on your screen.”
The mere existence of distractions is harmful because resisting distractions that we see out of the corner of our eye (that Twitter app sitting right there on our iPhone screen) takes effort. Using fMRI to measure brain activity, neuroscientist Adam Gazzaley of the University of California, San Francisco, found that when people try to ignore distractions it requires significant mental resources. Signals from the prefrontal cortex race down to the visual cortex, suppressing neuronal activity and thereby filtering out what the brain’s higher-order cognitive regions have deemed irrelevant. So far, so good.
The problem is that the same prefrontal regions are also required for judgment, attention, problem solving, weighing options, and working memory, all of which are required to accomplish a goal. Our brains have limited capacity to do all that. If the prefrontal cortex is mightily resisting distractions, it isn’t hunkering down to finish the term paper, monthly progress report, sales projections, or other goal it’s supposed to be working toward. “We are all cruising along on a superhighway of interference” produced by the ubiquity of digital technology, Gazzaley and Rosen wrote in their 2016 book The Distracted Mind. That impedes our ability to accomplish everyday goals, to say nothing of the grander ones that are built on the smaller ones.
The constant competition for our attention from all the goodies on our phone and other screens means that we engage in what a Microsoft scientist called “continuous partial attention.” We just don’t get our minds deeply into any one task or topic. Will that have consequences for how intelligent, creative, clever, and thoughtful we are? “It’s too soon to know,” Rosen said, “but there is a big experiment going on, and we are the lab rats.”
Tech Invasion LMGTFY
“Let me Google that for you” may be some of the most damaging words for our brain. Psychologists have theorized that the “Google Effect” causes our memories to weaken due merely to the fact that we know we can look something up, which means we don’t keep pounding away at the pathways that strengthen memory. Meanwhile, research suggests that relying on GPS weakens our age-old ability to navigate our surroundings. And to top it all off, the access to novel info popping up on our phone means that, according to Deloitte, people in the US check their phones an average of 46 times per day—which is more than a little disruptive.
Social media is undoubtedly a large part of most people`s lives these days, with an average person spending about 135 minutes daily on social media. This is more than two hours a day! Statistics reveal that teens spend up to nine hours daily on social media. With this in mind, we cannot help it but ask what draws people to spend so much of their time on social media. Well, a recent study tackled certain aspects of this issue.
Motivations for Social Media Use
The study, done by Ozimek and colleagues, came up with three motives for social media use. They included:
Self-presentation, or the need to present yourself and life as positively as possible (to both yourself and others)
Social interaction and the need to belong (staying in touch with friends and family members)
Although there might be more reasons than the ones outlined above, most of them stem from one of the three. For instance, if you tend to scroll back through your feed to remind yourself of some of the things you have posted earlier, self-presentation does matter to you. Regardless of your motivation for social media use, it is important to be aware of both the positive and negative effects it has on your wellbeing.
The researchers found that many people use social media in order to obtain materialistic goals and wondered whether materialism could be yet another motivation for social media use.
Materialism and Social Media
A recent study suggested that people having loads of Facebook friends are more materialist than those with fewer Facebook friends. “Materialistic people use Facebook more frequently because they tend to objectify their Facebook friends – they acquire Facebook friends to increase their possessions,” concluded the study’s researcher, Phillip Ozimek. The study involved use of a questionnaire to measure how much people actually compare themselves to others and their materialistic goals.
This falls in line with a previous study on materialism, which found that materialists collect things that they can show publicly, it is not about having them. And, Facebook is the ideal place for a materialist to display their items. In addition, there is yet another aspect of materialism –objectification- where materialists look on other people as objects. This is clearly seen in social media, where users tends to place quite a high value on the number of friends.
“More generally, we suggest that materialists have a tendency to view and treat non-material events (like friendships) as a possession or as means to attain their materialistic goals, the Ozimek study states.” This can be seen con job networking sites such as LinkedIn.
The Effects of Materialism
First and foremost, materialists often neglect the emotions of those they objectify, which can in turn impact interpersonal relationships. When a person feels devalued, they disconnect from the person.
Secondly, materialism could lead to emotional health problems too. When material items are the key to value, their removal might cause crisis or any other form of stress. If a certain person unfriends you, you can swap them for another. But, if a lot of people do the same, you are likely to question your own value, aren’t you?
It can be a full-time hobby to keep up with technology as it evolves. Every year, I find myself donating or selling my favorite gadgets as they become obsolete. However, there’s one ancient technology that I’ve been buying more than selling and that’s vinyl. And I’m not alone. Vinyl record sales hit a 28-year high in 2016, according to Fortune Magazine.
Raoul Benavides, owner of Flashlight Vinyl, explains why he was able to open a record store in 2016, and why we miss listening to vinyl records.
People like to think of themselves as savvy shoppers, but are still vulnerable to these common psychological tricks. Source: How Stores Trick You Into Buying More Things Oct 11, 2017 Video by The Atlantic How do consumers decide what to buy? The truth is that stores know you better than you do—both online and offline. […]
The new International Classification of Diseases (ICD-11), the WHO’s official diagnostic manual, will be published in 2018, having last been updated in 1990, so this new addition is quite significant.
“Health professionals need to recognise that gaming disorder may have serious health consequences,” Vladimir Poznyak at the WHO’s Department of Mental Health and Substance Abuse told New Scientist.
Of course, most people who indulge in a spot of Super Mario Odyssey or Zelda aren’t addicted, so the criteria for diagnosis of the disorder has been carefully considered.
According to a current draft, the criteria include making gaming a priority “to the extent that gaming takes precedence over other life interests”, and continuing this despite the risk of it being detrimental to your health – such as lack of sleep and sustenance. However, this behavior must be observed for at least a year before diagnosis can be confirmed.
According to Poznyak, the WHO has been considering this inclusion for the best part of a decade, and now, after consultations with mental health experts, the organization is satisfied it meets the criteria of a disorder. When asked why other technology-based addictions were not being included Poznyak said: “There is simply a lack of evidence that these are real disorders.”
Of course, there are plenty of arguments against this new inclusion, including the fear of unnecessarily attaching a stigma to people and trivializing what people consider “real” conditions.
Psychiatrist Allen Frances, former chair of the Diagnostic and Statistical Manual of Mental Disorders has previously said that the DSM, amassed by experts to help define and classify mental disorders, refused to include Internet addiction as a condition for fear of mislabelling and overtreating millions of people who just really really like their smartphones.
As he points out, “billions of people around the world are hooked on caffeine for fun or better functioning, but only rarely does this cause more trouble than its worth.”
However, it was also the DSM’s reclassification of gambling disorder from a compulsion to an addiction in 2013 that legitimized non-substance addiction as a diagnostic category – one that is very hard to define as it is based mostly on symptoms – opening up the possibility that almost anything could be considered pathological.
Indeed, multiple studies have been carried out asking whether or not a wide variety of subjects from shopping to sugar to suntanning to love can be officially described as addictive. Whether they too will one day be recognized as official conditions remains to be seen.
Just about every article I write sets the stage by giving recent estimates of the number of hours children are spending in front of screens. The numbers vary by survey or research study but the fact that they are high and getting higher does not. It’s easy to look at some stats:
Parents estimate their kids 5-to-18-years old spend 4.9 hours per day on a digital device.
Broken out by age in different studies, those numbers look like this:
Parents estimate children up to age eight spend 2 hours and 19 minutes with screens.
Parents estimate children aged 8-to-12 years old spend 4 hours and 36 minutes using screens.
Teens spend an average of 6 hours and 40 minutes engaged with screen entertainment (excluding school work)
The conclusion of examining these shocking statistics is usually: “This can’t be good,” with the same foreboding feeling as when a movie character goes out exploring in a horror flick.
Popular media and parents have been talking addiction in relation to screen-based media since the advent of the iPhone in 2007. The American Psychiatric Association is very conservative about behavioral addiction diagnoses. It took decades of research and consensus for the American Psychiatric Association to add Gambling Disorder as a behavioral addiction diagnosis to the DSM-V (the volume of diagnosable mental health issues) in 2013. Long before it’s addition to the DSM-V, many families struggled with a Gambling Disorder and there were many treatments available. The American Psychiatric Association has strict guidelines regarding research to validate a diagnosis and provide information on prevalence rates, comorbid conditions and course.
In 2013, the American Psychiatric Association also added Internet Gaming Disorder to its list of “Conditions for Further Study.” Once there is sufficient research basis, this disorder could move into a diagnosable disorder. However, it is restricted to online gaming, not screen-time in general. While people may feel addicted to screen-time, research has not yet shown same issues with tolerance and unsuccessful attempts to cut back.
Yet, parents and children alike are using the term “addiction” to describe their relationship with screen-based technology. A recent survey research on teenagers suggests that over 50% of them “feel addicted” to their mobile devices. The survey was conducted by Common Sense Media and, James Steyer, the founder and CEO stated, “What we’ve discovered is that kids and parents feel addicted to their mobile devices, that it is causing daily conflict in homes, and that families are concerned about the consequences. We also know that problematic media use can negatively affect children’s development and that multitasking can harm learning and performance. As a society we all have a responsibility to take media use and addiction seriously and make sure parents have the information to help them make smart choices for their families.”
Another recent survey study asked about addiction and digital devices: 67% of the 394 U.S. parents of children aged 5-to-18-years-old surveyed describe their children as addicted. Virtually an identical percentage of parents say they are addicted to digital devices as well.
Basically, we know that screen-time addiction is not a diagnosable mental health disorder and yet, we also know that a large percentage of parents and children are reporting that they feel they are “addicted” to screen-based media or digital devices. The next step is to clarify and quantify what kids and parents mean when they say they are addicted to screens.
How do I know if my kid’s screen-time is problematic?
Most parents have an idea when their child’s screen-time has become problematic. However, new research has given us astandardized way to determine if those hours of screen-time are problematic. A group of researchers have created the Problematic Media Use Measure specifically for parents of children aged 4-to-11-years. The scale items were created based on the 9 criteria for Internet Gaming Disorder in the DSM-5 and then validated in a series of studies. Importantly, the researchers found that the scale was able to predict problems in functioning, over and above children’s total number of hours of screen-time. This indication of incremental validity demonstrates that this measure adds something to our understanding of problematic screen-time beyond “she spends how many hours on that thing?”
The development of this scale is big news to parents (who can have a standardized method of examining their children’s screen habits) and to researchers (who have a more sensitive measure than total hours to examine screen-time problems). The scale items ask about those things that concern parents about screen-media.
Here are some areas to think about if you are worried about your child’s screen-time:
The study and associated measure was just published this year and more studies will need to be conducted to develop clinical cut-off scores. But, for now, this measure can help parents, clinicians and researchers parse out media-use from problematic media-use.
To learn more about the study, see the abstract here. The full citation for the study is:
Domoff, S. E., Harrison, K., Gearhardt, A. N., Gentile, D. A., Lumeng, J. C., & Miller, A. L. (2017). Development and Validation of the Problematic Media Use Measure: A Parent Report Measure of Screen Media “Addiction” in Children. Psychology of Popular Media Culture. Advance online publication. http://dx.doi.org/10.1037/ppm0000163