Ever walk into a networking event or cocktail party and all you hear is superficial chit-chat? The small talk is deafening and doesn’t evolve into anything substantial. You can hardly stand not to elicit an eye-roll in between sips of your Mojito.
Questions like what do you do? and where do you live? are predictable and exhausting; commentary about the weather or last night’s game fill up awkward moments as people size each other up to determine — isthis is someone I want to talk to?
As it turns out, the types of conversations you’re engaging in truly matter for your personal wellbeing. In 2010, scientists from the University of Arizona and Washington University in St. Louis investigated whether happy and unhappy people differ in the types of conversations they have.
Seventy-nine participants wore a recording device over four days and were periodically recorded as they went about their lives. Out of more than 20,000 recordings, researchers identified the conversations as trivial small talk or substantive discussions.
As published in Psychological Science, the happiest participants had twice as many genuine conversations and one third as much small talk as the unhappiest participants.
These findings suggest that the happy life is social and conversationally deep rather than isolated and superficial. The research has also confirmed what most people know but don’t practice: surface level small talk does not build relationships
The new trend: Ban the small talk
Obviously inspired, behavioral scientists Kristen Berman and Dan Ariely, co-founders of Irrational Labs, a non-profit behavioral consulting company, raised the bar by hosting a dinner party where small talk was literally banned and only meaningful conversations were allowed.
As documented in a Wired article, invited guests of Berman and Ariely were provided with index cards featuring examples of meaningful (and odd) conversation starters like, for example, the theory of suicide prevention or, um … “the art of the dominatrix.”
The party was a hit. The authors report that “everyone was happier” without the obligation of trivial small talk.
Seizing the opportunity as any innovative entrepreneur would, Carolina Gawroński, founder of No Small Talk dinners, launched her business last month in Hong Kong, which is quickly spreading to cities around the world.
“Growing up I was surrounded by, on the one side, [my father’s] interesting friends. But on the other side, there was this whole element of being social and being at bullshit social events,” Gawroński tells Hong Kong Free Press. “Since a young age, I’ve always questioned it: ‘Why do people talk like this? What’s the point?'”
The rules at a No Small Talk dinner event are simple: no phones and no small talk. Guests also receive cards with meaningful-conversation prompts.
Then, there’s Sean Bisceglia, a partner at Sterling Partners, a private equity firm. Bisceglia has hosted Jefferson-style dinners at his home for the past eight years.
The concept is basically the same but shared as a group in a whole-table conversation with a purpose: One person speaks at a time to the whole table, there are no side conversations, and small talk is completely banned.
“I do it because the shallowness of cocktail chitchat kind of drove me crazy,” Bisceglia tells Crain’s Chicago Business. “There was never any conversation deeper than two minutes. I really felt that if we could bring together a group of people, you could get into the issues and hear different people’s perspectives.”
13 questions to start great conversations
If you’ve bought on to this idea of banning small talk from your conversations, here are thirteen no-fail conversation starters cherry-picked from a few credible sources:
What’s your story?
What’s the most expensive thing you’ve ever stolen?
What is your present state of mind?
What absolutely excites you right now?
What book has influenced you the most?
If you could do anything you wanted tonight (anywhere, for any amount of money), what would you do and why?
If you had the opportunity to meet one person you haven’t met who would it be, why and what would you talk about?
What’s the most important thing I should know about you?
What do you value more, intelligence or common sense?
What movie is your favorite guilty pleasure, and why?
You are stuck on a deserted island, and you can only take three things. What would they be?
When and where were you happiest in your life?
What do you think is the driving force in your life?
Surreal books and films could make you smarter, research finds.
Stories by Franz Kafka or films by master of the absurd David Lynch could boost learning.
Even an unsettling feeling, like the absurdity of life, can engender the desired state.
The reason is that surreal or nonsensical things put our mind into overdrive looking for meaning.
When people are more motivated to search for meaning, they learn better, the psychologists found.
Dr Travis Proulx, the study’s first author, explained:
“The idea is that when you’re exposed to a meaning threat –– something that fundamentally does not make sense –– your brain is going to respond by looking for some other kind of structure within your environment.
And, it turns out, that structure can be completely unrelated to the meaning threat.”
For the study, people read a Franz Kafka’s short story called ‘The Country Doctor’ — which involves a nonsensical series of events.
A version of the story was rewritten to make more sense and read by a control group.
Afterwards, both groups were given an unconscious learning task that involved spotting strings of letters.
Dr Proulx said:
“People who read the nonsensical story checked off more letter strings –– clearly they were motivated to find structure.
But what’s more important is that they were actually more accurate than those who read the more normal version of the story.
They really did learn the pattern better than the other participants did.”
In a second study, people were made to feel their own lives didn’t make sense.
This was done by pointing out the contradictory decisions they had made.
Dr Proulx said:
“You get the same pattern of effects whether you’re reading Kafka or experiencing a breakdown in your sense of identity.
People feel uncomfortable when their expected associations are violated, and that creates an unconscious desire to make sense of their surroundings.
That feeling of discomfort may come from a surreal story, or from contemplating their own contradictory behaviors, but either way, people want to get rid of it.
So they’re motivated to learn new patterns.”
The study only tested unconscious learning, it doesn’t tell us whether you would be able to use this trick intentionally.
Dr Proulx said:
“It’s important to note that sitting down with a Kafka story before exam time probably wouldn’t boost your performance on a test.
What is critical here is that our participants were not expecting to encounter this bizarre story.
If you expect that you’ll encounter something strange or out of the ordinary, you won’t experience the same sense of alienation.
You may be disturbed by it, but you won’t show the same learning ability.
The key to our study is that our participants were surprised by the series of unexpected events, and they had no way to make sense of them.
Hence, they strived to make sense of something else.
≈ Comments Off on How Language Shapes the Way We Think
There are about 7,000 languages spoken around the world — and they all have different sounds, vocabularies and structures. But do they shape the way we think? Cognitive scientist Lera Boroditsky shares examples of language — from an Aboriginal community in Australia that uses cardinal directions instead of left and right to the multiple words for blue in Russian — that suggest the answer is a resounding yes. “The beauty of linguistic diversity is that it reveals to us just how ingenious and how flexible the human mind is,” Boroditsky says. “Human minds have invented not one cognitive universe, but 7,000.”
Lera Boroditsky · Cognitive scientist
Lera Boroditsky is trying to figure out how humans get so smart.
Popular image of man ogling another woman deemed degrading and discriminatory
The popular Distracted Boyfriend meme, based on a photo of a man turning away from his outraged girlfriend to stare admiringly at another woman, has been ruled sexist by Sweden’s advertising ombudsman.
The stock image, also known as Man Looking at Other Woman, by Antonio Guillem, a photographer from Barcelona, was named meme of the year in April and was one of the most widely shared memes in 2017, providing comment on anything from music to politics to hit TV shows.
“The advertisement objectifies women,” the ombudsman, RO, said. “It presents women as interchangeable items and suggests only their appearance is interesting … It also shows degrading stereotypical gender roles of both men and women and gives the impression men can change female partners as they change jobs.”
The ombudsman said the image objectified the two women by presenting them as workplaces, but the man as an individual, and added that the “other woman” was clearly a “sex object … unrelated to the advertisement, which is for recruiting salespeople, operating engineers and a web designer”.
The Swedish advertising industry is self-regulating, meaning that the ombudsman can criticise ads but it does not have the power to impose sanctions.
The ad, posted in April, drew nearly 1,000 comments, many from women who complained it was sexist. “1. You really don’t want to attract women to your company,” one commenter, Susanne Lahti Hagbard, said. “2. You really don’t want to attract sensible guys either.”
Another, Sofie Sundåker, said: “It doesn’t matter if it’s a popular meme. If you do not see how this picture is sexist whatever words are on the people, you are clearly not a workplace for any woman who wants to be taken seriously in her work.”
The company said on its Facebook page that its aim had been “to illustrate a situation that shows Bahnhof is an attractive employer, and that people who have a slightly duller workplace might be interested in us. This was the situation illustrated in this meme.
“Anyone familiar with the internet and meme culture knows how this meme is used and interpreted. Gender is usually irrelevant in the context. We explained meme culture to the ombudsman, but it chose to interpret the post differently”.
If the company should be punished for anything, it concluded, “it should be for using a tired old meme”.
With every Facebook post you like, tweet you send, or question you type into Google, you’re giving the internet strength. Feeding the algorithms. Paying the advertisers. You’re also helping to fill server farms that will ultimately be replaced by bigger server farms, effectively anchoring the internet in the real world. This is all sweet and rosy, if the internet-human relationship is mutually beneficial. But it’s not clear that it is.
In some ways, our nonstop online lives are bringing us closer. But at least as often, the relentless pace of social media, email, and constant pings and beeps only serve to pull us further apart. And all this tech is certainly bad for our health and happiness: Research links social media to depression and high-speed internet to poor sleep. Simply having a phone visible during meals has been shown to make conversation among friends less enjoyable.
It’s probably hard to imagine life without a high-powered computer in your pocket or purse at all times, but it’s worth remembering that you’re still an autonomous being.
That said, these effects aren’t inevitable. Not yet, anyway. It’s probably hard to imagine life without a high-powered computer in your pocket or purse at all times, but it’s worth remembering that you’re still an autonomous being. You can decide how often and in what way you interact with the internet. And if you talk to the researchers, authors, and entrepreneurs who understand digital technology best, you discover that many of them already have.
We reached out to eight digital experts to find out how they maintain a (reasonably) healthy relationship with technology. All agreed that push notifications are evil, so you should go ahead and turn those off right now. Some of the experts even said they keep their ringers and text notifications off, at least some of the time. Beyond that, they all had unique strategies for defending themselves against the intrusive, obnoxious, and possibly destructive effects of technology.
Professor of psychology and behavioral economics at Duke University, author of Predictably Irrational: The Hidden Forces That Shape Our Decisions
Much of Dan Ariely’s work — including Timeful, the A.I.-powered calendar app he built and sold to Google — focuses on making the most of limited time. One way he does this is by starting each morning in a distraction-free environment. “I think very carefully about the first hour of the day,” he says. “I used to have two computers, and one had no email or browser on it.” That’s the one he used for writing in the mornings.
“The thing is to realize that our time to work is actually quite precious.”
Ariely’s travel schedule forced him to abandon the dual-computer setup, but the experiment was fruitful enough that he now relies on a self-imposed internet ban to get work done. “The last thing I do each day is turn my computer off,” he says. “The next day, when I turn it back on, my browser and email are still off.” And Ariely keeps it that way until he’s powered through that first hour. “The thing is to realize that our time to work is actually quite precious,” he says. “We need to protect it.”
Stanford professor, retired entrepreneur, and founder of the Lean Startup movement
Over the two-plus decades that Steve Blank helped shape Silicon Valley, he ushered eight technology startups into the world. But it was during his tenure at Rocket Science Games, a company he founded in the mid-1990s, that Blank began getting high on his own supply. “I found myself drug addicted,” he says. “I’d be up playing games until four in the morning.”
“The devices started as tools and ended up as drugs for most people.”
Video games are hardly a Schedule 1 narcotic, but Blank was losing sleep and, he felt, setting a bad example for his children. Emerging research confirms his idea that games and social media can exert drug-like forces over users. A study published in the journal PLOS One even found that digital addictions can shrink the amount of white matter at certain brain sites, creating changes similar to those seen in alcohol, cocaine, and methamphetamine addictions. “The devices started as tools and ended up as drugs for most people,” Blank says. “App manufacturers are incentivized to make us addicted. I’ll contend that a ton of social media is actually a lot like oxycontin.”
When Blank realized that his gaming habit was robbing him of happiness by way of lost sleep and family time, he snapped his CD-ROMs in half (this was the ’90s, remember). Then he threw the pieces into the trash. “I literally went cold turkey,” he says. “And I haven’t played a video game since.”
Professor of psychology and director of the Emotion and Self-Control Laboratory at the University of Michigan
After studying Facebook — and, more important, after finding that the biggest users were the least satisfied with life — Ethan Kross decided to refrain from any social media use. But he still checks his email more often than he’d like. “It’s a self-control failure from a self-control expert,” he says.
To be fair, the professor is probably selling himself short. The truth is he relies on three solid rules to prevent compulsive emailing.
“So I just try to change my digital environment. We know from research that can be a powerful tool for enhancing self-control.”
First, Kross pushes all fast-moving work conversations to Slack. “That way I can get information from my lab collaborators quickly, and my email becomes less urgent.”
Second, he uses the snooze function, which is available on Gmail and services like Boomerang for Outlook, for any email that isn’t urgent. “If there are 50 things in my inbox, that can be disruptive to my immediate goals,” Kross says. So he snoozes them for a few hours or a few days, depending on the urgency.
Finally, Kross relies on an email-free iPad for reading, so he can’t check his incoming mail even if he wants to. “I don’t like checking my email when I’m in bed, because once every month I’ll receive something that makes me not sleep well,” he says. “So I just try to change my digital environment. We know from research that can be a powerful tool for enhancing self-control.”
Researcher and professor of psychology at San Diego State University and the author of iGen, a book about how the internet is changing young adults
In April of last year, Jean Twenge signed up for Twitter. It’s her first and only social media account, and almost immediately she found herself clashing with people who disagreed with her research. “It’s a public forum, and I felt a compulsion to defend my arguments,” Twenge says. “But is that the right response? I don’t know. For my own mental health, I know it’s not.”
“It’s a public forum, and I felt a compulsion to defend my arguments.”
It’s not that she wanted to be on Twitter, but as an academic with a book to promote, Twenge felt like she had to. After six months with the service, though, Twenge noticed that she was increasingly giving in to a compulsion to check up on conversations that were making her miserable. “It completely confirmed why I don’t have social media,” she says. And so she scaled back. Twenge kept the account for promotional reasons and still has periods of time when she’s active, but when she needs a refresh, she consciously steps away for days or weeks.
When asked if she’s tempted to open an Instagram or Facebook account — even if just for research purposes — she replies quickly, “Nope.”
Professor at San Francisco State University and president of the Biofeedback Federation of Europe
As a researcher who explores the impact of excessive phone use (it makes us feel lonely) and the bad posture brought on by constantly staring at a screen, Erik Peper makes a point of keeping his phone at a distance. When he leaves home in the morning, he packs it into his backpack instead of his pocket. And when he returns in the evening, he docks it at the charging station by his front door.
What’s the point? There are two, actually.
“There are very few things that are truly urgent.”
First, the microwaves coming off mobile devices could present a small risk to their owners, Peper says. In a paper he wrote for the journal Biofeedback, Peper cites epidemiological research showing that people who use cellphones for more than 10 years are more likely than nonusers to have tumors on their salivary glands and inside their ear canals. They’re also three times as likely to have certain brain and spinal-cord tumors on the side of their head where they hold their phone. “The data is weak and controversial,” Peper admits. “But I believe in the precautionary principle, which says that you have to first prove something is totally safe before you can use it.”
The second reason is that, simply put, it’s a distraction. “The phone hijacks our evolutionary patterns,” Peper says. “We don’t do good with multitasking, so if you’re writing an article, and every five minutes you pop back to answer a message, you’re much less productive in the long term.” The same logic applies to socializing, he says, which is why his phone is stored out of sight when he’s with friends and family.
Does it matter that he’s a little slow to reply to messages? Or that he occasionally misses a call? “There are very few things that are truly urgent,” Peper says. “It’s different if you’re a firefighter, but beyond that, whether I answer the email this minute, later today, or even this evening — it really makes no difference.”
CEO of IFTTT, a service that lets you program your apps and smart devices to carry out rote tasks
Years ago, Linden Tibbets decided he didn’t want to be a slave to his email. Which meant, in short, that he would read and send messages only while sitting at his desk.
“The only time I send email on my phone is if I’m running late to a meeting and there’s no other way to communicate,” Tibbets says. “That’s literally the only time.”
“You can be endlessly entertained with what’s happening in the world around you. You don’t need your phone.”
The upshot, he says, is that he’s able to address his correspondance with better focus. “I would much rather spend an extra hour in the evening responding to email than to be distracted by it off and on throughout the day,” Tibbets says. If it takes a while to reply to people, no big deal. “I just say, ‘Thanks for your patience. I apologize for being slow to get back to you.’”
And if he finds himself with a moment of downtime — standing in line for groceries, for instance — Tibbets considers it rare opportunity for mind wandering. “I play a game with myself where I try not to look at my phone,” he says. “I look at people. I read food labels. I observe things in the environment. You can be endlessly entertained with what’s happening in the world around you. You don’t need your phone.”
Professor of marketing at New York University and author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked
In Irresistable, Adam Alter argues that in some ways, tech addiction may actually be worse than cigarette addiction. Because the web is built on social connections, each new addict makes it harder for the rest of us to abstain. “Addictive tech is part of the mainstream in a way that addictive substances never will be,” Alter writes. “Abstinence isn’t an option.”
“I try to put my phone on airplane mode on weekends.”
So what does the tech critic do to protect his own mental autonomy? He disconnects when the workweek’s done. “I try to put my phone on airplane mode on weekends so I can take photos of my two young kids without interruptions from emails and other needy platforms.”
Entrepreneurial consultant, host of Glambition, a podcast for women in business
Last year, Ali Brown had a social media reckoning. “It was after the election, when everything was getting toxic and weird,” she says. “I was getting all my news from Facebook, and I felt this sense of unease all the time.”
So Brown did an entirely logical thing that most of us haven’t done: She drained the swamp on her phone. In one heroic moment of full-steam bravado, Brown deleted Facebook, Twitter, and Instagram and replaced them with apps from the Wall Street Journal and the New York Times. “I decided to pay for some really good journalism,” she says. “I’ll use my time to read those instead.”
“Responding to social media all day is going to get you nowhere.”
Once her healthier new phone routine was established, Brown added back one social media app — but just one! “I like Instagram because it’s generally happy and fun,” she says. “I post about my kids.”
Brown is lucky enough to have a team to run her Twitter and Facebook accounts, but she knows there are better uses for investing her personal time. “If you’re here in this life to do great, powerful work, then you need to create some space in your day to be a freethinker,” she says. “Responding to social media all day is going to get you nowhere.”
To her clients — mostly women running seven- and eight-figure companies — Brown generally offers this advice: “Try deleting social media for a week. You won’t miss anything, you won’t cease to exist, and you’ll thank me later.”
WRITTEN BY Clint Carter
Writer for publications such as Entrepreneur, Men’s Health, Men’s Journal, New York magazine, and Wall Street Journal.
I have observed an increasing number of articles coming across my news feeds and social media how inaccurate perceptions of aging women impacts them in the workplace. A recent WSJ article about women over 50 looking for work caught my attention as the trends in the workplace and media have some similarities. The article cited a study conducted in 2015 at the University of California, Irvine where researchers submitted 40,000 fake job applications from both male and female “candidates” across three age ranges. Unfortunately, significant evidence was found of age discrimination against older women. The author also noted that women often take jobs that are below their capacity, skill level and pay grade and are judged more harshly than their male counterparts for their appearance (Weber, 2017). Being someone who believes the “data doesn’t lie”, I looked at Census and labor statistics.
Women over 40 make up 48% of the U.S. population and men over 40 are roughly 44% (United States Census Bureau, 2017). However, when it comes to unemployment women tend to fare worse than men as they age. Unemployment in the 45-54 age range is higher for women (3.6% of women vs 3% of men), same for the 55-64 range (2.8% for both genders), and higher in the over 65 segment (4% of women vs 3.1% of men) (Labor Force Statistics from the Current Population Survey, 2018).
In the age of awareness and press coverage around unconscious bias, you would think the problem of discrimination and false perceptions associated with age and gender would lead to a more enlightened public. So, the question is, why? Why are women in higher age groups subjected to tougher hurdles and unfair perceptions by other groups? One variable to look at is the media we have consumed. If you think logically about the media consumed by multiple generations, older women have not usually been portrayed in a positive light. For example, Snow White had an evil older step mother, The Little Mermaid had Ursula the old gray-haired villain and 101 Dalmatians’ villain was Cruella Deville. The list goes on and on. See a theme here? If you don’t think these portrayals haven’t impacted our perceptions, please read on…
Cultivation theory in psychology posits that media develops the public’s worldview, especially in children. Media created worldviews, especially those with high exposure, can influence schemas as to what is perceived as normal, particularly with individuals in groups that have little exposure to other groups other than through media (Signorielli, 2004). Portrayals of age groups in television and film can influence our perceptions as to the size of a demographic group, as well as their competencies. Negative portrayals of older age groups can and will create perceptions, particularly with younger demographics, because they are not as likely to critically examine media portrayals. However, perception formation does not only impact younger generations, those in the aging group tend to hold negative stereotypes and perceptions about their own group as well (Lauzen & Dozier, 2005b).
A double standard associated with aging men and women exists in television, film and advertising messages about older women. In many films, women are often portrayed as younger in age compared to male characters and female characters are described as elderly at an earlier age than males. Women are often considered older in the film and television industry by age 35, where this age is higher for men (Bazzini, McIntosh, Smith, Cook, & Harris, 1997). Women’s value in film emphasizes looks and youth whereas men have additional attributes that define their worth. In an analysis of the top 100 grossing films in 2002, Lauzen and Dozier (2005a) found that male characters over the age of 50 were depicted as active in all aspects of life, whereas females were not. Men are portrayed as if they still have things to accomplish as they age, while women are portrayed with less purposeful lives, such as career aspirations (Lauzen & Dozier, 2005a).
Television isn’t any better than film and over time has portrayed aging women as becoming old earlier in life and are less visible than males. Furthermore, aging female characters are portrayed as less useful and with diminished capacity particularly around prestige and elements that would represent importance and vitality compared to men of the same age (Bazzini et. al., 1997: Signorielli, 2004). A study conducted in 2005 on primetime television characters found that representation, recognition and respect are not the same for men and women as they age. Specifically:
Aging female characters had less representation than their male counterparts starting in their 40s.
Portrayals of leadership increased with age, however when analyzed, men were much more likely to play leadership roles in their 40s and 50s compared to women.
Occupational power portrayals had a positive linear relationship to age for both genders however men in their 50s were more likely to have occupational power compared to females of the same age.
Male characters of all ages were likely to have goals whereas women in their 40s were most likely to have goals.
Lauzen & Dozier’s research concluded that there is double standard of respect afforded to aging characters based on gender. Male characters were more likely to have leadership roles, occupational power and goals compared to women, which could have potential effects on older women such as reinforcing a stereotype bias against them in the workplace (Lauzen & Dozier, 2005b).
Some of you reading this article may look at the age of the research I am referencing and say, “This research is between 10 – 20 years old and so much has changed”. With women’s issues receiving more attention in the media, it wouldn’t be farfetched to provide proof points of the changing times by referencing actors such as Lilly Tomlin and Jane Fonda in Grace and Frankie or Judy Dench or Helen Mirrin in powerful roles in recent years. However, this is a false assumption because cultivation theory posits that what we see in the media creates our world views regardless of the veracity. A study conducted in 2016 analyzed over 2000 movie screenplays and the gender associated with dialogue. As women aged, their percentage of dialogue quickly diminished while men’s dialogue increased in age. For example, women between 22-31 received 38% of screenplay words (men were 20%) and between ages 42-65 women received 20% while men received 39%. The numbers for over 65 were abysmal for both genders, however women fared worse with 3% compared to males at 5% (Anderson & Daniels, 2016).
The Center for the Study of Women in Television and Film’s 2017 analysis of the top 100 grossing films of 2017 did not provide an encouraging picture. Women’s total speaking roles were 34% of all characters which is sad considering they represent half the population. However, when their unfair portion of speaking roles were broken down by age, the story continues to favor the younger woman as men over 40 accounted for 46% of all male characters whereas women over 40 were only 29% (Lauzen M. M., 2018). While it is wonderful to see some older women taking on powerful lead roles, the attention it receives is certainly not the norm.
There you have it, as women age in media and entertainment, if they appear at all, they are often portrayed as old, ugly, evil, less competent, less powerful, have little to accomplish and receive less respect than their male counterparts. American culture associates beauty with goodness and therefore a woman’s value tends to be associated with her looks favoring the young (Bazzini, McIntosh, Smith, Cook, & Harris, 1997). The time has come for all supervisors, recruiters and human resource departments to rethink assumptions and check unconscious bias on aging women as well. Women over 40 are a sizeable portion of the population, we are not invisible and dammit we are just as smart, capable and appealing as our male counterparts. America’s unemployment is low, skilled talent is a growing issue and women over 40 represent an opportunity to fill the gap. Is your perception of that woman’s qualifications based on data or is Cinderella’s evil stepmother influencing your opinion?
Bazzini, D. G., McIntosh, W. D., Smith, S. M., Cook, M., & Harris, C. (1997). The aging woman in popular film: Underrepresented, unattractive, unfriendly, and unintelligent. Sex Roles: A Journal of Research, 36(7-8), 531-543. doi:10.1007/BF0276689
Video game addiction is a term that has been used for years by parents and mental health professionals who believe that it’s a real disorder. Now, there’s more weight behind their argument: The World Health Organization (WHO) has including “gaming disorder” as a new mental health condition listed in the 11th edition of its International Classification of Diseases.
According to WHO, there are three major criteria for the diagnosis of gaming disorder: Gaming takes precedence over other activities so much that a person often stops doing other things, a person continues gaming even when it causes issues in their life or they feel that they can’t stop, and gaming causes significant distress and impairments in a person’s relationships with others, as well as their work or school life. If your child gets sucked into a game for a few days, but goes back to normal after that, they wouldn’t qualify: Instead, people must engage in this behavior for at least 12 months, WHO says.
It’s worth noting that WHO’s stance on gaming addiction is different from that of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the handbook used by health professionals in the U.S. and other countries to help diagnose mental health disorders. The DSM-5 calls out “Internet Gaming Disorder” but says it’s a condition that warrants more clinical research and experience before it can be classified in the book as a formal disorder.
WHO says on its website that all people who participate in gaming should be aware that gaming disorder is a real condition, and that it’s important to be mindful of how often they play video games. However, they also point out that gaming disorder only affects a small amount of people who game.
It’s only natural that the news would make you give your child’s gaming system the side-eye.
In general, parents should limit the amount of screen time their children have daily, and gaming is included in that, along with TV, computers, phones and tablet use, Gina Posner, MD, a pediatrician at MemorialCare Orange Coast Medical Center in Fountain Valley, Calif., tells Yahoo Lifestyle.
Screen time isn’t recommended at all for kids who are 18 months or younger, but for children who are older than that up to five, it’s generally recommended that they have not more than one hour of screen time, she says. For those who are six and up, it’s more at the parents’ discretion. “The maximum amount of screen time should be two hours a day, but less is always better,” Posner says.
Posner says that it’s important to set clear limits for your child when it comes to screen time and gaming. For example, say that your child has to do their homework first and/or get out and play for an hour before they’re allowed to game. And even then, make it clear that they’re only allowed to do so for a set period of time.
If your child starts fussing when they’re not allowed to be gaming all day, it’s a clear sign that you need to cut back, Posner says.
Treatment for gaming disorder is generally based in cognitive behavioral therapy, which would generally be done in two phases, Simon Rego, PsyD, chief psychologist at Montefiore Medical Center/Albert Einstein College of Medicine, tells Yahoo Lifestyle. The first is raising awareness for your child that their gaming is a problem, and looking for triggers and cues that could make the gaming habit better or worse. A mental health professional would also address problematic thoughts associated with either stopping playing or the thoughts that keep them gaming, he says.
The goal then is to step down the behavior from something that’s pathological to problematic, and then being able to manage it in a “reasonable way,” Rego says. People don’t necessarily have to quit gaming altogether, but they do need to learn to better manage it with parameters, like only gaming with friends during select times during the day vs. doing it at night alone in their room.
If you suspect that your child has a gaming disorder, it’s important to seek help for it.
Just know that this is still a new diagnosis and you may need to do some sleuthing to find someone who specializes in this kind of behavior.
Smartphones have by now been implicated in so many crummy outcomes—car fatalities, sleep disturbances, empathy loss, relationship problems, failure to notice a clown on a unicycle—that it almost seems easier to list the things they don’t mess up than the things they do. Our society may be reaching peak criticism of digital devices.
Even so, emerging research suggests that a key problem remains underappreciated. It involves kids’ development, but it’s probably not what you think. More than screen-obsessed young children, we should be concerned about tuned-out parents.
Yes, parents now have more face time with their children than did almost any parents in history. Despite a dramatic increase in the percentage of women in the workforce, mothers today astoundingly spend more time caring for their children than mothers did in the 1960s. But the engagement between parent and child is increasingly low-quality, even ersatz. Parents are constantly present in their children’s lives physically, but they are less emotionallyattuned. To be clear, I’m not unsympathetic to parents in this predicament. My own adult children like to joke that they wouldn’t have survived infancy if I’d had a smartphone in my clutches 25 years ago.
To argue that parents’ use of screens is an underappreciated problem isn’t to discount the direct risks screens pose to children: Substantial evidence suggests that many types of screen time (especially those involving fast-paced or violent imagery) are damaging to young brains. Today’s preschoolers spend more than four hours a day facing a screen. And, since 1970, the average age of onset of “regular” screen use has gone from 4 years to just four months.
Some of the newer interactive games kids play on phones or tablets may be more benign than watching TV (or YouTube), in that they better mimic children’s natural play behaviors. And, of course, many well-functioning adults survived a mind-numbing childhood spent watching a lot of cognitive garbage. (My mother—unusually for her time—prohibited Speed Racer and Gilligan’s Island on the grounds of insipidness. That I somehow managed to watch every single episode of each show scores of times has never been explained.) Still, no one really disputes the tremendous opportunity costs to young children who are plugged in to a screen: Time spent on devices is time not spent actively exploring the world and relating to other human beings.
Yet for all the talk about children’s screen time, surprisingly little attention is paid to screen use by parents themselves, who now suffer from what the technology expert Linda Stone more than 20 years ago called “continuous partial attention.” This condition is harming not just us, as Stone has argued; it is harming our children. The new parental-interaction style can interrupt an ancient emotional cueing system, whose hallmark is responsive communication, the basis of most human learning. We’re in uncharted territory.
Child-development experts have different names for the dyadic signaling system between adult and child, which builds the basic architecture of the brain. Jack P. Shonkoff, a pediatrician and the director of Harvard’s Center on the Developing Child, calls it the “serve and return” style of communication; the psychologists Kathy Hirsh-Pasek and Roberta Michnick Golinkoff describe a “conversational duet.” The vocal patterns parents everywhere tend to adopt during exchanges with infants and toddlers are marked by a higher-pitched tone, simplified grammar, and engaged, exaggerated enthusiasm. Though this talk is cloying to adult observers, babies can’t get enough of it. Not only that: One study showed that infants exposed to this interactive, emotionally responsive speech style at 11 months and 14 months knew twice as many words at age 2 as ones who weren’t exposed to it.
Child development is relational, which is why, in one experiment, nine-month-old babies who received a few hours of Mandarin instruction from a live human could isolate specific phonetic elements in the language while another group of babies who received the exact same instruction via video could not. According to Hirsh-Pasek, a professor at Temple University and a senior fellow at the Brookings Institution, more and more studies are confirming the importance of conversation. “Language is the single best predictor of school achievement,” she told me, “and the key to strong language skills are those back-and-forth fluent conversations between young children and adults.”
A problem therefore arises when the emotionally resonant adult–child cueing system so essential to early learning is interrupted—by a text, for example, or a quick check-in on Instagram. Anyone who’s been mowed down by a smartphone-impaired stroller operator can attest to the ubiquity of the phenomenon. One consequence of such scenarios has been noted by an economist who tracked a rise in children’s injuries as smartphones became prevalent. (AT&T rolled out smartphone service at different times in different places, thereby creating an intriguing natural experiment. Area by area, as smartphone adoption rose, childhood ER visits increased.) These findings attracted a decent bit of media attention to the physical dangers posed by distracted parenting, but we have been slower to reckon with its impact on children’s cognitive development. “Toddlers cannot learn when we break the flow of conversations by picking up our cellphones or looking at the text that whizzes by our screens,” Hirsh-Pasek said.
In the early 2010s, researchers in Boston surreptitiously observed 55 caregivers eating with one or more children in fast-food restaurants. Forty of the adults were absorbed with their phones to varying degrees, some almost entirely ignoring the children (the researchers found that typing and swiping were bigger culprits in this regard than taking a call). Unsurprisingly, many of the children began to make bids for attention, which were frequently ignored. A follow-up study brought 225 mothers and their approximately 6-year-old children into a familiar setting and videotaped their interactions as each parent and child were given foods to try. During the observation period, a quarter of the mothers spontaneously used their phone, and those who did initiated substantially fewer verbal and nonverbal interactions with their child.
Yet another rigorously designed experiment, this one conducted in the Philadelphia area by Hirsh-Pasek, Golinkoff, and Temple’s Jessa Reed, tested the impact of parental cellphone use on children’s language learning. Thirty-eight mothers and their 2-year-olds were brought into a room. The mothers were then told that they would need to teach their children two new words (blicking, which was to mean “bouncing,” and frepping, which was to mean “shaking”) and were given a phone so that investigators could contact them from another room. When the mothers were interrupted by a call, the children did not learn the word, but otherwise they did. In an ironic coda to this study, the researchers had to exclude seven mothers from the analysis, because they didn’t answer the phone, “failing to follow protocol.” Good for them!
It has never been easy to balance adults’ and children’s needs, much less their desires, and it’s naive to imagine that children could ever be the unwavering center of parental attention. Parents have always left kids to entertain themselves at times—“messing about in boats,” in a memorable phrase from The Wind in the Willows, or just lounging aimlessly in playpens. In some respects, 21st-century children’s screen time is not very different from the mother’s helpers every generation of adults has relied on to keep children occupied. When parents lack playpens, real or proverbial, mayhem is rarely far behind. Caroline Fraser’s recent biography of Laura Ingalls Wilder, the author of Little House on the Prairie, describes the exceptionally ad hoc parenting style of 19th-century frontier parents, who stashed babies on the open doors of ovens for warmth and otherwise left them vulnerable to “all manner of accidents as their mothers tried to cope with competing responsibilities.” Wilder herself recounted a variety of near-calamities with her young daughter, Rose; at one point she looked up from her chores to see a pair of riding ponies leaping over the toddler’s head.
Occasional parental inattention is not catastrophic (and may even build resilience), but chronic distraction is another story. Smartphone use has been associated with a familiar sign of addiction: Distracted adults grow irritable when their phone use is interrupted; they not only miss emotional cues but actually misread them. A tuned-out parent may be quicker to anger than an engaged one, assuming that a child is trying to be manipulative when, in reality, she just wants attention. Short, deliberate separations can of course be harmless, even healthy, for parent and child alike (especially as children get older and require more independence). But that sort of separation is different from the inattention that occurs when a parent is with a child but communicating through his or her nonengagement that the child is less valuable than an email. A mother telling kids to go out and play, a father saying he needs to concentrate on a chore for the next half hour—these are entirely reasonable responses to the competing demands of adult life. What’s going on today, however, is the rise of unpredictable care, governed by the beeps and enticements of smartphones. We seem to have stumbled into the worst model of parenting imaginable—always present physically, thereby blocking children’s autonomy, yet only fitfully present emotionally.
Fixing the problem won’t be easy, especially given that it is compounded by dramatic changes in education. More young children than ever (about two-thirds of 4-year-olds) are in some form of institutional care, and recent trends in early-childhood education have filled many of their classrooms with highly scripted lessons and dull, one-sided “teacher talk.” In such environments, children have few opportunities for spontaneous conversation.
One piece of good news is that young children are prewired to get what they need from adults, as most of us discover the first time our diverted gaze is jerked back by a pair of pudgy, reproaching hands. Young children will do a lot to get a distracted adult’s attention, and if we don’t change our behavior, they will attempt to do it for us; we can expect to see a lot more tantrums as today’s toddlers age into school. But eventually, children may give up. It takes two to tango, and studies from Romanian orphanages showed the world that there are limits to what a baby brain can do without a willing dance partner. The truth is, we don’t really know how much our kids will suffer when we fail to engage.
Of course, adults are also suffering from the current arrangement. Many have built their daily life around the miserable premise that they can always be on—always working, always parenting, always available to their spouse and their own parents and anyone else who might need them, while also staying on top of the news, while also remembering, on the walk to the car, to order more toilet paper from Amazon. They are stuck in the digital equivalent of the spin cycle.
Under the circumstances, it’s easier to focus our anxieties on our children’s screen time than to pack up our own devices. I understand this tendency all too well. In addition to my roles as a mother and a foster parent, I am the maternal guardian of a middle-aged, overweight dachshund. Being middle-aged and overweight myself, I’d much rather obsess over my dog’s caloric intake, restricting him to a grim diet of fibrous kibble, than address my own food regimen and relinquish (heaven forbid) my morning cinnamon bun. Psychologically speaking, this is a classic case of projection—the defensive displacement of one’s failings onto relatively blameless others. Where screen time is concerned, most of us need to do a lot less projecting.
If we can get a grip on our “technoference,” as some psychologists have called it, we are likely to find that we can do much more for our children simply by doing less—regardless of the quality of their schooling and quite apart from the number of hours we devote to them. Parents should give themselves permission to back off from the suffocating pressure to be all things to all people. Put your kid in a playpen, already! Ditch that soccer-game appearance if you feel like it. Your kid will be fine. But when you are with your child, put down your damned phone.
Everyday things that you rely on will be considered dinosaurs in the next 10 years. Check out which of your favorite ones make the list! by Lisa Douglas from: https://www.urbo.com/content/things-that-will-probably-be-extinct-by-2025/ Life’s pretty darn good right now with our handy cell phones, our easy-to-use remotes, and our ability to send someone a document with the push of […]
When I was a kid, we often went out for ice cream and a game of mini-golf. Most of the set-ups were fun and relatively easy to negotiate. But there was always that one hole. That one where you gotta time it just right to get the ball through the series of 3 tunnels, making sure the rotating blades of the windmills don’t get in the way. UGH! I hated that one.
Certainly by the time you were on your 12th attempt, the game started to lose its carefree feel and performance anxiety set in.
In our family, we devised a rule to deal with this and keep the game fun. If, after 5 tries you could not get that ball where you wanted it to be, you got a Do-Over. You got to wipe the slate clean and start over again. Usually this worked.
Sometimes you just have to step back, take a deep breath and start back at the beginning with a fresh attitude.
In “real life” we rarely get do-overs. Most of the time you can’t un-ring a bell.
Enter . . . regret.
Psych Pstuff’s Summary
Regrets: everyone has them to some extent. Harsh words, career mistakes, missed opportunities — these are all common experiences. Sometimes we regret the way we acted or failed to act. Other times, we think we wouldn’t do anything differently but regret that the outcome was not as intended.
Regret is generally considered a negative emotion, in the classic way that we regress to automatically judging something either “good” or “bad.” While it surely doesn’t feel good, regret can be good for us in several ways by helping to clarify and focus the confusing aspects of a situation.
After all, for the most part, the majority of us are doing the best we can, given the circumstances. Most of us make bad choices because we don’t have all the information about just how bad that choice is. Regret gives us the gift of hindsight to tuck away for “next time.”
Regret gives you perspective nothing else can.
Psychologist Carl Jung once said “Even a happy life cannot be without a measure of darkness, and the word ‘happy’ would lose its meaning if it were not balanced by sadness.” Knowing what you don’t want, how you don’t want to be, from first-hand experience, helps you truly understand what you do want to do or be.
Regret can also keep us humble. And at least a little bit of humility is a good thing. Its opposite is not. Regret reminds us that we are not perfect and puts us in touch with our humanity.
Run amok, of course, regret, like just about anything let run amok, can be negative and harmful. Contemplating is good. Reflecting is good. Ruminating? Not so much. Obsessing on what you could have and should have done better can lead to feelings of worthlessness and depression that paralyzes us instead of inspiring us to do better.
Most of us make bad choices because we don’t have all the information about just how bad that choice is. Regret gives us the gift of hindsight to tuck away for “next time.”
Research has indicated a cultural component to the experience of regret. Collectivistic cultures that emphasize the group over the individual tend to report experiencing less regret. Individualistic societies place an emphasis on individual choice, independence, and performance, setting the stage for self-doubt and blame.
Other research, conducted by Neal Roese of the Kellogg School of Management at Northwestern University, has indicated regret is considered the most effective negative emotion, specifically with respect to: (1) making sense of the world, (2) avoiding future negative behaviors, (3) gaining insight, (4) achieving social harmony, and (5) improving ability to approach desired opportunities.
While wallowing in missed opportunities or less-than-stellar behavior has been shown to have negative effects on physical and emotional health, when used sparingly, and in an introspective and constructive manner, it can clearly be a tool to facilitate decision-making and increase satisfaction.
Regrets can be big or small based on the severity of the negative outcomes for ourselves and others. But either way, they can provide a guiding light and the wisdom that can only come from experience.
Perhaps a healthy way to consider some of our less-than-perfect life decisions was best expressed by the classic crooner Frank Sinatra: “Regrets, I’ve had a few, but then again, too few to mention”
And with a little luck, life even gives you a do-over.