When you think about a sentence, you usually think about words — not lines. But sentence diagramming brings geometry into grammar.
If you weren’t taught to diagram a sentence, this might sound a little zany. But the practice has a long — and controversial — history in U.S. schools.
And while it was once commonplace, many people today don’t even know what it is.
So let’s start with the basics.
“It’s a fairly simple idea,” says Kitty Burns Florey, the author of Sister Bernadette’s Barking Dog: The Quirky History and Lost Art of Diagramming Sentences. “I like to call it a picture of language. It really does draw a picture of what language looks like.”
I asked her to show me, and for an example she used the first sentence she recalls diagramming: “The dog barked.”
“By drawing a line and writing ‘dog’ on the left side of the line and ‘barked’ on the right side of the line and separating them with a little vertical line, we could see that ‘dog’ was the subject of the sentence and ‘barked’ was the predicate or the verb,” she explains. “When you diagram a sentence, those things are always in that relation to each other. It always makes the same kind of picture. And supposedly, it makes it easier for kids who are learning to write, learning to use correct English.”
An Education ‘Phenomenon’
Burns Florey and other experts trace the origin of diagramming sentences back to 1877 and two professors at Brooklyn Polytechnic Institute. In their book, Higher Lessons in English, Alonzo Reed and Brainerd Kellogg made the case that students would learn better how to structure sentences if they could see them drawn as graphic structures.
After Reed and Kellogg published their book, the practice of diagramming sentences had something of a Golden Age in American schools.
“It was a purely American phenomenon,” Burns Florey says. “It was invented in Brooklyn, it swept across this country like crazy and became really popular for 50 or 60 years and then began to die away.”
By the 1960s, new research dumped criticism on the practice.
“Diagramming sentences … teaches nothing beyond the ability to diagram,” declared the 1960 Encyclopedia of Educational Research.
In 1985, the National Council of Teachers of English declared that “repetitive grammar drills and exercises” — like diagramming sentences — are “a deterrent to the improvement of students’ speaking and writing.”
Nevertheless, diagramming sentences is still taught — you can find it in textbooks and see it in lesson plans. My question is, why?
Burns Florey says it might still be a good tool for some students. “When you’re learning to write well, it helps to understand what the sentence is doing and why it’s doing it and how you can improve it.”
But does it deserve a place in English class today? (The Common Core doesn’t mention it.)
“There are two kinds of people in this world — the ones who loved diagramming, and the ones who hated it,” Burns Florey says.
She’s in the first camp. But she understands why, for some students, it never clicks.
“It’s like a middle man. You’ve got a sentence that you’re trying to write, so you have to learn to structure that, but also you have to learn to put it on these lines and angles and master that, on top of everything else.”
So many students ended up frustrated, viewing the technique “as an intrusion or as an absolutely confusing, crazy thing that they couldn’t understand.”
Ever walk into a networking event or cocktail party and all you hear is superficial chit-chat? The small talk is deafening and doesn’t evolve into anything substantial. You can hardly stand not to elicit an eye-roll in between sips of your Mojito.
Questions like what do you do? and where do you live? are predictable and exhausting; commentary about the weather or last night’s game fill up awkward moments as people size each other up to determine — isthis is someone I want to talk to?
As it turns out, the types of conversations you’re engaging in truly matter for your personal wellbeing. In 2010, scientists from the University of Arizona and Washington University in St. Louis investigated whether happy and unhappy people differ in the types of conversations they have.
Seventy-nine participants wore a recording device over four days and were periodically recorded as they went about their lives. Out of more than 20,000 recordings, researchers identified the conversations as trivial small talk or substantive discussions.
As published in Psychological Science, the happiest participants had twice as many genuine conversations and one third as much small talk as the unhappiest participants.
These findings suggest that the happy life is social and conversationally deep rather than isolated and superficial. The research has also confirmed what most people know but don’t practice: surface level small talk does not build relationships
The new trend: Ban the small talk
Obviously inspired, behavioral scientists Kristen Berman and Dan Ariely, co-founders of Irrational Labs, a non-profit behavioral consulting company, raised the bar by hosting a dinner party where small talk was literally banned and only meaningful conversations were allowed.
As documented in a Wired article, invited guests of Berman and Ariely were provided with index cards featuring examples of meaningful (and odd) conversation starters like, for example, the theory of suicide prevention or, um … “the art of the dominatrix.”
The party was a hit. The authors report that “everyone was happier” without the obligation of trivial small talk.
Seizing the opportunity as any innovative entrepreneur would, Carolina Gawroński, founder of No Small Talk dinners, launched her business last month in Hong Kong, which is quickly spreading to cities around the world.
“Growing up I was surrounded by, on the one side, [my father’s] interesting friends. But on the other side, there was this whole element of being social and being at bullshit social events,” Gawroński tells Hong Kong Free Press. “Since a young age, I’ve always questioned it: ‘Why do people talk like this? What’s the point?'”
The rules at a No Small Talk dinner event are simple: no phones and no small talk. Guests also receive cards with meaningful-conversation prompts.
Then, there’s Sean Bisceglia, a partner at Sterling Partners, a private equity firm. Bisceglia has hosted Jefferson-style dinners at his home for the past eight years.
The concept is basically the same but shared as a group in a whole-table conversation with a purpose: One person speaks at a time to the whole table, there are no side conversations, and small talk is completely banned.
“I do it because the shallowness of cocktail chitchat kind of drove me crazy,” Bisceglia tells Crain’s Chicago Business. “There was never any conversation deeper than two minutes. I really felt that if we could bring together a group of people, you could get into the issues and hear different people’s perspectives.”
13 questions to start great conversations
If you’ve bought on to this idea of banning small talk from your conversations, here are thirteen no-fail conversation starters cherry-picked from a few credible sources:
What’s your story?
What’s the most expensive thing you’ve ever stolen?
What is your present state of mind?
What absolutely excites you right now?
What book has influenced you the most?
If you could do anything you wanted tonight (anywhere, for any amount of money), what would you do and why?
If you had the opportunity to meet one person you haven’t met who would it be, why and what would you talk about?
What’s the most important thing I should know about you?
What do you value more, intelligence or common sense?
What movie is your favorite guilty pleasure, and why?
You are stuck on a deserted island, and you can only take three things. What would they be?
When and where were you happiest in your life?
What do you think is the driving force in your life?
Surreal books and films could make you smarter, research finds.
Stories by Franz Kafka or films by master of the absurd David Lynch could boost learning.
Even an unsettling feeling, like the absurdity of life, can engender the desired state.
The reason is that surreal or nonsensical things put our mind into overdrive looking for meaning.
When people are more motivated to search for meaning, they learn better, the psychologists found.
Dr Travis Proulx, the study’s first author, explained:
“The idea is that when you’re exposed to a meaning threat –– something that fundamentally does not make sense –– your brain is going to respond by looking for some other kind of structure within your environment.
And, it turns out, that structure can be completely unrelated to the meaning threat.”
For the study, people read a Franz Kafka’s short story called ‘The Country Doctor’ — which involves a nonsensical series of events.
A version of the story was rewritten to make more sense and read by a control group.
Afterwards, both groups were given an unconscious learning task that involved spotting strings of letters.
Dr Proulx said:
“People who read the nonsensical story checked off more letter strings –– clearly they were motivated to find structure.
But what’s more important is that they were actually more accurate than those who read the more normal version of the story.
They really did learn the pattern better than the other participants did.”
In a second study, people were made to feel their own lives didn’t make sense.
This was done by pointing out the contradictory decisions they had made.
Dr Proulx said:
“You get the same pattern of effects whether you’re reading Kafka or experiencing a breakdown in your sense of identity.
People feel uncomfortable when their expected associations are violated, and that creates an unconscious desire to make sense of their surroundings.
That feeling of discomfort may come from a surreal story, or from contemplating their own contradictory behaviors, but either way, people want to get rid of it.
So they’re motivated to learn new patterns.”
The study only tested unconscious learning, it doesn’t tell us whether you would be able to use this trick intentionally.
Dr Proulx said:
“It’s important to note that sitting down with a Kafka story before exam time probably wouldn’t boost your performance on a test.
What is critical here is that our participants were not expecting to encounter this bizarre story.
If you expect that you’ll encounter something strange or out of the ordinary, you won’t experience the same sense of alienation.
You may be disturbed by it, but you won’t show the same learning ability.
The key to our study is that our participants were surprised by the series of unexpected events, and they had no way to make sense of them.
Hence, they strived to make sense of something else.
≈ Comments Off on How Language Shapes the Way We Think
There are about 7,000 languages spoken around the world — and they all have different sounds, vocabularies and structures. But do they shape the way we think? Cognitive scientist Lera Boroditsky shares examples of language — from an Aboriginal community in Australia that uses cardinal directions instead of left and right to the multiple words for blue in Russian — that suggest the answer is a resounding yes. “The beauty of linguistic diversity is that it reveals to us just how ingenious and how flexible the human mind is,” Boroditsky says. “Human minds have invented not one cognitive universe, but 7,000.”
Lera Boroditsky · Cognitive scientist
Lera Boroditsky is trying to figure out how humans get so smart.
With every Facebook post you like, tweet you send, or question you type into Google, you’re giving the internet strength. Feeding the algorithms. Paying the advertisers. You’re also helping to fill server farms that will ultimately be replaced by bigger server farms, effectively anchoring the internet in the real world. This is all sweet and rosy, if the internet-human relationship is mutually beneficial. But it’s not clear that it is.
In some ways, our nonstop online lives are bringing us closer. But at least as often, the relentless pace of social media, email, and constant pings and beeps only serve to pull us further apart. And all this tech is certainly bad for our health and happiness: Research links social media to depression and high-speed internet to poor sleep. Simply having a phone visible during meals has been shown to make conversation among friends less enjoyable.
It’s probably hard to imagine life without a high-powered computer in your pocket or purse at all times, but it’s worth remembering that you’re still an autonomous being.
That said, these effects aren’t inevitable. Not yet, anyway. It’s probably hard to imagine life without a high-powered computer in your pocket or purse at all times, but it’s worth remembering that you’re still an autonomous being. You can decide how often and in what way you interact with the internet. And if you talk to the researchers, authors, and entrepreneurs who understand digital technology best, you discover that many of them already have.
We reached out to eight digital experts to find out how they maintain a (reasonably) healthy relationship with technology. All agreed that push notifications are evil, so you should go ahead and turn those off right now. Some of the experts even said they keep their ringers and text notifications off, at least some of the time. Beyond that, they all had unique strategies for defending themselves against the intrusive, obnoxious, and possibly destructive effects of technology.
Professor of psychology and behavioral economics at Duke University, author of Predictably Irrational: The Hidden Forces That Shape Our Decisions
Much of Dan Ariely’s work — including Timeful, the A.I.-powered calendar app he built and sold to Google — focuses on making the most of limited time. One way he does this is by starting each morning in a distraction-free environment. “I think very carefully about the first hour of the day,” he says. “I used to have two computers, and one had no email or browser on it.” That’s the one he used for writing in the mornings.
“The thing is to realize that our time to work is actually quite precious.”
Ariely’s travel schedule forced him to abandon the dual-computer setup, but the experiment was fruitful enough that he now relies on a self-imposed internet ban to get work done. “The last thing I do each day is turn my computer off,” he says. “The next day, when I turn it back on, my browser and email are still off.” And Ariely keeps it that way until he’s powered through that first hour. “The thing is to realize that our time to work is actually quite precious,” he says. “We need to protect it.”
Stanford professor, retired entrepreneur, and founder of the Lean Startup movement
Over the two-plus decades that Steve Blank helped shape Silicon Valley, he ushered eight technology startups into the world. But it was during his tenure at Rocket Science Games, a company he founded in the mid-1990s, that Blank began getting high on his own supply. “I found myself drug addicted,” he says. “I’d be up playing games until four in the morning.”
“The devices started as tools and ended up as drugs for most people.”
Video games are hardly a Schedule 1 narcotic, but Blank was losing sleep and, he felt, setting a bad example for his children. Emerging research confirms his idea that games and social media can exert drug-like forces over users. A study published in the journal PLOS One even found that digital addictions can shrink the amount of white matter at certain brain sites, creating changes similar to those seen in alcohol, cocaine, and methamphetamine addictions. “The devices started as tools and ended up as drugs for most people,” Blank says. “App manufacturers are incentivized to make us addicted. I’ll contend that a ton of social media is actually a lot like oxycontin.”
When Blank realized that his gaming habit was robbing him of happiness by way of lost sleep and family time, he snapped his CD-ROMs in half (this was the ’90s, remember). Then he threw the pieces into the trash. “I literally went cold turkey,” he says. “And I haven’t played a video game since.”
Professor of psychology and director of the Emotion and Self-Control Laboratory at the University of Michigan
After studying Facebook — and, more important, after finding that the biggest users were the least satisfied with life — Ethan Kross decided to refrain from any social media use. But he still checks his email more often than he’d like. “It’s a self-control failure from a self-control expert,” he says.
To be fair, the professor is probably selling himself short. The truth is he relies on three solid rules to prevent compulsive emailing.
“So I just try to change my digital environment. We know from research that can be a powerful tool for enhancing self-control.”
First, Kross pushes all fast-moving work conversations to Slack. “That way I can get information from my lab collaborators quickly, and my email becomes less urgent.”
Second, he uses the snooze function, which is available on Gmail and services like Boomerang for Outlook, for any email that isn’t urgent. “If there are 50 things in my inbox, that can be disruptive to my immediate goals,” Kross says. So he snoozes them for a few hours or a few days, depending on the urgency.
Finally, Kross relies on an email-free iPad for reading, so he can’t check his incoming mail even if he wants to. “I don’t like checking my email when I’m in bed, because once every month I’ll receive something that makes me not sleep well,” he says. “So I just try to change my digital environment. We know from research that can be a powerful tool for enhancing self-control.”
Researcher and professor of psychology at San Diego State University and the author of iGen, a book about how the internet is changing young adults
In April of last year, Jean Twenge signed up for Twitter. It’s her first and only social media account, and almost immediately she found herself clashing with people who disagreed with her research. “It’s a public forum, and I felt a compulsion to defend my arguments,” Twenge says. “But is that the right response? I don’t know. For my own mental health, I know it’s not.”
“It’s a public forum, and I felt a compulsion to defend my arguments.”
It’s not that she wanted to be on Twitter, but as an academic with a book to promote, Twenge felt like she had to. After six months with the service, though, Twenge noticed that she was increasingly giving in to a compulsion to check up on conversations that were making her miserable. “It completely confirmed why I don’t have social media,” she says. And so she scaled back. Twenge kept the account for promotional reasons and still has periods of time when she’s active, but when she needs a refresh, she consciously steps away for days or weeks.
When asked if she’s tempted to open an Instagram or Facebook account — even if just for research purposes — she replies quickly, “Nope.”
Professor at San Francisco State University and president of the Biofeedback Federation of Europe
As a researcher who explores the impact of excessive phone use (it makes us feel lonely) and the bad posture brought on by constantly staring at a screen, Erik Peper makes a point of keeping his phone at a distance. When he leaves home in the morning, he packs it into his backpack instead of his pocket. And when he returns in the evening, he docks it at the charging station by his front door.
What’s the point? There are two, actually.
“There are very few things that are truly urgent.”
First, the microwaves coming off mobile devices could present a small risk to their owners, Peper says. In a paper he wrote for the journal Biofeedback, Peper cites epidemiological research showing that people who use cellphones for more than 10 years are more likely than nonusers to have tumors on their salivary glands and inside their ear canals. They’re also three times as likely to have certain brain and spinal-cord tumors on the side of their head where they hold their phone. “The data is weak and controversial,” Peper admits. “But I believe in the precautionary principle, which says that you have to first prove something is totally safe before you can use it.”
The second reason is that, simply put, it’s a distraction. “The phone hijacks our evolutionary patterns,” Peper says. “We don’t do good with multitasking, so if you’re writing an article, and every five minutes you pop back to answer a message, you’re much less productive in the long term.” The same logic applies to socializing, he says, which is why his phone is stored out of sight when he’s with friends and family.
Does it matter that he’s a little slow to reply to messages? Or that he occasionally misses a call? “There are very few things that are truly urgent,” Peper says. “It’s different if you’re a firefighter, but beyond that, whether I answer the email this minute, later today, or even this evening — it really makes no difference.”
CEO of IFTTT, a service that lets you program your apps and smart devices to carry out rote tasks
Years ago, Linden Tibbets decided he didn’t want to be a slave to his email. Which meant, in short, that he would read and send messages only while sitting at his desk.
“The only time I send email on my phone is if I’m running late to a meeting and there’s no other way to communicate,” Tibbets says. “That’s literally the only time.”
“You can be endlessly entertained with what’s happening in the world around you. You don’t need your phone.”
The upshot, he says, is that he’s able to address his correspondance with better focus. “I would much rather spend an extra hour in the evening responding to email than to be distracted by it off and on throughout the day,” Tibbets says. If it takes a while to reply to people, no big deal. “I just say, ‘Thanks for your patience. I apologize for being slow to get back to you.’”
And if he finds himself with a moment of downtime — standing in line for groceries, for instance — Tibbets considers it rare opportunity for mind wandering. “I play a game with myself where I try not to look at my phone,” he says. “I look at people. I read food labels. I observe things in the environment. You can be endlessly entertained with what’s happening in the world around you. You don’t need your phone.”
Professor of marketing at New York University and author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked
In Irresistable, Adam Alter argues that in some ways, tech addiction may actually be worse than cigarette addiction. Because the web is built on social connections, each new addict makes it harder for the rest of us to abstain. “Addictive tech is part of the mainstream in a way that addictive substances never will be,” Alter writes. “Abstinence isn’t an option.”
“I try to put my phone on airplane mode on weekends.”
So what does the tech critic do to protect his own mental autonomy? He disconnects when the workweek’s done. “I try to put my phone on airplane mode on weekends so I can take photos of my two young kids without interruptions from emails and other needy platforms.”
Entrepreneurial consultant, host of Glambition, a podcast for women in business
Last year, Ali Brown had a social media reckoning. “It was after the election, when everything was getting toxic and weird,” she says. “I was getting all my news from Facebook, and I felt this sense of unease all the time.”
So Brown did an entirely logical thing that most of us haven’t done: She drained the swamp on her phone. In one heroic moment of full-steam bravado, Brown deleted Facebook, Twitter, and Instagram and replaced them with apps from the Wall Street Journal and the New York Times. “I decided to pay for some really good journalism,” she says. “I’ll use my time to read those instead.”
“Responding to social media all day is going to get you nowhere.”
Once her healthier new phone routine was established, Brown added back one social media app — but just one! “I like Instagram because it’s generally happy and fun,” she says. “I post about my kids.”
Brown is lucky enough to have a team to run her Twitter and Facebook accounts, but she knows there are better uses for investing her personal time. “If you’re here in this life to do great, powerful work, then you need to create some space in your day to be a freethinker,” she says. “Responding to social media all day is going to get you nowhere.”
To her clients — mostly women running seven- and eight-figure companies — Brown generally offers this advice: “Try deleting social media for a week. You won’t miss anything, you won’t cease to exist, and you’ll thank me later.”
WRITTEN BY Clint Carter
Writer for publications such as Entrepreneur, Men’s Health, Men’s Journal, New York magazine, and Wall Street Journal.
Video game addiction is a term that has been used for years by parents and mental health professionals who believe that it’s a real disorder. Now, there’s more weight behind their argument: The World Health Organization (WHO) has including “gaming disorder” as a new mental health condition listed in the 11th edition of its International Classification of Diseases.
According to WHO, there are three major criteria for the diagnosis of gaming disorder: Gaming takes precedence over other activities so much that a person often stops doing other things, a person continues gaming even when it causes issues in their life or they feel that they can’t stop, and gaming causes significant distress and impairments in a person’s relationships with others, as well as their work or school life. If your child gets sucked into a game for a few days, but goes back to normal after that, they wouldn’t qualify: Instead, people must engage in this behavior for at least 12 months, WHO says.
It’s worth noting that WHO’s stance on gaming addiction is different from that of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the handbook used by health professionals in the U.S. and other countries to help diagnose mental health disorders. The DSM-5 calls out “Internet Gaming Disorder” but says it’s a condition that warrants more clinical research and experience before it can be classified in the book as a formal disorder.
WHO says on its website that all people who participate in gaming should be aware that gaming disorder is a real condition, and that it’s important to be mindful of how often they play video games. However, they also point out that gaming disorder only affects a small amount of people who game.
It’s only natural that the news would make you give your child’s gaming system the side-eye.
In general, parents should limit the amount of screen time their children have daily, and gaming is included in that, along with TV, computers, phones and tablet use, Gina Posner, MD, a pediatrician at MemorialCare Orange Coast Medical Center in Fountain Valley, Calif., tells Yahoo Lifestyle.
Screen time isn’t recommended at all for kids who are 18 months or younger, but for children who are older than that up to five, it’s generally recommended that they have not more than one hour of screen time, she says. For those who are six and up, it’s more at the parents’ discretion. “The maximum amount of screen time should be two hours a day, but less is always better,” Posner says.
Posner says that it’s important to set clear limits for your child when it comes to screen time and gaming. For example, say that your child has to do their homework first and/or get out and play for an hour before they’re allowed to game. And even then, make it clear that they’re only allowed to do so for a set period of time.
If your child starts fussing when they’re not allowed to be gaming all day, it’s a clear sign that you need to cut back, Posner says.
Treatment for gaming disorder is generally based in cognitive behavioral therapy, which would generally be done in two phases, Simon Rego, PsyD, chief psychologist at Montefiore Medical Center/Albert Einstein College of Medicine, tells Yahoo Lifestyle. The first is raising awareness for your child that their gaming is a problem, and looking for triggers and cues that could make the gaming habit better or worse. A mental health professional would also address problematic thoughts associated with either stopping playing or the thoughts that keep them gaming, he says.
The goal then is to step down the behavior from something that’s pathological to problematic, and then being able to manage it in a “reasonable way,” Rego says. People don’t necessarily have to quit gaming altogether, but they do need to learn to better manage it with parameters, like only gaming with friends during select times during the day vs. doing it at night alone in their room.
If you suspect that your child has a gaming disorder, it’s important to seek help for it.
Just know that this is still a new diagnosis and you may need to do some sleuthing to find someone who specializes in this kind of behavior.
Smartphones have by now been implicated in so many crummy outcomes—car fatalities, sleep disturbances, empathy loss, relationship problems, failure to notice a clown on a unicycle—that it almost seems easier to list the things they don’t mess up than the things they do. Our society may be reaching peak criticism of digital devices.
Even so, emerging research suggests that a key problem remains underappreciated. It involves kids’ development, but it’s probably not what you think. More than screen-obsessed young children, we should be concerned about tuned-out parents.
Yes, parents now have more face time with their children than did almost any parents in history. Despite a dramatic increase in the percentage of women in the workforce, mothers today astoundingly spend more time caring for their children than mothers did in the 1960s. But the engagement between parent and child is increasingly low-quality, even ersatz. Parents are constantly present in their children’s lives physically, but they are less emotionallyattuned. To be clear, I’m not unsympathetic to parents in this predicament. My own adult children like to joke that they wouldn’t have survived infancy if I’d had a smartphone in my clutches 25 years ago.
To argue that parents’ use of screens is an underappreciated problem isn’t to discount the direct risks screens pose to children: Substantial evidence suggests that many types of screen time (especially those involving fast-paced or violent imagery) are damaging to young brains. Today’s preschoolers spend more than four hours a day facing a screen. And, since 1970, the average age of onset of “regular” screen use has gone from 4 years to just four months.
Some of the newer interactive games kids play on phones or tablets may be more benign than watching TV (or YouTube), in that they better mimic children’s natural play behaviors. And, of course, many well-functioning adults survived a mind-numbing childhood spent watching a lot of cognitive garbage. (My mother—unusually for her time—prohibited Speed Racer and Gilligan’s Island on the grounds of insipidness. That I somehow managed to watch every single episode of each show scores of times has never been explained.) Still, no one really disputes the tremendous opportunity costs to young children who are plugged in to a screen: Time spent on devices is time not spent actively exploring the world and relating to other human beings.
Yet for all the talk about children’s screen time, surprisingly little attention is paid to screen use by parents themselves, who now suffer from what the technology expert Linda Stone more than 20 years ago called “continuous partial attention.” This condition is harming not just us, as Stone has argued; it is harming our children. The new parental-interaction style can interrupt an ancient emotional cueing system, whose hallmark is responsive communication, the basis of most human learning. We’re in uncharted territory.
Child-development experts have different names for the dyadic signaling system between adult and child, which builds the basic architecture of the brain. Jack P. Shonkoff, a pediatrician and the director of Harvard’s Center on the Developing Child, calls it the “serve and return” style of communication; the psychologists Kathy Hirsh-Pasek and Roberta Michnick Golinkoff describe a “conversational duet.” The vocal patterns parents everywhere tend to adopt during exchanges with infants and toddlers are marked by a higher-pitched tone, simplified grammar, and engaged, exaggerated enthusiasm. Though this talk is cloying to adult observers, babies can’t get enough of it. Not only that: One study showed that infants exposed to this interactive, emotionally responsive speech style at 11 months and 14 months knew twice as many words at age 2 as ones who weren’t exposed to it.
Child development is relational, which is why, in one experiment, nine-month-old babies who received a few hours of Mandarin instruction from a live human could isolate specific phonetic elements in the language while another group of babies who received the exact same instruction via video could not. According to Hirsh-Pasek, a professor at Temple University and a senior fellow at the Brookings Institution, more and more studies are confirming the importance of conversation. “Language is the single best predictor of school achievement,” she told me, “and the key to strong language skills are those back-and-forth fluent conversations between young children and adults.”
A problem therefore arises when the emotionally resonant adult–child cueing system so essential to early learning is interrupted—by a text, for example, or a quick check-in on Instagram. Anyone who’s been mowed down by a smartphone-impaired stroller operator can attest to the ubiquity of the phenomenon. One consequence of such scenarios has been noted by an economist who tracked a rise in children’s injuries as smartphones became prevalent. (AT&T rolled out smartphone service at different times in different places, thereby creating an intriguing natural experiment. Area by area, as smartphone adoption rose, childhood ER visits increased.) These findings attracted a decent bit of media attention to the physical dangers posed by distracted parenting, but we have been slower to reckon with its impact on children’s cognitive development. “Toddlers cannot learn when we break the flow of conversations by picking up our cellphones or looking at the text that whizzes by our screens,” Hirsh-Pasek said.
In the early 2010s, researchers in Boston surreptitiously observed 55 caregivers eating with one or more children in fast-food restaurants. Forty of the adults were absorbed with their phones to varying degrees, some almost entirely ignoring the children (the researchers found that typing and swiping were bigger culprits in this regard than taking a call). Unsurprisingly, many of the children began to make bids for attention, which were frequently ignored. A follow-up study brought 225 mothers and their approximately 6-year-old children into a familiar setting and videotaped their interactions as each parent and child were given foods to try. During the observation period, a quarter of the mothers spontaneously used their phone, and those who did initiated substantially fewer verbal and nonverbal interactions with their child.
Yet another rigorously designed experiment, this one conducted in the Philadelphia area by Hirsh-Pasek, Golinkoff, and Temple’s Jessa Reed, tested the impact of parental cellphone use on children’s language learning. Thirty-eight mothers and their 2-year-olds were brought into a room. The mothers were then told that they would need to teach their children two new words (blicking, which was to mean “bouncing,” and frepping, which was to mean “shaking”) and were given a phone so that investigators could contact them from another room. When the mothers were interrupted by a call, the children did not learn the word, but otherwise they did. In an ironic coda to this study, the researchers had to exclude seven mothers from the analysis, because they didn’t answer the phone, “failing to follow protocol.” Good for them!
It has never been easy to balance adults’ and children’s needs, much less their desires, and it’s naive to imagine that children could ever be the unwavering center of parental attention. Parents have always left kids to entertain themselves at times—“messing about in boats,” in a memorable phrase from The Wind in the Willows, or just lounging aimlessly in playpens. In some respects, 21st-century children’s screen time is not very different from the mother’s helpers every generation of adults has relied on to keep children occupied. When parents lack playpens, real or proverbial, mayhem is rarely far behind. Caroline Fraser’s recent biography of Laura Ingalls Wilder, the author of Little House on the Prairie, describes the exceptionally ad hoc parenting style of 19th-century frontier parents, who stashed babies on the open doors of ovens for warmth and otherwise left them vulnerable to “all manner of accidents as their mothers tried to cope with competing responsibilities.” Wilder herself recounted a variety of near-calamities with her young daughter, Rose; at one point she looked up from her chores to see a pair of riding ponies leaping over the toddler’s head.
Occasional parental inattention is not catastrophic (and may even build resilience), but chronic distraction is another story. Smartphone use has been associated with a familiar sign of addiction: Distracted adults grow irritable when their phone use is interrupted; they not only miss emotional cues but actually misread them. A tuned-out parent may be quicker to anger than an engaged one, assuming that a child is trying to be manipulative when, in reality, she just wants attention. Short, deliberate separations can of course be harmless, even healthy, for parent and child alike (especially as children get older and require more independence). But that sort of separation is different from the inattention that occurs when a parent is with a child but communicating through his or her nonengagement that the child is less valuable than an email. A mother telling kids to go out and play, a father saying he needs to concentrate on a chore for the next half hour—these are entirely reasonable responses to the competing demands of adult life. What’s going on today, however, is the rise of unpredictable care, governed by the beeps and enticements of smartphones. We seem to have stumbled into the worst model of parenting imaginable—always present physically, thereby blocking children’s autonomy, yet only fitfully present emotionally.
Fixing the problem won’t be easy, especially given that it is compounded by dramatic changes in education. More young children than ever (about two-thirds of 4-year-olds) are in some form of institutional care, and recent trends in early-childhood education have filled many of their classrooms with highly scripted lessons and dull, one-sided “teacher talk.” In such environments, children have few opportunities for spontaneous conversation.
One piece of good news is that young children are prewired to get what they need from adults, as most of us discover the first time our diverted gaze is jerked back by a pair of pudgy, reproaching hands. Young children will do a lot to get a distracted adult’s attention, and if we don’t change our behavior, they will attempt to do it for us; we can expect to see a lot more tantrums as today’s toddlers age into school. But eventually, children may give up. It takes two to tango, and studies from Romanian orphanages showed the world that there are limits to what a baby brain can do without a willing dance partner. The truth is, we don’t really know how much our kids will suffer when we fail to engage.
Of course, adults are also suffering from the current arrangement. Many have built their daily life around the miserable premise that they can always be on—always working, always parenting, always available to their spouse and their own parents and anyone else who might need them, while also staying on top of the news, while also remembering, on the walk to the car, to order more toilet paper from Amazon. They are stuck in the digital equivalent of the spin cycle.
Under the circumstances, it’s easier to focus our anxieties on our children’s screen time than to pack up our own devices. I understand this tendency all too well. In addition to my roles as a mother and a foster parent, I am the maternal guardian of a middle-aged, overweight dachshund. Being middle-aged and overweight myself, I’d much rather obsess over my dog’s caloric intake, restricting him to a grim diet of fibrous kibble, than address my own food regimen and relinquish (heaven forbid) my morning cinnamon bun. Psychologically speaking, this is a classic case of projection—the defensive displacement of one’s failings onto relatively blameless others. Where screen time is concerned, most of us need to do a lot less projecting.
If we can get a grip on our “technoference,” as some psychologists have called it, we are likely to find that we can do much more for our children simply by doing less—regardless of the quality of their schooling and quite apart from the number of hours we devote to them. Parents should give themselves permission to back off from the suffocating pressure to be all things to all people. Put your kid in a playpen, already! Ditch that soccer-game appearance if you feel like it. Your kid will be fine. But when you are with your child, put down your damned phone.
When I was a kid, we often went out for ice cream and a game of mini-golf. Most of the set-ups were fun and relatively easy to negotiate. But there was always that one hole. That one where you gotta time it just right to get the ball through the series of 3 tunnels, making sure the rotating blades of the windmills don’t get in the way. UGH! I hated that one.
Certainly by the time you were on your 12th attempt, the game started to lose its carefree feel and performance anxiety set in.
In our family, we devised a rule to deal with this and keep the game fun. If, after 5 tries you could not get that ball where you wanted it to be, you got a Do-Over. You got to wipe the slate clean and start over again. Usually this worked.
Sometimes you just have to step back, take a deep breath and start back at the beginning with a fresh attitude.
In “real life” we rarely get do-overs. Most of the time you can’t un-ring a bell.
Enter . . . regret.
Psych Pstuff’s Summary
Regrets: everyone has them to some extent. Harsh words, career mistakes, missed opportunities — these are all common experiences. Sometimes we regret the way we acted or failed to act. Other times, we think we wouldn’t do anything differently but regret that the outcome was not as intended.
Regret is generally considered a negative emotion, in the classic way that we regress to automatically judging something either “good” or “bad.” While it surely doesn’t feel good, regret can be good for us in several ways by helping to clarify and focus the confusing aspects of a situation.
After all, for the most part, the majority of us are doing the best we can, given the circumstances. Most of us make bad choices because we don’t have all the information about just how bad that choice is. Regret gives us the gift of hindsight to tuck away for “next time.”
Regret gives you perspective nothing else can.
Psychologist Carl Jung once said “Even a happy life cannot be without a measure of darkness, and the word ‘happy’ would lose its meaning if it were not balanced by sadness.” Knowing what you don’t want, how you don’t want to be, from first-hand experience, helps you truly understand what you do want to do or be.
Regret can also keep us humble. And at least a little bit of humility is a good thing. Its opposite is not. Regret reminds us that we are not perfect and puts us in touch with our humanity.
Run amok, of course, regret, like just about anything let run amok, can be negative and harmful. Contemplating is good. Reflecting is good. Ruminating? Not so much. Obsessing on what you could have and should have done better can lead to feelings of worthlessness and depression that paralyzes us instead of inspiring us to do better.
Most of us make bad choices because we don’t have all the information about just how bad that choice is. Regret gives us the gift of hindsight to tuck away for “next time.”
Research has indicated a cultural component to the experience of regret. Collectivistic cultures that emphasize the group over the individual tend to report experiencing less regret. Individualistic societies place an emphasis on individual choice, independence, and performance, setting the stage for self-doubt and blame.
Other research, conducted by Neal Roese of the Kellogg School of Management at Northwestern University, has indicated regret is considered the most effective negative emotion, specifically with respect to: (1) making sense of the world, (2) avoiding future negative behaviors, (3) gaining insight, (4) achieving social harmony, and (5) improving ability to approach desired opportunities.
While wallowing in missed opportunities or less-than-stellar behavior has been shown to have negative effects on physical and emotional health, when used sparingly, and in an introspective and constructive manner, it can clearly be a tool to facilitate decision-making and increase satisfaction.
Regrets can be big or small based on the severity of the negative outcomes for ourselves and others. But either way, they can provide a guiding light and the wisdom that can only come from experience.
Perhaps a healthy way to consider some of our less-than-perfect life decisions was best expressed by the classic crooner Frank Sinatra: “Regrets, I’ve had a few, but then again, too few to mention”
And with a little luck, life even gives you a do-over.
by Kate Ryan General Mills, Kellogg’s, and Unilever own just about everything Source: This Infographic Shows How Only 10 Companies Own All The World’s Brands Just when you think there’s no end to the diversity of junk food lining supermarket aisles, an insanely detailed infographic comes along to set us all straight. Out of the […]
I was 40-something. I walked across the threshold of the house I grew up in. It was . . .
From the Greek philosopher Heraclitus, who said, “No man ever steps in the same river twice, for it’s not the same river and he’s not the same man,” to Bon Jovi, who asked, “Who says you can’t go home?” humanity has ruminated on returning to our childhood homes.
The theme is one we revisit repeatedly.
We see it in movies, too, from comedies (The Royal Tenenbaums, Home for the Holidays, This is Where I Leave You) to dramas (On Golden Pond, Young Adult, The Judge) that have poignantly depicted heading home for a visit or a re-nesting. Homecoming is equally well represented in classic and contemporary literature (You Can’t Go Home Again by Thomas Wolfe, Gilead and its sequel Home by Marilynne Robinson, and An American Childhood by Annie Dillard).
Regardless of the circumstances of return — joyous or tragic — the experience is … well … it’s complicated.
For some, home is the ultimate safety net as one walks the tightrope of life — always there, always solid, always ready to catch you should you stumble. For others the concept of home dissipates like a morning dream and there is not much left to go back to, except in one’s head. Some childhood rooms are preserved like a time capsule. Others are transformed into the sewing room Mom always wanted, just days after one’s departure. Millennials are notorious for going back home — provided they left in the first place. This privilege has, supposedly, given them the freedom to pursue their dreams, to fail and fail again, without dire consequences.
So, it is paradise or inferno? As with so much in life, it depends.
To answer Bon Jovi, it was in fact Thomas Wolfe who insisted one cannot go back home again, and yet we have Dorothy returned from Oz proclaiming, “There’s no place like home!”
In the end, good or bad, love it or hate it, curse it or miss it, perhaps Nickleback says it best:
I miss that town
I miss the faces
You can’t erase
You can’t replace it
I miss it now
I can’t believe it
So hard to stay
Too hard to leave it
It’s hard to say it, time to say it
Goodbye … goodbye
PSYCH PSTUFF’S SUMMARY
Psychologist Carl Gustav Jung, in his autobiography Memories, Dreams, Reflections, spoke of the home that he built as a “self-realization of the unconscious … a concretization of the individuation process … a symbol of psychic wholeness.” Building on Jung’s work, Clare Cooper Marcus, architect, psychologist and author of House As a Mirror of the Self: Exploring the Deep Meaning of Home, asserts that “as we change and grow throughout our lives, our psychological development is punctuated not only by meaningful emotional relationships with people, but also by close, affective ties with a number of significant physical environments, beginning in childhood.” She insists that, regardless of the external nature of our dwellings — mansions or shacks — we all have a strong emotional relationship, positive or negative, with our homes.
Our census tells us that fewer than 10% of the population remain in the same house they lived in 30 years prior. We are mobile in this 21st century, very mobile. In fact, it is estimated that the average American will move 11.7 times in his or her lifetime. We don’t stay permanently attached to our childhood homes and neighborhoods — at least not physically. But we do, psychologically.
The lure of nostalgia is strong. Millions of adults revisit their childhood homes long after they, or their family members, occupy that space. Some are content to merely drive by and observe from outside. Others write letters to the current owners or even knock on the door and ask to have a look at their old bedroom.
We remember, and feel compelled to re-visit, the view from our bedroom window, the schoolyard playground, our dinner table, the front porch and backyard.
Propelled by his own experience of revisiting his childhood haunts, Burger surveyed other adults about their personal pilgrimages “home.” This is some of what he discovered:
There are three primary reasons for making a trip back to one’s childhood home or neighborhood:
• To reconnect with childhood — 42% percent of people Burger interviewed visited their childhood homes in hopes of jogging their memory and getting back in touch with who they were as a child.
• To help resolve a current crisis or problem by reflecting on their past — 15% of those studied expressed the need to reevaluate how they developed their values and what led them to make the decisions that they made.
• To bring closure to unfinished business from childhood — 12% reported abuse or trauma and hoped that returning to the home where they experienced that pain would be therapeutic and cathartic.
Regardless of the underlying motivations for the return, Burger discovered that in almost all of the cases, people reported being glad they made the journey to their childhood home, even though it was often a deeply emotional and unpredictable experience.
He found three exceptions, where the experience was not a positive one and people reported wishing they had not made the trip back to their past:
• When the house in which they grew up had significantly changed or was no longer there — this usually proved unexpected and very upsetting.
• For those who returned anticipating an escape from problems and hoping to relive the romanticized memory of their childhood — in these cases, reality did not match their expectations and they were deeply disappointed and disillusioned.
• For those who returned to work through childhood trauma — often the painful memories seemed more intense while visiting the childhood home and they did not experience the anticipated relief or closure. (Burger recommends people revisiting the past to confront a traumatic period in their lives do so with the help of a professional counselor.)
In any event, there are few experiences in one’s life that can move a person as deeply and unpredictably as returning “home.”