The Mystery of People Who Speak Dozens of Languages

What can hyperpolyglots teach the rest of us?

One researcher of language acquisition describes her basic question as “How do I get a thought from my mind into yours?”

Illustration by Oliver Munday; source photograph from Universal History Archive / Getty (face)

Source: The Mystery of People Who Speak Dozens of Languages

Last May, Luis Miguel Rojas-Berscia, a doctoral candidate at the Max Planck Institute for Psycholinguistics, in the Dutch city of Nijmegen, flew to Malta for a week to learn Maltese. He had a hefty grammar book in his backpack, but he didn’t plan to open it unless he had to. “We’ll do this as I would in the Amazon,” he told me, referring to his fieldwork as a linguist. Our plan was for me to observe how he went about learning a new language, starting with “hello” and “thank you.”

Rojas-Berscia is a twenty-seven-year-old Peruvian with a baby face and spiky dark hair. A friend had given him a new pair of earrings, which he wore on Malta with funky tank tops and a chain necklace. He looked like any other laid-back young tourist, except for the intense focus—all senses cocked—with which he takes in a new environment. Linguistics is a formidably cerebral discipline. At a conference in Nijmegen that had preceded our trip to Malta, there were papers on “the anatomical similarities in the phonatory apparati of humans and harbor seals” and “hippocampal-dependent declarative memory,” along with a neuropsychological analysis of speech and sound processing in the brains of beatboxers. Rojas-Berscia’s Ph.D. research, with the Shawi people of the Peruvian rain forest, doesn’t involve fMRI data or computer modelling, but it is still arcane to a layperson. “I’m developing a theory of language change called the Flux Approach,” he explained one evening, at a country inn outside the city, over the delicious pannenkoeken (pancakes) that are a local specialty. “A flux is a dynamism that involves a social fact and an impact, either functionally or formally, in linguistic competence.”

Linguistic competence, as it happens, was the subject of my own interest in Rojas-Berscia. He is a hyperpolyglot, with a command of twenty-two living languages (Spanish, Italian, Piedmontese, English, Mandarin, French, Esperanto, Portuguese, Romanian, Quechua, Shawi, Aymara, German, Dutch, Catalan, Russian, Hakka Chinese, Japanese, Korean, Guarani, Farsi, and Serbian), thirteen of which he speaks fluently. He also knows six classical or endangered languages: Latin, Ancient Greek, Biblical Hebrew, Shiwilu, Muniche, and Selk’nam, an indigenous tongue of Tierra del Fuego, which was the subject of his master’s thesis. We first made contact three years ago, when I was writing about a Chilean youth who called himself the last surviving speaker of Selk’nam. How could such a claim be verified? Pretty much only, it turned out, by Rojas-Berscia.

Superlative feats have always thrilled average mortals, in part, perhaps, because they register as a victory for Team Homo Sapiens: they redefine the humanly possible. If the ultra-marathoner Dean Karnazes can run three hundred and fifty miles without sleep, he may inspire you to jog around the block. If Rojas-Berscia can speak twenty-two languages, perhaps you can crank up your high-school Spanish or bat-mitzvah Hebrew, or learn enough of your grandma’s Korean to understand her stories. Such is the promise of online language-learning programs like Pimsleur, Babbel, Rosetta Stone, and Duolingo: in the brain of every monolingual, there’s a dormant polyglot—a genie—who, with some brisk mental friction, can be woken up. I tested that presumption at the start of my research, signing up on Duolingo to learn Vietnamese. (The app is free, and I was curious about the challenges of a tonal language.) It turns out that I’m good at hello—chào—but thank you, cảm ơn, is harder.

The word “hyperpolyglot” was coined two decades ago, by a British linguist, Richard Hudson, who was launching an Internet search for the world’s greatest language learner. But the phenomenon and its mystique are ancient. In Acts 2 of the New Testament, Christ’s disciples receive the Holy Spirit and can suddenly “speak in tongues” (glōssais lalein, in Greek), preaching in the languages of “every nation under heaven.” According to Pliny the Elder, the Greco-Persian king Mithridates VI, who ruled twenty-two nations in the first century B.C., “administered their laws in as many languages, and could harangue in each of them.” Plutarch claimed that Cleopatra “very seldom had need of an interpreter,” and was the only monarch of her Greek dynasty fluent in Egyptian. Elizabeth I also allegedly mastered the tongues of her realm—Welsh, Cornish, Scottish, and Irish, plus six others.

With a mere ten languages, Shakespeare’s Queen does not qualify as a hyperpolyglot; the accepted threshold is eleven. The prowess of Giuseppe Mezzofanti (1774-1849) is more astounding and better documented. Mezzofanti, an Italian cardinal, was fluent in at least thirty languages and studied another forty-two, including, he claimed, Algonquin. In the decades that he lived in Rome, as the chief custodian of the Vatican Library, notables from around the world dropped by to interrogate him in their mother tongues, and he flitted as nimbly among them as a bee in a rose garden. Lord Byron, who is said to have spoken Greek, French, Italian, German, Latin, and some Armenian, in addition to his immortal English, lost a cursing contest with the Cardinal and afterward, with admiration, called him a “monster.” Other witnesses were less enchanted, comparing him to a parrot. But his gifts were certified by an Irish scholar and a British philologist, Charles William Russell and Thomas Watts, who set a standard for fluency that is still useful in vetting the claims of modern Mezzofantis: Can they speak with an unstilted freedom that transcends rote mimicry?

Mezzofanti, the son of a carpenter, picked up Latin by standing outside a seminary, listening to the boys recite their conjugations. Rojas-Berscia, by contrast, grew up in an educated trilingual household. His father is a Peruvian businessman, and the family lives comfortably in Lima. His mother is a shop manager of Italian origin, and his maternal grandmother, who cared for him as a boy, taught him Piedmontese. He learned English in preschool and speaks it impeccably, with the same slight Latin inflection—a trill of otherness, rather than an accent—that he has in every language I can vouch for. Maltese had been on his wish list for a while, along with Uighur and Sanskrit. “What happens is this,” he said, over dinner at a Chinese restaurant in Nijmegen, where he was chatting in Mandarin with the owner and in Dutch with a server, while alternating between French and Spanish with a fellow-student at the institute. “I’m an amoureux de langues. And, when I fall in love with a language, I have to learn it. There’s no practical motive—it’s a form of play.” An amoureux, one might note, covets his beloved, body and soul.

My own modest competence in foreign languages (I speak three) is nothing to boast of in most parts of the world, where multilingualism is the norm. People who live at a crossroads of cultures—Melanesians, South Asians, Latin-Americans, Central Europeans, sub-Saharan Africans, plus millions of others, including the Maltese and the Shawi—acquire languages without considering it a noteworthy achievement. Leaving New York, on the way to the Netherlands, I overheard a Ghanaian taxi-driver chatting on his cell phone in a tonal language that I didn’t recognize. “It’s Hausa,” he told me. “I speak it with my father, whose family comes from Nigeria. But I speak Twi with my mom, Ga with my friends, some Ewe, and English is our lingua franca. If people in Chelsea spoke one thing and people in SoHo another, New Yorkers would be multilingual, too.”

Linguistically speaking, that taxi-driver is a more typical citizen of the globe than the average American is. Consider Adul Sam-on, one of the teen-age soccer players rescued last July from the cave in Mae Sai, Thailand. Adul grew up in dire poverty on the porous Thai border with Myanmar and Laos, where diverse populations intersect. His family belongs to an ethnic minority, the Wa, who speak an Austroasiatic language that is also widespread in parts of China. In addition to Wa, according to the Times, Adul is “proficient” in Thai, Burmese, Mandarin, and English—which enabled him to interpret for the two British divers who discovered the trapped team.

Nearly two billion people study English as a foreign language—about four times the number of native speakers. And apps like Google Translate make it possible to communicate, almost anywhere, by typing conversations into a smartphone (presuming your interlocutor can read). Ironically, however, as the hegemony of English decreases the need to speak other languages for work or for travel, the cachet attached to acquiring them seems to be growing. There is a thriving online community of ardent linguaphiles who are, or who aspire to become, polyglots; for inspiration, they look to Facebook groups, YouTube videos, chat rooms, and language gurus like Richard Simcott, a charismatic British hyperpolyglot who orchestrates the annual Polyglot Conference. This gathering has been held, on various continents, since 2009, and it attracts hundreds of aficionados. The talks are mostly in English, though participants wear nametags listing the languages they’re prepared to converse in. Simcott’s winkingly says “Try Me.”

No one becomes a hyperpolyglot by osmosis, or without sacrifice—it’s a rare, herculean feat. Rojas-Berscia, who gave up a promising tennis career that interfered with his language studies, reckons that there are “about twenty of us in Europe, and we all know, or know of, one another.” He put me in touch with a few of his peers, including Corentin Bourdeau, a young French linguist whose eleven languages include Wolof, Farsi, and Finnish; and Emanuele Marini, a shy Italian in his forties, who runs an export-import business and speaks almost every Slavic and Romance language, plus Arabic, Turkish, and Greek, for a total of nearly thirty. Neither willingly uses English, resenting its status as a global bully language—its prepotenza, as Marini put it to me, in Italian. Ellen Jovin, a dynamic New Yorker who has been described as the “den mother” of the polyglot community, explained that her own avid study of languages—twenty-five, to date—“is almost an apology for the dominance of English. Polyglottery is an antithesis to linguistic chauvinism.”

Much of the data on hyperpolyglots is still sketchy. But, from a small sample of prodigies who have been tested by neurolinguists, responded to online surveys, or shared their experience in forums, a partial profile has emerged. An extreme language learner has a more-than-random chance of being a gay, left-handed male on the autism spectrum, with an autoimmune disorder, such as asthma or allergies. (Endocrine research, still inconclusive, has investigated the hypothesis that these traits may be linked to a spike in testosterone during gestation.) “It’s true that L.G.B.T. people are well represented in our community,” Simcott told me, when we spoke in July. “And a lot identify as being on the spectrum, some mildly, others more so. It was a subject we explored at the conference last year.”

Simcott himself is an ambidextrous, heterosexual, and notably outgoing forty-one-year-old. He lives in Macedonia with his wife and daughter, a budding polyglot of eleven, who was, he told me, trilingual at sixteen months. His own parents were monolingual, though he was fascinated, as a boy, “by the different ways people spoke English.” (Like Henry Higgins, Simcott can nail an accent to a precise point on the map, not only in the British Isles but all over Europe.) “I’m mistaken for a native in about six languages,” he told me, even though he started slow, learning French in grade school and Spanish as a teen-ager. At university, he added Italian, Portuguese, Swedish, and Old Icelandic. His flawless German, acquired post-college, as an au pair, made Dutch a cinch.

As Simcott entered late adolescence, he said, “the Internet was starting up,” so he could practice his languages in chat rooms. He also found a sense of identity that had eluded him. There was, in particular, a mysterious polyglot who haunted the same rooms. “He was the first person who really encouraged me,” Simcott said. “Everyone else either warned me that my brain would burst or saw me as a talking horse. Eventually, I made a video using bits and bobs of sixteen languages, so I wouldn’t have to keep performing.” But the stranger gave Simcott a validation that he still recalls with emotion. He founded the conference partly to pay that debt forward, by creating a clubhouse for the kind of geeky kid he had been, to whom no tongue was foreign but no place was home.

A number of hyperpolyglots are reclusive savants who bank their languages rather than using them to communicate. The more extroverted may work as translators or interpreters. Helen Abadzi, a Greek educator who speaks nineteen languages “at least at an intermediate level” spent decades at the World Bank. Kató Lomb, a Hungarian autodidact, learned seventeen tongues—the last, Hebrew, in her late eighties—and in middle age became one of the world’s first simultaneous interpreters. Simcott joined the British Foreign Service. On tours of duty in Yemen, Bosnia, and Moldova, he picked up some of the lingo. Every summer, he set himself the challenge of learning a new tongue more purposefully, either by taking a university course—as he did in Mandarin, Japanese, Czech, Arabic, Finnish, and Georgian—or with a grammar book and a tutor.

However they differ, the hyperpolyglots whom I met all winced at the question “How many languages do you speak?” As Rojas-Berscia explained it, the issue is partly semantic: What does the verb “to speak” mean? It is also political. Standard accents and grammar are usually those of a ruling class. And the question is further clouded by the “chauvinism” that Ellen Jovin feels obliged to resist. The test of a spy, in thrillers, is to “pass for a native,” even though the English-speaking natives of Glasgow, Trinidad, Delhi, Lagos, New Orleans, and Melbourne (not to mention Eliza Doolittle’s East End) all sound foreign to one another. “No one masters all the nuances of a language,” Simcott said. “It’s a false standard, and one that gets raised, ironically, mostly by monoglots—Americans in particular. So let’s just say that I have studied more than fifty, and I use about half of them.”

Richard Hudson’s casual search for the ultimate hyperpolyglot was inconclusive, but it led him to an American journalist, Michael Erard, who had embarked on the same quest more methodically. Erard, who has a doctorate in English, spent six years reading the scientific literature and debriefing its authors, visiting archives (including Mezzofanti’s, in Bologna), and tracking down every living language prodigy he had heard from or about. It was his online survey, conducted in 2009, that generated the first systematic overview of linguistic virtuosity. Some four hundred respondents provided information about their gender and their orientation, among other personal details, including their I.Q.s (which were above average). Nearly half spoke at least seven languages, and seventeen qualified as hyperpolyglots. The distillation of this research, “Babel No More,” published in 2012, is an essential reference book—in its way, an ethnography of what Erard calls a “neural tribe.”

The awe that tribe members command has always attracted opportunists. There are, for example, “bizglots” and “broglots,” as Erard calls them. The former hawk tutorials with the dubious promise that anyone can become a prodigy, while the latter engage in online bragfests, like “postmodern frat boys.” And then there are the fauxglots. My favorite is “George Psalmanazar” (his real name is unknown), a vagabond of mysterious provenance and endearing chutzpah who wandered through Europe in the late seventeenth century, claiming, by turns, to be Irish, Japanese, and, ultimately, Formosan. Samuel Johnson befriended him in London, where Psalmanazar published a travelogue about his “native” island which included translations from its language—an ingenious pastiche of his invention. Erard pursued another much hyped character, Ziad Fazah, a Guinness-record holder until 1997, who claimed to speak fifty-eight languages fluently. Fazah flamed out spectacularly on a Chilean television show, failing to answer even simple questions posed to him by native speakers.

Rojas-Berscia derides such theatrics as “monkey business,” and dismisses prodigies who monetize their gifts. “Where do they get the time for it?” he wonders. Erard, in his survey for “Babel No More,” queried his subjects on their learning protocols, and, while some were vague (“I accept mistakes and uncertainty; I listen and read a lot”), others gave elaborate accounts of drawing “mind maps” and of building “memory anchors,” or of creating an architectural model for each new language, to be furnished with vocabulary as they progressed. When I asked Simcott if he had any secrets, he paused to think about it. “Well, I don’t have an amazing memory,” he said. “At many tasks, I’m just average. A neurolinguist at the City University of New York, Loraine Obler, ran some tests on me, and I performed highly on recalling lists of nonsense words.” (That ability, Obler’s research suggests, strongly correlates with a gift for languages.) “I was also a standout at reproducing sounds,” he continued. “But, the more languages you learn, in the more families, the easier it gets. Each one bangs more storage hooks into the wall.”

Alexander Argüelles, a legendary figure in the community, warned Erard that immodesty is the hallmark of a charlatan. When Erard met him, ten years ago, Argüelles, an American who lives in Singapore, started his day at three in the morning with a “scriptorium” exercise: “writing two pages apiece in Arabic, Sanskrit, and Chinese, the languages he calls the ‘etymological source rivers.’ ” He continued with other languages, from different families, until he had filled twenty-four notebook pages. As dawn broke, he went for a long run, listening to audiobooks and practicing what he calls “shadowing”: as the foreign sounds flowed into his headphones, he shouted them out at the top of his lungs. Back at home, he turned to drills in grammar and phonetics, logging the time he had devoted to each language on an Excel spreadsheet. Erard studied logs going back sixteen months, and calculated that Argüelles had spent forty per cent of his waking life studying fifty-two languages, in increments that varied from four hundred and fifty-six hours (Arabic) to four hours (Vietnamese). “The way I see it, there are three types of polyglots,” he told Erard. There were the “ultimate geniuses . . . who excel at anything they do”; the Mezzofantis, “who are only good at languages”; and the “people like me.” He refused to consider himself a special case—he was simply a Stakhanovite.

Erard is a pensive man of fifty, still boyish-looking, with a gift for listening that he prizes in others. We met in Nijmegen, at the Max Planck Institute, where he was finishing a yearlong stint as the writer-in-residence, and looking forward to moving back to Maine with his family. “I saw only when the book was finished that many of the stories had a common thread,” he told me. We had been walking through the woods that surround the institute, listening to the vibrant May birdsong, a Babel of voices. His subjects, he reflected, had been cut from the herd of average mortals by their wiring or by their obsession. They had embraced their otherness, and they had cultivated it. Yet, if speech defines us as human, a related faculty had eluded them: the ability to connect. Each new language was a potential conduit—an escape route from solitude. “I hadn’t realized that was my story, too,” he said.

Rojas-Berscia and I took a budget flight from Brussels to Malta, arriving at midnight. The air smelled like summer. Our taxi-driver presumed we were mother and son. “How do you say ‘mother’ in Maltese?” Rojas-Berscia asked him, in English. By the time we had reached the hotel, he knew the whole Maltese family. Two local newlyweds, still in their wedding clothes, were just checking in. “How do you say ‘congratulations’?” Rojas-Berscia asked. The answer was nifrah.

We were both starving, so we dropped our bags and went to a local bar. It was Saturday night, and the narrow streets of the quarter were packed with revellers grooving to deafening music. I had pictured something a bit different—a quaint inn on a quiet square, perhaps, where a bronze Knight of Malta tilted at the bougainvillea. But Rojas-Berscia is not easily distracted. He took out his notebook and jotted down the kinship terms he had just learned. Then he checked his phone. “I texted the language guide I lined up for us,” he explained. “He’s a personal trainer I found online, and I’ll start working out with him tomorrow morning. A gym is a good place to get the prepositions for direction.” The trainer arrived and had a beer with us. He was overdressed, with a lacquered mullet, and there was something shifty about him. Indeed, Rojas-Berscia prepaid him for the session, but he never turned up the next day. He had, it transpired, a subsidiary line of work.

I didn’t expect Rojas-Berscia to master Maltese in a week, but I was surprised at his impromptu approach. He spent several days raptly eavesdropping on native speakers in markets and cafés and on long bus rides, bathing in the warm sea of their voices. If we took a taxi to some church or ruin, he would ride shotgun and ask the driver to teach him a few common Maltese phrases, or to tell him a joke. He didn’t record these encounters, but in the next taxi or shop he would use the new phrases to start a conversation. Hyperpolyglots, Erard writes, exhibit an imperative “will to plasticity,” by which he means plasticity of the brain. But I was seeing plasticity of a different sort, which I myself had once possessed. In my early twenties, I had learned two languages simultaneously, the first by “sleeping with my dictionary,” as the French put it, and the other by drinking a lot of wine and being willing to make a fool of myself jabbering at strangers. With age, I had lost my gift for abandon. That had been my problem with Vietnamese. You have to inhabit a language, not only speak it, and fluency requires some dramatic flair. I should have been hanging out in New York’s Little Saigon, rather than staring at a screen.

The Maltese were flattered by Rojas-Berscia’s interest in their language, but dumbfounded that he would bother to learn it—what use was it to him? Their own history suggests an answer. Malta, an archipelago, is an almost literal stepping stone from Africa to Europe. (While we were there, the government turned away a boatload of asylum seekers.) Its earliest known inhabitants were Neolithic farmers, who were succeeded by the builders of a temple complex on Gozo. (Their mysterious megaliths are still standing.) Around 750 B.C., Phoenician traders established a colony, which was conquered by the Romans, who were routed by the Byzantines, who were kicked out by the Aghlabids. A community of Arabs from the Muslim Emirate of Sicily landed in the eleventh century and dug in so deep that waves of Christian conquest—Norman, Swabian, Aragonese, Spanish, Sicilian, French, and British—couldn’t efface them. Their language is the source of Maltese grammar and a third of the lexicon, making Malti the only Semitic language in the European Union. Rojas-Berscia’s Hebrew helped him with plurals, conjugations, and some roots. As for the rest of the vocabulary, about half comes from Italian, with English and French loanwords. “We should have done Uighur,” I teased him. “This is too easy for you.”

Linguistics gave Rojas-Berscia tools that civilians lack. But he was drawn to linguistics in part because of his aptitude for systematizing. “I can’t remember names,” he told me, yet his recall for the spoken word is preternatural. “It will take me a day to learn the essentials,” he had reckoned, as we planned the trip. The essentials included “predicate formation, how to quantify, negation, pronouns, numbers, qualification—‘good,’ ‘bad,’ and such. Some clausal operators—‘but,’ ‘because,’ ‘therefore.’ Copular verbs like ‘to be’ and ‘to seem.’ Basic survival verbs like ‘need,’ ‘eat,’ ‘see,’ ‘drink,’ ‘want,’ ‘walk,’ ‘buy,’ and ‘get sick.’ Plus a nice little shopping basket of nouns. Then I’ll get our guide to give me a paradigm—‘I eat an apple, you eat an apple’—and voilà.” I had, I realized, covered the same ground in Vietnamese—tôi ăn một quả táo—but it had cost me six months.

It wasn’t easy, though, to find the right guide. I suggested we try the university. “Only if we have to,” Rojas-Berscia said. “I prefer to avoid intellectuals. You want the street talk, not book Maltese.” How would he do this in the Amazon? “Monolingual fieldwork on indigenous tongues, without the reference point of a lingua franca, is harder, but it’s beautiful,” he said. “You start by making bonds with people, learning to greet them appropriately, and observing their gestures. The rules of behavior are at least as important in cultural linguistics as the rules of grammar. It’s not just a matter of finding the algorithm. The goal is to become part of a society.”

After the debacle with the “trainer,” we went looking for volunteers willing to spend an hour or so over a drink or a coffee. We auditioned a tattoo artist with blond dreadlocks, a physiology student from Valletta, a waiter on Gozo, and a tiny old lady who sold tickets to the catacombs outside Mdina (a location for King’s Landing in “Game of Thrones”). Like nearly all Maltese, they spoke good English, though Rojas-Berscia valued their mistakes. “When someone says, ‘He is angry for me,’ you learn something about his language—it represents a convention in Maltese. The richness of a language’s conventions is the highest barrier to sounding like a native in it.”

On our third day, Rojas-Berscia contacted a Maltese Facebook friend, who invited us to dinner in Birgu, a medieval city fortified by the Knights of Malta in the sixteenth century. The sheltered port is now a marina for super-yachts, although a wizened ferryman shuttles humbler travellers from the Birgu quays to those of Senglea, directly across from them. The waterfront is lined with old palazzos of coralline limestone, whose façades were glowing in the dusk. We ordered some Maltese wine and took in the scene. But the minute Rojas-Berscia opened his notebook his attention lasered in on his task. “Please don’t tell me if a verb is regular or not,” he chided his friend, who was being too helpful. “I want my brain to do the work of classifying.”

Rojas-Berscia’s brain is of great interest to Simon Fisher, his senior colleague at the institute and a neurogeneticist of international renown. In 2001, Fisher, then at Oxford, was part of a team that discovered the FOXP2 gene and identified a single, heritable mutation of it that is responsible for verbal dyspraxia, a severe language disorder. In the popular press, FOXP2 has been mistakenly touted as “the language gene,” and as the long-sought evidence for Noam Chomsky’s famous theory, which posits that a spontaneous mutation gave Homo sapiens the ability to acquire speech and that syntax is hard-wired. Other animals, however, including songbirds, also bear a version of the gene, and most of the researchers I met believe that language is probably, as Fisher put it, a “bio-cultural hybrid”—one whose genesis is more complicated than Chomsky would allow. The question inspires bitter controversy.

Fisher’s lab at Nijmegen focusses on pathologies that disrupt speech, but he has started to search for DNA variants that may correlate with linguistic virtuosity. One such quirk has already been discovered, by the neuroscientist Sophie Scott: an extra loop of gray matter, present from birth, in the auditory cortex of some phoneticians. “The genetics of talent is unexplored territory,” Fisher said. “It’s a hard concept to frame for an experiment. It’s also a sensitive topic. But you can’t deny the fact that your genome predisposes you in certain ways.”

The genetics of talent may thwart average linguaphiles who aspire to become Mezzofantis. Transgenerational studies are the next stage of research, and they will seek to establish the degree to which a genius for language runs in the family. Argüelles is the child of a polyglot. Kató Lomb was, too. Simcott’s daughter might contribute to a science still in its infancy. In the meantime, Fisher is recruiting outliers like Rojas-Berscia and collecting their saliva; when the sample is broad enough, he hopes, it will generate some conclusions. “We need to establish the right cutoff point,” he said. “We tend to think it should be twenty languages, rather than the conventional eleven. But there’s a trade-off: with a lower number, we have a bigger cohort.”

I asked Fisher about another cutoff point: the critical period for acquiring a language without an accent. The common wisdom is that one loses the chance to become a spy after puberty. Fisher explained why that is true for most people. A brain, he said, sacrifices suppleness to gain stability as it matures; once you master your mother tongue, you don’t need the phonetic plasticity of childhood, and a typical brain puts that circuitry to another use. But Simcott learned three of the languages in which he is mistaken for a native when he was in his twenties. Corentin Bourdeau, who grew up in the South of France, passes for a local as seamlessly in Lima as he does in Tehran. Experiments in extending or restoring plasticity, in the hope of treating sensory disabilities, may also lead to opportunities for greater acuity. Takao Hensch, at Harvard, has discovered that Valproate, a drug used to treat epilepsy, migraines, and bipolar disorder, can reopen the critical period for visual development in mice. “Might it work for speech?” Fisher said. “We don’t know yet.”

Rojas-Berscia and I parted on the train from Brussels to Nijmegen, where he got off and I continued to the Amsterdam airport. He had to finish his thesis on the Flux Approach before leaving for a research job in Australia, where he planned to study aboriginal languages. I asked him to assess our little experiment. “The grammar was easy,” he said. “The orthography is a little difficult, and the verbs seemed chaotic.” His prowess had dazzled our consultants, but he wasn’t as impressed with himself. He could read bits of a newspaper; he could make small talk; he had learned probably a thousand words. When a taxi-driver asked if he’d been living on Malta for a year, he’d laughed with embarrassment. “I was flattered, of course,” he added. “And his excitement for my progress excited him to help us.” “Excitement about your progress,” I clucked. It was a rare lapse.

A week later, I was on a different train, from New York to Boston. Fisher had referred me to his collaborator Evelina Fedorenko. Fedorenko is a cognitive neuroscientist at Massachusetts General Hospital who also runs what her postdocs call the EvLab, at M.I.T. My first e-mail to her had bounced back—she was on maternity leave. But then she wrote to say that she would be delighted to meet me. “Are you claustrophobic?” she added. If not, she said, I could take a spin in her fMRI machine, to see what she does with her hyperpolyglots.

Fedorenko is small and fair, with delicate features. She was born in Volgograd in 1980. “When the Soviet Union fell apart, we were starving, and it wasn’t fun,” she said. Her father was an alcoholic, but her parents were determined to help her fulfill her exceptional promise in math and science, which meant escaping abroad. At fifteen, she won a place in an exchange program, sponsored by Senator Bill Bradley, and spent a year in Alabama. Harvard gave her a full scholarship in 1998, and she went on to graduate school at M.I.T., in linguistics and psychology. There, she met the cognitive scientist Ted Gibson. They married, and they now have a one-year-old daughter.

One afternoon, I visited Fedorenko at her home, in Belmont. (She spends as much time as she can with her baby, who was babbling like a songbird.) “Here is my basic question,” she said. “How do I get a thought from my mind into yours? We begin by asking how language fits into the broader architecture of the mind. It’s a late invention, evolutionarily, and a lot of the brain’s machinery was already in place.”

She wondered: Does language share a mechanism with other cognitive functions? Or is it autonomous? To seek an answer, she developed a set of “localizer tasks,” administered in an fMRI machine. Her first goal was to identify the “language-responsive cortex,” and the tasks involved reading or listening to a sequence of sentences, some of them garbled or composed of nonsense words.

The responsive cortex proved to be separate from regions involved in other forms of complex thought. We don’t, for example, use the same parts of our brains for music and for speech, which seems counterintuitive, especially in the case of a tonal language. But pitch, Fedorenko explained, has its own neural turf. And life experience alters the picture. “Literate people use one region of their cortex in recognizing letters,” she said. “Illiterate people don’t have that region, though it develops if they learn to read.”

In order to draw general conclusions, Fedorenko needed to study the way that language skills vary among individuals. They turned out to vary greatly. The intensity of activity in response to the localizer tests was idiosyncratic; some brains worked harder than others. But that raised another question: Did heightened activity correspond to a greater aptitude for language? Or was the opposite true—that the cortex of a language prodigy would show less activity, because it was more efficient?

I asked Fedorenko if she had reason to believe that gay, left-handed males on the spectrum had some cerebral advantage in learning languages. “I’m not prepared to accept that reporting as anything more than anecdotal,” she said. “Males, for one thing, get greater encouragement for intellectual achievement.”

Fedorenko’s initial subjects had been English-speaking monolinguals, or bilinguals who also spoke Spanish or Mandarin. But, in 2013, she tested her first prodigy. “We heard about a local kid who spoke thirty languages, and we recruited him,” she said. He introduced her to other whizzes, and as the study grew Fedorenko needed material in a range of tongues. Initially, she used Bible excerpts, but “Alice’s Adventures in Wonderland” came to seem more congenial. The EvLab has acquired more than forty “Alice” translations, and Fedorenko plans to add tasks in sign language.

Twelve years on, Fedorenko is confident of certain findings. All her subjects show less brain activity when working in their mother tongue; they don’t have to sweat it. As the language in the tests grows more challenging, it elicits more neural activity, until it becomes gibberish, at which point it elicits less—the brain seems to give up, quite sensibly, when a task is futile. Hyperpolyglots, too, work harder in an unfamiliar tongue. But their “harder” is relaxed compared with the efforts of average people. Their advantage seems to be not capacity but efficiency. No matter how difficult the task, they use a smaller area of their brain in processing language—less tissue, less energy.

All Fedorenko’s guinea pigs, including me, also took a daunting nonverbal memory test: squares on a grid flash on and off as you frantically try to recall their location. This trial engages a neural network separate from the language cortex—the executive-function system. “Its role is to support general fluid intelligence,” Fedorenko said. What kind of boost might it give to, say, a language prodigy? “People claim that language learning makes you smarter,” she replied. “Sadly, we don’t have evidence for it. But, if you play an unfamiliar language to ‘normal’ people, their executive-function systems don’t show much response. Those of polyglots do. Perhaps they’re striving to grasp a linguistic signal.” Or perhaps that’s where their genie resides.

Barring an infusion of Valproate, most of us will never acquire Rojas-Berscia’s twenty-eight languages. As for my own brain, I reckoned that the scan would detect a lumpen mass of mac and cheese embedded with low-wattage Christmas lights. After the memory test, I was sure that it had. “Don’t worry,” Matt Siegelman, Fedorenko’s technician, reassured me. “Everyone fails it—well, almost.”

Siegelman’s tactful letdown woke me from my adventures in language land. But as I was leaving I noticed a copy of “Alice” in Vietnamese. I report to you with pride that I could make out “white rabbit” (thỏ trắng), “tea party” (tiệc trà), and ăn tôi, which—you knew it!—means “eat me.” ♦

This article appears in the print edition of the September 3, 2018, issue, with the headline “Maltese for Beginners.”

A Picture Of Language: The Fading Art Of Diagramming Sentences

Once a popular way to teach grammar, the practice of diagramming sentences has fallen out of favor.

The design firm Pop Chart Lab has taken the first lines of famous novels and diagrammed those sentences. This one shows the opening of Franz Kafka’s Metamorphosis.   Pop Chart Lab

Source: A Picture Of Language: The Fading Art Of Diagramming Sentences

When you think about a sentence, you usually think about words — not lines. But sentence diagramming brings geometry into grammar.

If you weren’t taught to diagram a sentence, this might sound a little zany. But the practice has a long — and controversial — history in U.S. schools.

And while it was once commonplace, many people today don’t even know what it is.

So let’s start with the basics.

“It’s a fairly simple idea,” says Kitty Burns Florey, the author of Sister Bernadette’s Barking Dog: The Quirky History and Lost Art of Diagramming Sentences. “I like to call it a picture of language. It really does draw a picture of what language looks like.”

I asked her to show me, and for an example she used the first sentence she recalls diagramming: “The dog barked.”

“By drawing a line and writing ‘dog’ on the left side of the line and ‘barked’ on the right side of the line and separating them with a little vertical line, we could see that ‘dog’ was the subject of the sentence and ‘barked’ was the predicate or the verb,” she explains. “When you diagram a sentence, those things are always in that relation to each other. It always makes the same kind of picture. And supposedly, it makes it easier for kids who are learning to write, learning to use correct English.”

An Education ‘Phenomenon’

Burns Florey and other experts trace the origin of diagramming sentences back to 1877 and two professors at Brooklyn Polytechnic Institute. In their book, Higher Lessons in English, Alonzo Reed and Brainerd Kellogg made the case that students would learn better how to structure sentences if they could see them drawn as graphic structures.

After Reed and Kellogg published their book, the practice of diagramming sentences had something of a Golden Age in American schools.

“It was a purely American phenomenon,” Burns Florey says. “It was invented in Brooklyn, it swept across this country like crazy and became really popular for 50 or 60 years and then began to die away.”

By the 1960s, new research dumped criticism on the practice.

“Diagramming sentences … teaches nothing beyond the ability to diagram,” declared the 1960 Encyclopedia of Educational Research.

In 1985, the National Council of Teachers of English declared that “repetitive grammar drills and exercises” — like diagramming sentences — are “a deterrent to the improvement of students’ speaking and writing.”

Nevertheless, diagramming sentences is still taught — you can find it in textbooks and see it in lesson plans. My question is, why?

Burns Florey says it might still be a good tool for some students. “When you’re learning to write well, it helps to understand what the sentence is doing and why it’s doing it and how you can improve it.”

But does it deserve a place in English class today? (The Common Core doesn’t mention it.)

“There are two kinds of people in this world — the ones who loved diagramming, and the ones who hated it,” Burns Florey says.

She’s in the first camp. But she understands why, for some students, it never clicks.

“It’s like a middle man. You’ve got a sentence that you’re trying to write, so you have to learn to structure that, but also you have to learn to put it on these lines and angles and master that, on top of everything else.”

So many students ended up frustrated, viewing the technique “as an intrusion or as an absolutely confusing, crazy thing that they couldn’t understand.”

Science Discovered That Banning Small Talk from Your Conversations Makes You Happier (Try Asking These 13 Questions Instead)

It’s time to delete questions like ‘what do you do?’ and ‘where do you live?’ from your vocabulary forever.

Source: Science Discovered That Banning Small Talk from Your Conversations Makes You Happier (Try Asking These 13 Questions Instead)

By Marcel Schwantes Principal and founder, Leadership From the Core

Ever walk into a networking event or cocktail party and all you hear is superficial chit-chat? The small talk is deafening and doesn’t evolve into anything substantial. You can hardly stand not to elicit an eye-roll in between sips of your Mojito.

Questions like what do you do? and where do you live? are predictable and exhausting; commentary about the weather or last night’s game fill up awkward moments as people size each other up to determine — is this is someone I want to talk to?

As it turns out, the types of conversations you’re engaging in truly matter for your personal wellbeing. In 2010, scientists from the University of Arizona and Washington University in St. Louis investigated whether happy and unhappy people differ in the types of conversations they have.

The findings

Seventy-nine participants wore a recording device over four days and were periodically recorded as they went about their lives. Out of more than 20,000 recordings, researchers identified the conversations as trivial small talk or substantive discussions.

As published in Psychological Science, the happiest participants had twice as many genuine conversations and one third as much small talk as the unhappiest participants.

These findings suggest that the happy life is social and conversationally deep rather than isolated and superficial. The research has also confirmed what most people know but don’t practice: surface level small talk does not build relationships

The new trend: Ban the small talk

Obviously inspired, behavioral scientists Kristen Berman and Dan Ariely, co-founders of Irrational Labs, a non-profit behavioral consulting company, raised the bar by hosting a dinner party where small talk was literally banned and only meaningful conversations were allowed.

As documented in a Wired article, invited guests of Berman and Ariely were provided with index cards featuring examples of meaningful (and odd) conversation starters like, for example, the theory of suicide prevention or, um … “the art of the dominatrix.”

The party was a hit. The authors report that “everyone was happier” without the obligation of trivial small talk.

Seizing the opportunity as any innovative entrepreneur would, Carolina Gawroński, founder of No Small Talk dinners, launched her business last month in Hong Kong, which is quickly spreading to cities around the world.

“Growing up I was surrounded by, on the one side, [my father’s] interesting friends. But on the other side, there was this whole element of being social and being at bullshit social events,” Gawroński tells Hong Kong Free Press. “Since a young age, I’ve always questioned it: ‘Why do people talk like this? What’s the point?'”

The rules at a No Small Talk dinner event are simple: no phones and no small talk. Guests also receive cards with meaningful-conversation prompts.

Then, there’s Sean Bisceglia, a partner at Sterling Partners, a private equity firm. Bisceglia has hosted Jefferson-style dinners at his home for the past eight years.

The concept is basically the same but shared as a group in a whole-table conversation with a purpose: One person speaks at a time to the whole table, there are no side conversations, and small talk is completely banned.

“I do it because the shallowness of cocktail chitchat kind of drove me crazy,” Bisceglia tells Crain’s Chicago Business. “There was never any conversation deeper than two minutes. I really felt that if we could bring together a group of people, you could get into the issues and hear different people’s perspectives.”

13 questions to start great conversations

If you’ve bought on to this idea of banning small talk from your conversations, here are thirteen no-fail conversation starters cherry-picked from a few credible sources:

  1. What’s your story?
  2. What’s the most expensive thing you’ve ever stolen?
  3. What is your present state of mind?
  4. What absolutely excites you right now?
  5. What book has influenced you the most?
  6. If you could do anything you wanted tonight (anywhere, for any amount of money), what would you do and why?
  7. If you had the opportunity to meet one person you haven’t met who would it be, why and what would you talk about?
  8. What’s the most important thing I should know about you?
  9. What do you value more, intelligence or common sense?
  10. What movie is your favorite guilty pleasure, and why?
  11. You are stuck on a deserted island, and you can only take three things. What would they be?
  12. When and where were you happiest in your life?
  13. What do you think is the driving force in your life?

This Style of Entertainment Makes You Smarter – PsyBlog

An unsettling feeling, like the absurdity of life, can engender the desired state.

Source: This Style of Entertainment Makes You Smarter – PsyBlog

Surreal books and films could make you smarter, research finds.

Stories by Franz Kafka or films by master of the absurd David Lynch could boost learning.

Even an unsettling feeling, like the absurdity of life, can engender the desired state.

The reason is that surreal or nonsensical things put our mind into overdrive looking for meaning.

When people are more motivated to search for meaning, they learn better, the psychologists found.

Dr Travis Proulx, the study’s first author, explained:

“The idea is that when you’re exposed to a meaning threat –– something that fundamentally does not make sense –– your brain is going to respond by looking for some other kind of structure within your environment.

And, it turns out, that structure can be completely unrelated to the meaning threat.”

For the study, people read a Franz Kafka’s short story called ‘The Country Doctor’ — which involves a nonsensical series of events.

A version of the story was rewritten to make more sense and read by a control group.

Afterwards, both groups were given an unconscious learning task that involved spotting strings of letters.

Dr Proulx said:

“People who read the nonsensical story checked off more letter strings –– clearly they were motivated to find structure.

But what’s more important is that they were actually more accurate than those who read the more normal version of the story.

They really did learn the pattern better than the other participants did.”

In a second study, people were made to feel their own lives didn’t make sense.

This was done by pointing out the contradictory decisions they had made.

Dr Proulx said:

“You get the same pattern of effects whether you’re reading Kafka or experiencing a breakdown in your sense of identity.

People feel uncomfortable when their expected associations are violated, and that creates an unconscious desire to make sense of their surroundings.

That feeling of discomfort may come from a surreal story, or from contemplating their own contradictory behaviors, but either way, people want to get rid of it.

So they’re motivated to learn new patterns.”

The study only tested unconscious learning, it doesn’t tell us whether you would be able to use this trick intentionally.

Dr Proulx said:

“It’s important to note that sitting down with a Kafka story before exam time probably wouldn’t boost your performance on a test.

What is critical here is that our participants were not expecting to encounter this bizarre story.

If you expect that you’ll encounter something strange or out of the ordinary, you won’t experience the same sense of alienation.

You may be disturbed by it, but you won’t show the same learning ability.

The key to our study is that our participants were surprised by the series of unexpected events, and they had no way to make sense of them.

Hence, they strived to make sense of something else.

The study was published in the journal Psychological Science (Proulx & Heine, 2009).

How Language Shapes the Way We Think

There are about 7,000 languages spoken around the world — and they all have different sounds, vocabularies and structures. But do they shape the way we think? Cognitive scientist Lera Boroditsky shares examples of language — from an Aboriginal community in Australia that uses cardinal directions instead of left and right to the multiple words for blue in Russian — that suggest the answer is a resounding yes. “The beauty of linguistic diversity is that it reveals to us just how ingenious and how flexible the human mind is,” Boroditsky says. “Human minds have invented not one cognitive universe, but 7,000.”



Lera Boroditsky · Cognitive scientist

Lera Boroditsky is trying to figure out how humans get so smart.


Distracted Boyfriend meme is sexist, rules Swedish ad watchdog


, , , , , ,

From The Guardian

Popular image of man ogling another woman deemed degrading and discriminatory

The popular Distracted Boyfriend meme, based on a photo of a man turning away from his outraged girlfriend to stare admiringly at another woman, has been ruled sexist by Sweden’s advertising ombudsman.

The stock image, also known as Man Looking at Other Woman, by Antonio Guillem, a photographer from Barcelona, was named meme of the year in April and was one of the most widely shared memes in 2017, providing comment on anything from music to politics to hit TV shows.

The ombudsman said recruitment advertisements posted on Facebook by the internet services provider Bahnhof, which labelled the boyfriend “You”, the girlfriend “Your current workplace”, and the second woman “Bahnhof”, were gender-discriminatory, the Local reported.

“The advertisement objectifies women,” the ombudsman, RO, said. “It presents women as interchangeable items and suggests only their appearance is interesting … It also shows degrading stereotypical gender roles of both men and women and gives the impression men can change female partners as they change jobs.”

The ombudsman said the image objectified the two women by presenting them as workplaces, but the man as an individual, and added that the “other woman” was clearly a “sex object … unrelated to the advertisement, which is for recruiting salespeople, operating engineers and a web designer”.


The Swedish advertising industry is self-regulating, meaning that the ombudsman can criticise ads but it does not have the power to impose sanctions.

The ad, posted in April, drew nearly 1,000 comments, many from women who complained it was sexist. “1. You really don’t want to attract women to your company,” one commenter, Susanne Lahti Hagbard, said. “2. You really don’t want to attract sensible guys either.”

Another, Sofie Sundåker, said: “It doesn’t matter if it’s a popular meme. If you do not see how this picture is sexist whatever words are on the people, you are clearly not a workplace for any woman who wants to be taken seriously in her work.”

The company said on its Facebook page that its aim had been “to illustrate a situation that shows Bahnhof is an attractive employer, and that people who have a slightly duller workplace might be interested in us. This was the situation illustrated in this meme.

“Anyone familiar with the internet and meme culture knows how this meme is used and interpreted. Gender is usually irrelevant in the context. We explained meme culture to the ombudsman, but it chose to interpret the post differently”.

If the company should be punished for anything, it concluded, “it should be for using a tired old meme”.

While it frequently features near the top of world gender-equality rankings, a 2016 study found Sweden was the worst of the Nordic countries at combating sexist advertising. This year, Stockholm council voted to bar ads deemed sexist or degrading from the city’s public billboards.

Tech Titans Dish Advice About Phone Addiction – Great Escape – Medium


, , , , ,


Your phone is training you to be its servant. Here’s how to fight back.

by Clint Carter

Source: Tech Titans Dish Advice About Phone Addiction – Great Escape – Medium

With every Facebook post you like, tweet you send, or question you type into Google, you’re giving the internet strength. Feeding the algorithms. Paying the advertisers. You’re also helping to fill server farms that will ultimately be replaced by bigger server farms, effectively anchoring the internet in the real world. This is all sweet and rosy, if the internet-human relationship is mutually beneficial. But it’s not clear that it is.

In some ways, our nonstop online lives are bringing us closer. But at least as often, the relentless pace of social media, email, and constant pings and beeps only serve to pull us further apart. And all this tech is certainly bad for our health and happiness: Research links social media to depression and high-speed internet to poor sleep. Simply having a phone visible during meals has been shown to make conversation among friends less enjoyable.

It’s probably hard to imagine life without a high-powered computer in your pocket or purse at all times, but it’s worth remembering that you’re still an autonomous being.

That said, these effects aren’t inevitable. Not yet, anyway. It’s probably hard to imagine life without a high-powered computer in your pocket or purse at all times, but it’s worth remembering that you’re still an autonomous being. You can decide how often and in what way you interact with the internet. And if you talk to the researchers, authors, and entrepreneurs who understand digital technology best, you discover that many of them already have.

We reached out to eight digital experts to find out how they maintain a (reasonably) healthy relationship with technology. All agreed that push notifications are evil, so you should go ahead and turn those off right now. Some of the experts even said they keep their ringers and text notifications off, at least some of the time. Beyond that, they all had unique strategies for defending themselves against the intrusive, obnoxious, and possibly destructive effects of technology.

Give Yourself One Honest Hour of Work Each Day

Dan Ariely, PhD

Professor of psychology and behavioral economics at Duke University, author of Predictably Irrational: The Hidden Forces That Shape Our Decisions

Much of Dan Ariely’s work — including Timeful, the A.I.-powered calendar app he built and sold to Google — focuses on making the most of limited time. One way he does this is by starting each morning in a distraction-free environment. “I think very carefully about the first hour of the day,” he says. “I used to have two computers, and one had no email or browser on it.” That’s the one he used for writing in the mornings.

“The thing is to realize that our time to work is actually quite precious.”

Ariely’s travel schedule forced him to abandon the dual-computer setup, but the experiment was fruitful enough that he now relies on a self-imposed internet ban to get work done. “The last thing I do each day is turn my computer off,” he says. “The next day, when I turn it back on, my browser and email are still off.” And Ariely keeps it that way until he’s powered through that first hour. “The thing is to realize that our time to work is actually quite precious,” he says. “We need to protect it.”

Quit Cold Turkey

Steve Blank

Stanford professor, retired entrepreneur, and founder of the Lean Startup movement

Over the two-plus decades that Steve Blank helped shape Silicon Valley, he ushered eight technology startups into the world. But it was during his tenure at Rocket Science Games, a company he founded in the mid-1990s, that Blank began getting high on his own supply. “I found myself drug addicted,” he says. “I’d be up playing games until four in the morning.”

“The devices started as tools and ended up as drugs for most people.”

Video games are hardly a Schedule 1 narcotic, but Blank was losing sleep and, he felt, setting a bad example for his children. Emerging research confirms his idea that games and social media can exert drug-like forces over users. A study published in the journal PLOS One even found that digital addictions can shrink the amount of white matter at certain brain sites, creating changes similar to those seen in alcohol, cocaine, and methamphetamine addictions. “The devices started as tools and ended up as drugs for most people,” Blank says. “App manufacturers are incentivized to make us addicted. I’ll contend that a ton of social media is actually a lot like oxycontin.”

When Blank realized that his gaming habit was robbing him of happiness by way of lost sleep and family time, he snapped his CD-ROMs in half (this was the ’90s, remember). Then he threw the pieces into the trash. “I literally went cold turkey,” he says. “And I haven’t played a video game since.”

Create an Email System and Stick to It

Ethan Kross, PhD

Professor of psychology and director of the Emotion and Self-Control Laboratory at the University of Michigan

After studying Facebook — and, more important, after finding that the biggest users were the least satisfied with life — Ethan Kross decided to refrain from any social media use. But he still checks his email more often than he’d like. “It’s a self-control failure from a self-control expert,” he says.

To be fair, the professor is probably selling himself short. The truth is he relies on three solid rules to prevent compulsive emailing.

“So I just try to change my digital environment. We know from research that can be a powerful tool for enhancing self-control.”

First, Kross pushes all fast-moving work conversations to Slack. “That way I can get information from my lab collaborators quickly, and my email becomes less urgent.”

Second, he uses the snooze function, which is available on Gmail and services like Boomerang for Outlook, for any email that isn’t urgent. “If there are 50 things in my inbox, that can be disruptive to my immediate goals,” Kross says. So he snoozes them for a few hours or a few days, depending on the urgency.

Finally, Kross relies on an email-free iPad for reading, so he can’t check his incoming mail even if he wants to. “I don’t like checking my email when I’m in bed, because once every month I’ll receive something that makes me not sleep well,” he says. “So I just try to change my digital environment. We know from research that can be a powerful tool for enhancing self-control.”

Take Weeklong Breaks as Necessary

Jean Twenge, PhD

Researcher and professor of psychology at San Diego State University and the author of iGen, a book about how the internet is changing young adults

In April of last year, Jean Twenge signed up for Twitter. It’s her first and only social media account, and almost immediately she found herself clashing with people who disagreed with her research. “It’s a public forum, and I felt a compulsion to defend my arguments,” Twenge says. “But is that the right response? I don’t know. For my own mental health, I know it’s not.”

“It’s a public forum, and I felt a compulsion to defend my arguments.”

It’s not that she wanted to be on Twitter, but as an academic with a book to promote, Twenge felt like she had to. After six months with the service, though, Twenge noticed that she was increasingly giving in to a compulsion to check up on conversations that were making her miserable. “It completely confirmed why I don’t have social media,” she says. And so she scaled back. Twenge kept the account for promotional reasons and still has periods of time when she’s active, but when she needs a refresh, she consciously steps away for days or weeks.

When asked if she’s tempted to open an Instagram or Facebook account — even if just for research purposes — she replies quickly, “Nope.”

Dock Your Gadget and Walk Away

Erik Peper, PhD

Professor at San Francisco State University and president of the Biofeedback Federation of Europe

As a researcher who explores the impact of excessive phone use (it makes us feel lonely) and the bad posture brought on by constantly staring at a screen, Erik Peper makes a point of keeping his phone at a distance. When he leaves home in the morning, he packs it into his backpack instead of his pocket. And when he returns in the evening, he docks it at the charging station by his front door.

What’s the point? There are two, actually.

“There are very few things that are truly urgent.”

First, the microwaves coming off mobile devices could present a small risk to their owners, Peper says. In a paper he wrote for the journal Biofeedback, Peper cites epidemiological research showing that people who use cellphones for more than 10 years are more likely than nonusers to have tumors on their salivary glands and inside their ear canals. They’re also three times as likely to have certain brain and spinal-cord tumors on the side of their head where they hold their phone. “The data is weak and controversial,” Peper admits. “But I believe in the precautionary principle, which says that you have to first prove something is totally safe before you can use it.”

The second reason is that, simply put, it’s a distraction. “The phone hijacks our evolutionary patterns,” Peper says. “We don’t do good with multitasking, so if you’re writing an article, and every five minutes you pop back to answer a message, you’re much less productive in the long term.” The same logic applies to socializing, he says, which is why his phone is stored out of sight when he’s with friends and family.

Does it matter that he’s a little slow to reply to messages? Or that he occasionally misses a call? “There are very few things that are truly urgent,” Peper says. “It’s different if you’re a firefighter, but beyond that, whether I answer the email this minute, later today, or even this evening — it really makes no difference.”

Eliminate Email on Your Phone

Linden Tibbets

CEO of IFTTT, a service that lets you program your apps and smart devices to carry out rote tasks

Years ago, Linden Tibbets decided he didn’t want to be a slave to his email. Which meant, in short, that he would read and send messages only while sitting at his desk.

“The only time I send email on my phone is if I’m running late to a meeting and there’s no other way to communicate,” Tibbets says. “That’s literally the only time.”

“You can be endlessly entertained with what’s happening in the world around you. You don’t need your phone.”

The upshot, he says, is that he’s able to address his correspondance with better focus. “I would much rather spend an extra hour in the evening responding to email than to be distracted by it off and on throughout the day,” Tibbets says. If it takes a while to reply to people, no big deal. “I just say, ‘Thanks for your patience. I apologize for being slow to get back to you.’”

And if he finds himself with a moment of downtime — standing in line for groceries, for instance — Tibbets considers it rare opportunity for mind wandering. “I play a game with myself where I try not to look at my phone,” he says. “I look at people. I read food labels. I observe things in the environment. You can be endlessly entertained with what’s happening in the world around you. You don’t need your phone.”

Schedule Moments of Disconnection

Adam Alter, PhD

Professor of marketing at New York University and author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked

In Irresistable, Adam Alter argues that in some ways, tech addiction may actually be worse than cigarette addiction. Because the web is built on social connections, each new addict makes it harder for the rest of us to abstain. “Addictive tech is part of the mainstream in a way that addictive substances never will be,” Alter writes. “Abstinence isn’t an option.”

“I try to put my phone on airplane mode on weekends.”

So what does the tech critic do to protect his own mental autonomy? He disconnects when the workweek’s done. “I try to put my phone on airplane mode on weekends so I can take photos of my two young kids without interruptions from emails and other needy platforms.”

Swap Out the Brain-Rot Apps for Ones That Enrich

Ali Brown

Entrepreneurial consultant, host of Glambition, a podcast for women in business

Last year, Ali Brown had a social media reckoning. “It was after the election, when everything was getting toxic and weird,” she says. “I was getting all my news from Facebook, and I felt this sense of unease all the time.”

So Brown did an entirely logical thing that most of us haven’t done: She drained the swamp on her phone. In one heroic moment of full-steam bravado, Brown deleted Facebook, Twitter, and Instagram and replaced them with apps from the Wall Street Journal and the New York Times. “I decided to pay for some really good journalism,” she says. “I’ll use my time to read those instead.”

“Responding to social media all day is going to get you nowhere.”

Once her healthier new phone routine was established, Brown added back one social media app — but just one! “I like Instagram because it’s generally happy and fun,” she says. “I post about my kids.”

Brown is lucky enough to have a team to run her Twitter and Facebook accounts, but she knows there are better uses for investing her personal time. “If you’re here in this life to do great, powerful work, then you need to create some space in your day to be a freethinker,” she says. “Responding to social media all day is going to get you nowhere.”

To her clients — mostly women running seven- and eight-figure companies — Brown generally offers this advice: “Try deleting social media for a week. You won’t miss anything, you won’t cease to exist, and you’ll thank me later.”

Go to the profile of Clint Carter

WRITTEN BY Clint Carter

Writer for publications such as Entrepreneur, Men’s Health, Men’s Journal, New York magazine, and Wall Street Journal.

Wicked Witch or Job Candidate?


, , , ,

Wicked Witch or Job Candidate

I have observed an increasing number of articles coming across my news feeds and social media how inaccurate perceptions of aging women impacts them in the workplace. A recent WSJ article about women over 50 looking for work caught my attention as the trends in the workplace and media have some similarities.  The article cited a study conducted in 2015 at the University of California, Irvine where researchers submitted 40,000 fake job applications from both male and female “candidates” across three age ranges.  Unfortunately, significant evidence was found of age discrimination against older women. The author also noted that women often take jobs that are below their capacity, skill level and pay grade and are judged more harshly than their male counterparts for their appearance (Weber, 2017). Being someone who believes the “data doesn’t lie”, I looked at Census and labor statistics.

Women over 40 make up 48% of the U.S. population and men over 40 are roughly 44% (United States Census Bureau, 2017). However, when it comes to unemployment women tend to fare worse than men as they age. Unemployment in the 45-54 age range is higher for women (3.6% of women vs 3% of men), same for the 55-64 range (2.8% for both genders), and higher in the over 65 segment (4% of women vs 3.1% of men) (Labor Force Statistics from the Current Population Survey, 2018).

In the age of awareness and press coverage around unconscious bias, you would think the problem of discrimination and false perceptions associated with age and gender would lead to a more enlightened public. So, the question is, why? Why are women in higher age groups subjected to tougher hurdles and unfair perceptions by other groups? One variable to look at is the media we have consumed. If you think logically about the media consumed by multiple generations, older women have not usually been portrayed in a positive light. For example, Snow White had an evil older step mother, The Little Mermaid had Ursula the old gray-haired villain and 101 Dalmatians’ villain was Cruella Deville. The list goes on and on. See a theme here? If you don’t think these portrayals haven’t impacted our perceptions, please read on…

Cultivation theory in psychology posits that media develops the public’s worldview, especially in children. Media created worldviews, especially those with high exposure, can influence schemas as to what is perceived as normal, particularly with individuals in groups that have little exposure to other groups other than through media (Signorielli, 2004). Portrayals of age groups in television and film can influence our perceptions as to the size of a demographic group, as well as their competencies. Negative portrayals of older age groups can and will create perceptions, particularly with younger demographics, because they are not as likely to critically examine media portrayals. However, perception formation does not only impact younger generations, those in the aging group tend to hold negative stereotypes and perceptions about their own group as well (Lauzen & Dozier, 2005b).

A double standard associated with aging men and women exists in television, film and advertising messages about older women. In many films, women are often portrayed as younger in age compared to male characters and female characters are described as elderly at an earlier age than males. Women are often considered older in the film and television industry by age 35, where this age is higher for men (Bazzini, McIntosh, Smith, Cook, & Harris, 1997). Women’s value in film emphasizes looks and youth whereas men have additional attributes that define their worth. In an analysis of the top 100 grossing films in 2002, Lauzen and Dozier (2005a) found that male characters over the age of 50 were depicted as active in all aspects of life, whereas females were not. Men are portrayed as if they still have things to accomplish as they age, while women are portrayed with less purposeful lives, such as career aspirations (Lauzen & Dozier, 2005a).

Television isn’t any better than film and over time has portrayed aging women as becoming old earlier in life and are less visible than males. Furthermore, aging female characters are portrayed as less useful and with diminished capacity particularly around prestige and elements that would represent importance and vitality compared to men of the same age (Bazzini et. al., 1997: Signorielli, 2004). A study conducted in 2005 on primetime television characters found that representation, recognition and respect are not the same for men and women as they age. Specifically:

  • Aging female characters had less representation than their male counterparts starting in their 40s.
  • Portrayals of leadership increased with age, however when analyzed, men were much more likely to play leadership roles in their 40s and 50s compared to women.
  • Occupational power portrayals had a positive linear relationship to age for both genders however men in their 50s were more likely to have occupational power compared to females of the same age.
  • Male characters of all ages were likely to have goals whereas women in their 40s were most likely to have goals.

Lauzen & Dozier’s research concluded that there is double standard of respect afforded to aging characters based on gender. Male characters were more likely to have leadership roles, occupational power and goals compared to women, which could have potential effects on older women such as reinforcing a stereotype bias against them in the workplace (Lauzen & Dozier, 2005b).

Some of you reading this article may look at the age of the research I am referencing and say, “This research is between 10 – 20 years old and so much has changed”. With women’s issues receiving more attention in the media, it wouldn’t be farfetched to provide proof points of the changing times by referencing actors such as Lilly Tomlin and Jane Fonda in Grace and Frankie or Judy Dench or Helen Mirrin in powerful roles in recent years. However, this is a false assumption because cultivation theory posits that what we see in the media creates our world views regardless of the veracity. A study conducted in 2016 analyzed over 2000 movie screenplays and the gender associated with dialogue. As women aged, their percentage of dialogue quickly diminished while men’s dialogue increased in age. For example, women between 22-31 received 38% of screenplay words (men were 20%) and between ages 42-65 women received 20% while men received 39%. The numbers for over 65 were abysmal for both genders, however women fared worse with 3% compared to males at 5% (Anderson & Daniels, 2016).

The Center for the Study of Women in Television and Film’s 2017 analysis of the top 100 grossing films of 2017 did not provide an encouraging picture. Women’s total speaking roles were 34% of all characters which is sad considering they represent half the population. However, when their unfair portion of speaking roles were broken down by age, the story continues to favor the younger woman as men over 40 accounted for 46% of all male characters whereas women over 40 were only 29%  (Lauzen M. M., 2018). While it is wonderful to see some older women taking on powerful lead roles, the attention it receives is certainly not the norm.

There you have it, as women age in media and entertainment, if they appear at all, they are often portrayed as old, ugly, evil, less competent, less powerful, have little to accomplish and receive less respect than their male counterparts. American culture associates beauty with goodness and therefore a woman’s value tends to be associated with her looks favoring the young (Bazzini, McIntosh, Smith, Cook, & Harris, 1997). The time has come for all supervisors, recruiters and human resource departments to rethink assumptions and check unconscious bias on aging women as well. Women over 40 are a sizeable portion of the population, we are not invisible and dammit we are just as smart, capable and appealing as our male counterparts. America’s unemployment is low, skilled talent is a growing issue and women over 40 represent an opportunity to fill the gap. Is your perception of that woman’s qualifications based on data or is Cinderella’s evil stepmother influencing your opinion?


Anderson, H., & Daniels, M. (2016, April). Film dialogue from 2000 screenplays, broken down by gender and age. Retrieved from The Pudding:

Bazzini, D. G., McIntosh, W. D., Smith, S. M., Cook, M., & Harris, C. (1997). The aging woman in popular film: Underrepresented, unattractive, unfriendly, and unintelligent. Sex Roles: A Journal of Research, 36(7-8), 531-543. doi:10.1007/BF0276689

Labor force statistics from the current population survey. (2018, July 6). Retrieved from Bureau of Labor Statistics:

Lauzen, M. M. (2018). It’s a man’s (celluloid) world 2017. Retrieved September 1, 2018, from Center for the Study of Women in Television and Film:

Lauzen, M., & Dozier, D. (2005a). Maintaining the double standard: Portrayals of age and gender in popular flms. Sex Roles, 52(7/8), 437-446. doi:10.1007/s11199-00593710-1

Lauzen, M., & Dozier, D. (2005b). Recognition and respect revisited: Portrayals of age and gender in prime-time television. Mass Communication, 8(3), 241-256.

Signorielli, N. (2004). Aging on television: Messages relating to gender, race and occupation in prime time. Journal Of Broadcasting & Electric Media, 48(2), 279-301.

United States Census Bureau. (2017, June). Annual estimates of the resident population by sex, age, race, and hispanic origin for the United States and states: April 1, 2010 to July 1, 2016 more information. Retrieved from American Fact Finder:

Weber, L. (2017, October 10). After 50, women struggle to find a foothold at work. Retrieved from Wall Street Journal:




What Every Parent Needs To Know About ‘Gaming Disorder’


If you suspect that your child has a gaming disorder, it’s important to seek help for it.


by Korin Miller

Video game addiction is a term that has been used for years by parents and mental health professionals who believe that it’s a real disorder. Now, there’s more weight behind their argument: The World Health Organization (WHO) has including “gaming disorder” as a new mental health condition listed in the 11th edition of its International Classification of Diseases.

According to WHO, there are three major criteria for the diagnosis of gaming disorder: Gaming takes precedence over other activities so much that a person often stops doing other things, a person continues gaming even when it causes issues in their life or they feel that they can’t stop, and gaming causes significant distress and impairments in a person’s relationships with others, as well as their work or school life. If your child gets sucked into a game for a few days, but goes back to normal after that, they wouldn’t qualify: Instead, people must engage in this behavior for at least 12 months, WHO says.

It’s worth noting that WHO’s stance on gaming addiction is different from that of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the handbook used by health professionals in the U.S. and other countries to help diagnose mental health disorders. The DSM-5 calls out “Internet Gaming Disorder” but says it’s a condition that warrants more clinical research and experience before it can be classified in the book as a formal disorder.

WHO says on its website that all people who participate in gaming should be aware that gaming disorder is a real condition, and that it’s important to be mindful of how often they play video games. However, they also point out that gaming disorder only affects a small amount of people who game.

It’s only natural that the news would make you give your child’s gaming system the side-eye.

In general, parents should limit the amount of screen time their children have daily, and gaming is included in that, along with TV, computers, phones and tablet use, Gina Posner, MD, a pediatrician at MemorialCare Orange Coast Medical Center in Fountain Valley, Calif., tells Yahoo Lifestyle.

Screen time isn’t recommended at all for kids who are 18 months or younger, but for children who are older than that up to five, it’s generally recommended that they have not more than one hour of screen time, she says. For those who are six and up, it’s more at the parents’ discretion. “The maximum amount of screen time should be two hours a day, but less is always better,” Posner says.

Posner says that it’s important to set clear limits for your child when it comes to screen time and gaming. For example, say that your child has to do their homework first and/or get out and play for an hour before they’re allowed to game. And even then, make it clear that they’re only allowed to do so for a set period of time.

If your child starts fussing when they’re not allowed to be gaming all day, it’s a clear sign that you need to cut back, Posner says.

Treatment for gaming disorder is generally based in cognitive behavioral therapy, which would generally be done in two phases, Simon Rego, PsyD, chief psychologist at Montefiore Medical Center/Albert Einstein College of Medicine, tells Yahoo Lifestyle. The first is raising awareness for your child that their gaming is a problem, and looking for triggers and cues that could make the gaming habit better or worse. A mental health professional would also address problematic thoughts associated with either stopping playing or the thoughts that keep them gaming, he says.

The goal then is to step down the behavior from something that’s pathological to problematic, and then being able to manage it in a “reasonable way,” Rego says. People don’t necessarily have to quit gaming altogether, but they do need to learn to better manage it with parameters, like only gaming with friends during select times during the day vs. doing it at night alone in their room.

If you suspect that your child has a gaming disorder, it’s important to seek help for it.

Just know that this is still a new diagnosis and you may need to do some sleuthing to find someone who specializes in this kind of behavior.



The Dangers of Distracted Parenting

When it comes to children’s development, parents should worry less about kids’ screen time—and more about their own.

Source: The Dangers of Distracted Parenting


Smartphones have by now been implicated in so many crummy outcomes—car fatalities, sleep disturbances, empathy loss, relationship problems, failure to notice a clown on a unicycle—that it almost seems easier to list the things they don’t mess up than the things they do. Our society may be reaching peak criticism of digital devices.

Even so, emerging research suggests that a key problem remains underappreciated. It involves kids’ development, but it’s probably not what you think. More than screen-obsessed young children, we should be concerned about tuned-out parents.

Yes, parents now have more face time with their children than did almost any parents in history. Despite a dramatic increase in the percentage of women in the workforce, mothers today astoundingly spend more time caring for their children than mothers did in the 1960s. But the engagement between parent and child is increasingly low-quality, even ersatz. Parents are constantly present in their children’s lives physically, but they are less emotionallyattuned. To be clear, I’m not unsympathetic to parents in this predicament. My own adult children like to joke that they wouldn’t have survived infancy if I’d had a smartphone in my clutches 25 years ago.

To argue that parents’ use of screens is an underappreciated problem isn’t to discount the direct risks screens pose to children: Substantial evidence suggests that many types of screen time (especially those involving fast-paced or violent imagery) are damaging to young brains. Today’s preschoolers spend more than four hours a day facing a screen. And, since 1970, the average age of onset of “regular” screen use has gone from 4 years to just four months.

Some of the newer interactive games kids play on phones or tablets may be more benign than watching TV (or YouTube), in that they better mimic children’s natural play behaviors. And, of course, many well-functioning adults survived a mind-numbing childhood spent watching a lot of cognitive garbage. (My mother—unusually for her time—prohibited Speed Racer and Gilligan’s Island on the grounds of insipidness. That I somehow managed to watch every single episode of each show scores of times has never been explained.) Still, no one really disputes the tremendous opportunity costs to young children who are plugged in to a screen: Time spent on devices is time not spent actively exploring the world and relating to other human beings.

Yet for all the talk about children’s screen time, surprisingly little attention is paid to screen use by parents themselves, who now suffer from what the technology expert Linda Stone more than 20 years ago called “continuous partial attention.” This condition is harming not just us, as Stone has argued; it is harming our children. The new parental-interaction style can interrupt an ancient emotional cueing system, whose hallmark is responsive communication, the basis of most human learning. We’re in uncharted territory.

Child-development experts have different names for the dyadic signaling system between adult and child, which builds the basic architecture of the brain. Jack P. Shonkoff, a pediatrician and the director of Harvard’s Center on the Developing Child, calls it the “serve and return” style of communication; the psychologists Kathy Hirsh-Pasek and Roberta Michnick Golinkoff describe a “conversational duet.” The vocal patterns parents everywhere tend to adopt during  exchanges with infants and toddlers are marked by a higher-pitched tone, simplified grammar, and engaged, exaggerated enthusiasm. Though this talk is cloying to adult observers, babies can’t get enough of it. Not only that: One study showed that infants exposed to this interactive, emotionally responsive speech style at 11 months and 14 months knew twice as many words at age 2 as ones who weren’t exposed to it.

Child development is relational, which is why, in one experiment, nine-month-old babies who received a few hours of Mandarin instruction from a live human could isolate specific phonetic elements in the language while another group of babies who received the exact same instruction via video could not. According to Hirsh-Pasek, a professor at Temple University and a senior fellow at the Brookings Institution, more and more studies are confirming the importance of conversation. “Language is the single best predictor of school achievement,” she told me, “and the key to strong language skills are those back-and-forth fluent conversations between young children and adults.”

A problem therefore arises when the emotionally resonant adult–child cueing system so essential to early learning is interrupted—by a text, for example, or a quick check-in on Instagram. Anyone who’s been mowed down by a smartphone-impaired stroller operator can attest to the ubiquity of the phenomenon. One consequence of such scenarios has been noted by an economist who tracked a rise in children’s injuries as smartphones became prevalent. (AT&T rolled out smartphone service at different times in different places, thereby creating an intriguing natural experiment. Area by area, as smartphone adoption rose, childhood ER visits increased.) These findings attracted a decent bit of media attention to the physical dangers posed by distracted parenting, but we have been slower to reckon with its impact on children’s cognitive development. “Toddlers cannot learn when we break the flow of conversations by picking up our cellphones or looking at the text that whizzes by our screens,” Hirsh-Pasek said.

In the early 2010s, researchers in Boston surreptitiously observed 55 caregivers eating with one or more children in fast-food restaurants. Forty of the adults were absorbed with their phones to varying degrees, some almost entirely ignoring the children (the researchers found that typing and swiping were bigger culprits in this regard than taking a call). Unsurprisingly, many of the children began to make bids for attention, which were frequently ignored. A follow-up study brought 225 mothers and their approximately 6-year-old children into a familiar setting and videotaped their interactions as each parent and child were given foods to try. During the observation period, a quarter of the mothers spontaneously used their phone, and those who did initiated substantially fewer verbal and nonverbal interactions with their child.

Yet another rigorously designed experiment, this one conducted in the Philadelphia area by Hirsh-Pasek, Golinkoff, and Temple’s Jessa Reed, tested the impact of parental cellphone use on children’s language learning. Thirty-eight mothers and their 2-year-olds were brought into a room. The mothers were then told that they would need to teach their children two new words (blicking, which was to mean “bouncing,” and frepping, which was to mean “shaking”) and were given a phone so that investigators could contact them from another room. When the mothers were interrupted by a call, the children did not learn the word, but otherwise they did. In an ironic coda to this study, the researchers had to exclude seven mothers from the analysis, because they didn’t answer the phone, “failing to follow protocol.” Good for them!

It has never been easy to balance adults’ and children’s needs, much less their desires, and it’s naive to imagine that children could ever be the unwavering center of parental attention. Parents have always left kids to entertain themselves at times—“messing about in boats,” in a memorable phrase from The Wind in the Willows, or just lounging aimlessly in playpens. In some respects, 21st-century children’s screen time is not very different from the mother’s helpers every generation of adults has relied on to keep children occupied. When parents lack playpens, real or proverbial, mayhem is rarely far behind. Caroline Fraser’s recent biography of Laura Ingalls Wilder, the author of Little House on the Prairie, describes the exceptionally ad hoc parenting style of 19th-century frontier parents, who stashed babies on the open doors of ovens for warmth and otherwise left them vulnerable to “all manner of accidents as their mothers tried to cope with competing responsibilities.” Wilder herself recounted a variety of near-calamities with her young daughter, Rose; at one point she looked up from her chores to see a pair of riding ponies leaping over the toddler’s head.

Occasional parental inattention is not catastrophic (and may even build resilience), but chronic distraction is another story. Smartphone use has been associated with a familiar sign of addiction: Distracted adults grow irritable when their phone use is interrupted; they not only miss emotional cues but actually misread them. A tuned-out parent may be quicker to anger than an engaged one, assuming that a child is trying to be manipulative when, in reality, she just wants attention. Short, deliberate separations can of course be harmless, even healthy, for parent and child alike (especially as children get older and require more independence). But that sort of separation is different from the inattention that occurs when a parent is with a child but communicating through his or her nonengagement that the child is less valuable than an email. A mother telling kids to go out and play, a father saying he needs to concentrate on a chore for the next half hour—these are entirely reasonable responses to the competing demands of adult life. What’s going on today, however, is the rise of unpredictable care, governed by the beeps and enticements of smartphones. We seem to have stumbled into the worst model of parenting imaginable—always present physically, thereby blocking children’s autonomy, yet only fitfully present emotionally.

Fixing the problem won’t be easy, especially given that it is compounded by dramatic changes in education. More young children than ever (about two-thirds of 4-year-olds) are in some form of institutional care, and recent trends in early-childhood education have filled many of their classrooms with highly scripted lessons and dull, one-sided “teacher talk.” In such environments, children have few opportunities for spontaneous conversation.

One piece of good news is that young children are prewired to get what they need from adults, as most of us discover the first time our diverted gaze is jerked back by a pair of pudgy, reproaching hands. Young children will do a lot to get a distracted adult’s attention, and if we don’t change our behavior, they will attempt to do it for us; we can expect to see a lot more tantrums as today’s toddlers age into school. But eventually, children may give up. It takes two to tango, and studies from Romanian orphanages showed the world that there are limits to what a baby brain can do without a willing dance partner. The truth is, we don’t really know how much our kids will suffer when we fail to engage.

Of course, adults are also suffering from the current arrangement. Many have built their daily life around the miserable premise that they can always be on—always working, always parenting, always available to their spouse and their own parents and anyone else who might need them, while also staying on top of the news, while also remembering, on the walk to the car, to order more toilet paper from Amazon. They are stuck in the digital equivalent of the spin cycle.

Under the circumstances, it’s easier to focus our anxieties on our children’s screen time than to pack up our own devices. I understand this tendency all too well. In addition to my roles as a mother and a foster parent, I am the maternal guardian of a middle-aged, overweight dachshund. Being middle-aged and overweight myself, I’d much rather obsess over my dog’s caloric intake, restricting him to a grim diet of fibrous kibble, than address my own food regimen and relinquish (heaven forbid) my morning cinnamon bun. Psychologically speaking, this is a classic case of projection—the defensive displacement of one’s failings onto relatively blameless others. Where screen time is concerned, most of us need to do a lot less projecting.

If we can get a grip on our “technoference,” as some psychologists have called it, we are likely to find that we can do much more for our children simply by doing less—regardless of the quality of their schooling and quite apart from the number of hours we devote to them. Parents should give themselves permission to back off from the suffocating pressure to be all things to all people. Put your kid in a playpen, already! Ditch that soccer-game appearance if you feel like it. Your kid will be fine. But when you are with your child, put down your damned phone.


ERIKA CHRISTAKIS is the author of The Importance of Being Little: What Young Children Really Need From Grownups.