15 brands millennials loved that ‘kids these days’ avoid

consumer psychology research

Gen Zs have ditched Lacoste for Nike  Sophia Grace/YouTube

Millennials loved Abercrombie and Facebook, but Gen Zs tend to wear fast fashion and athleisure.

Source: 15 brands millennials loved that ‘kids these days’ avoid

By Rachel Premack, Business Insider

  • Millennials loved preppy clothes and Facebook. 
  • But Gen Zs tend to wear fast fashion and athleisure. They’ve also dumped Facebook for Snapchat and Instagram.
  • Read on to see the 15 brands millennials loved as teens that haven’t captured today’s youth.  

When hitting the mall, millennials leaned towards preppy brands like Ralph Lauren.

But today’s teens are all about streetwear, athleisure, and fast fashion — Nike, Adidas, and Forever 21 dominate the Gen Z shopping cart.

Using insights from asset management firm Piper Jaffray’s semi-annual Taking Stock With Teens survey and Bobby Calise, VP of brand tracking at the youth insights firm Ypulse, Business Insider curated a list of 15…

View original post 796 more words

Why ceasing to be creative is a mistake

Photo by gustavo centurion on Unsplash

Even drawing stick figures has its benefits.

Source: Why ceasing to be creative is a mistake

By

  • Many of us stop making art at a young age, convinced, perhaps, that we just don’t have the talent for it.
  • This belief, however, may be wrong, and the benefits that producing art can bring aren’t contingent on talent.
  • Is creating art an activity that all of us should pursue? Can artistic skill be taught?

When we think of life skills, we usually think of things like learning to cook, becoming financially literate, learning to de-escalate conflict, or cultivating our emotional intelligence. We don’t typically think of becoming better artists as a life skill. Indeed, artistic talent is seen as something innate — Some people are artists, and some people are not.

However, for those of us who profess to have no artistic talent whatsoever, it may be that cultivating this skill is even more important than for those who have, allegedly, “innate” artistic talent. So, is creating art a life skill? What kind of benefits can it bring? And, crucially, can it be taught, or is the act of creating something limited only to the lucky few?

Our innate love of art

In a cave in Indonesia, there are outlines of human hands traced in paint. To date, these tracings are the oldest example of art, dating back nearly 40,000 years ago. Human beings don’t consistently perform an activity for 40,000 unless its hardwired into us, and making art is something that is as human as communicating, laughing, or breathing air.

In an April interview with the Harvard Gazette, Dr. Ellen Winner, a psychologist who has studied art, said:

“My best guess is that art itself is not a direct product of natural selection, but is a byproduct of our bigger brains — which themselves evolved for survival reasons. Art is just something we cannot help but do. While we may not need art to survive, our lives would be entirely different without it. The arts are a way of making sense of and understanding ourselves and others, a form of meaning-making just as important as are the sciences.”

A sense of aesthetic appreciation is so innate in humans that we easily distinguish between and prefer abstract art created by a master (those paintings with, say, a few splotches of color that look like anybody could do it) over artificially generated copies or abstract works of art created by children and animals.

So, one big argument for pursuing your artistic capability is simply because it’s a natural, human thing to do. The odds are good you going to make something creative at some point, so why not develop that ability further? This in and of itself doesn’t serve as a particularly compelling reason, but there are plenty of benefits that producing art can bring.

The physical and mental benefits of making art

 

Research has shown that producing art has a positive impact on human psychology. One study compared two groups that spent 10 weeks doing an art-related activity. The first group produced visual art in a class, while the second spent time cognitively evaluating artwork at a museum. After the 10-week intervention, the researchers compared the groups using an MRI.

They found that the art production group had significantly more connections in a critical part of the brain called the default mode network. The default mode network is associated with a variety of functions, such as reflecting on one’s emotional state, empathy, and imagining the future. Not only was this important part of the brain strengthened by producing art, but the participants in the art-production group also became better able to cope with stress.

Other research has shown that producing visual art diminishes the experience of negative emotions and increases positive ones and reduces depression, stress, and anxiety. There appears to be a significant connection between producing visual art and physical health as well, especially since visual art production has been linked with reducing cortisol, the hormone associated with stress.

In older adults, participating in art classes improved their perception of their health and made them more active. They also visited their doctors more frequently and required less medication.

Can art be taught?

It’s clear that producing art can improve cognitive function and physical health, but for those who don’t believe they have artistic talent, these findings may just represent a missed opportunity. Some believe that art can’t be taught. First, it’s important to remember that the studies referenced previously randomly assigned people to produce artwork; none of those individuals were selected for any innate artistic talent, and so the benefits found by those studies can be acquired by anybody.

Many artists believe that while anybody can be taught art to some extent, artistic geniuses are born rather than made. “There is no question in my mind that artists are born,” says Nancy Locke, a professor of art history at Penn State. But, she argues, its crucial to cultivate this innate talent.

Research backs this up to some extent. In the Big Five personality theory, the trait of “openness to experience” — or the trait that predicts whether an individual enjoys getting out of their comfort zone and seeking out unfamiliar experiences — has been shown to be associated with preferences for artistic activities. Psychologists believe that personality traits such as openness to experience are a combination of both genetics and the environment, so it’s fair to say that artistic talent is indeed innate to some extent.

What does this mean for the aspiring artist? The scientific literature referenced above suggests that the many benefits of art production can be gained simply be practicing art regardless of talent. And, since even those with innate talents can’t go very far in art without practice, it may be the case that you possess such talent but have never cultivated it.

The cognitive benefits of creating art aren’t even contingent on skill. The next time you have to attend a lecture or study something, allow yourself to doodle in the margins: Studies have shown that you’ll be 29 percent more likely to recall information and less likely to daydream.

Increasingly, the idea that producing art is some mysterious, unknowable process is diminishing. Instead, creating art is more akin to the visual analog of writing; everybody needs to write a little in the course of their day, not just great writers. Similarly, we should acknowledge that everybody needs to create a bit of art every day, either for greater recall, improved cognition, to reduce stress, or simply for the natural pleasure of creating something.

Americans Don’t Read . . .  and That’s Affecting Our Elections

“How can any Man judge, unless his Mind has been opened and enlarged by Reading?” – John Adams

Source: Americans Don’t Read… and That’s Affecting Our Elections

By  Annie Holmquist

In 2013, the Nation’s Report Card showed that only 38% of high school seniors were proficient in reading. With scores like that, the U.S. isn’t likely to earn the “most literate country” award any time soon.

So what is America’s international literacy ranking? According to The Washington Post, the U.S. places seventh behind Nordic countries such as Finland, Norway, and Sweden. Such a score is obtained by looking at newspaper circulation and readership, library availability, education access, reading scores, and computer usage in each nation.

The Washington Post bemoans the fact that the leading nation of the free world ranks so low in such an important area. And well they should, particularly as the following U.S. literacy statistics are even more alarming:

  • 14% of adults can’t read.
  • Only 13% of adults can read at a proficient level.
  • 28% of adults didn’t read a book in the last year.
  • 50% of adults can’t read a book written at an 8th grade level.

But so what, right? In our enlightened digital age, what harm does it really bring if American literacy is tanking?

A lot of harm, according to John Adams, particularly when it comes to elections. In 1761, he noted:

“The very Ground of our Liberties, is the freedom of Elections. Every Man has in Politicks as well as Religion, a Right to think and speak and Act for himself. No man either King or Subject, Clergyman or Layman has any Right to dictate to me the Person I shall choose for my Legislator and Ruler. I must judge for myself, but how can I judge, how can any Man judge, unless his Mind has been opened and enlarged by Reading. A Man who can read, will find in his Bible, in the common sermon Books that common People have by them and even in the Almanack and News Papers, Rules and observations, that will enlarge his Range of Thought, and enable him the better to judge who has and who has not that Integrity of Heart, and that Compass of Knowledge and Understanding, which form the Statesman.”

Considering the state of the modern election circus, would you say it’s high time for Americans to step up their literacy game?

 

In her spare time Annie enjoys the outdoors, gardening, reading, and events with family and friends.

Andrew Sullivan: My Distraction Sickness — and Yours

Based on: Wanderer Above the Sea of Fog, by Caspar David Friedrich (1818). Illustrations by Kim Dong-kyu

An endless bombardment of news and gossip and images has rendered us manic information addicts. It broke me. It might break you, too.

Source: Andrew Sullivan: My Distraction Sickness — and Yours

By

I was sitting in a large meditation hall in a converted novitiate in central Massachusetts when I reached into my pocket for my iPhone. A woman in the front of the room gamely held a basket in front of her, beaming beneficently, like a priest with a collection plate. I duly surrendered my little device, only to feel a sudden pang of panic on my way back to my seat. If it hadn’t been for everyone staring at me, I might have turned around immediately and asked for it back. But I didn’t. I knew why I’d come here.

A year before, like many addicts, I had sensed a personal crash coming. For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. And at times, as events took over, I’d spend weeks manically grabbing every tiny scrap of a developing story in order to fuse them into a narrative in real time. I was in an unending dialogue with readers who were caviling, praising, booing, correcting. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long.

I was, in other words, a very early adopter of what we might now call living-in-the-web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone — connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. Twitter emerged as a form of instant blogging of microthoughts. Users were as addicted to the feedback as I had long been — and even more prolific. Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never-stopping, this always-updating. I remember when I decided to raise the ante on my blog in 2007 and update every half-hour or so, and my editor looked at me as if I were insane. But the insanity was now banality; the once-unimaginable pace of the professional blogger was now the default for everyone.

If the internet killed you, I used to joke, then I would be the first to find out. Years later, the joke was running thin. In the last year of my blogging life, my health began to give out. Four bronchial infections in 12 months had become progressively harder to kick. Vacations, such as they were, had become mere opportunities for sleep. My dreams were filled with the snippets of code I used each day to update the site. My friendships had atrophied as my time away from the web dwindled. My doctor, dispensing one more course of antibiotics, finally laid it on the line: “Did you really survive HIV to die of the web?”

But the rewards were many: an audience of up to 100,000 people a day; a new-media business that was actually profitable; a constant stream of things to annoy, enlighten, or infuriate me; a niche in the nerve center of the exploding global conversation; and a way to measure success — in big and beautiful data — that was a constant dopamine bath for the writerly ego. If you had to reinvent yourself as a writer in the internet age, I reassured myself, then I was ahead of the curve. The problem was that I hadn’t been able to reinvent myself as a human being.

I tried reading books, but that skill now began to elude me. After a couple of pages, my fingers twitched for a keyboard. I tried meditation, but my mind bucked and bridled as I tried to still it. I got a steady workout routine, and it gave me the only relief I could measure for an hour or so a day. But over time in this pervasive virtual world, the online clamor grew louder and louder. Although I spent hours each day, alone and silent, attached to a laptop, it felt as if I were in a constant cacophonous crowd of words and images, sounds and ideas, emotions and tirades — a wind tunnel of deafening, deadening noise. So much of it was irresistible, as I fully understood. So much of the technology was irreversible, as I also knew. But I’d begun to fear that this new way of living was actually becoming a way of not-living.

By the last few months, I realized I had been engaging — like most addicts — in a form of denial. I’d long treated my online life as a supplement to my real life, an add-on, as it were. Yes, I spent many hours communicating with others as a disembodied voice, but my real life and body were still here. But then I began to realize, as my health and happiness deteriorated, that this was not a both-and kind of situation. It was either-or. Every hour I spent online was not spent in the physical world. Every minute I was engrossed in a virtual interaction I was not involved in a human encounter. Every second absorbed in some trivia was a second less for any form of reflection, or calm, or spirituality. “Multitasking” was a mirage. This was a zero-sum question. I either lived as a voice online or I lived as a human being in the world that humans had lived in since the beginning of time.

And so I decided, after 15 years, to live in reality.

Based on: Le déjeuner sur l’herbe, by Edouard Manet (1863). Illustration: Kim Dong-kyu

Since the invention of the printing press, every new revolution in information technology has prompted apocalyptic fears. From the panic that easy access to the vernacular English Bible would destroy Christian orthodoxy all the way to the revulsion, in the 1950s, at the barbaric young medium of television, cultural critics have moaned and wailed at every turn. Each shift represented a further fracturing of attention — continuing up to the previously unimaginable kaleidoscope of cable TV in the late-20th century and the now infinite, infinitely multiplying spaces of the web. And yet society has always managed to adapt and adjust, without obvious damage, and with some more-than-obvious progress. So it’s perhaps too easy to view this new era of mass distraction as something newly dystopian.

But it sure does represent a huge leap from even the very recent past. The data bewilder. Every single minute on the planet, YouTube users upload 400 hours of video and Tinder users swipe profiles over a million times. Each day, there are literally billions of Facebook “likes.” Online outlets now publish exponentially more material than they once did, churning out articles at a rapid-fire pace, adding new details to the news every few minutes. Blogs, Facebook feeds, Tumblr accounts, tweets, and propaganda outlets repurpose, borrow, and add topspin to the same output.

We absorb this “content” (as writing or video or photography is now called) no longer primarily by buying a magazine or paper, by bookmarking our favorite website, or by actively choosing to read or watch. We are instead guided to these info-nuggets by myriad little interruptions on social media, all cascading at us with individually tailored relevance and accuracy. Do not flatter yourself in thinking that you have much control over which temptations you click on. Silicon Valley’s technologists and their ever-perfecting algorithms have discovered the form of bait that will have you jumping like a witless minnow. No information technology ever had this depth of knowledge of its consumers — or greater capacity to tweak their synapses to keep them engaged.

And the engagement never ends. Not long ago, surfing the web, however addictive, was a stationary activity. At your desk at work, or at home on your laptop, you disappeared down a rabbit hole of links and resurfaced minutes (or hours) later to reencounter the world. But the smartphone then went and made the rabbit hole portable, inviting us to get lost in it anywhere, at any time, whatever else we might be doing. Information soon penetrated every waking moment of our lives.

And it did so with staggering swiftness. We almost forget that ten years ago, there were no smartphones, and as recently as 2011, only a third of Americans owned one. Now nearly two-thirds do. That figure reaches 85 percent when you’re only counting young adults. And 46 percent of Americans told Pew surveyors last year a simple but remarkable thing: They could not live without one. The device went from unknown to indispensable in less than a decade. The handful of spaces where it was once impossible to be connected — the airplane, the subway, the wilderness — are dwindling fast. Even hiker backpacks now come fitted with battery power for smartphones. Perhaps the only “safe space” that still exists is the shower.

Am I exaggerating? A small but detailed 2015 study of young adults found that participants were using their phones five hours a day, at 85 separate times. Most of these interactions were for less than 30 seconds, but they add up. Just as revealing: The users weren’t fully aware of how addicted they were. They thought they picked up their phones half as much as they actually did. But whether they were aware of it or not, a new technology had seized control of around one-third of these young adults’ waking hours.

The interruptions often feel pleasant, of course, because they are usually the work of your friends. Distractions arrive in your brain connected to people you know (or think you know), which is the genius of social, peer-to-peer media. Since our earliest evolution, humans have been unusually passionate about gossip, which some attribute to the need to stay abreast of news among friends and family as our social networks expanded. We were hooked on information as eagerly as sugar. And give us access to gossip the way modernity has given us access to sugar and we have an uncontrollable impulse to binge. A regular teen Snapchat user, as the Atlantic recently noted, can have exchanged anywhere between 10,000 and even as many as 400,000 snaps with friends. As the snaps accumulate, they generate publicly displayed scores that bestow the allure of popularity and social status. This, evolutionary psychologists will attest, is fatal. When provided a constant source of information and news and gossip about each other — routed through our social networks — we are close to helpless.

Just look around you — at the people crouched over their phones as they walk the streets, or drive their cars, or walk their dogs, or play with their children. Observe yourself in line for coffee, or in a quick work break, or driving, or even just going to the bathroom. Visit an airport and see the sea of craned necks and dead eyes. We have gone from looking up and around to constantly looking down.

If an alien had visited America just five years ago, then returned today, wouldn’t this be its immediate observation? That this species has developed an extraordinary new habit — and, everywhere you look, lives constantly in its thrall?

I arrived at the meditation retreat center a few months after I’d quit the web, throwing my life and career up in the air. I figured it would be the ultimate detox. And I wasn’t wrong. After a few hours of silence, you tend to expect some kind of disturbance, some flurry to catch your interest. And then it never comes. The quiet deepens into an enveloping default. No one spoke; no one even looked another person in the eye — what some Buddhists call “noble silence.” The day was scheduled down to the minute, so that almost all our time was spent in silent meditation with our eyes closed, or in slow-walking meditation on the marked trails of the forest, or in communal, unspeaking meals. The only words I heard or read for ten days were in three counseling sessions, two guided meditations, and nightly talks on mindfulness.

I’d spent the previous nine months honing my meditation practice, but, in this crowd, I was a novice and a tourist. (Everyone around me was attending six-week or three-month sessions.) The silence, it became apparent, was an integral part of these people’s lives — and their simple manner of movement, the way they glided rather than walked, the open expressions on their faces, all fascinated me. What were they experiencing, if not insane levels of boredom?

And how did their calm somehow magnify itself when I was surrounded by them every day? Usually, when you add people to a room, the noise grows; here, it was the silence that seemed to compound itself. Attached to my phone, I had been accompanied for so long by verbal and visual noise, by an endless bombardment of words and images, and yet I felt curiously isolated. Among these meditators, I was alone in silence and darkness, yet I felt almost at one with them. My breathing slowed. My brain settled. My body became much more available to me. I could feel it digesting and sniffing, itching and pulsating. It was if my brain were moving away from the abstract and the distant toward the tangible and the near.

Things that usually escaped me began to intrigue me. On a meditative walk through the forest on my second day, I began to notice not just the quality of the autumnal light through the leaves but the splotchy multicolors of the newly fallen, the texture of the lichen on the bark, the way in which tree roots had come to entangle and overcome old stone walls. The immediate impulse — to grab my phone and photograph it — was foiled by an empty pocket. So I simply looked. At one point, I got lost and had to rely on my sense of direction to find my way back. I heard birdsong for the first time in years. Well, of course, I had always heard it, but it had been so long since I listened.

My goal was to keep thought in its place. “Remember,” my friend Sam Harris, an atheist meditator, had told me before I left, “if you’re suffering, you’re thinking.” The task was not to silence everything within my addled brain, but to introduce it to quiet, to perspective, to the fallow spaces I had once known where the mind and soul replenish.

Soon enough, the world of “the news,” and the raging primary campaign, disappeared from my consciousness. My mind drifted to a trancelike documentary I had watched years before, Philip Gröning’s Into Great Silence, on an ancient Carthusian monastery and silent monastic order in the Alps. In one scene, a novice monk is tending his plot of garden. As he moves deliberately from one task to the next, he seems almost in another dimension. He is walking from one trench to another, but never appears focused on actually getting anywhere. He seems to float, or mindfully glide, from one place to the next.

He had escaped, it seemed to me, what we moderns understand by time. There was no race against it; no fear of wasting it; no avoidance of the tedium that most of us would recoil from. And as I watched my fellow meditators walk around, eyes open yet unavailable to me, I felt the slowing of the ticking clock, the unwinding of the pace that has all of us in modernity on a treadmill till death. I felt a trace of a freedom all humans used to know and that our culture seems intent, pell-mell, on forgetting.

Based on: Hotel Room, by Edward Hopper (1931). Illustration: Kim Dong-kyu

We all understand the joys of our always-wired world — the connections, the validations, the laughs, the porn, the info. I don’t want to deny any of them here. But we are only beginning to get our minds around the costs, if we are even prepared to accept that there are costs. For the subtle snare of this new technology is that it lulls us into the belief that there are no downsides. It’s all just more of everything. Online life is simply layered on top of offline life. We can meet in person and text beforehand. We can eat together while checking our feeds. We can transform life into what the writer Sherry Turkle refers to as “life-mix.”

But of course, as I had discovered in my blogging years, the family that is eating together while simultaneously on their phones is not actually together. They are, in Turkle’s formulation, “alone together.” You are where your attention is. If you’re watching a football game with your son while also texting a friend, you’re not fully with your child — and he knows it. Truly being with another person means being experientially with them, picking up countless tiny signals from the eyes and voice and body language and context, and reacting, often unconsciously, to every nuance. These are our deepest social skills, which have been honed through the aeons. They are what make us distinctively human.

By rapidly substituting virtual reality for reality, we are diminishing the scope of this interaction even as we multiply the number of people with whom we interact. We remove or drastically filter all the information we might get by being with another person. We reduce them to some outlines — a Facebook “friend,” an Instagram photo, a text message — in a controlled and sequestered world that exists largely free of the sudden eruptions or encumbrances of actual human interaction. We become each other’s “contacts,” efficient shadows of ourselves.

Think of how rarely you now use the phone to speak to someone. A text is far easier, quicker, less burdensome. A phone call could take longer; it could force you to encounter that person’s idiosyncrasies or digressions or unexpected emotional needs. Remember when you left voice-mail messages — or actually listened to one? Emojis now suffice. Or take the difference between trying to seduce someone at a bar and flipping through Tinder profiles to find a better match. One is deeply inefficient and requires spending (possibly wasting) considerable time; the other turns dozens and dozens of humans into clothes on an endlessly extending rack.

No wonder we prefer the apps. An entire universe of intimate responses is flattened to a single, distant swipe. We hide our vulnerabilities, airbrushing our flaws and quirks; we project our fantasies onto the images before us. Rejection still stings — but less when a new virtual match beckons on the horizon. We have made sex even safer yet, having sapped it of serendipity and risk and often of physical beings altogether. The amount of time we spend cruising vastly outweighs the time we may ever get to spend with the objects of our desire.

Our oldest human skills atrophy. GPS, for example, is a godsend for finding our way around places we don’t know. But, as Nicholas Carr has noted, it has led to our not even seeing, let alone remembering, the details of our environment, to our not developing the accumulated memories that give us a sense of place and control over what we once called ordinary life. The writer Matthew Crawford has examined how automation and online living have sharply eroded the number of people physically making things, using their own hands and eyes and bodies to craft, say, a wooden chair or a piece of clothing or, in one of Crawford’s more engrossing case studies, a pipe organ. We became who we are as a species by mastering tools, making them a living, evolving extension of our whole bodies and minds. What first seems tedious and repetitive develops into a skill — and a skill is what gives us humans self-esteem and mutual respect.

Yes, online and automated life is more efficient, it makes more economic sense, it ends monotony and “wasted” time in the achievement of practical goals. But it denies us the deep satisfaction and pride of workmanship that comes with accomplishing daily tasks well, a denial perhaps felt most acutely by those for whom such tasks are also a livelihood — and an identity.

Indeed, the modest mastery of our practical lives is what fulfilled us for tens of thousands of years — until technology and capitalism decided it was entirely dispensable. If we are to figure out why despair has spread so rapidly in so many left-behind communities, the atrophying of the practical vocations of the past — and the meaning they gave to people’s lives — seems as useful a place to explore as economic indices.

So are the bonds we used to form in our everyday interactions — the nods and pleasantries of neighbors, the daily facial recognition in the mall or the street. Here too the allure of virtual interaction has helped decimate the space for actual community. When we enter a coffee shop in which everyone is engrossed in their private online worlds, we respond by creating one of our own. When someone next to you answers the phone and starts talking loudly as if you didn’t exist, you realize that, in her private zone, you don’t. And slowly, the whole concept of a public space — where we meet and engage and learn from our fellow citizens — evaporates. Turkle describes one of the many small consequences in an American city: “Kara, in her 50s, feels that life in her hometown of Portland, Maine, has emptied out: ‘Sometimes I walk down the street, and I’m the only person not plugged in … No one is where they are. They’re talking to someone miles away. I miss them.’ ”

Has our enslavement to dopamine — to the instant hits of validation that come with a well-crafted tweet or Snapchat streak — made us happier? I suspect it has simply made us less unhappy, or rather less aware of our unhappiness, and that our phones are merely new and powerful antidepressants of a non-pharmaceutical variety. In an essay on contemplation, the Christian writer Alan Jacobs recently commended the comedian Louis C.K. for withholding smartphones from his children. On the Conan O’Brien show, C.K. explained why: “You need to build an ability to just be yourself and not be doing something. That’s what the phones are taking away,” he said. “Underneath in your life there’s that thing … that forever empty … that knowledge that it’s all for nothing and you’re alone … That’s why we text and drive … because we don’t want to be alone for a second.”

He recalled a moment driving his car when a Bruce Springsteen song came on the radio. It triggered a sudden, unexpected surge of sadness. He instinctively went to pick up his phone and text as many friends as possible. Then he changed his mind, left his phone where it was, and pulled over to the side of the road to weep. He allowed himself for once to be alone with his feelings, to be overwhelmed by them, to experience them with no instant distraction, no digital assist. And then he was able to discover, in a manner now remote from most of us, the relief of crawling out of the hole of misery by himself. For if there is no dark night of the soul anymore that isn’t lit with the flicker of the screen, then there is no morning of hopefulness either. As he said of the distracted modern world we now live in: “You never feel completely sad or completely happy, you just feel … kinda satisfied with your products. And then you die. So that’s why I don’t want to get a phone for my kids.”

The early days of the retreat passed by, the novelty slowly ceding to a reckoning that my meditation skills were now being tested more aggressively. Thoughts began to bubble up; memories clouded the present; the silent sessions began to be edged by a little anxiety.

And then, unexpectedly, on the third day, as I was walking through the forest, I became overwhelmed. I’m still not sure what triggered it, but my best guess is that the shady, quiet woodlands, with brooks trickling their way down hillsides and birds flitting through the moist air, summoned memories of my childhood. I was a lonely boy who spent many hours outside in the copses and woodlands of my native Sussex, in England. I had explored this landscape with friends, but also alone — playing imaginary scenarios in my head, creating little nooks where I could hang and sometimes read, learning every little pathway through the woods and marking each flower or weed or fungus that I stumbled on. But I was also escaping a home where my mother had collapsed with bipolar disorder after the birth of my younger brother and had never really recovered. She was in and out of hospitals for much of my youth and adolescence, and her condition made it hard for her to hide her pain and suffering from her sensitive oldest son.

I absorbed a lot of her agony, I came to realize later, hearing her screams of frustration and misery in constant, terrifying fights with my father, and never knowing how to stop it or to help. I remember watching her dissolve in tears in the car picking me up from elementary school at the thought of returning to a home she clearly dreaded, or holding her as she poured her heart out to me, through sobs and whispers, about her dead-end life in a small town where she was utterly dependent on a spouse. She was taken away from me several times in my childhood, starting when I was 4, and even now I can recall the corridors and rooms of the institutions she was treated in when we went to visit.

I knew the scar tissue from this formative trauma was still in my soul. I had spent two decades in therapy, untangling and exploring it, learning how it had made intimacy with others so frightening, how it had made my own spasms of adolescent depression even more acute, how living with that kind of pain from the most powerful source of love in my life had made me the profoundly broken vessel I am. But I had never felt it so vividly since the very years it had first engulfed and defined me. It was as if, having slowly and progressively removed every distraction from my life, I was suddenly faced with what I had been distracting myself from. Resting for a moment against the trunk of a tree, I stopped, and suddenly found myself bent over, convulsed with the newly present pain, sobbing.

And this time, even as I eventually made it back to the meditation hall, there was no relief. I couldn’t call my husband or a friend and talk it over. I couldn’t check my email or refresh my Instagram or text someone who might share the pain. I couldn’t ask one of my fellows if they had experienced something similar. I waited for the mood to lift, but it deepened. Hours went by in silence as my heart beat anxiously and my mind reeled.

I decided I would get some distance by trying to describe what I was feeling. The two words “extreme suffering” won the naming contest in my head. And when I had my 15-minute counseling session with my assigned counselor a day later, the words just kept tumbling out. After my panicked, anguished confession, he looked at me, one eyebrow raised, with a beatific half-smile. “Oh, that’s perfectly normal,” he deadpanned warmly. “Don’t worry. Be patient. It will resolve itself.” And in time, it did. Over the next day, the feelings began to ebb, my meditation improved, the sadness shifted into a kind of calm and rest. I felt other things from my childhood — the beauty of the forests, the joy of friends, the support of my sister, the love of my maternal grandmother. Yes, I prayed, and prayed for relief. But this lifting did not feel like divine intervention, let alone a result of effort, but more like a natural process of revisiting and healing and recovering. It felt like an ancient, long-buried gift.

In his survey of how the modern West lost widespread religious practice, ASecular Age, the philosopher Charles Taylor used a term to describe the way we think of our societies. He called it a “social imaginary” — a set of interlocking beliefs and practices that can undermine or subtly marginalize other kinds of belief. We didn’t go from faith to secularism in one fell swoop, he argues. Certain ideas and practices made others not so much false as less vibrant or relevant. And so modernity slowly weakened spirituality, by design and accident, in favor of commerce; it downplayed silence and mere being in favor of noise and constant action. The reason we live in a culture increasingly without faith is not because science has somehow disproved the unprovable, but because the white noise of secularism has removed the very stillness in which it might endure or be reborn.

The English Reformation began, one recalls, with an assault on the monasteries, and what silence the Protestants didn’t banish the philosophers of the Enlightenment mocked. Gibbon and Voltaire defined the Enlightenment’s posture toward the monkish: from condescension to outright contempt. The roar and disruption of the Industrial Revolution violated what quiet still remained until modern capitalism made business central to our culture and the ever-more efficient meeting of needs and wants our primary collective goal. We became a civilization of getting things done — with the development of America, in some ways, as its crowning achievement. Silence in modernity became, over the centuries, an anachronism, even a symbol of the useless superstitions we had left behind. The smartphone revolution of the past decade can be seen in some ways simply as the final twist of this ratchet, in which those few remaining redoubts of quiet — the tiny cracks of inactivity in our lives — are being methodically filled with more stimulus and noise.

And yet our need for quiet has never fully gone away, because our practical achievements, however spectacular, never quite fulfill us. They are always giving way to new wants and needs, always requiring updating or repairing, always falling short. The mania of our online lives reveals this: We keep swiping and swiping because we are never fully satisfied. The late British philosopher Michael Oakeshott starkly called this truth “the deadliness of doing.” There seems no end to this paradox of practical life, and no way out, just an infinite succession of efforts, all doomed ultimately to fail.

Except, of course, there is the option of a spiritual reconciliation to this futility, an attempt to transcend the unending cycle of impermanent human achievement. There is a recognition that beyond mere doing, there is also being; that at the end of life, there is also the great silence of death with which we must eventually make our peace. From the moment I entered a church in my childhood, I understood that this place was different because it was so quiet. The Mass itself was full of silences — those liturgical pauses that would never do in a theater, those minutes of quiet after communion when we were encouraged to get lost in prayer, those liturgical spaces that seemed to insist that we are in no hurry here. And this silence demarcated what we once understood as the sacred, marking a space beyond the secular world of noise and business and shopping.

The only place like it was the library, and the silence there also pointed to something beyond it — to the learning that required time and patience, to the pursuit of truth that left practical life behind. Like the moment of silence we sometimes honor in the wake of a tragedy, the act of not speaking signals that we are responding to something deeper than the quotidian, something more profound than words can fully express. I vividly recall when the AIDS Memorial Quilt was first laid out on the Mall in Washington in 1987. A huge crowd had gathered, drifts of hundreds of chattering, animated people walking in waves onto the scene. But the closer they got, and the more they absorbed the landscape of unimaginably raw grief, their voices petered out, and a great emptiness filled the air. This is different, the silence seemed to say. This is not our ordinary life.

Most civilizations, including our own, have understood this in the past. Millennia ago, as the historian Diarmaid MacCulloch has argued, the unnameable, often inscrutably silent God of the Jewish Scriptures intersected with Plato’s concept of a divinity so beyond human understanding and imperfection that no words could accurately describe it. The hidden God of the Jewish and Christian Scriptures spoke often by not speaking. And Jesus, like the Buddha, revealed as much by his silences as by his words. He was a preacher who yet wandered for 40 days in the desert; a prisoner who refused to defend himself at his trial. At the converted novitiate at the retreat, they had left two stained-glass windows depicting Jesus. In one, he is in the Garden of Gethsemane, sweating blood in terror, alone before his execution. In the other, he is seated at the Last Supper, with the disciple John the Beloved resting his head on Jesus’s chest. He is speaking in neither.

That Judeo-Christian tradition recognized a critical distinction — and tension — between noise and silence, between getting through the day and getting a grip on one’s whole life. The Sabbath — the Jewish institution co-opted by Christianity — was a collective imposition of relative silence, a moment of calm to reflect on our lives under the light of eternity. It helped define much of Western public life once a week for centuries — only to dissipate, with scarcely a passing regret, into the commercial cacophony of the past couple of decades. It reflected a now-battered belief that a sustained spiritual life is simply unfeasible for most mortals without these refuges from noise and work to buffer us and remind us who we really are. But just as modern street lighting has slowly blotted the stars from the visible skies, so too have cars and planes and factories and flickering digital screens combined to rob us of a silence that was previously regarded as integral to the health of the human imagination.

This changes us. It slowly removes — without our even noticing it — the very spaces where we can gain a footing in our minds and souls that is not captive to constant pressures or desires or duties. And the smartphone has all but banished them. Thoreau issued his jeremiad against those pressures more than a century ago: “I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived. I did not wish to live what was not life, living is so dear.”

When you enter the temporary Temple at Burning Man, the annual Labor Day retreat for the tech elite in the Nevada desert, there is hardly any speaking. Some hover at the edges; others hold hands and weep; a few pin notes to a wall of remembrances; the rest are kneeling or meditating or simply sitting. The usually ornate and vast wooden structure is rivaled only by the massive tower of a man that will be burned, like the Temple itself, as the festival reaches its climax, and tens of thousands of people watch an inferno.

They come here, these architects of our internet world, to escape the thing they unleashed on the rest of us. They come to a wilderness where no cellular signals penetrate. You leave your phone in your tent, deemed useless for a few, ecstatically authentic days. There is a spirit of radical self-reliance (you survive for seven days or so only on what you can bring into the vast temporary city) and an ethic of social equality. You are forced to interact only as a physical human being with other physical human beings — without hierarchy. You dance, and you experiment; you build community in various camps. And for many, this is the high point of their year — a separate world for fantasy and friendship, enhanced by drugs that elevate your sense of compassion or wonder or awe.

Like a medieval carnival, this new form of religion upends the conventions that otherwise rule our lives. Like a safety valve, it releases the pent-up pressures of our wired cacophony. Though easily mockable, it is trying to achieve what our culture once routinely provided, and it reveals, perhaps, that we are not completely helpless in this newly distracted era. We can, one senses, begin to balance it out, to relearn what we have so witlessly discarded, to manage our neuroses so they do not completely overwhelm us.

There are burgeoning signs of this more human correction. In 2012, there were, for example, around 20 million yoga practitioners in the U.S., according to a survey conducted by Ipsos Public Affairs. By 2016, the number had almost doubled. Mindfulness, at the same time, has become a corporate catchword for many and a new form of sanity for others. It’s also hard to explain, it seems to me, the sudden explosion of interest in and tolerance of cannabis in the past 15 years without factoring in the intensifying digital climate. Weed is a form of self-medication for an era of mass distraction, providing a quick and easy path to mellowed contemplation in a world where the ample space and time necessary for it are under siege.

If the churches came to understand that the greatest threat to faith today is not hedonism but distraction, perhaps they might begin to appeal anew to a frazzled digital generation. Christian leaders seem to think that they need more distraction to counter the distraction. Their services have degenerated into emotional spasms, their spaces drowned with light and noise and locked shut throughout the day, when their darkness and silence might actually draw those whose minds and souls have grown web-weary. But the mysticism of Catholic meditation — of the Rosary, of Benediction, or simple contemplative prayer — is a tradition in search of rediscovery. The monasteries — opened up to more lay visitors — could try to answer to the same needs that the booming yoga movement has increasingly met.

And imagine if more secular places responded in kind: restaurants where smartphones must be surrendered upon entering, or coffee shops that marketed their non-Wi-Fi safe space? Or, more practical: more meals where we agree to put our gadgets in a box while we talk to one another? Or lunch where the first person to use their phone pays the whole bill? We can, if we want, re-create a digital Sabbath each week — just one day in which we live for 24 hours without checking our phones. Or we can simply turn off our notifications. Humans are self-preserving in the long run. For every innovation there is a reaction, and even the starkest of analysts of our new culture, like Sherry Turkle, sees a potential for eventually rebalancing our lives.

And yet I wonder. The ubiquitous temptations of virtual living create a mental climate that is still maddeningly hard to manage. In the days, then weeks, then months after my retreat, my daily meditation sessions began to falter a little. There was an election campaign of such brooding menace it demanded attention, headlined by a walking human Snapchat app of incoherence. For a while, I had limited my news exposure to the New York Times’ daily briefings; then, slowly, I found myself scanning the click-bait headlines from countless sources that crowded the screen; after a while, I was back in my old rut, absorbing every nugget of campaign news, even as I understood each to be as ephemeral as the last, and even though I no longer needed to absorb them all for work.

Then there were the other snares: the allure of online porn, now blasting through the defenses of every teenager; the ease of replacing every conversation with a texting stream; the escape of living for a while in an online game where all the hazards of real human interaction are banished; the new video features on Instagram, and new friends to follow. It all slowly chipped away at my meditative composure. I cut my daily silences from one hour to 25 minutes; and then, almost a year later, to every other day. I knew this was fatal — that the key to gaining sustainable composure from meditation was rigorous discipline and practice, every day, whether you felt like it or not, whether it felt as if it were working or not. Like weekly Mass, it is the routine that gradually creates a space that lets your life breathe. But the world I rejoined seemed to conspire to take that space away from me. “I do what I hate,” as the oldest son says in Terrence Malick’s haunting Tree of Life.

I haven’t given up, even as, each day, at various moments, I find myself giving in. There are books to be read; landscapes to be walked; friends to be with; life to be fully lived. And I realize that this is, in some ways, just another tale in the vast book of human frailty. But this new epidemic of distraction is our civilization’s specific weakness. And its threat is not so much to our minds, even as they shape-shift under the pressure. The threat is to our souls. At this rate, if the noise does not relent, we might even forget we have any.

*This article appears in the September 19, 2016, issue of New York Magazine.

American kids want to be famous on YouTube, and kids in China want to go to space: survey

Children ages 8 to 12 in the US, the UK, and China were recently polled in honor of the 50th anniversary of the first manned moon landing.

Source: American kids want to be famous on YouTube, and kids in China want to go to space: survey

Paige Leskin, Business Insider

 

  • A recent survey of 3,000 kids found that being a YouTube star was a more sought-after profession than being an astronaut among kids in the US and the United Kingdom.
  • Children ages 8 to 12 in the US, the UK, and China were recently polled in honor of the 50th anniversary of the Apollo 11 mission, which resulted in the first person to walk on the moon.
  • Kids in the US and the UK were three times as likely to want to be YouTubers or vloggers as astronauts, while kids in China were more likely to want to be astronauts.

Neil Armstrong became a role model in the eyes of kids everywhere 50 years ago when he became the first person to walk on the moon on July 20, 1969.

Kids in a recent survey, however, were much more likely to aspire to be the next YouTube star rather than the next person in space. The survey, conducted by Harris Poll on behalf of Lego, found that children in the US and the United Kingdom were three times as likely to want to be YouTubers or vloggers as astronauts when they grow up.

The survey asked 3,000 kids ages 8 to 12 to choose from five professions to answer which they wanted to be when they grew up: astronaut, musician, professional athlete, teacher, or vlogger/YouTuber. Though the top choice among kids in the US and the UK was vlogger/YouTuber, 56% of kids in China said they wanted to be an astronaut.

harris poll lego survey youtubers over astronautsResults of the Harris Poll survey.harris poll

The nonprobability online survey was conducted in honor of the 50th anniversary of the first manned moon landing. The poll surveyed 3,000 kids, ages 8 to 12, divided evenly among the US, the UK, and China.

Though the survey’s results cannot necessarily be applied to all kids, the results reflect a trend seen among Generation Z. As evident at this year’s VidCon, a three-day conference about online video, an estimated 75,000 teens and their parents showed up to hear from their favorite YouTubers.

“Every time I go to schools, the most said thing from 90% of kids is, ‘I want to be a YouTuber,'” the YouTuber DeStorm Power told Business Insider. “They want to be social-media stars.”

 

Think FaceApp Is Scary? Wait Till You Hear About Facebook

GAVRIIL GRIGOROV/TASS/GETTY IMAGES

The idea that FaceApp is somehow exceptionally dangerous threatens to obscure the real point: All apps deserve this level of scrutiny.

Source: Think FaceApp Is Scary? Wait Till You Hear About Facebook

FaceApp is a viral lark that takes a convincing guess at what you’ll look like when you’re old. FaceApp is also the product of a Russian company that sends photos from your device to its servers, retains rights to use them in perpetuity, and performs artificial intelligence black magic on them. And so the FaceApp backlash has kicked into gear, with anxious stories and tweets warning you off of its charms. Which, fine! Just make sure you save some of that ire for bigger targets.

The response to FaceApp is predictable, if only because this cycle has happened before. FaceApp went viral when it launched in 2017, and prompted a similar—if far more muted—privacy kerfuffle. But compared to Meitu, that year’s other viral face manipulator, which is quite a phrase to type, FaceApp was downright saintly in its data collection. At least FaceApp didn’t access your GPS and SIM card information. More energy was directed at bigger problems, like FaceApp’s blackface filter. (Yep!)

“This is definitely not a unique FaceApp problem. FaceApp is part of a larger privacy problem.”

CHRISTINE BANNAN, EPIC

The latest frenzy appears to have been kicked off by a since-deleted tweet that claimed FaceApp uploads all of your photos to the cloud. That certainly would be alarming. But FaceApp has denied the claim, and multiple security researchers have confirmed that it’s not so. FaceApp takes only the photo you ask it to manipulate. The company also says it deletes “most images” from its servers within 48 hours of uploading, although admittedly there’s no way to confirm that it does so in practice. If you want FaceApp to remove all of your data from its servers, you can send a request within the app, by going to Settings > Support > Report a bug and putting “Privacy” in the subject line. “Our support team is currently overloaded, but these requests have our priority,” FaceApp founder Yaroslav Goncharov said in a statement. “We are working on the better UI for that.”

Those measures don’t make FaceApp some paragon of data privacy. While the way it manages photos is kosher under Apple rules, FaceApp doesn’t make it clear enough to users that it’s sending them to a server. “I cannot think of any situation where an app should not be very painfully clear about a photo being uploaded to a remote server,” says Will Strafach, security researcher and developer of Guardian, an iOS firewall app. “Users always have the right to know this.”

Still, it’s important to note that while FaceApp calls St. Petersburg home, its servers are based in the US. The company said in a statement that “the user data is not transferred to Russia.” Like almost everyone else, FaceApp uses Amazon’s cloud. And it has at least a plausible reason for doing so: The processing power required to apply a Methuselahn filter on your face is more manageable there than on your device. More recent iPhones and Android devices have machine learning capabilities baked into their hardware, but it’s safe to assume that plenty of FaceApp’s reported 80 million users are on older models.

So what’s changed since 2017? On the FaceApp side, not much. But the world around it looks markedly different. Russia has become synonymous with nefarious online meddling, to the point that any company—even a silly filter app—becomes a bogeyman. Awareness of facial recognition’s perils has reached something close to critical mass. And the idea that one’s personal data might be worth protecting has gained real, immutable traction.

All for the better, or at least on those last two points. You should ask questions about FaceApp. You should be extremely cautious about what data you choose to share with it, especially something as personal as a photo of your face. But the idea that FaceApp is somehow exceptionally dangerous threatens to obscure the real point: All apps deserve this level of scrutiny—including, and especially, the ones you use the most.

“People give photos to lots of different apps. I think this is probably getting attention because it’s Russian developers,” says Christine Bannan, consumer protection counsel at the nonprofit Electronic Privacy Information Center. “But this is definitely not a unique FaceApp problem. FaceApp is part of a larger privacy problem.”

Take the most obvious example, and not only for its similar name. Facebook has nearly 2.5 billion monthly active users to FaceApp’s 80 million. It, too, applies facial recognition to photos that those users upload to its servers. It also actively pushed a VPN that allowed it to track the activity of anyone who installed it not just within the Facebook app but anywhere on their phone. When Apple finally banned that app, Facebook snuck it in again through the backdoor. And that’s before you get to the privacy violations that have led to a reported $5 billion fine from the FTC, a record by orders of magnitude.

People have expressed concern that FaceApp’s terms of service includes “a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.” Rightly so. But see how closely it mirrors Facebook’s terms of service, which also says that “when you share, post, or upload content that is covered by intellectual property rights (like photos or videos) on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings).” (Which is as good a reminder as any to lock down your Facebook privacy settings.)

And it’s obviously not just Facebook. Look at Life360, a family-tracking app that turns user data into revenue through advertising and partnerships. TikTok is based in China, a country with a damning history of facial recognition abuses. For years, US carriers sold detailed location data of their customers without explicit consent. As noted by Ad Weekreporter Shoshana Wodinsky, FaceApp itself sends data to DoubleClick, the Google-owned ad company, and to Facebook. And so do countless others.

Should you be worried about FaceApp? Sure. But not necessarily more than any other app you let into your photo library. Or any other part of your phone.

“I wish people would think before they try out any app, but that just isn’t realistic. People want to use cool-looking services and they’ll never read a boring privacy policy before doing so,” says Joseph Jerome, privacy counsel at the nonprofit Center for Democracy & Technology. “There’s a real tension between individuals wanting to have fun with their photos and their images being used for a host of different facial recognition and image analytics products. This is why we’ve been calling for regulations around biometric data.”

Instead of these panics, which fade in and out in step with the virality of their targets, maybe a healthier focus is on broader awareness. Your data has value. Think twice about who you give it to, regardless of what country they’re in or how silly they make you look.

Viral App FaceApp Now Owns Access To More Than 150 Million People’s Faces And Names

“Your face will most likely end up training some AI facial-recognition algorithm”

Source: Viral App FaceApp Now Owns Access To More Than 150 Million People’s Faces And Names

John Koetsier is a journalist, analyst, author, and speaker.

Everyone’s seen them: friends posting pictures of themselves now, and years in the future.

Viral app FaceApp has been giving people the power to change their facial expressions, looks, and now age for several years. But at the same time, people have been giving FaceApp the power to use their pictures — and names — for any purpose it wishes, for as long as it desires.

And we thought we learned a lesson from Cambridge Analytica.

More than 100 million people have downloaded the app from Google Play. And FaceApp is now the top-ranked app on the iOS App Store in 121 countries, according to App Annie.

While according to FaceApp’s terms of service people still own their own “user content” (read: face), the company owns a never-ending and irrevocable royalty-free license to do anything they want with it … in front of whoever they wish:

You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you. When you post or otherwise share User Content on or through our Services, you understand that your User Content and any associated information (such as your [username], location or profile photo) will be visible to the public.

FaceApp terms of use

That may not be dangerous and your likeness may stay on Amazon servers in America, as Forbes has determined, but they still own a license to do whatever they want with it. That doesn’t mean the app’s Russian parent company, Wireless Labs, will offer your face to the FSB, but it does have consequences, as PhoneArena’s Peter Kostadinov says:

You might end up on a billboard somewhere in Moscow, but your face will most likely end up training some AI facial-recognition algorithm.

Peter Kostadinov 

Whether that matters to you or not is your decision.

But what we have learned in the past few years about viral Facebook apps is that the data they collect is not always used for the purposes that we might assume. And, that the data collected is not always stored securely, safely, privately.

Once something is uploaded to the cloud, you’ve lost control whether or not you’ve given away legal license to your content. That’s one reason why privacy-sensitive Apple is doing most of its AI work on-device.

And it’s a good reason to be wary when any app wants access and a license to your digital content and/or identity.

As former Rackspace manager Rob La Gesse mentioned today:

To make FaceApp actually work, you have to give it permissions to access your photos – ALL of them. But it also gains access to Siri and Search …. Oh, and it has access to refreshing in the background – so even when you are not using it, it is using you.

Rob La Gesse

The app doesn’t have to be doing anything nefarious today to make you cautious about giving it that much access to your most personal computing device.

Follow John Koetsier on Twitter or LinkedIn. Check out his website or some of his other work here.

Labeling Theory: How do the labels we use change our reality?

Labeling Theory: The labels we apply – or the others apply to us – determine our identity, our behaviour and also our reality.

Source: ▷ Labeling Theory: How do the labels we use change our reality?

“Be curious, not judgmental”, wrote Walt Whitman. Life is neither good nor bad. Where some see a problem, others may find an opportunity. Every time we label the events, we turn them into good or bad. Every time we judge what happens to us, we start a battle against the reality in which we will almost always have the chance to lose.

Labels, that rudimentary mechanism of reaction with which we limit reality

Labels can become so useful that we find it difficult to escape them. In some situations they make life easier for us since they become cardinal points, a rapid system of orientation that activates the response mechanisms we have learned without having to think too much. They are like a simplified trigger that connects a complex reality with a simple answer.

Our deep passion for labels comes, in large part, from our need to feel safe and control our environment. A label is a quick response that makes us feel that we have the control, even if it is only an illusionary perception.

If we labeled a person as “toxic”, we don’t need anything more, we will try to stay away from him. If we labeled a situation as “undesirable” we will do everything possible to escape it.

The problem is that the world is not so simple. Every time we apply a label we are reducing the wealth of what we’re labeling. When we classify the events as “good” or “bad”, we stop perceiving the complete image. As Søren Kierkegaard said: “When you label me, you deny me”, because every time we label someone we deny his wealth and complexity.

The Labeling Theory: How do the labels we use shape our reality?

Psychologists began to study labels in the 1930s, when linguist Benjamin Whorf proposed the hypothesis of linguistic relativity. He believed that the words we use to describe what we see are not mere labels, but end up determining what we see.

Decades later, cognitive psychologist Lera Boroditsky demonstrated it with an experiment. She asked people of English or Russian mother tongue to distinguish between two very similar but subtly different shades of blue. In English, there is only one word for the blue color, but the Russians automatically divide the spectrum of blue into lighter blues (goluboy) and darker blues (siniy). Interestingly, those who spoke Russian distinguished the difference between the two tones faster, while those who spoke English needed much more.

Labels not only shape our perception of the color, but also change the way we perceive more complex situations. A classic study conducted at Princeton University showed the enormous scope of labels.

These psychologists showed a group of people a video of a girl playing in a low-income neighborhood and to another group showed the same girl, playing in the same way, but in a high-middle class neighborhood. In the video were also asked some questions to the girl, to some she answered well, with others she made mistakes.

Darley and Gross discovered that people used the socioeconomic status label as an index of academic ability. When the girl was labeled as “middle class”, people believed that her cognitive performance was better. This reveals to us that a simple label, apparently innocuous and objective, activates a series of prejudices or preconceived ideas that end up determining our image of people or reality.

The problem goes much further, the implications of labeling are immense, as demonstrated by Robert Rosenthal and Lenore Jacobson. These educational psychologists found that if teachers believe that a child has less intellectual capacity – even if it’s not true – they will treat him as such and that child will end up getting worse grades, not because he lacks the necessary skills but simply because he received less attention during the lessons. It’s a self-fulfilling prophecy: when we believe that something is real, we can make it real with our attitudes and behaviors.

Nobody is immune to the influence of labels. The labeling theory indicates that our identity and behaviors are determined or influenced by the terms that we or others use to describe us.

The labels say more about who’s labeling, than who is labeled

Toni Morrison, the American writer, winner of a Pulitzer Prize and Nobel Prize for Literature, wrote: “The definitions belong to the definers, not the defined”. Each label we place, with the objective of limiting the others, actually restricts our world. Each label is the expression of our inability to deal with complexity and uncertainty, with the unexpected and the ambivalent.

In fact, we usually resort to labels when reality is so complex that it overwhelms us psychologically, or when we don’t have the cognitive tools to assess in a fair measure what is happening.

From this perspective, each label is like a tunnel that closes our vision to a more vast, wide and complex reality. And if we don’t have a global perspective of what is happening, we cannot respond adaptively. In that moment we stop responding to reality to begin to respond to the biased image of reality that we have built in our mind.

Flexible labels reduce our stress

Using fixed terms to describe people or ourselves is not only limiting, but also stressful. On the contrary, thinking about identity more flexibly will decrease our level of stress, as indicated by psychologists at the University of Texas.

The study, carried out with students, revealed that those who believed that the personality could change, both their own and that of the classmates they labeled, were less stressed in situations of social exclusion and, at the end of the year, they become less ill than people who used to apply fixed labels.

Having a more flexible view of the world allows us to adapt more easily to changes, so we will stress much less. Furthermore, understanding that everything can change – ourselves or people – will prevent us from falling into the arms of fatalism, so that we can develop a more optimistic vision of life.

How to escape from labels?

We need to remember that “good” and “bad” are two sides of the same coin. If we don’t understand it, we will remain trapped in dichotomous thinking, victims of the labels we apply ourselves.

We also need to understand that if someone does something wrong from our point of view, it doesn’t mean that he is a bad person, but simply a person who did something that doesn’t correspond to our value system.

Remember that “Sometimes it is the people no one can imagine anything of who do the things no one can imagine”,  as Alan Turing said. Because sometimes, we just have to open up to experiences, without pre-established ideas, and let them surprise us.

Sources:

Yeager, D.S. et. Al. (2014) The far-reaching effects of believing people can change: implicit theories of personality shape stress, health, and achievement during adolescence. J Pers Soc Psychol; 106(6): 867-884.

Boroditsky, L. et. Al. (2007) Russian blues reveal effects of language on color discrimination. Proc Natl Acad Sci USA; 104(19): 7780-7785.

Darley, J.M. & Gross, P.H. (1983) A hypothesis-confirming bias in labeling effects. Journal of Personality and Social Psychology; 44(1): 20-33.

Rosenthal, R., y Jacobson, L. (1980) Pygmalion en la escuela. Expectativas del maestro y desarrollo intelectual del alumno. Madrid: Ed. Marova.