Contents

How Technology Messes With Your Memory

Corbis Images

When you were younger, how many phone numbers could you remember? Probably several-your home phone, your parents’ work phones, your friends’ houses…the list probably includes your whole teenage social circle.

But right now-without peeking-how many phone numbers do you know? Your own, possibly your boyfriend’s?

Well, you’re not alone. Technology is not only changing our lives, it’s also changing how we memorize and recall information. Past research has shown that when people are able to easily look up information online, they’re less likely to remember the actual facts and more likely to remember how to access that information (dubbed the ‘Google Effect’). (Are You Too Attached to Your iPhone?)

A new study published by Kaspersky Lab, an internet security company, found that the Google Effect extends beyond online facts and extraneous information to include important personal information, such as contact information, photos, directions, and, yes, phone numbers. In other words, not only can you no longer remember state capitals, you also can’t call your brother without looking up his number on your smartphone.

Drawing a blank on information that you trust a digital device to store and remember for you is what Kaspersky’s report calls ‘digital amnesia.’

While this might seem like a bad thing, the act of forgetting is not inherently negative, according to the study. After all, it allows you to lose irrelevant memories (such as old bank account details) and free up space for important memories. Forgetting only becomes unhelpful when it involves losing information you need to remember, says neuroscientist Kathryn Mills, Ph.D.

But simply ‘forgetting’ is not the issue, because it appears we’ve now crossed over into outsourcing information that’s personally relevant to our own lives, such as turning to pictures and tweets to store memories of our vacations and milestones. And letting things slip from memory is actually a symptom of a bigger problem, according to Los Angeles-based clinical psychologist Jenny Yip. “We’re not just losing our ability to memorize,” she says. “We’re creating an imbalance in our brains, as over-dependence on technology creates a more dominant left brain at the expense of the right brain-and the right side of your brain is necessary for memory and concentration.”

Even if memorization skills aren’t necessary in our constantly connected world, it’s important to pay attention to your right brain, Yip explains. Luckily, a recent Study Showed Weight-Lifting Improves Memory. To help strengthen your memory powerhouse further, though, Yip suggests detoxing from technology and screens at least one day a week to interact with your environment-read a book (research shows that reading a paperback book gives you better recall than reading a Kindle), learn or practice a new language or sport, or have a deep, meaningful face-to-face conversation with another person. (We’ve got you covered: 8 Steps for Doing a Digital Detox Without FOMO.)

As long as you have the Internet at your fingertips, it doesn’t really matter that your memorization skills are deteriorating. But when it comes to keeping our brains healthy, disconnecting for the sake of our actual memory will go a long way.

  • By Sarah Jacobsson Purewal

The Internet is no doubt changing modern society. It has profoundly altered how we gather information, consume news, carry out war, and create and foster social bonds. But is it altering our brains? A growing number of scientists think so, and studies are providing data to show it.

What remains to be seen is whether the changes are good or bad, and whether the brain is, as one neuroscientist believes, undergoing unprecedented evolution.

Texting and instant messaging, social networking sites and the Internet in general can certainly be said to distract people from other tasks. But what researchers are worrying more about are the plastic brains of teens and young adults who are now growing up with all this, the “digital natives” as they’re being called.

“My fear is that these technologies are infantilising the brain into the state of small children who are attracted by buzzing noises and bright lights, who have a small attention span and who live for the moment,” said Baroness Greenfield, an Oxford University neuroscientist and director of the Royal Institution, in The Daily Mail today. “I often wonder whether real conversation in real time may eventually give way to these sanitised and easier screen dialogues, in much the same way as killing, skinning and butchering an animal to eat has been replaced by the convenience of packages of meat on the supermarket shelf.”

Odd analogy, but one worth pondering.

Inevitable brain change

Every generation adapts to change, and the brain gets used for different purposes. For ancient man there was the spear, the mammoth, and the rock to hide behind. Agriculture changed the world, as did writing. Then came gunpowder, the Industrial Revolution, radio, and TV dinners. Man would never be the same. Adapt or die, hiding behind a rock with no friends, no family.

The pace picked up. Cell phones changed everything. Smart phones made them seem quaint. Our brains adapted. I used to have dozens of phone numbers committed to memory. Now that they’re all in my Blackberry (and before that the Palm, going back a decade now) I can remember only those I’d memorized when I was a child. I don’t even know my wife’s cell phone or work number. I’m not sure what all that brain capacity is being used for now, other than struggling to focus on writing columns like this while checking email several times and surfing from valid research sites to unrelated pages detailing the latest condition of Jane Goody, who I’d never heard of until recently, to reaching for my hip when my stomach gurgles but I think my phone is vibrating (a modern condition called phantom vibration syndrome).

But I digress. And I’m touching on the “Google is making us stupid” notion, written about last summer in the Atlantic by Nicholas Carr, who notes how he used to “spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text.”

Carr blames the lack of concentration on a decade of being online.

But forget us old folks. What about the kids, whose online use we, er, monitor?

The Daily Mail article today points out that students tend no longer to plan essays before starting to write: Thanks to computers and MS Word, they can edit as they go along. I grew up learning to do an outline on paper before writing any essay or story, a habit that was reinforced in journalism school. I rarely do so anymore (though when the writing doesn’t go well, it’s still a great tactic). Good or bad? I’m not sure. Change, yes. Nowadays I think with my fingers, and my brain bounces around a lot more when I write, outlining on the fly.

Yet I worry about my children and what skills they’ll develop spending hours a day either on a computer, using a cell phone to talk or text or surf (while driving?!) or watching TV, and whether all that activity will enhance their well being, help them make lifelong friendships, find a mate, get a job. Teens have always hidden out (in the woods, under the grandstands, or in their rooms), but now, thanks to their various electronic social networks, a cell phone and perhaps a laptop tuned to Hulu, they can truly become hermits, harder than ever to coax out. The dinner bell, long ago replaced with a shout down the hallway, has now given way to an evening SMS.

Learning experience

On the assumption that technological progress can’t be stopped, the flip-side to the inevitable digitalization of life is the simple argument that kids need to learn new digital skills to survive and thrive in our fast-changing society.

Researchers at the University of Minnesota last year asked 16- to 18-year-olds what they learn from using social networking sites. The students listed technology skills as the top lesson, followed by creativity, then being open to new or diverse views and communication skills.

“What we found was that students using social networking sites are actually practicing the kinds of 21st century skills we want them to develop to be successful today,” said Christine Greenhow, a learning technologies researcher at the university and leader of the study.

One example Greenhow gave: A student might take up video production after seeing a cool video on MySpace. “Students are developing a positive attitude towards using technology systems, editing and customizing content and thinking about online design and layout,” she explained. “They’re also sharing creative original work like poetry and film and practicing safe and responsible use of information and technology. The Web sites offer tremendous educational potential.”

It’s up to educators , Greenhow believes, to figure out how to leverage all this.

Evolution of a new human brain?

Meanwhile, much more research needs to be done to determine if social networking sites, and the Internet in general, are good or bad for children and teens, or neither. Studies going back to the late 1990s have flip-flopped on this as often as new social networking sites pop up.

For now, there are only hints and indications that all this change may indeed lead to young brains that work differently than those of previous generations. But evidence is indeed mounting.

“We are seeing children’s brain development damaged because they don’t engage in the activity they have engaged in for millennia,” says Sue Palmer, author of “Toxic Childhood” (Orion, 2007). “I’m not against technology and computers. But before they start social networking, they need to learn to make real relationships with people.”

Others think a profound evolutionary change is underway.

UCLA neuroscientist Gary Small thinks the dramatic shift in how we gather information and communicate has touched off a rapid evolution of the brain.

“Perhaps not since early man first discovered how to use a tool has the human brain been affected so quickly and so dramatically,” Small contends. “As the brain evolves and shifts its focus towards new technological skills, it drifts away from fundamental social skills.”

(Can you keep up? That may depend in part on how your brain is wired. People who welcome new experiences have stronger connections between their brain centers associated with memory and reward than people who tend to avoid anything new, scientists recently reported in the journal Nature Neuroscience.)

Small, author of “iBrain: Surviving the Technological Alteration of the Modern Mind” (Collins Living, 2008), puts people into two categories: digital natives (your kids) and digital immigrants (the rest of us who cope with varying degrees of success with all this). The former are better at snap decisions and juggling lots of sensory input; the latter are great at reading facial expressions.

“The typical immigrant’s brain was trained in completely different ways of socializing and learning, taking things step-by-step and addressing one task at a time,” Small says.

Interestingly, while Internet use causes changes in brain activity and wiring among people of any age, as a brain-scan study showed, the changes are most pronounced among digital natives. As Small puts it, just searching the Internet “appears to engage a greater extent of neural circuitry that is not activated during reading — but only in those with prior Internet experience.”

For the sake of balance, perhaps we should require all children to learn how to skin and butcher an animal.

Robert Roy Britt is the Editorial Director of Imaginova. In this column, The Water Cooler, he takes a daily look at what people are talking about in the world of science and beyond.

Memory Impairment: An Epidemic We Cannot Forget

Do you remember watching commercials and movies from the 90’s and early 2000’s that used to say we only use 10% of our brains? Although that is a false statement and considered ‘misleading,’ a leading neurologist in Abu Dhabi says excessive reliance on electronic devices and social media could negatively affect your child’s memory and cognition.

Why Technology Is Warping Your Memory

Take a moment to think about the last time you memorized someone’s phone number. That must have been well over 10 years ago, right?

Now think about the last time you looked up something on your phone because you were trying to remember the name of something or someone. It wouldn’t be surprising if that were earlier today or yesterday.

Our access to the Internet (and technology in general) has grown stronger and stronger. As a result, we are being bombarded with new information by the minute and we are no longer forced to remember the things we had to remember 10 to 15 years ago.

That said, it turns out that our reliance on technology has detrimental effects on how our brains process information. It affects the way we learn and it directly impacts our attention spans outside of the virtual world.

What the Experts Say about How Technology Affects Memory Loss

A growing body of research suggests that technology may have profound effects on the memories of younger people, particularly the short-term memory (aka working memory). In some cases, it can even alter or impair its functions.

Dr. Taoufik Al Saadi, chief medical officer and chairman of neurology at the American Centre for Psychiatry and Neurology in Abu Dhabi, says more and more youth today report impaired memory that hampers their daily functioning.

“In the past, impaired memory was typically a concern among people aged 60 years and older,” Dr. Al Saadi said. “But lately, I have seen quite a few individuals in their thirties and forties complaining of memory deficits.”

Information Overload: When the Glass Is More Than Full

Tony Schwartz, a productivity expert and author of The Way We’re Working Isn’t Working, compares the information overload in our brains to glass of water that’s overflowing:

“It’s like having water poured into a glass continuously all day long, so whatever was there at the top has to spill out as the new water comes down. We’re constantly losing the information that’s just come in — we’re constantly replacing it, and there’s no place to hold what you’ve already gotten. It makes for a very superficial experience; you’ve only got whatever’s in your mind at the moment. And it’s hard for people to metabolize and make sense of the information because there’s so much coming at them and they’re so drawn to it. You end up feeling overwhelmed because what you have is an endless amount of facts without a way of connecting them into a meaningful story.”

Let’s take a look at another study to prove his point.

How Taking Pictures/Videos on Your Phone Can Hinder Memory

We all know how much society loves taking photos, particularly young people. We all love capturing memories on our smartphones.

A 2013 study put people into two groups: one was assigned to observe 15 different artifacts and the other was assigned to take photos of 15 different artifacts.

The researchers concluded that taking photos may actually hinder one’s ability to remember what they’re capturing on camera.

Our Brains Were Once the Only External Hard Drives

Before the Internet, what did we do when we didn’t know something? (It’s probably so far back, very few of us can remember.)

We all used to delegate mental tasks to others. When presented with new information, we automatically distributed responsibility to people for remembering certain facts and concepts among members of our particular social group. We would remember some of the things we were good at and we would trust others around us to remember the rest. When we couldn’t remember the name of the actor in that movie from 5 years ago or how to fix a broken machine, we would simply turn to someone we knew and trusted.

That has all vanished now that we can get access to the Internet anytime and anywhere we want.

We need to bring the positive effects of the pre-Internet era into this era. There are loads of social networks out there that make it easier for us to have interactions with people, but it is certainly not the same as having a conversation with someone in the same room as you.

A Few Things We Need to Remember

In an age filled with multitasking and an excessive reliance on technology, we are finding high rates of forgetfulness.

Ultimately, these are the three points you need to remember after reading this:

1. Our reliance on technology has worsened over the years.

2. Several studies on information overload prove that the epidemic of short-term memory loss is real.

3. There is a social aspect to remembering… and we might be able to solve the memory-loss problem if we stopped ignoring people and started having more conversations with people again.

Sources:

  • Gulf News
  • Huffington Post
  • Slate
  • Huffington Post

The answer is yes. Technology affects humans everyday in multiple ways. It shapes our memory, our social life, our perception of things, our intelligence, and our mood.

Even though our brain can hold almost an infinite amount of information, technology can still affect how much information we can hold and retain. Researchers compare this lack of memory to a full glass of water. Tony Schwartz, CEO of The Energy Project, explain this water phenomenon. He compares the brain to a full glass of water. When you pour more water in the glass, or add new information, the water just fills in, yet the water that was previously there pours out. He is showing that the information that we are retaining from technology replaces our old information and that information jut leaves our brain. Another argument that Carolyn Gregoire makes in her article about how technology affects memory is that the fact that the internet has so much information that is easily accessible, we as humans tend to not remember as much information. Humans feel the need to not remember information that an be found on Google, which decreases the amount of information/memory the brain can hold. Gregoire also argues that being distracted makes it harder to form new memories. Zaldy S. Tan, director of the Memory Disorders Clinic at Beth Israel Deaconess Medical Center, says that not paying attention causes the memories we make to be “messy,” which makes it hard remembering that memory.

Even though technology may not affect the actual information that we receive, it does affect how we perceive it and store it on our memory. Technology decreases the strength of the memory and how well we will be able to retain that information from our brain. So yes, technology strongly affects our memory.

.

Is our constant use of digital technologies affecting our brain health? We asked 11 experts.

With so many of us now constantly tethered to digital technology via our smartphones, computers, tablets, and even watches, there is a huge experiment underway that we didn’t exactly sign up for.

Companies like Google, Facebook, Twitter, Apple, even Vox (if we’re being completely honest) are competing for our attention, and they’re doing so savvily, knowing the psychological buttons to push to keep us coming back for more. It’s now common for American kids to get a smartphone by age 10. That’s a distraction device they carry in their pockets all the time.

The more adapted to the attention economy we become, the more we fear it could be hurting us. In Silicon Valley, we’re told more parents are limiting their kids’ screen time and even writing no-screen clauses into their contracts with nannies. Which makes us wonder: Do they know something we don’t?

If it’s true that constant digital distractions are changing our cognitive functions for the worse — leaving many of us more scatterbrained, more prone to lapses in memory, and more anxious — it means we’re living through a profound transformation of human cognition. Or could it be that we’re overreacting, like people in the past who panicked about new technologies such as the printing press or the radio?

To find out, we decided to ask experts: How is our constant use of digital technologies affecting our brain health?

The answers, you’ll see, are far from certain or even consistent. There’s a lot not yet known about the connection between media use and brain health in adults and kids. The evidence that does exist on multitasking and memory, for instance, suggests a negative correlation, but a causal link is still elusive. Still, many of the researchers and human behavior experts we spoke with still feel an unease about where the constant use of digital technology is taking us.

“We’re all pawns in a grand experiment to be manipulated by digital stimuli to which no one has given explicit consent,” Richard Davidson, a neuroscientist at the University of Wisconsin, told us. But what are the results of the experiment?

Our conversations were edited for length and clarity.

Tech companies have powerful, pervasive tools to influence, and prey on, our psychology

Richard Davidson, neuroscientist at the University of Wisconsin Madison and founder and director of the Center for Healthy Minds

I am most worried about the increase in distractability, the national attention deficit we all suffer from, and the consequences that arise from this.

Our attention is being captured by devices rather than being voluntarily regulated. We are like a sailor without a rudder on the ocean — pushed and pulled by the digital stimuli to which we are exposed rather than by the intentional direction of our own mind.

The ability to voluntarily regulate attention is more developed in humans than other species. As William James, the great psychologist, wrote in 1890, “The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.”

But we are becoming impaired in that capacity, globally. We’re all pawns in a grand experiment to be manipulated by digital stimuli to which no one has given explicit consent. This is happening insidiously under the radar.

This, to me, underscores the urgency of training our minds with meditation so we don’t have to check our phone 80 times a day.

Christopher Burr, philosopher of cognitive science and postdoctoral researcher at the Oxford Internet Institute

Our constant use of digital technologies is allowing intelligent systems to learn more and more about our psychological traits, with varying degrees of validity or accuracy. For instance, our smartphone’s accelerometer might be used to infer our stress levels at work, or an automated analysis of our vocal patterns could determine that we’re depressed.

But what’s concerning to me is that users are rarely fully informed that their data could be used in this way. Furthermore, there is often insufficient consideration by the companies who develop the growing variety of “health and well-being” technologies of the risks of intervening. For instance, companies may be nudging a user to change sleep patterns, mood, or dietary preferences and causing unintended harm.

In a health care setting, a doctor will try to avoid interventions that do not involve the patient in the decision-making process. Instead, doctors try to respect and promote the patient’s self-understanding and self-determination. We need to find ways of upholding this relationship in the domain of health and well-being technologies as well.

Any inference or subsequent intervention that aims at changing the behavior of a user should be fully transparent, and ideally scrutinized by an ethical review committee. This would help to minimize the chance of unintended consequences (e.g., increased stress, anxiety, or even the risk of behavioral addiction).

The research so far shows a correlation between digital media bombardment and problems with thinking. But it’s far from conclusive.

Anthony Wagner, chair of the department of psychology at Stanford

The science tells us that there is a negative relationship between using more media simultaneously and working memory capacity. And we know working memory capacity correlates with language comprehension, academic performance, and a whole host of outcome variables that we care about.

The science tells us that the negative relationship exists, but the science doesn’t tell us whether the media behavior is causing the change. It’s too early to really conclude. The answer is we have no idea.

But if there’s a causal relationship, and we are transforming the underlying cognitive functional capabilities, that could have a consequence for academic performance or achievement. One would want to know that.

The field needs to go to big science; we need to go to really large . I’d take the early studies as suggestions of relationships, but now, let’s actually do the science with using design and power that would lead us to believe things might be more trustworthy in terms of the result that everyone finds.

Paul Murphy, Alzheimer’s researcher in the department of molecular and cellular biochemistry at the University of Kentucky

Neurodegenerative diseases take decades to develop, and widespread use of electronic devices like smartphones, etc. is a still a relatively recent thing. So the scary way to look at this is that we are conducting a risky experiment with some potentially serious public health consequences, and we won’t know for another decade or so if we’ve made some terrible mistakes.

In a way, this is analogous to the problems that we have on studying the long-term effects of screen time on children. We can suspect that this may be bad, but we are still many years away from knowing, and we are nowhere near knowing what sort of exposure is safe or how much might be dangerous.

There’s particular concern, and research focus, on what technology does to young, developing minds

Gary Small, author of the book iBrain and director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior

My biggest concern is with young people, whose brains are still developing from birth through adolescence. There’s a process called pruning . This could be affected through all the time using tech. We don’t have data on that — but it certainly can raise a concern.

does affect our brain health. It has an upside and a downside. The downside is that when people are using it all the time it interferes with their memory because they are not paying attention to what’s going on. They are distracted.

As far as I know, there are not systematic studies looking at that. You can only look indirectly at this. So we have studied the frequency of memory complaints according to age. You find about 15 percent of young adults complain about their memory, which suggests there might be things going on such as distraction.

On the positive side, there are certain mental tasks, when using these technologies, that exercise our brains. Some studies have shown some video games and apps can improve working memory, fluid intelligence , and multitasking skills.

Susanne Baumgartner, Center for Research on Children, Adolescents, and the Media, University of Amsterdam

I am researching the potential impacts of social media and smartphone use on adolescents’ attention and sleep. I am particularly interested in the effects of media multitasking — that is, using media while engaging in other media activities or doing homework, or being in a conversation. Most teenagers nowadays have their own smartphones and therefore access to all kinds of media content whenever they want.

We find in our studies that adolescents who engage in media multitasking more frequently report more sleep problems and more attention problems. They also show lower academic performance. However, this does not necessarily indicate that media use was the cause of this.

When looking at sleep problems, we found that stress related to social media use was a better indicator of sleep problems than the amount of social media use. This seems to indicate that it is not social media use per se that is related to sleep problems, but rather whether adolescents feel stressed by their usage.

So overall, I am still a bit hesitant about the conclusion that digital media use is detrimental to adolescents’ cognitive development. At this point, we need more studies that truly investigate these impacts in long-term studies and with better measurements (e.g., tracking smartphone behavior instead of just asking teenagers about their media use).

And we should also not forget to look at potential beneficial effects. For example, studies conducted by other researchers found that specific types of media use, such as playing action video games, can be beneficial for cognitive abilities.

Elizabeth Englander, director and founder of the Massachusetts Aggression Reduction Center

One of the most striking things we’ve been looking at in the lab is that teens often tell us almost all characteristics of social media can make them feel more anxious.

If they see what their friends are doing, that can make them feel anxious about not being a part of it. If they don’t see what friends are doing, that also makes them anxious — they worry about being left out. The times they don’t feel anxious is when they are using social media and actively engaging with their friends in a positive way. But at other times, it does seem to increase anxiety.

That’s striking. It’s a model of an interaction where there’s this strong reward system — and that it kind of seems to keep kids on an emotional tether. One girl described it as a leash.

In terms of direct evidence , it’s limited. But think about it: How do people connect with each other? They do it through social skills. And how do you build social skills? There’s only one way we are aware of — through face-to-face interactions with other peers your age.

When you have a society where other things are displacing face-to-face social interactions, it’s reasonable to assume those are going to impact the development of social skills. It does seem to be what we are seeing now.

We need to find a way to balance the risks of ever-present digital technology with its rewards

Heather Kirkorian, associate professor in the school of human ecology at the University of Wisconsin Madison

One thing is clear: The impact of digital media depends partly on how we use them.

In the case of infants and young children, researchers often refer to content and context. That is, the impact of digital media on young children depends on what children are doing and how those activities are structured by the adults who are — or are not — in the room.

For instance, we might compare video-chatting with a grandparent versus watching an educational TV show versus playing a violent video game versus using a finger-painting app. Young children are the most likely to benefit from digital media when the content is engaging, educational, and relevant to their own lives; when they use it together with others — when parents help children understand what they see onscreen and connect it to what they experience offscreen. And when digital media activities are balanced with offscreen activities like playing outside, playing with toys, reading books with caregivers, and getting the recommended amount of sleep.

So the research with teens and adults isn’t much different. For instance, the effects of social media depend on whether we use them to connect with loved ones throughout the day and get social support versus compare our lives to the often highly filtered lives of others and expose ourselves to bullying or other negative content.

Similarly, the impact of video games on attention depends on the type of game that is played and the type of attention that is being measured.

Adam Gazzaley, professor of neurology at University of California San Francisco and author of The Distracted Mind

I’ve written a lot about the direct impact of digital technology on emotional regulation, attention, and stress, as driven by overexposure to information, rapid reward cycles, and simultaneous engagement in multiple tasks. These are certainly reasons to be concerned.

But personally, I find one of the most challenging aspects of our digital preoccupation to be the displacement it induces from nature, face-to-face communication, physical activity, and quiet, internally focused moments.

I’m currently deep into a trip to New Zealand with limited technology exposure so that I can focus on connecting with friends, nature, and my own mind. I realize now more than ever before how important these experiences are for my brain health.

That being said, I do believe that technology can offer us an incredible opportunity to enhance our cognition and enrich our lives. Figuring this out is our next great technological and human challenge.

The case for companies making products that are less addictive

Ethan Zuckerman, director of the Center for Civic Media at MIT

With any new technology, there is always a pattern of people saying, “This is addictive, and it’s destroying society as we know it.” There’s often something real to those concerns. There’s also often something which is moral panic.

One of the ways you sense moral panic is that it tends to be focused on our kids or sexuality. So when you see someone saying we are going to have a lost generation, or that Bluetooth is leading youth to have sex at unprecedented rates, these are always indications of moral panic rather than concern about real things.

From what I can tell, parenting culture in Silicon Valley is this performative craziness. I’m going to virtue-signal harder than anyone else. I am a better parent than you are because I put crazier restrictions on my family than you do. feels very consistent with that.

The reason those stories are satisfying is you come out of it thinking, “What assholes. If they think this stuff isn’t good, why do they continue to do it?” Then you have folks like Jaron Lanier who say, “Quit your social media now; it’s bad for you.” That feels irresponsible in another way — there are clearly billions of people who aren’t going to quit social media in part because it’s become a critical communications tech. It’s core to how they interact with the world. For a lot of work and play, it’s essential these days.

So what I want to say to Lanier is make it better. We’re not putting this genie back in the bottle. There’s a lot of stuff from it that’s turned out to be good. There’s no one seriously proposing we’re going to turn all of this off.

The interesting question is what are the real problems and how do we address them and make them better? How would you mitigate those harmful effects? What are the positive effects we want out of it?

Nir Eyal, author of Hooked: How to Build Habit-Forming Products

Technology is like smoking cannabis.

Ninety percent of people who smoke cannabis do not get addicted. But the point is that you’re going to get some people who misuse a product; if it’s sufficiently good and engaging, that’s bound to happen. The solution to that is we should fix the harm — not the technology itself, but the harm it does. I want companies to look for the addicts and help them.

Lots of companies make addictive products — I guarantee somebody is addicted to Vox. The good news is that these companies know how much you’re using their product. So if they wanted to, they could simply look at their log and say, “Look, if you use the product 30 hours a week, 40 hours a week, we’re gonna reach out and say, ‘Hey, can we help you moderate your behavior? You’re showing a behavioral pattern consistent with someone who may be struggling with an addiction. How can we help?’”

And you know what, the fact is it would actually make the platform better. It is in their interest to do this. I know that some of them are working on it.

The impact of technology on our memory

WHEN I give talks, I am often approached by people who are worried about their memory. Maybe they are studying for an exam and don’t feel that they learn as well as their peers. Maybe they keep forgetting to close the window when they leave the house. Or maybe they struggle to remember an event that happened a few weeks ago but which everyone else can describe in vivid detail.

To feel that your memory may not be up to scratch can be unsettling or even downright frightening. And that’s hardly surprising — memory makes us who we are. Being able to reflect on and share the past is fundamental to our sense of identity, our relationships, and our capacity to imagine the future.

To lose any part of this ability not only causes problems in our daily routine, it threatens the very notion of who we are. By far the biggest health fear in people over the age of 50 is Alzheimer’s disease and the catastrophic loss of personal memory it entails.

Memory disorders in young people

Are concerns about memory the preserve of the post-retirement generation? It seems not. In fact, if modern trends are anything to go by, younger people are just as nervous of losing access to their past. Go to any big concert these days, and your view of the performer will frequently be obscured by a sea of smartphones, each committing the sights and sounds to a safe permanent digital record.

Stay updated with
BT newsletters

As far back as cave dwellers, humans have found ways to preserve knowledge and experiences, but has the modern lifestyle taken it a step too far? Could an over-reliance on technology make our memory systems lazier and less efficient?

Some studies have found that using an internet search engine can lead to poorer recall of information, although another study recently published failed to replicate this effect. And most researchers agree that in these situations it is not that memory becomes less effective, just that we use it differently.

How about recording events on a smartphone? A recent study showed that a group pausing to take photos at regular intervals had poorer recall of the event than those who were immersed in the experience. And an earlier piece of research suggested that photos helped people remember what they saw, but reduced their memory of what was said.

It seems that the key factor in these situations is attention – actively taking photos may distract and distance someone from aspects of an experience, meaning that less is remembered.

However, there are novel ways around this problem if you insist on taking pictures. Our own work has shown that distraction can be countered if photos are taken automatically using a wearable camera.

Technology and memory

While it may be true that technology is changing the way we use our memory at times, there is no scientific reason to believe that it reduces the inherent capacity of our brains to learn.

Nevertheless, in today’s fast-paced and demanding society, there are other factors that may have a negative impact, for example poor quality sleep, stress, distractions, depression and alcohol consumption. The good news is that these effects are generally regarded as temporary unless they continue over very long periods of time.

There are a small number of people who may experience memory problems over and above everyday forgetfulness.

Head injuries, strokes, epilepsy, brain infections such as encephalitis, or congenital conditions such as hydrocephalus, a build up of fluid in the brain, can all lead to a significant loss in our ability to retain and recall information.

And recently, a new condition has been identified — severely deficient autobiographical memory — which describes a small percentage of the population who report a specific but marked impairment in the ability to recall their past.

These people are the exception though, and most people who worry about their memory have no real cause for concern. When it comes to remembering, we all have our own strengths and weaknesses.

The friend who gets top marks in every pub quiz may be the same one who always forgets where he left his wallet. And the partner who can describe last year’s holiday in incredible detail may take forever to learn a new language. In fact, even world memory champions report everyday forgetfulness, like losing their keys.

By and large, when our memory fails us, it is because we are tired, not paying attention, or trying to do too much at once. Using lists, diaries and smartphone reminders does not make memory less efficient – rather, it frees the brain up to do other things. And instead of making us lazy, looking something up on the internet can help to reinforce or enrich our knowledge base.

But there may be occasions when technology gets in the way – by distracting us from a potentially special moment, or luring us into surfing the web instead of getting much-needed sleep. Most everyday memory lapses can be fixed simply by being more mindful and less busy. So, if you want to remember times with friends, my advice is to enjoy the moment, chat about it afterwards and enjoy a good night’s sleep. THE CONVERSATION

  • The writer is a neuropsychologist at University of Westminster

New technologies that could make your memory sharp again

This article is reprinted by permission from NextAvenue.org.

Are you constantly forgetting where you left your keys or noticing it takes you longer to remember names than it did a few years ago? You’re not alone.

More than half of adults begin to struggle with age-related memory issues by age 60, according to Harvard Medical School.

The good news is memory lapses are a normal part of the aging process — yet, that doesn’t make them any less frustrating. That’s why many research teams at renowned universities around the country are exploring how new technologies can play a role in slowing or reversing age-related memory loss.

Types of memory

Before understanding how these technologies might work, you must first know that there are two basic types of memory: short-term and long-term.

“Most people will say, ‘I can’t remember what I had for breakfast — there goes my short-term,’ or, ‘I can’t remember 10 years ago — there goes my long-term,’” says Dr. Harry D. Schneider, neurolinguistics consultant for the Brain Function Laboratory at Yale University School of Medicine. However, short-term memory, which makes up to 50% to 75% of what we use every day, is anything that is 30 seconds or less, Schneider says. Everything else is considered long-term memory.

“Memory begins with learning and sensing something — feeling, touching, hearing, smelling,” Schneider notes. That’s why short-term memory is also known as working memory. An example of this: recalling a seven-digit phone number that you read or heard.

Also see: Use this $9 item each day to improve your memory

On the other hand, long-term memory is anything longer than 30 seconds. Part of this is regular long-term recall, such as remembering names; and part of it is unconscious memory, or all the things you do automatically without thinking about them, like running or riding a bike. “If we had to consciously remember everything, we’d overload and stop functioning,” Schneider says.

Transcranial Magnetic Stimulation

In April 2019, Northwestern University released results of a new study that showed how one technology could help restore regular long-term recall memory in older adults to the level of that in young adults. Lead by Joel Voss, associate professor at Northwestern’s Feinberg School of Medicine, the study used a technique called Transcranial Magnetic Stimulation, or TMS.

A form of noninvasive brain stimulation, TMS works by sending electromagnetic pulses into specific areas of the brain. Voss’ team targeted a region of the brain called the hippocampus, which helps you recall two unrelated things at the same time — for example, where you left your cellphone and your neighbor’s name.

Because the hippocampus is located too deep within the brain to stimulate non-invasively with TMS, the researchers focused on an area of the parietal lobe, which communicates with the hippocampus. Think of it like “a highway to the hippocampus,” says Schneider.

As part of Voss’ study, 16 participants age 64 to 80 who were experiencing age-related memory loss, took a memory test that mimicked real-world problems. After undergoing five days of TMS treatments, they retook the test. The results: Participants’ long-term recall memory improved to the level of a young-adult cohort of the same size, showing an average test improvement of 30 percent post-TMS.

“ certainly are promising, but we’re in the early stages of this kind of technology,” says Aneesha Nilakantan, a cognitive neuroscientist and data scientist on the Northwestern team. “A lot more research needs to be done, especially around what time is best to start TMS, and long it will last.”

Alternative uses for TMS

Voss’ study was far from the first time TMS technology has been used to stimulate the brain. Yet, it’s among the initial studies focused on memory loss. Psychiatrists around the country have been using TMS to treat depression for several years.

Dr. Ryan Wakim, a board-certified psychiatrist and president and CEO of Transformations TMS centers in Pittsburgh and West Virginia, treats patients with depression using TMS. He does this using NeuroStar Advanced Therapy, which manufactures the TMS Therapy System.

The Food and Drug Administration (FDA) approved TMS for treating depression in 2008, and for obsessive compulsive disorder in 2018. While those are the only current FDA on-label uses for the machine, Wakim sees great potential for treatments to expand in the future.

Don’t miss: Consumers could get up to $20,000 apiece in Equifax settlement — how to get your share

“For a while I said is the future of psychiatry,” he says, “but I realized in the last 12 months that it’s really the future of the brain.”

Another technology for reversing memory loss

There is a slight risk — less than 0.1% — of seizures with TMS treatment. And certain factors can increase that risk, such as sleep deprivation, family history of seizures, alcohol use and a previous neurological condition, according to a 2015 article published in the journal Neuropsychiatric Disease and Treatment.

Schneider specializes in another type of noninvasive brain stimulation, called Transcranial Direct Current Stimulation, or tDCS, that has an even lower risk for seizures.

It differs from TMS in a few ways. One, it’s much more portable. The machine itself is about the size of a deck of cards, and it’s wired to two small sponge electrodes that you put on your scalp and cover up with a headband. Rather than being confined to a chair, as with TMS treatments, patients undergoing tDCS can put the gadget into a fanny pack and walk around.

Second, it works with a smaller dose of magnetism and electricity. “Imagine if you wanted to turn on a light switch, and the switch was the size of a grain of rice,” says Schneider. “You won’t feel anything, but it will begin to change some of the chemicals floating around in your brain.”

The biggest effect of tDCS is it begins to change the way your brain is wired, affecting its neuroplasticity. Think of it like a bypass, Schneider says. If one part of your brain is blocked and not working (i.e., having memory problems), tDCS can slowly build up a new neural pathway that bypasses that section.

Again, more research is needed, but there have been studies showing that strategically applied tDCS can temporarily improve thinking skills in healthy older adults, says Tracy Vannorsdall, a neuroscientist at Johns Hopkins Medicine in Baltimore.

Related: Want to do your job better? Use your ‘jazz brain’

However, “We are still working to determine what brain regions to target to optimize cognition in older adults, how frequently to apply tDCS and what cognitive training activities should accompany the stimulation,” Vannorsdall adds.

Non-tech ways to improve memory loss

If you’re experiencing some age-related memory loss but not ready to seek technology-based treatment (or it isn’t available in your area), there are other actions you can take. Here are a few tips from the experts for strengthening your memory:

  • Don’t stress. “When you get anxious and think your memory is going down the drain, it does,” Schneider says. It turns into a self-fulfilling, yet temporary cycle. So, chill out and realize this is normal.
  • Evaluate your diet. A healthy diet is important for maintaining good memory function. Also, a few studies have shown that restricting calories improves memory in older adults.
  • Exercise. Vigorous physical exercise, if you are able to do it safely, is the best thing you can do for your memory. Find a way to get 30 minutes of vigorous exercise a few days a week.
  • Try something new. Engaging in new activities that keep you socially and physically active, such as ballroom dancing or gardening clubs, are great ways to keep your brain sharp as you age.
  • Limit alcohol. Keep consumption to, say, half a glass of wine a day or less.
  • Learn another language. This is the single best thing for keeping your mind in tune, Schneider says, because it requires your conscious memory to work hard.

Kelsey Ogletree is a freelance writer based in Chicago covering travel, food, health and wellness for magazines like Shape, AARP, Architectural Digest and more.

By Helen Thomson

Christopher Anderson/Magnum

IS AN ostrich’s eye bigger than its brain? This kind of trivia question was once a cognitive workout, but when was the last time you really pondered a question, rather than simply turning to the internet for help? Then there are phone numbers and friends’ birthdays: information we once stored in our brain is now held in the smartphone in the palm of our hand.

Outsourcing memories, for instance to pad and paper, is nothing new, but it has become easier than ever to do so using external devices, leading some to wonder whether our memories are suffering as a result.

Master your memory

The truth about memory is far more elaborate than we previously thought. Here’s your guide to how it really works

Probably the largest data dump is of snapshots of events, whether it is thousands of photos posted on social media or status updates documenting our lives. You might think that taking pictures and sharing stories helps you to preserve memories of events, but the opposite is true. When Diana Tamir at Princeton University and her colleagues sent people out on tours, those encouraged to take pictures actually had a poorer memory of the tour at a later date. “Creating a hard copy of an experience through media leaves only a diminished copy in our own heads,” she says.

People who rely on a satellite navigation system to get around are also worse at working out where they have been than those who use maps.

The mere expectation of information being at our fingertips seems to have an effect. When we think something can …

Technology and memory loss

Leave a Reply

Your email address will not be published. Required fields are marked *