CEO CORNER: THE PROLIFERATION OF COGNITIVE OUTSOURCING

By Jon Karafin

September 22, 2021

Reading Time:
15 Minutes

If you’ve been keeping up with this series, you’ve read a lot about how the human brain works (or, in so many cases, how it doesn’t work) as well as how the latest technology, far from enriching our lives and experiences, serves to distract us from what’s really important.

What have we learned?

  • The human brain lacks the capacity to adequately process all incoming sensory information.
  • Visual perception is severely limited, and human memories are unreliable.
  • The history of photography is littered with undistinguished snapshots of babies and pets—not to mention ill-advised selfies.
  • Believe it or not, cell-phone subscriptions now outnumber the world’s population.

It all boils down to this: we think that we are adequately interpreting and fully understanding the world around us without realizing how flawed our perception really is. Frankly, it’s amazing humans are able to survive at all.

In 2009, researchers at the University of California, San Diego calculated that American households consumed 3.6 zettabytes (not a made-up word! 1 zettabyte is 1,000,000,000 trillion bytes) of information every year. Source: UC San Diego [3]
In 2009, researchers at the University of California, San Diego calculated that American households consumed 3.6 zettabytes (not a made-up word! 1 zettabyte is 1,000,000,000 trillion bytes) of information every year. Source: UC San Diego [3]

If there’s any silver lining, it’s the fact that taking so many photographs is a great way to preserve our fading memories and false perception. Right? This one, unfortunately, may be the final nail in the coffin for us all.

As we’ve learned, humans today generate an unprecedented amount of data, but our brains lack the bandwidth to process it all.[1] Is it any wonder researchers have found people need new cognitive shortcuts to keep up with the quantity of information available?[2]

The desire to expand the brain’s capabilities beyond its current means is not new. It is human nature to seek out technology that may supplement cognitive function and memory. That is the impetus for technological invention: building new tools that reach beyond established limitations to achieve greater capacity. The process is rooted in the history of visual communication:[4]

  • Cave paintings allowed us to record events and share stories
  • The alphabet allowed us to share complex ideas with language
  • Manuscripts and books compiled the knowledge of a lifetime in volumes to be accessed by future generations
  • The printing press distributed the printed word to larger groups of readers
  • Broadcast television provided simultaneous access to visual information to mass audiences
  • The Internet makes the world’s information resources accessible from any device

So strong is our desire for help with massive amounts of sensory input that we place an unearned amount of trust in technology. The journal Science published a study demonstrating that when people are told a computer will store a given piece of information, they are significantly less likely to remember it for themselves.[5] Researchers call this cognitive offloading or cognitive outsourcing.

Marketers take advantage of cognitive offloading to help consumers make purchase decisions more quickly. Source: GreenBook Blog[6]
Marketers take advantage of cognitive offloading to help consumers make purchase decisions more quickly. Source: GreenBook Blog[6]

When a computer saves a piece of information,
you are significantly less likely to remember it.

Don’t believe it? How many people’s phone numbers have you memorized?

The average person today is likely to know their spouse’s number (that’s what the grocery club card is listed under, after all), the phone number of the home where they grew up, and … that’s it.

A decade ago, the answer was different. Before we could offload every number into a centralized address book, some of us memorized dozens of numbers—even the numbers of the pizza delivery services that we contact today over third-party smartphone apps.

The average person today doesn’t have
more than a few phone numbers memorized
due to “cognitive outsourcing” to their smartphones.

Studies demonstrate that the massive increase in the number of photos captured with our smart devices diminishes our ability to recall experiences, distracts us, and takes us out of the moment. As a science reporter for Vox warns, “constantly sharing photos may even be changing how we recall events in our own lives.”[7]



The increased quantity of photos we take
diminishes our memory and ability
to recall experiences.

This all goes back to attentional filters and memory formation methodologies. (Refer to “The Evolution of Visual Perception.”) In order to form any lasting memory, you need to pay attention, especially since the brain seems to have a hard processing limit of around 120 bits per second.[8]

Neurons are living cells that require oxygen and glucose to survive. Just like our bodies, our minds experience fatigue after working hard. Every trivial bit of information you consume—Facebook status updates, tweets and text messages, Instagram pics and TikTok clips—is competing for those same resources along with more important issues related to financial decisions, personal relationships, or the simple logistics of getting through everyday life.[9]

Scientists have shown the very act of photographing something focuses our limited attention onto the operation of the device, shifting it from the experience itself. You remember the act of taking the photograph, but if you’re not paying sufficient attention, you won’t recall the details of what was happening in the world outside of your device.

Researchers have called this the “photo-taking impairment effect.”[10] You understand your phone’s camera as a kind of Dropbox for your brain, meaning that when you take a photo of something, you’re counting on the camera to remember for you. You don’t engage in any of the elaborative or emotional kinds of processing that really would help you remember those experiences, because, as far as your brain is concerned, you’ve outsourced it to your camera.”[11]



Photographs today make the camera function like a Dropbox for your brain.

Researchers have further demonstrated that the intent to share an image on social media additionally alters our memories in a subtle but profound way. We are apparently more likely to remember the moment from a third-person perspective, which significantly decreases the intensity of emotion derived from an event.[12]

Photographs shared on social media alter memories to recall moments from a third-person perspective, decreasing the intensity of emotion derived from events.

Although one may be attempting to expand one’s cognitive capabilities by documenting a moment with photographic permanence, researchers conclude that, due to the lack of attention applied to the experience and impaired emotional memory of the events captured, the ability to remember that the photos even exist is decreased—meaning that most of the photographs are never looked at again.[13]



Most photographs on your smartphone are never looked at again.

Research does suggest contrary results in certain situations. When the camera is used as a tool for a task—for instance, extending visibility of an object beyond what the eye can naturally see (e.g., zooming in on something)—memory of an object can improve. [14] However, this improvement is limited to the examined object, and does not improve memory of the experience or surrounding events.

Self-proclaimed “foodies” (people who love to experience food) capture images of food and constantly share them to various social media websites. Interestingly, researchers determined that Instagramming your food can make it taste better, but if individuals exclusively photograph the food and not the people or the atmosphere of the restaurant, it may demonstrate a predisposition toward unhealthy behavior around food.[15]



Posting images of food on social media repeatedly may be a sign of behavioral issues.

When people interact with or think about their devices, their attention for given tasks is divided and cognitive performance may suffer. Studies have shown that use of these devices can create a “gravitational pull on the orientation of attention”:

Results from two experiments indicate that even when people are successful at maintaining sustained attention—as when avoiding the temptation to check their phones—the mere presence of these devices reduces available cognitive capacity. Moreover, these cognitive costs are highest for those highest in smartphone dependence. [16]

Researchers say these cognitive impairments are similar to those of other diversions of attention outside of photos and devices. However, the increased frequency of diversions created by personal devices generates an usually potent diversion of attention.

Some of the research really is shocking. Take, for example, the conclusion that merely having a device or smartphone out on a desk, without touching or receiving any communications from it, may lead to statistically significant impairment of cognitive capacity, as though you haven’t been getting enough sleep.

Researchers found that the mere presence of a smartphone reduces available cognitive capacity. [17]
Researchers found that the mere presence of a smartphone reduces available cognitive capacity. [17]



It turns out that there is a direct correlation between the distance of the device from an individual and the impact to cognitive performance. When smartphones are placed in a separate room, regardless of being on or off, cognitive performance suffers less than when the phone is nearby. [18]

The mere presence of your smartphone leads to decreased cognitive capacity on par with the effects of sleep deprivation.

Cognitive psychologists say it’s natural to pay attention to things that are habitually relevant. You are likely to turn your head if you hear someone call your name, or if your child starts crying, whether or not you are otherwise preoccupied. But the pull on our attention created by devices is so strong that it has created a new kind of false sensory response called a “phantom buzz,” where we imagine that our phones have vibrated. That’s right: Even if we successfully ignore our devices, our anxiety over missing some crucial new bit of information continues to impair our cognitive abilities.[19]

Let’s set aside our addiction to the steady drip-drip-drip of texts, dystopian news headlines and social media morsels. Even if you’re reading the collected works of Shakespeare on your phone, studies show that screen time itself directly impacts our mental and physical health. Increased screen time has been linked to decreased sleep duration and efficiency. Studies suggest the blue light emitted by displays can disrupt the body’s natural production of melatonin, a hormone that controls circadian rhythms and makes it easier to fall asleep.[20]



The blue light from displays can disrupt the body’s natural circadian rhythms and sleeping patterns.

Further, too much screen time has been linked to an increased likelihood of obesity, mainly due to the sedentary behavior associated with electronic activity. Research has shown that reducing screen time may be an effective strategy for reducing the incidence of obesity.[21] Other studies suggest one link between TV and obesity may be the amount of advertising related to unhealthy foods. In a study, children watching cartoons with food commercials ate 45% more unhealthy snacks than a control group that did not receive these commercials. [22]



Increased screen time is linked to increased obesity rates.

The amount of screen time has also been linked to anxiety and depression. Spending six hours or greater on displays is statistically correlated to increased chances of moderate to severe depression in adults. One study found that children who watch three hours of TV per day were shown to be three times as likely to experience a language delay and score lower on school readiness tests.[23] Another study examined the connection between language, visual and cognitive control regions of the brain, concluding that children’s brain connectivity is increased by time spent reading books and decreased by time spent with screen-based media. [24]

Adults watching television for more than six hours daily are more likely to suffer from moderate to severe depression.

Who’s ready to watch some TV???

In all seriousness, if you’ve been following this entire series, you may be wondering by now exactly what all of this has to do with holograms in general, and light fields in particular.

Despite our current condition of information overload, our emotions, experiences and memories remain intimately tied to our sense of vision.

All of human knowledge is informed by what we see. Human vision has evolved over millions of years to better inform our perceptions of and interactions with the universe around us. It is also the main way we interface with our devices.

The world we see is largely based on sense perception, and sight is our primary input. Everything around us is visible as a collection of light energy that provides sensory data through our eyes to be processed by the visual cortex of the brain. The light field defines how photons travel through space and interact with materials. What we ultimately see and understand as the world around us is bundles of light that focus at the back of our eyes.

As a result, we share a deep psychological need to communicate visually the same way we interact in the real world. From simple, monochromatic cave drawings we have developed still photography and motion pictures that we view on displays with billions of colors and constantly increasing resolution. And, in nearly every decade of the past century, we have seen the rise and fall of immersive display technologies that attempt to replicate the realistic holographic visions portrayed in science fiction.

Figure 1: A brief history of VR technology.[22]
Figure 1: A brief history of VR technology.[25]

Surprisingly few studies have attempted to compare memory performance between real objects and 2D representations. However, the research that does exist clearly demonstrates that our cognitive capabilities were increased by the presence of real objects as compared to 2D images, including both color photographs and black-and-white line drawings, with stimuli closely matched across conditions for size, orientation, and illumination. “Our data suggest that real objects are more memorable than pictorial stimuli,” the researchers concluded.[26]

Representations of real images, including holograms, clearly demonstrate increased cognitive capabilities and memory performance vs. all other forms of 2D displays.

Let’s revisit what we have learned on this journey that brings us to the above conclusion:

  1. Humans value the concept of immortality. The realistic path to immortality is persistence in the memories of the living.
  2. Human memory is imperfect. Our memories can fail because of inadequacies in our cognitive capabilities or be influenced by the settings in which they are created, things that happen to us subsequent to the event being remembered, and by the various cognitive processes that store and access human memories. “Although our cognition allows us to attend to, rehearse, and organize information, cognition may also lead to distortions and errors in our judgments and our behaviors.”[27]
  3. The human brain evolved over millions of years to focus on only the most crucial information contained within the firehose of data—an estimated 11 million bits per second—that human sensory receptors are sending for processing.[28] Our brains have evolved to prioritize an amount of information from the visual system that’s a magnitude greater than that delivered by the other four traditionally defined senses combined.
  4. Although neurobiologists do not yet agree how many systems are involved in analyzing visual information, it appears that motion, direction, borders, lines, contours, depth and brightness are processed in a separate visual system from color, form and detail.[29] How many additional subdivisions there may be is still a subject for debate, with some researchers concluding there are three or four visual systems (or possibly more).
  5. These visual systems need to work together to provide an accurate visual representation of the world. However, visual conflicts between the systems may occur and, if not all of the visual pathways are leveraged, the visual perception of a particular object will not be experienced the same way as when all visual pathways are leveraged.
  6. According to the UN, sometime around 2015 we reached an inflection point where the number of active cellphone subscriptions worldwide grew larger than the number of people on this planet! [30]
  7. Given this massive growth, nearly 9 trillion images were stored by the end of 2020. And nearly 20% of all photos ever taken, throughout all of time, were captured in 2020.
  8. Unlike any other time in history, we spend most of our waking hours looking at flat 2D displays and devices.[31]
  9. Numerous studies demonstrate that the massive increase in photos captured with our smart devices “actually diminishes our ability to recall experiences, diverts our attention, and takes us out of the moment. Constantly sharing photos may even be changing how we recall events in our own lives.” [32]
  10. What’s most disturbing about these cognitive impairments is their frequency; the omnipresence of digital devices helps them constitute “a particularly potent draw on the orientation of attention.”[33]
  11. Too much screen time leads to significant developmental, mental and physical health issues.
  12. Most importantly, looking at pictures of infants and baby animals releases hormones in the pleasure center, similar to the brain’s dopamine response to sex, music, love, and drugs… (OK, ignore this one. Sorta.)

Are you seeing it yet?

Our brains evolved over millions of years to form multiple visual neural pathways. We have come to understand the world around us by interpreting these visual impulses, and the human visuomotor system has evolved largely to perceive and interact with real objects and environments, not images.

The human visuomotor system has evolved
to perceive and interact with real objects, not images.

Only in recent history have we been overwhelmed by forms of visual data that are flat and two-dimensional. Over the other millions of years of our brains’ evolution, our perception of the world, and our stored memories, have been created out of visual input that included cues regarding depth, motion parallax, binocular disparity, shape, surface texture, specular highlights, perspective, and many others that include separate neural pathways in the visual cortex. Conversely, when we view a monoscopic representation of an object, we experience the stimulus as flat—which has been shown to disrupt object recognition, exciting only a subset of neurons in an already bandwidth-limited attentional system. [34]

Flat objects disrupt object recognition
and only excite a subset of neural pathways.

Unlike flat displays, real objects uniquely effect neural responses pertaining to potential action, such as grasping and manipulation. For example, the superior parieto-occipital cortex responds differently if a graspable object is within reach, regardless of whether the grasp is planned or executed. It’s thought that the brain’s perception of potential interaction with a real object may strengthen memories at the time they’re encoded.[35]

Researchers found recall and recognition of real objects was significantly better than of matched color photos of the same items.[36]
Researchers found recall and recognition of real objects was significantly better than of matched color photos of the same items.[36]



Real objects have actual size, distance and position relative to an observer, while images only have an expected size based on our experience with similar visual data.

“Knowing the size, distance and location of a stimulus has consequences for the way in which it is perceived, and this shapes future neural processing for cognition, action and memory.”[37]

The potential for real-object motor interaction strengthens memories by enhancing the depth of processing when memories are encoded.

In other words, real objects and representations of real objects are remembered equivalently to real objects, and our brains are wired from millions of years of evolution to leverage as many neural pathways as possible for real objects only. Flat images engage only a subset of all of the neural pathways possible to encode a lasting memory.

Additional studies indicated that depth information from binocular disparity cues—such as those provided by autostereoscopic displays, VR and AR HMDs, and glasses-dependent 3D screens—may have a modest effect on the amount of time it takes to recognize an object, but they did not demonstrate any cognitive memory performance increase over 2D images. Some studies found that cognitive performance actually declines when observers move from 2D to 3D viewing conditions.

Stereoscopic display evaluations did not demonstrate any cognitive memory performance increase over 2D counterpart display studies.

“Real objects may be more memorable because they more strongly activate dorsal stream regions at encoding, perhaps promoting deeper processing and superior memory. In other words, real objects may have a memory benefit due to embodied cognition.”[38]

What does this mean in a commercial sense? There are clear implications: Controlled studies found that consumers are willing to pay as much as 60% more for goods that were displayed to them as real objects, compared to flat, two-dimensional representations of the same objects. [39]

Studies of rats’ memories when placed into virtual reality mazes confirmed that although the rats were successful at navigating the environments, their neural activity was highly abnormal. UCLA researches found that more than half of all neurons in the hippocampus shut down in virtual reality. The remaining neurons fired in a disordered pattern UCLA neurophysicist Mayank Mehta described as “alarmingly different” from the activity rats display while navigating a real maze, demonstrating that the brain requires multiple types of sensory input to construct a functioning spatial map.

As it happens, rodents and humans process space in similar ways. A VR headset can make it look like you are moving in space, but if other sensory input doesn’t match, neural activity seems to become strange. “VR breaks the laws of physics,” Mehta said. Because VR is essentially a visual illusion that violates the otherwise consistent relationships between different sensory stimuli, it creates abnormal patterns of brain activity.[40]

The neural activity in VR test subjects was “alarmingly different.”

Well then. If smartphone cameras actually impair our ability to remember things, screen time robs us of sleep and fattens our bodies, and VR tricks our brain into freaking out completely, does the last century-plus of advances in photographic imaging portend anything but a true dystopia?

I’m glad you asked.

The Holographic Future

Our 2D devices are a cognitive crutch. Our brain outsources information that we believe will be accessible in the future to these devices, thereby limiting the neural connections necessary to create new memories.

Even more frightening, the very act of recalling the memory can corrupt its accuracy.

We can document every moment with a photograph, but this is even worse. The act of viewing the image essentially overwrites what remains of the existing memory, slightly altering the encoded information every time.

At the same time, we have become dependent to the point of distraction on screens and smart devices. The mere presence of our smartphones can adversely affect our ability to think and problem-solve—even when we aren’t using them. Even when we aren’t looking at them. And even when they are powered off altogether.

Displays are now an extension of our bodies, providing capabilities previously thought limited to science fiction. There is no closer tie to emotion, experience and memory than our sense of vision.

But our brains know the difference between flat photographic visual representations and the real thing. In fact, both holograms and real objects increase your cognitive capacity, unlike 2D displays and devices.

Representations of real objects further strengthen the neural encoding of memories, encouraging deeper processing and superior recollection.

These findings have commercial implications, as well. Controlled studies demonstrate that consumers are willing to pay as much as 60% more for visuals displayed as real objects vs. flat 2D representations of the same object.[41]

Our smartphones are flat displays that are damaging our cognitive capabilities as our addiction to them grows stronger.

But it won’t always be this way. There is an exciting journey ahead of us.

This is the future of perception, cognition, and sensory engagement—where dynamic light objects become a new reality and our visual memories are immortalized.

Light is our medium. Space is our canvas.

We are building the holographic future together.

[1] https://www.fastcompany.com/3051417/why-its-so-hard-to-pay-attention-explained-by-science

[2] https://www.vox.com/science-and-health/2018/3/28/17054848/smartphones-photos-memory-research-psychology-attention

[3] https://qi.ucsd.edu/news-article.php?id=1630

[4]. https://www.historyofvisualcommunication.com/

[5]. https://www.science.org/doi/full/10.1126/science.1207745

[6]. https://www.greenbook.org/mr/market-research-technology/offloading-cognition-the-new-behavioral-science-approach-to-winning-customers/

[7]. https://www.vox.com/science-and-health/2018/3/28/17054848/smartphones-photos-memory-research-psychology-attention

[8]. https://www.vox.com/science-and-health/2018/3/28/17054848/smartphones-photos-memory-research-psychology-attention

[9]. https://www.fastcompany.com/3051417/why-its-so-hard-to-pay-attention-explained-by-science

[10]. https://journals.sagepub.com/doi/abs/10.1177/0956797613504438

[11]. https://medium.com/thrive-global/does-taking-pictures-on-our-smartphones-have-any-effect-on-our-memory-1a9ee03adbf8

[12].  https://www.vox.com/science-and-health/2018/3/28/17054848/smartphones-photos-memory-research-psychology-attention

[13]. https://blogs.scientificamerican.com/mind-guest-blog/how-well-can-we-remember-someone-s-life-after-they-die/

[14].  https://www.vox.com/science-and-health/2018/3/28/17054848/smartphones-photos-memory-research-psychology-attention

[15]. https://www.cheatsheet.com/gear-style/reasons-psychologists-warn-not-to-take-so-many-photos.html/

[16]. https://www.journals.uchicago.edu/doi/full/10.1086/691462

[17]. https://www.hendrix.edu/uploadedFiles/Academics/Faculty_Resources/Teaching_and_Learning/Presence-Smartphone-reduces-cognitive-capacity.pdf

[18]. https://hbr.org/2018/03/having-your-smartphone-nearby-takes-a-toll-on-your-thinking

[19]. https://hbr.org/2018/03/having-your-smartphone-nearby-takes-a-toll-on-your-thinking

[20]. https://en.wikipedia.org/wiki/Screen_time

[21]. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1906831/

[22]. https://www.hsph.harvard.edu/obesity-prevention-source/obesity-causes/television-and-sedentary-behavior-and-obesity/

[23]. https://en.wikipedia.org/wiki/Screen_time

[24]. https://onlinelibrary.wiley.com/doi/abs/10.1111/apa.14176

[25]. https://www.behance.net/gallery/23128577/information-Visualization

[26]. https://doi.org/10.3389/fnhum.2014.00837

[27]. https://2012books.lardbucket.org/books/beginning-psychology/s12-03-accuracy-and-inaccuracy-in-mem.html

[28]. https://www.britannica.com/science/information-theory/Physiology

[29]. https://www.discovermagazine.com/mind/the-vision-thing-mainly-in-the-brain

[30]. https://qz.com/1608103/there-are-now-more-cellphones-than-people-in-the-world/, https://datareportal.com/reports/digital-2019-global-digital-overview, https://venturebeat.com/2018/09/11/newzoo-smartphone-users-will-top-3-billion-in-2018-hit-3-8-billion-by-2021/, https://www2.deloitte.com/content/dam/Deloitte/us/Documents/technology-media-telecommunications/us-global-mobile-consumer-survey-second-edition.pdf, https://www.itu.int/en/ITU-D/Statistics/Pages/stat/default.aspx

[31]. https://www.marketwatch.com/story/people-are-spending-most-of-their-waking-hours-staring-at-screens-2018-08-01, https://www.forbes.com/sites/nicolefisher/2019/01/24/how-much-time-americans-spend-in-front-of-screens-will-terrify-you/#5d9a52f01c67

[32]. https://www.vox.com/science-and-health/2018/3/28/17054848/smartphones-photos-memory-research-psychology-attention

[33]. https://www.journals.uchicago.edu/doi/full/10.1086/691462

[34]. https://doi.org/10.3389/fnhum.2014.00837

[35] https://doi.org/10.3389/fnhum.2014.00837

[36] https://doi.org/10.3389/fnhum.2014.00837

[37] https://doi.org/10.3389/fnhum.2014.00837

[38] https://doi.org/10.3389/fnhum.2014.00837

[39] https://www.aeaweb.org/articles?id=10.1257/aer.100.4.1556

[40] http://newsroom.ucla.edu/stories/ucla-researcher-uses-virtual-reality-to-understand-how-animals-perceive-space

[41]. https://doi.org/10.3389/fnhum.2014.00837