logo Voices of the next generation - our generation

Join Our Team
Make a Pitch
Loading

Finding ‘The Big Idea:’ It’s Still Possible

by Caitlin Gilbert | Georgetown University

F Posted in: Sports and Culture, Voices P Posted on: July 20, 2012
Caitlin Gilbert Caitlin Gilbert

When I read Neal Gabler’s New York Times essay, “The Elusive Big Idea,” last year, I was terrified. In the essay, Gabler argues that the climate of our society is not conducive to generating “big ideas:” Einstein’s relativity, Henry Ford’s assembly line, and Betty Friedan’s The Feminine Mystique, to name a few. He claims the advent of social media and the Internet have made it virtually impossible to analyze and think critically about the surplus of information that is instantly and constantly available to us, thereby preventing “big idea”-requisite insight:

“The collection itself is exhausting: what each of our friends is doing at that particular moment and then the next moment and the next one; who Jennifer Aniston is dating right now; which video is going viral on YouTube this hour; what Princess Letizia or Kate Middleton is wearing that day. In effect, we are living within the nimbus of an informational Gresham’s law in which trivial information pushes out significant information, but it is also an ideational Gresham’s law in which information, trivial or not, pushes out ideas,” Gabler wrote.

The passage felt particularly ominous as I surveyed the college library floor I was on to find every computer screen in sight bearing a newsfeed or tweet stream. Gabler even suggests that we have entered a “post-idea” world, where thinking itself ceases to exist. In the year since reading his essay, I have tenaciously held on to the hope that he is wrong.

Specifically, Gabler cited The Atlantic’s “14 Biggest Ideas of the Year” as suggestive of this ostensibly apocalyptic trend. The 2011 “ideas” were undoubtedly more observational than inspirational, yet, in this year’s list, published last month, the “ideas” have graduated from facts to opinions, the most striking of which include “Lotteries for College Admissions,” “Bankers Should Be Boring,” and the emphatic finale, “Let’s Cool It With The Big Ideas.” Either Gabler has been proven wrong, or The Atlantic was just as threatened by his piece as I was.

The annual Aspen Ideas Festival, held a few weeks ago and co-sponsored by The Atlantic and the Aspen Institute, provided another glimmer of hope. The festival is centered around discussion, which — if the last two thousand years of human history have taught us anything — generates great ideas. However, there was an obvious distinction between the high-brow panels bearing profound titles like “Will They Trust Us Again?,” “Can Character Be Taught?” and “What is Life Worth?” and the presentations in the “Designs for the Future” series: whereas the former involved some of the leading experts in policy, science, education and the arts analyzing issues central to the Western World, the latter embodied pure and beautiful innovation.

Small teams of Stanford University’s design school (better known as “d. school”) students pitched their exciting and impactful projects, and the three winners showcased their projects at the festival.  One group updated the WWII-age method of delivering intravenous fluids to soldiers in the field with a simpler, faster-functioning, and more accurate model they dubbed “Combat IV;” the new model can be applied to any setting, be it a warzone, a city post-natural disaster, or even an ambulance on a college campus, where IV fluids must be administered efficiently. Another group harnessed the power of social media to develop a nonpartisan online platform, Social Teeth, which makes policy advocacy accessible to the layperson: imagine a working mother of three spearheading a national advertising campaign for healthy eating habits.

So why weren’t the leading experts at the festival creating these ideas? Logic would suggest that years of expertise would leave an individual with the knowledge to do so. The human brain, however, does not follow simple logic.

Neuroscience and psychology research from the past few years tells us that creativity is motivated by a few key factors. First, diversity of perspective provides the brain with resources with which to be creative. In 2009, the Kellogg School of Management found that the addition of just one member to a group improved the group’s performance on a given problem-solving task.

Whereas the Aspen experts were specialists in a particular field–the many panels at the festival typically involved experts within the same field–the teams were interdisciplinary. The Combat IV team included medical students, business school students and Iraq War veterans. While insight is an internal process, implementation (of the “big idea” variety) requires external perspectives. The more perspectives there are, the better and more nuanced the idea is likely to be.

Secondly, creativity requires risk. Despite the fact that most will be quick to support ‘creativity’ as an abstract notion, few actually are creative. The irony of the coexistence of the panel on creativity and the teams of students that presented their creativity was all too noticeable. Researchers from the University of Pennsylvania, Cornell University, and the University of North Carolina — Chapel Hill recently published findings that people subconsciously associate creative ideas with negative words like “vomit,” “poison,” and “agony,” while ascribing a more positive connotation to practical ideas. The uncertainty largely inherent in creative ideas motivates these biases, even when there is evidence supporting the creative ideas’ plausibility. Metacognitively, knowing that we have these internal biases allows us to work against them more effectively; taking risks is a cognitive effort like any other.

Especially in the information-inundated world Gabler describes, creativity’s third source, space, can be difficult to attain. To be creative, the brain requires the physical processing power to synthesize this surplus of information — switching from absorb mode to analyze mode can only happen if we allow it to. Such a mental break can last only nanoseconds — e.g. pausing to think about a paragraph in an online article — or hours, like meditation.

Researchers at the University of Leiden in The Netherlands compared the task-based convergent (finding common links) and divergent (developing possibilities) thinking of different types of meditators; open-monitoring meditators, those who did not fixate on one particular thing, were better at thinking divergently than were focused-attention meditators, who mentally fixated on one thing. While meditation might not always be appropriate, the deliberate distancing from the addictive habit of information intake will always improve creativity.

College students, like the young members of Stanford’s d. school, have the ability to be just as, if not more, creative. Arguably, we have even more creative potential, as we tend to have a better grasp on the tools available to be creative (like social media) than do older generations. The missing link, then, is the conscious effort of combining the requisite perspective, risk, and space, to act on that potential.

Gabler argues that our generation’s characteristic “information overload” prevents us from generating “big ideas.”

The term “information overload” was actually coined by Alvin Toffler in his book Future Shock,* to describe what he saw as an overwhelming flood of information, which was, in his opinion, a major contributor to “shattering stress and disorientation.” One might assume this was simply an apt description of today’s 24/7 media platforms and the cumulative effect of YouTube, Facebook, Twitter and the countless news channels with their split screen formats blaring above the ubiquitous crawl at the bottom of the screen. In fact, Toffler coined this phrase over 40 years ago. That’s 20 years before the explosion of cable, the rise of the Internet and the advent of the iPhone and BlackBerry.

Each generation has its own version of “information overload” (more entertainingly known as “infobesity”). Even in the 1st century AD, Seneca bemoaned, “the abundance of books is distraction.”  The modern equivalent of “books” is the Internet, and the next technological boom will bring a new medium to complain about.  Gabler, like the Senecas before him, can only be proven wrong by the progressive thinking and innovation that has and will continue to shape human history. The onus of this challenge falls squarely on our shoulders: it’s time to get creative.

 

*As a fun addendum, check out Orson Welles narrating this obscure documentary based on Toffler’s book — the paranoia reaches humorously hyperbolic levels.

Caitlin Gilbert Caitlin Gilbert Caitlin is a senior Neurobiology major at Georgetown University, where she also minors in Linguistics. In addition to being a "Voices" contributor for NextGen Journal, she is a research assistant in a developmental neurobiology lab at Georgetown. She thoroughly enjoys working towards a dual career in research and medicine, especially when every teacher, peer, and friend suggests automobile insurance as a better occupation.

, Tags: , , , , , , , , , , , , , , ,

i Join The Conversation

f Facebook

R Most Popular

q Most Recent

Mike Trivella NEW Oct 2011 A Different Pair of Shades
TalentEarth Talent Earth Connects Job Seekers
headshot maeve wall On Mental Health, Come Out of the Woodwork
Dan Gorman All Good Things….