When I come across pieces like this I try to break them down into their component parts.
Part 1. The author of the editorial, Neal Gabler, doesn't like the Atlantic's most recent cover story. A solid premise upon which to build an editorial. I dislike cover stories all the time. Insert lame joke about a skin mag I've never read here.
Apparently what happened was this. The Atlantic filed a writ of intent with the International Magazine Consortium, which regulates the content of all print media, informing them that their cover story for July would be about ideas. However, when it actually came time to write the story, the good folks at the Atlantic were dismayed to discover that there were no ideas to put in their story. Obviously, they couldn't change the title; the writ had already been filed, and everyone knows the IMC is brutal to those who don't fulfill writs of intent. You could get your status as a magazine revoked, meaning we'd have to henceforth refer to it as a zine called Atlantik! Or you could get a visit from the Jeeps. And we all know what that means. So there was no choice but for the beleaguered editors of the Atlantic to fill their article with idea-like food product and hope no one noticed.
Thank God for Gabler and his eagle eye.
2. In the past, people had ideas, but they don't anymore. For proof, Gabler matches public intellectuals of the past to their soundbitiest ideas - the Feminine Mystique to Betty Friedan, the Big Bang Theory to...Carl Sagan. I'd say Stephen Hawking is turning over in his grave, but he'd need help. Sagan also died in 1996, so we're clearly not talking about the distant past. Gabler also points out that public intellectuals aren't invited on late-night TV anymore, which must be quite a blow to Stephen Colbert, as he only invited Nobel Prize-winning biologist Ronald DePinho, Bernard-Henri Lévy, and computer-age sociologist Sherry Turkle on his show for their looks.
REALLY disappointed to find that Julie Taymor and Julie Newmar aren't the same person.
Look, I get it. There is a LOT of nonsense in the world. I just don't buy - have NEVER bought - the argument that there's more now than there was before.
Partly because I was a history major in college, so I've had to spend an unfortunate amount of time mucking around in the idiocies of the past. Example; medicine. In the last two hundred years we have made amazing strides in epidemiology, diagnosis, prophylactic care, hygiene, public health, and the treatment of disease. But that only came after around eighteen hundred years of bleeding sick people to let the humors out and packing wounds with poultices made of herbs, goose fat, and little replicas of dead saints.
Again, not saying people were stupider then or are smarter now. There's plenty of evidence-based medicine among certain natural healing communities, and there's a lot of conclusions made by supposedly modern medicine that are spurious. Look at psychiatry. Fifty years ago homosexuality was a disease, and in the next edition of the DSM it's likely that Asperger's will be removed for bad diagnostic criteria.
It's the nature of rational inquiry to always be shooting from a moving train, but that means that you can usually only tell if you hit something if you look behind you. That's the real problem with assuming that the age of ideas is over; you can only recognize a great idea in hindsight. The same scientific culture that spawned Marie Curie spawned Franz Joseph Gall, the same political age that birthed Woodrow Wilson birthed William Jennings Bryan.
Looking at a handful of great, epoch-making, world changing ideas as evidence for the existence of an age of titanic intellect is an example of the logical fallacy known as availability heuristics; the belief that, because you know of a few powerful thoughts from a given time period, that time period must have been characterized by a preponderance of powerful thoughts (and, it goes without saying, a popular contempt for weak ones). The truth is we tend to remember the most sensational aspects of the past - the great ahead-of-their-time ideas and the hilariously dumb ones, and very little of the vast, vast middle. So the vast middle we live with TODAY looks like a new development, one which is naturally to be regretted.
I'm certain that, if you stopped an average person on the street in the 20s, they'd be as likely to be talking about who Clara Bow was dating and what Alice Roosevelt wore to a state dinner as we'd be likely to discuss the conjugal exploits of any one of the Kardashians. Difference is, of course, we have the Internet, so rather than disappearing into the ether for contemplation only by aliens that want to dress up as our dad, now we have to live with these inane thoughts all the time. Which brings me to...
3. We are apparently living in a 'post-idea' world. Now we're at the meat and potatoes. The post-idea world means that we have too much information to think about what we know. This is mostly the fault of the Internet, as well as the fact that, apparently, people have started making small talk, which they didn't do before.
We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.
This distinction between soft, interpersonal data collection and hard-hitting, serious thinking will be familiar to anyone who has carried a working vagina into a scholastic environment at some point within the last 50 years. If we weren't so distracted by feelings, the narrative goes, we would have more ideas. And who's polluting us with these feelings? People who weren't part of the discussion, by and large, when the great ideas Gabler misses were being dreamed up.
Of course, the other side of this narrative is just as familiar - when you feel like your specialization is under siege from arrivistes, you dismiss their contributions and batten down your hatches - make the ideas whose passage you're lamenting as narrowly defined as possible. In the rest of the essay, Gabler goes on to discount the ideological contributions of a) scientists, b) people who make money, c) people who use Twitter, and d) people he, Gabler, has never heard of. This is convenient, as it means that anyone with an ounce of curiosity who follows ideas today, and can therefore come up with twenty names of great thinkers at the drop of a hat to disprove his supposition, will be disproved outright.
I'm not sure what kind of thinker, if they existed today, WOULD count for Gabler - they'd have to come from a kibbutz on the dark side of the Moon where they hadn't been polluted with extraneous information from the last thirty years or so. I do know that the other benefit of this kind of battening is that it enforces artificial purity standards of ideas - because there are HUGE ideas that people talk about all the time.
Here's one - the Singularity. Basically, the idea that eventually a machine will be developed whose cognitive processes are so indistinguishable from organic consciousness that the distinction between man and machine will become meaningless. Do I think this is likely? No, but perhaps not less likely than proletarian revolts that lead to the creation of communist utopias. The point is, by any measure the Singularity meets the criteria of an Idea - it's a philosophical construct that allows for a radical reunderstanding of the world. You can engage passionately with it, debate its merits, expend scholarly energy proving or disproving its tenets. And it simply could not have come into existence in a world without advanced computing and the Internet.
Who discusses the Singularity, though? Mostly, college students and conspiracy theorists - because the real benefit of artificial purity standards for ideas is that you can discount them based not on what they are but who they come from. If there's any commonality to the type of thinking that Gabler says is non-idea thinking, it's GENRE: Ideas are supposed to come from university-trained humanities overlords and are disseminated down a carefully ordained print hierarchy to a grateful public. If they come from elsewhere...you might as well show up to a semiotics conference with the Compleat Works of Anne McCaffrey.
If Gabler's arguing that THAT world of ideas - the ones where the Idea producers don't have to sully themselves with contact with the Idea consumers (sometimes even, Heaven forbid, in 140 character increments), that those two roles will be forever separated, then I agree with him. But the world of ideas we have now is better. Madder, certainly - there's more people who can read, who can write, than ever before, and the bandwidth for them to work out their cognitive processes is in nearly every medium cheap and infinite - and less hierarchical. But better. I can start off doing a Google search for Ric Ocasek's birthday and end up reading a monograph on string theory three hours later - and I and everyone I know and everyone you know IS thinking about those things they read, passionately. This is exciting times.
There was a great NPR article a few months ago that said there are two ways to cope with the fact that, no matter how hard you try, you can't engage with every idea you want to in your lifetime. Some people cull - meaning, decide sight unseen that some stuff is just not going to be worth their time - and some people surrender to the fact that they won't get everything and just enjoy the ride.
I agree with that, but I'd replace cull and surrender with climb and surf. For climbers, knowledge is supposed to exist in stacks, with some stuff at the bottom and other things higher up. The cream, it's assumed, is supposed to rise to the top. And becoming well-educated should be a climbing process, starting at the bottom with vulgar, simple things and ascending higher and higher until you get to the really meaningful stuff. So when confronted with the idea that nothing, no matter how beautiful and sacred, will ever stay still, will ever be above reinterpretation or reevaluation as the needs of people and culture evolve, some people throw up their hands and say, this system is broken. Once upon a time it might have worked right, but not anymore. I can't climb it. I give up.
Or knowledge can be an ocean. Oceans are cooler than mountains. There are more animals, more secrets, more places to play. There's more motion, more danger, and no stability. You can drown in them, or you can float lazily. And if you cultivate the discipline, you can surf on them, and do the next best thing to flying.
Hard to sell, though, whereas half-baked lamentations for The Good Old Days will always grab an editor by the pocketbook.