Although I rarely have time to participate in the conversations (I have what is quaintly called a “day job”), I do read most of the comments posted to this blog. A few days ago, one commenter, in an aside to the point being made, suggested that the US should stop “wasting” money on space exploration.
I disagree, because I think the evidence is overwhelming that money spent on exploration and research is invested, not wasted. And the return on that investment has been impressive, as articles from Investopedia and elsewhere have documented.
Leaving aside the benefits that cannot be monetized– satisfaction of our human urge to explore, to understand, to seek out new life and new civilizations (okay, I’m a Star Trek fan)–here are just some of the very concrete returns on America’s investment in NASA:
Aircraft collision-avoidance systems
Cordless power tools
Corrosion resistant coatings for bridges
Digital imaging
Ear thermometers
GPS (global positioning satellites)
Household water filters
Hydroponic plant-growing systems
Implantable pacemakers
Infrared handheld cameras
Kidney dialysis machines
LASIK corrective eye surgery
Memory foam mattresses
Scratch-resistant sunglasses
Safety grooving on pavement
Shoe insoles
Virtual reality
Weather forecasting
Space exploration has also expanded human knowledge and contributed to research in education, healthcare, pollution control, rain forest protection and transportation. These and many other NASA-inspired advancements have a profound effect on life on Earth by improving health, safety, comfort and convenience. Entire industries have been built on space technology, including personal computers and natural resource mapping. As one of the nation’s strongest industries and an employer of nearly one million Americans, the aeronautics industry uses NASA-developed technology on nearly all aircrafts.
These benefits have been produced by an agency with the smallest budget of any of the major agencies in the federal government. NASA’s share of total U.S. Federal outlay has consistently remained below 1%, and during the past five years, closer to 0.5%.I think we get our money’s worth. We surely get more value per dollar than we get from our extravagant defense spending.
And unlike money spent on weapons, we are enhancing rather than degrading our humanity.
I teach an undergraduate course in Media and Public Affairs. It’s a challenging course to teach, because every year, the definition of “media” changes, and the erosion of the part of the profession called “journalism” becomes more pronounced.
In a recent New York Times column, written in the aftermath of the uprising at the University of Missouri (and the indefensible conduct of a journalism school adjunct professor during that uprising), Timothy Egan addressed the current environment:
I’d like to believe that this video snippet was just another absurdity of campus life, where the politics are so vicious, as they say, because the stakes are so small. But it goes to a more troubling trend — the diminishment of a healthy, professionally trained free press.
For some time now, it’s been open season on this beaten-down trade, from the left and the right. Into that vacuum have emerged powerful partisan voices, injecting rumors and outright lies into the public arena, with no consequence. At the same time, it’s become extremely difficult for reporters who adhere to higher standards to make a living. Poverty-level wages have become the norm at many a town’s lone nonpartisan media outlet.
More than 20,000 newsroom jobs have been lost in this country since 2001 — a work force drop of about 42 percent. The mean salary of reporters in 2013 was $44,360; journalists now earn less than the national average for all United States workers, according to the Bureau of Labor Statistics.
With the loss of the traditional business model, a new media has emerged–providing celebrity gossip and “infotainment,” pandering to partisan loyalties and pre-existing prejudices, and–rather than competing to tell us what we need to know about our government and society– vying to see what words and phrases will trigger the most “clicks.”
As I told my students at the outset of the current semester, it is no longer possible to teach this course in the conventional way–a professor introducing students to a body of agreed-upon scholarship. Instead, the class has become a joint expedition into a wild and wooly information environment that is evolving on a weekly basis– and a joint exploration of the ways in which the loss of that quaint thing we used to call “journalism” is affecting our ability to engage with each other in a democratic system.
How long can this continue before we no longer share a common vocabulary–or reality?
Just as there is a difference between job training and education, there’s a difference between intelligence and skill.
A recent DailyKos post by a neurologist disputed the notion that being a neurosurgeon should be taken as evidence that Ben Carson is smart. The author distinguished between genuine intellect and technical skill.
“Smart” is a multifaceted cognitive feature composed of excellent analytical skills, possession of an extensive knowledge base that is easily and frequently augmented, possession of a good memory, and being readily curious about the world and willing, even eager, to reject previously accepted notions in the face of new data. Being smart includes having the ability to analyze new data for validity and, thinking creatively, draw new insights from existing common knowledge….
My point is that neurosurgeons are not automatically smart because they are a neurosurgeon. To get through training and have any sort of practice they must be disciplined, have immense ego strength, a reasonably good memory, and have mental and physical stamina. However, like many other doctors, they are not always smart. Neurosurgeons, like other surgeons, can be outstanding technicians but that is different than being intellectually brilliant. A truly brilliant internal medicine specialist once told me that “you can train anyone to perform a procedure”. I’ve seen surgical assistants, not doctors but physician’s assistants that specialize in surgery, perform technically difficult procedures with stunning alacrity. It’s the old rule: do something enough times and you will get damn good at it.
I thought about the difference between skill and intellect–both of which are important, but which are not the same thing– when I heard Marco Rubio’s astonishing statement in the recent GOP debate that “Welders make more than philosophers. America needs more welders and less [sic} philosophers.”
Not only was Rubio wrong on the facts (philosophers actually earn more than welders), but think about what this sneering dismissal of the worth of intellectual pursuits tells us about his worldview. Clearly, Rubio (and apparently everyone on that debate stage) evaluates the worth of any profession solely on the basis of what it pays. If welders did make more than teachers, then welders would obviously be superior.
I’m a big fan of market economics, but the fact that the market rewards pornographers more than it rewards nurses doesn’t mean we need more pornographers and fewer nurses.
Let’s be clear: the skilled trades are important and honorable. But scholarship, research, scientific inquiry and yes, philosophy and theology, are essential to human progress. They also give our lives meaning and purpose.
Socrates–a philosopher– said the unexamined life is not worth living. There wasn’t anyone on that debate stage who appears to understand that sentiment, let alone agree with it–and that is terrifying.
I’ve run across several columns/posts recently focused on a distinction–one that is gaining in importance–between Ignorance and anti-knowledge, or what we might call intentional or stubborn ignorance. In the aftermath of yet another presidential debate, the distinction merits consideration.
As Lee McIntyre put it in last Sunday’s New York Times,
We’ve all heard the phrase “you’re entitled to your own opinion, but not your own facts.” Opinions are the sorts of things about which we can take a poll. They are sometimes well-informed, but rarely expected to be anything other than subjective. Facts, on the other hand, are “out there” in the world, separate from us, so it makes little sense to ask people what they think of them. As the comedian John Oliver so aptly put it… “You don’t need people’s opinion on a fact. You might as well have a poll asking: ‘Which number is bigger, 15 or 5?’ Or ‘Do owls exist’ or ‘Are there hats?’”
McIntyre distinguishes between skepticism–withholding belief because the evidence does not live up to the standards of science–from denialism, which is the refusal to believe something even in the face of what most reasonable people would take to be compelling evidence.
At Dispatches from the Culture Wars, Ed Brayton has a similar rumination on the phenomenon he calls “virulent ignorance,” and quotes from an article by former congressional staffer Mike Lofgren:
Fifty years ago, if a person did not know who the prime minister of Great Britain was, what the conflict in Vietnam was about, or the barest rudiments of how a nuclear reaction worked, he would shrug his shoulders and move on. And if he didn’t bother to know those things, he was in all likelihood politically apathetic and confined his passionate arguing to topics like sports or the attributes of the opposite sex.
There were exceptions, like the Birchers’ theory that fluoridation was a monstrous communist conspiracy, but they were mostly confined to the fringes. Certainly, political candidates with national aspirations steered clear of such balderdash.
At present, however, a person can be blissfully ignorant of how to locate Kenya on a map, but know to a metaphysical certitude that Barack Obama was born there, because he learned it from Fox News. Likewise, he can be unable to differentiate a species from a phylum but be confident from viewing the 700 Club that evolution is “politically correct” hooey and that the earth is 6,000 years old….
Anti-knowledge is a subset of anti-intellectualism, and as Richard Hofstadter has pointed out, anti-intellectualism has been a recurrent feature in American life, generally rising and receding in synchronism with fundamentalist revivalism…
To a far greater degree than previous outbreaks, fundamentalism has merged its personnel, its policies, its tactics and its fate with a major American political party, the Republicans.
Buttressing this merger is a vast support structure of media, foundations, pressure groups and even a thriving cottage industry of fake historians and phony scientists. From Fox News to the Discovery Institute (which exists solely to “disprove” evolution), and from the Heritage Foundation (which propagandizes that tax cuts increase revenue despite massive empirical evidence to the contrary) to bogus “historians” like David Barton (who confected a fraudulent biography of a piously devout Thomas Jefferson that had to be withdrawn by the publisher), the anti-knowledge crowd has created an immense ecosystem of political disinformation.
I think it is this support structure that is most worrisome, because it enables what political psychologists call “confirmation bias,” the tendency we all share to look for evidence that confirms our pre-existing opinions.
Thanks to modern technologies, any crank or ideologue can create the “evidence” we desire–at least, if we aren’t too fussy about what constitutes evidence.
There’s nothing wrong with genuine ignorance; it can be corrected with credible information. Intentional, stubborn, “faith-based” ignorance, on the other hand, will destroy us.
Nothing causes Americans to clutch their pearls and get their panties in a twist like arguments about religion. Let Starbucks omit snowflakes from their seasonal cups, and the fundamentalists are up in arms–they just know that those plain red cups are an attack on Jesus!
Is a failure to specifically endorse a religion (a la the offense of plain red cups and “Happy Holidays”) really equivalent to an attack? (And not so incidentally, don’t you people screaming about these assaults have lives to live and other things to do?)
Americans don’t agree on the definition of religion, let alone what constitutes an insult. What is the difference between a religion and a cult? Between religion and ideology? Are some religious beliefs better for society than others, and if so, which ones and why? We may not be able to answer these questions, but most of us seem firmly convinced that whatever it is, religion is good for us.
In the aftermath of the shooting at Umpqua Community College, for example, Fox host Bill O’Reilly cited weakening religion as the culprit. “As the world becomes more secular,” he declared, “civilized restraints to bad behavior drop.” Former Arkansas Gov. Mike Huckabee offered similar sentiments after the 2012 school shooting in Newtown, Conn., blaming such wanton violence on the fact that “we have systematically removed God from our schools.”
The theory is simple: If people become less religious, then society will decay. Crime will skyrocket, violence will rise, and once-civilized life will degenerate into immorality and depravity. It’s an old, widespread notion. And it’s demonstrably false.
If it were true that when belief in God weakens, societal well-being diminishes, then we should see abundant evidence for this. But we don’t. In fact, we find just the opposite: Those societies today that are the most religious — where faith in God is strong and religious participation is high — tend to have the highest violent crime rates, while those societies in which faith and church attendance are the weakest — the most secular societies — tend to have the lowest.
Zukerman notes–quite properly–that correlation is not the same thing as causation. But the correlations are certainly striking:
According to the latest study from the Pew Research Center, the 10 states that report the highest levels of belief in God are Louisiana, Arkansas, Alabama, Mississippi, Georgia, South Carolina, North Carolina, Kentucky, Tennessee and Oklahoma (tied with Utah). The 10 states with the lowest levels of belief in God are Maine, Vermont, Connecticut, Rhode Island, New Hampshire, Massachusetts, New York, Alaska, Oregon and California. And as is the case in the rest of the world, when it comes to nearly all standard measures of societal health, including homicide rates, the least theistic states generally fare much better than the most theistic. Consider child-abuse fatality rates: Highly religious Mississippi’s is twice that of highly secular New Hampshire’s, and highly religious Kentucky’s is four times higher than highly secular Oregon’s.
Given self-proclaimed “Christians” proclivity to wax hysterical over the loss of snowflakes on a Starbucks cup, I think we might infer some measure of causation…