Just The Facts, Ma’am…

Shades of Joe Friday!

There really are incredible resources on the Internet. Granted, it can be hard to locate them  in that ever-burgeoning sea of spin, propaganda and conspiracy theories, but they exist. Last week, I blogged about “ProCon,” a site that presents the arguments made by contending sides on so-called “hot button” issues.

Today, I want to highlight USA FACTS, a site devoted to presenting a data-driven portrait of the American population, our government’s finances, and government’s impact on society.

We are a non-partisan, not-for-profit civic initiative and have no political agenda or commercial motive. We provide this information as a free public service and are committed to maintaining and expanding it in the future.

We rely exclusively on publicly available government data sources. We don’t make judgments or prescribe specific policies. Whether government money is spent wisely or not, whether our quality of life is improving or getting worse – that’s for you to decide. We hope to spur serious, reasoned, and informed debate on the purpose and functions of government. Such debate is vital to our democracy. We hope that USAFacts will make a modest contribution toward building consensus and finding solutions.

The site offers a brief description of its genesis:

USAFacts was inspired by a conversation Steve Ballmer had with his wife, Connie. She wanted him to get more involved in philanthropic work. He thought it made sense to first find out what government does with the money it raises. Where does the money come from and where is it spent? Whom does it serve? And most importantly, what are the outcomes?

With his business background, Steve searched for solid, reliable, impartial numbers to tell the story… but eventually realized he wasn’t going to find them. He put together a small team of people – economists, writers, researchers – and got to work.

Ultimately, Ballmer partnered in this effort with Stanford’s Institute for Economic Policy Research (SIEPR), the Penn Wharton Budget Model, and Lynchburg College.

The site does something I have long advocated–it collects numerical data about America’s state and federal governments that has previously been available only through a patchwork of reports and agency sites, and assembles it in a usable, comprehensible (and comprehensive) format. Want to know how much government took in in taxes last year and the year before? Where those dollars came from? What was spent? What a “trust fund” is and what its assets are? How many people work for government? What governments owe?

It’s all there, along with population demographics.

The charts are simple, the text understandable.

The next time one of the talking heads on cable makes an assertion about job creation under President X, or deficits amassed under President Y, his numbers can be checked in real time. (Like the t-shirt says, “Trust Data, Not Lore.”) (Star Trek fans will get that…)

These days, it sometimes seems as if partisans are uninterested in those pesky things we call facts; indeed, they seem to resent those of us who prefer to deal with accurate data. This site isn’t for them–but it is definitely a great resource for the rest of us!

Comments

Not a Pretty Picture

Vox recently reported on a research project that led to publication of a book titled “Everyone Lies.” If you are one of those people who suspects that humanity is filled with people who are anything but noble, it’s apparently the book for you.

Stephens-Davidowitz was working on a PhD in economics at Harvard when he became obsessed with Google Trends, a tool that tracks how frequently searches are made in a given area over a given time period.

He spent five years combing through this data. The idea was that you could get far better real-time information about what people are thinking by looking at Google Trends data than you could through polls or some other survey device.

It turns out he was right.

Whatever face people assume when they are interacting with other humans, they are clearly far more candid with Google; during the 2016 Presidential campaign, Stephens-Davidowitz tallied numerous searches with racist epithets and “jokes,” finding that those spiked across the country during Trump’s primary run, and not merely in the South. The data painted a picture of a racially polarized electorate that responded to what he termed Trump’s “ethno-nationalist” rhetoric.

There were earlier signs, too. On Obama’s 2008 election night, Stephens-Davidowitz found that “one in every hundred Google searches that included the word ‘Obama’ also included ‘KKK’” or the n-word. Searches for racist websites like Stormfront also spiked.

“There was a darkness and hatred that was hidden from traditional sources,” Stephens-Davidowitz says. “Those searches are hard to reconcile with a society in which racism is a small factor.”

The Google search data didn’t just confirm the suspicions of many of us about the extent of racism (and the extent to which irrational hatred and opposition to Obama was based upon the color of his skin). One of the most startling findings was that America is returning to the era of “back alley” abortions–experiencing a crisis of self-induced abortions in places where draconian state laws have cut off most access to abortion clinics.

I’m pretty convinced that the United States has a self-induced abortion crisis right now based on the volume of search inquiries. I was blown away by how frequently people are searching for ways to do abortions themselves now. These searches are concentrated in parts of the country where it’s hard to get an abortion and they rose substantially when it became harder to get an abortion.

As the author notes, people share things with Google that they don’t tell anyone else, not even family members, close friends, anonymous surveys, or their doctors.

People feel very comfortable confessing things to Google. In general, Google tells us that people are different than they present themselves. One way they’re different, I have to say, is that they’re nastier and meaner than they often present themselves.

I’ve done a lot of research on racism, for example, and I was shocked by how frequently people make racist searches, particularly for jokes mocking African Americans. This concealed ugliness can predict a lot of behaviors, particularly in the political realm.

The data also sheds light on anti-Muslim attitudes.

One of the studies I talk about in the book is a study of Islamophobia. It’s not really Islamophobia, it’s like Islamo-rage, or something like that. It’s essentially people with horrifically violent thoughts toward Muslim Americans. People search things like “kill Muslims” or “I hate Muslims” or “Muslims are evil.” These people are basically maniacs and you can actually see minute-by-minute when these searches rise and when these searches fall.

What’s interesting about this is that we’re talking about a relatively small group of maniacs. The average American does not search “kill Muslims” or “I hate Muslims”; it’s a small group but it’s also an important group because these types of people can create a lot of problems. They are the ones who tend to commit hate crimes or even murder Muslims.

Clearly, Google and other emerging technologies can teach us a lot about ourselves. Of course, as the old joke goes, “This book taught me much that I did not wish to know.” The question is whether we can use these hitherto unavailable insights in ways that improve us. Given our irrational responses to data we already possess (responding to episodes of gun violence by advocating for more guns and less regulation, as we’ve just seen again, for example), it’s hard to be optimistic.

Comments

About That Minimum Wage Debate….

Who was it who coined the immortal observation that “It ain’t what we don’t know that hurts us–it’s what we know that just ain’t so”?

I thought about that when I read a recent report  about job creation experience in states that had recently raised their minimum wage.

Economists at Goldman Sachs conducted a simple evaluation of the impact of these state minimum-wage increases. The researchers compared employment changes between December and January in the 13 states where the minimum wage increased with the changes in the remainder of the states, and found that the states where the minimum wage went up had faster employment growth than the states where the minimum wage remained at its 2013 level.

When we updated the GS analysis using additional employment data from the BLS, we saw the same pattern: employment growth was higher in states where the minimum wage went up. While this kind of simple exercise can’t establish causality, it does provide evidence against theoretical negative employment effects of minimum-wage increases.

It has always seemed reasonable to assume that higher wages would depress job creation.  What that simple logic missed, however, were the many factors other than wage rates that influence the decision whether to add employees. The cited study joins an overwhelming body of evidence that the simple equation is wrong.

It’s another one of those things we know that just ain’t so.

Comments

Criminal Justice by the Numbers: The Moneyball Approach

“The approach is simple. First, government needs to figure out what works. Second, government should fund what works. Then, it should stop funding what doesn’t work.”

That, in a nutshell, is Peter Orszag’s summary of a recent, detailed set of recommendations  issued by the Brennan Center for Justice at NYU’s law school. He calls it “the Moneyball approach”–going by statistical evidence rather than gut impressions.

The Brennan Center’s proposal, Reforming Funding to Reduce Mass Incarceration, is a plan to link federal grant money to modern criminal justice goals – to use that grant money more strategically– to promote innovative crime-reduction policies nationwide and to reduce crime, incarceration and the costs of both.

The proposal, titled “Success-Oriented Funding,” would change the federal government’s $352 million Edward Byrne Memorial Justice Assistance Grant (JAG) Program, by focusing on the government’s current criteria for determining whether a grant has been successful.  (Fortunately, given the gridlock in Congress, It could be implemented by the DOJ– it wouldn’t require legislation.)

The fundamental premise of the program is that “what gets measured gets done.” If you are measuring the number of arrests, you’ll increase arrests. If you are measuring reductions in crime or recidivism, you’ll get reductions in crime and recidivism.

A blue-ribbon panel including prosecutors and defense lawyers, Republicans and Democrats, academics and officeholders has signed on to the proposal. As Orszag noted in the Forward:

Based on rough calculations, less than $1 out of every $100 of government spending is backed by even the most basic evidence that the money is being spent wisely. With so little performance data, it is impossible to say how many of the programs are effective. The consequences of failing to measure the impact of so many of our government programs — and of sometimes ignoring the data even when we do measure them — go well beyond wasting scarce tax dollars. Every time a young person participates in a program that doesn’t work but could have participated in one that does, that represents a human cost. And failing to do any good is by no means the worst sin possible: Some state and federal dollars flow to programs that actually harm the people who participate in them.

Figuring out what we taxpayers are paying for, and whether what we are paying for works.

Evidence. What a novel idea!

Comments

Translating Anecdote into Data

There’s an old academic adage that reminds researchers “the plural of anecdote is not data.” It’s a worthwhile caution against drawing too broad a conclusion from one or two examples.

This caution came to mind last night at my grandson’s tenth birthday party . (This was the one for family–I am very grateful that my son and daughter-in-law separate the wild kid’s celebration from the more staid event for grandparents and aunts and uncles.) My brother-in-law said something about “those Republicans” in a way that made it clear he did not consider himself to be one of them. This from the person who is easily the most conservative member of our family.

Nor is ours a family that was considered “liberal” until relatively recently. My husband and I met when we served in a Republican Administration; my sister and brother-in-law were active Republicans (my sister was one of those good citizens who polled her neighborhood for the precinct committee person). Our daughter used to work for a group called Republicans for Choice (yes, Virginia, there really were pro-choice Republicans once upon a time); she now works for a group called Democrats for Education Reform.

Little by little, as the GOP became more and more extreme, more inhospitable to science, diversity and modernity, we left.

This is one family, one anecdote.

I wonder how widespread our experience is, and whether there’s any data to confirm a wider exodus.

Anyone know?

Comments