Conducting Cost-Benefit Analysis: The role of assumptions

I am currently working on a cost-benefit analysis on water quality in Ohio. After spending two weeks preparing and thinking about this current cost-benefit analysis, I am starting assembling some preliminary models. This step is often known as projecting the outcomes and is almost always the most difficult part of a cost-benefit analysis. 

One of the main challenges when trying to project outcomes is fully understanding what sort of assumptions we as analysts have to make in order to make projections at all. Our assumptions define how we think about our analysis and they shape our results. Assumptions cannot be avoided in any sort of analysis, but as analysts it is our job to be fully aware of which ones we make. 

Generally speaking, each additional assumption makes the analysis easier to perform and to understand, but it makes it more difficult to generalize the results. A common misconception is that having stronger assumptions makes analysis less accurate or less meaningful, but sometimes making additional assumptions actually improves our models. 

This tradeoff is extremely common and is even described mathematically by the bias-variance tradeoff. In statistical terms, we evaluate estimators using a criteria called mean squared error which can be written as the sum of an estimator's variance and its squared bias. Decreasing one of those quantities often means increasing the other. In our context adding more assumptions simplifies the models and reduces the variance, but because we are assuming more things we may be adding some bias to our results. It is ok to keep some bias, especially if we understand that it exists and it lowers the variance of our estimate making it more useful.

At the beginning of my model building, I chose to make as many assumptions as I had to in order to get some preliminary results. It may not make sense to assume that certain things will remain constant well into the future, but because this is an iterative process it is important to get a working model to build off of. Building a simple model can give you a general idea of potential results and can give you an answer which then you can build from by refining your model.

Making assumptions is also useful because they help you identify what things you need to learn more about. After going through and making all of the first models, it is important to ask which assumptions can I potentially get rid of. Does removing these assumptions make the model sufficiently more generalizable?

As analysts, it is our job to make clear that predicting the future is extremely difficult. I once heard someone say that making predictions is like trying to drive a car by only looking in the rearview mirror. Sometimes trying to work with real data forces us into making strong assumptions. Still, by using the best available methods and fully understanding the ramifications of the assumptions we make, we can help policy makers decide on the best course of action with better information than they would have on their own.

This blog post is part of a series of posts on conducting cost-benefit analysis for newcomers by Scioto Analysis Policy Analyst Michael Hartnett.

Is it time for all Ohio children to get free meals at schools?

Last month, the Ohio State Board of Education officially recommended that the state of Ohio use American Rescue Plan Act dollars to provide free breakfast and lunch to Ohio students through the end of the year.

The federal government has been providing free lunch to children from low-income families since 1946, when President Harry Truman signed the National School Lunch Act into law. The program was expanded in 1966 through the Child Nutrition Act, which added breakfast and summer meals to the program.

The federal free school lunch and breakfast program is one of the largest antipoverty programs in the United States. In a study Scioto Analysis released last year, we estimated from American Community Survey and Current Population Survey data that more than 8,800 Ohioans were pulled out of poverty in 2018 by the federal free school lunch and breakfast program. 

Nationally, the impact is even larger. The Census Bureau estimates 300,000 Americans were pulled out of poverty by school lunch and breakfast programs in 2020, and an additional 3.2 million were pulled out of poverty by a combination of SNAP food assistance and free school lunch.

Free lunch has a big impact on the children who receive them. Half of the food consumption for these children comes from free or reduced-priced meals. Recent expansions in availability of free lunch has also meant better outcomes for children. Recent studies of expansions of free school lunch programs suggest the expansions have led to increases in math scores for students, especially among elementary school students and Hispanic students. These expansions have also led to decreases in suspensions among white elementary-school age boys.

School lunch programs have been expanding for years. In 2011, schools in Illinois, Kentucky, and Michigan began piloting a program that would make all children in low-income districts eligible for free lunch. By 2019, two-thirds of all low-income schools across the country were providing free lunch meals to their children.

The pandemic brought a seismic sea change to this landscape. The United States Department of Agriculture, which administers the free school lunch and breakfast programs, suspended all eligibility requirements for free and reduced meals, making free lunch universal with a single administrative change.

Last summer, this expansion lapsed, and Ohio reverted to a limited system of provision of free and reduced priced lunches.

Five states—California, Maine, Massachusetts, Nevada, and Vermont—have passed legislation providing no-cost meals to all students. Pennsylvania recently joined this list, passing legislation making breakfast free in schools.

It seems that the biggest reason free school lunch and breakfast has not been expanded even more aggressively in the United States is because of simple budget constraints. In the particular policy the state Board of Education calls for, American Rescue Plan dollars have a number of different alternate uses. I have not done an analysis of all possible uses for these funds, but I have to imagine this would be a use that would yield relatively high economic and equity benefits.

One potential objection to universal school meal programs is that they would mainly benefit middle-and upper-income households and do little for children from low-income households since low-income households are likely already covered by the current program. The benefits of administrative simplicity, though, have already led to universal provision in low-income schools. Providing this benefit to all children and then clawing back costs through progressive income taxation is likely a more efficient program design than a system focused on splitting hairs around eligibility.

The pandemic changed a lot of assumptions about the U.S. safety net. Maybe a permanent change could come to the program: maybe all children in Ohio should get free meals at school.

This commentary first appeared in the Ohio Capital Journal.

How should an analyst decide on a policy recommendation?

Deciding is maybe the strangest step of Eugene Bardach’s Eightfold Path of policy analysis. This is because the job of the analyst is generally not to say how the world should be, but to describe how it is and could be. One way to think of the Eightfold Path is to split the parts of the path into these three categories.

You could describe selecting criteria as an exercise in describing the world “should be.” After all, selecting criteria is about making value judgments about what a policymaker should care about. But we often lean on values universal enough (effectiveness, efficiency, equity) that any policymaker should be interested in them and we tailor our selection of criteria based on the desires put forth by the client, so that step is still more descriptive than prescriptive.

Deciding is a different story. 

In A Practical Guide to Policy Analysis, Bardach says that

[T]he object of all your analytic effort should not be merely to present the client with a list of well-worked-out options. It should be to ensure that at least one of them—and more than one, if possible—would be an excellent choice to take aim at solving, or mitigating the problem.

The reason this step is a leap from the previous six steps lies in its interaction with step four. Surely any policy analysis could include dozens of different criteria by which to evaluate a policy. But policy analysis is also by its nature time-limited, so selection of criteria cannot be exhaustive. Policy analysts are thus working with imperfect information.

In the classic understanding of the relationship between the policy analyst and the policymaker, the policymaker makes the decision of which policy to choose and the analyst provides information to make the decision with. Under this framework, the policy analyst deciding for the policymaker can be seen as inappropriate. It also can be seen as unhelpful: for some criteria—I’m thinking of “political feasibility” in particular—surely the policymaker has better information than the analyst does.

So why does Bardach include it? I would hazard to guess that it has to do with the actual desires of policymakers.

When I was a budget analyst at the Ohio Legislative Service Commission, one tension we dealt with when working with legislators was between analysis and recommendation. While our charge was to provide the policymakers with analysis, legislators would often ask analysts for their opinions on the policy. They weren’t looking just for advice, but for collaborators. And sure, maybe they wouldn’t make the exact same decision the analyst tells them to, but having that guidance was helpful to them.

Some analysts felt uncomfortable being asked a question like this because they felt that it undermined their credibility as analysts. They felt that the jump from “could be” to “should be” was one that an analyst shouldn’t make. But by putting yourself in the shoes of the decisionmaker, the analyst can bring clarity to their findings are guide a policymaker in a way that is helpful to them: by telling them what you would do if you were making the decision.

Ultimately, the policymaker will make the final decision. But if the policymaker is looking for a recommendation, give it to her. It will only make her job in deciding easier.

How do Policymakers Value Risk of Death Reduction?

How much should society pay to save a life? According to a recent meta-analysis published in the Journal of Benefit-Cost Analysis by the Cambridge University Press, $8 million is a good place to start. 

In policy analysis, this number is often referred to as the Value of Statistical Life (VSL). Many people are initially hesitant when they are presented with the idea of VSL, pointing out that it is unpleasant to assign a monetary value to life. So why should we put a dollar value on lives saved at all? 

Imagine that a local government is trying to figure out how many traffic lights to install in a city. Traffic lights are good because they make road intersections safer, but they also require resources to install and maintain. How many traffic lights should the city build?

If we assume that this local government has to raise taxes in order to finance these new stop lights, we need some way to measure the value they provide in order to reach the optimal solution. Each additional stop light may reduce traffic deaths by some small amount, but if they are only providing small benefits then it may be the case that those resources are better spent somewhere else. Would you be willing to pay an additional $1,000 in taxes in a year to reduce your chance of death by one in a million? These are the tradeoffs policymakers confront when they make decisions on behalf of the public.

Once we realize why it is important to have a measure for VSL, we must figure out the best way to calculate it. The article from the Journal of Benefit-Cost Analysis groups VSL studies into three main categories.

First are the studies that try to measure VSL through choices people make in the labor market. Hedonic wage studies use labor market data to see what sorts of workplace risk individuals are willing to accept for higher wages. While these estimates have some drawbacks (e.g. the labor market does not capture the entire population), they are based on reliable and easily accessible data. 

There are also studies that measure VSL by looking at individual decisions like people’s willingness to wear helmets while riding bikes. The authors mention that these sorts of studies often rely on researchers making significant assumptions and are therefore not used as often. 

The final type of VSL study tries to measure people's willingness to pay for safety through methods such as contingent valuation studies. The biggest benefit of these studies is that they can more specifically ask about different risks that may be more applicable to a public policy context. The drawback is that they hinge on stated preference for risk of death reduction, which can sometimes be different than the revealed preferences people make when faced with real decisions. Words don’t necessarily speak louder than actions.

This new study in the Journal of Benefit-Cost Analysis is unique because it does not get its information from other individual studies about VSL, but rather from other meta analyses about VSL. By taking into account the widest possible set of VSL estimates, the authors are able to get the best possible picture of where VSL is currently. 

The authors estimate that the central value for VSL is about $8.0 million, with the 90% confidence interval ranging from $2.4 – $14.0 million. This number largely falls in line with what we see policy makers actually use. The US Department of Transportation has said that they recommend the VSL be $10.9 million after adjusting for inflation. 

The literature on VSL continues to evolve as researchers work to better understand tradeoffs between small increases and decreases in risk of death and everything else in life. For policy analysts and policy makers, how human lives are impacted is the most significant part of any proposal. The better information we have about VSL, the more efficiently we can allocate our resources and reach the best possible outcomes.

Conducting Cost-Benefit Analysis: What am I Measuring?

Analysts often start a cost-benefit analysis with a policy option in mind. Sometimes, though, an analyst must tackle a policy question without predetermined policy options. This requires the analyst to spend serious time constructing policy options to address the given problem. This has been the focus of my last week working on Scioto Analysis’ current cost-benefit analysis. 

In policy analysis, this stage is often referred to as constructing the alternatives. “Alternatives” is a somewhat confusing phrase mainly used in policy analysis literature. When we talk about “alternatives,” what we mean are “policy options” worth considering.

The obvious alternative to consider in a cost-benefit analysis is the status quo. What is the current plan to address the issue of interest? What happens if policymakers do nothing and just let current trends continue? Knowing where you are currently is a great way to see what policy choices you have to adjust. 

Understanding the status quo is also important because projections about future trends could lead you to constructing different alternative policy options. For instance, a cost-benefit analysis of housing affordability policies in a city will require very different policy options if the population over the next ten years is projected to grow over the next ten years than if it is projected to stay flat.

Beyond the status quo, looking for similar situations in comparable states is a great way to find viable policy alternatives. Are neighboring cities or states dealing with the same problem? Have they been successful in their efforts? Are there further-off jurisdictions that might provide alternatives for more forward-thinking policymakers?

These sorts of general questions are a good way to get started when thinking about alternatives, but it is important to keep in mind that the more specific you can be with constructing an alternative, the better. When we are more specific about the mechanics of the policies we are analyzing, we can better predict what effects changes to current policy might have. 

For my current cost-benefit analysis, I was able to find that the policy choice I want to analyze has been adopted in slightly different forms across the country. Looking at other states in this context, I was able to find a few specific changes that the state of Ohio could adopt. 

Choosing policy alternatives from scratch can be a daunting proposition. But construction of alternatives can be one of the most useful steps in a cost-benefit analysis. If a policymaker is open to thinking of new solutions to a problem, this can be one of the most important parts of the analysis: bringing forth ideas in the policymaking process that weren’t there before. By fully understanding the current context of the problem and by finding how similar situations have been handled in the past, we can find practical, creative, and effective solutions to analyze and potentially implement in the public sector.

This blog post is part of a series of posts on conducting cost-benefit analysis for newcomers by Scioto Analysis Policy Analyst Michael Hartnett.

Scioto Analysis releases landmark income inequality study

This morning, Scioto Analysis released a landmark study on income inequality in Ohio. Analysts used data from the American Community Survey and Current Population Survey to assess the extent of inequality in the state.

“According to our analysis, a very high-income Ohioan makes five to nine times as much as a very low-income Ohioan,” said Scioto Analysis Principal Rob Moore.

Analysts found large returns to education, with workers without high school degrees making less than $10,000 a year on average and those with college degrees making upwards of $40,000.

Notably, race was a confounding factor in returns to education. While black Ohioans without high school degrees averaged the same amount of income as white Ohioans without high school degrees, black Ohioans with high school degrees, some college, or college degrees consistently averaged around $10,000 less in income than white counterparts.

Geography was also associated with inequality in Ohio. While very high-income households in rural areas averaged five to nine as much income as low-income households in rural areas, a high-income household in an urban area in Ohio had income anywhere from 12 to 21 times as high as a low-income household in an urban area.

Analysts also looked at different interventions to reduce inequality, using American Community Survey and Current Population Survey data to simulate different interventions to reduce inequality. Their simulations found that expansion of the state Earned Income Tax Credit and increasing the state minimum wage could have modest impacts on inequality across the state. They estimated that negative income tax proposal that provided cash to low income households using taxes levied on high income households would have a much larger impact.

“Traditional tools like Earned Income Tax Credits and minimum wage increases can ameliorate inequality,” said Moore, “but according to our simulations, a negative income tax would have an impact three times as large as either of those measures.”

This study is part of a larger project by Scioto Analysis to study well-being in the state of Ohio. UC Berkeley graduate students Nithya Nagarathinam, Julia Rosales, and Diego Villegas conducted analysis on this project.

Beginning a Cost-Benefit Analysis: Where Do I Start?

As the newest member of the team at Scioto Analysis, I have the opportunity this month to begin work on my first cost-benefit analysis. Cost-benefit analysis is one of the main tools analysts use to help policy makers understand the tradeoffs between policy options. 

The first step in a cost-benefit analysis, or any policy analysis for that matter, is to define the problem. Our research is only ever as good as the questions we ask. For this project, I am analyzing policies for cleaning up Ohio’s rivers, a topic close to home for us here at Scioto. 

I spent the first week of this project assembling evidence. I found that assembling evidence for a cost-benefit analysis is slightly different than for most other research projects, mainly because the scope of a cost-benefit analysis is often larger than research I have done in the past. I am most familiar with projects where the intervention and outcomes are prespecified. Policy choices almost always have far reaching side effects and the better we understand these the more accurately we can represent the trade-offs. 

I began having a general understanding of the evidence I was looking for, but it took a lot of reading to fully understand the scope of the question of how to improve water quality in Ohio. Seeking out reports from experts in this field brought my attention to a number of different side effects I would never have considered by myself. Even a brief conversation with a friend who studies marine biology helped me find new angles to explore!

After exploring how clean water can affect people’s lives, I finally had to make sure there were ways to monetize impacts. This is a cost-benefit analysis after all, and in order to compare the costs of policy changes (which are often but not always strictly financial) to the benefits, we need a standard unit of account, in this case dollars. 

The quality of the evidence collected directly affects the quality of the final report. As the saying goes, “garbage in, garbage out.“ It is important to find evidence that is as recent and up-to-date as possible, as well as research from comparable geographic and cultural contexts. A study estimating the willingness to pay for clean water in a developing country would not be very helpful in determining the willingness to pay for clean water in a place where safe drinking water is taken for granted. 

All of this research can take time and be frustrating. Information can be out of date, hidden behind paywalls, and sometimes just nonexistent. That being said, collecting good evidence is arguably the most important step in cost-benefit analysis and deserves care and attention from the analyst. A solid base of knowledge makes every part of the process go more smoothly and prevents us from being in a situation where we have garbage going in, and garbage coming out.

This blog post is part of a series of posts on conducting cost-benefit analysis for newcomers by Scioto Analysis Policy Analyst Michael Hartnett.

Alcohol abuse still a public health problem in Ohio

In 2020, early in the pandemic, Gov. Mike DeWine was the talk of the country, earning national praise for his public health efforts to slow the spread of coronavirus.

At the same time that the DeWine administration was effectively shutting down a third of the economy to slow the spread of the virus, it counterintuitively relaxed one public health policy. Through executive order, DeWine allowed restaurants to sell to-go alcohol to customers.

This order was ostensibly to give people more reason to stay at home and to create a new revenue stream for struggling businesses. Later that year, the order became law as the Ohio General Assembly made the policy permanent.

The CDC estimates that over 5,700 Ohioans die from excessive alcohol use per year, defined as binge drinking or heavy drinking throughout a week. The leading causes of death related to alcohol stem from poisonings, heart disease, liver disease, suicide, motor vehicle crashes, and homicide. In this way, alcohol harms people’s physiology, but also their psychology, loosening them up to unintentionally or intentionally harm themselves and others.

An interesting wrinkle with alcohol consumption is that its effects run opposite demographically to many other health trends. According to CDC survey data, college-educated Ohioans are 21% more likely to abuse alcohol than those without a college degree. Ohioans making more than $75,000 are 78% more likely to abuse alcohol than those making $25,000 or less.

White, male, and young Ohioans are more likely to drink excessively than their black, female, and older counterparts. White Ohioans are 18% more likely to drink excessively than Black Ohioans. Men in Ohio are 47% more likely to drink excessively than women. And Ohioans age 18-44 are 200% more likely to drink excessively than those age 65 and up.

While many worry about the impact of alcohol in the wake of the pandemic, self-reported excessive alcohol use decreased in Ohio from 2019 to 2020 by nearly a percentage point. A survey by an organization that provides resources on substance abuse suggested excessive alcohol consumption also fell in 2021.

Despite this progress, alcohol is still killing more people in Ohio than opioids are. So what options do we have for curbing excessive alcohol use?

A simple option is taxation. The U.S. Community Preventive Services Task Force reviewed 72 studies looking at the relationship between taxes, prices, and excessive alcohol consumption, and found that a 10% increase in price of alcohol leads to 5% lower consumption of beer, 6.4% lower consumption of wine, and 7.9% lower consumption of spirits. A can of Budweiser in 2010 cost one-fifth its price in 1950, and cheap liquor in 2010 cost one-fifteenth its price in 1950. Higher prices could reduce consumption and abuse.

A more light touch intervention to reduce excessive alcohol consumption is screening for alcohol abuse, known as “screening and brief intervention.” Instituting this in medical and other settings can help identify people who need help with reduction of excessive drinking behaviors and deliver information to them to curb such behaviors.

Alcohol abuse has been slowly waning in Ohio, but it still kills more people than opioid overdoses in Ohio. Having an intentional public health approach to alcohol abuse will help curb the social ills of excessive alcohol abuse and save lives.

This commentary first appeared in the Ohio Capital Journal.

Ohio economic experts disagree on electric vehicle policy

In a survey published by Scioto Analysis this morning, economists in Ohio were split on questions of progressivity of electric vehicles and cost-effectiveness of electric vehicle investments.

Nine of nineteen economists agreed with the statement that the current $200 annual fee for registering electric vehicles in Ohio is progressive. Among those who agreed, some said this was because higher-income people tend to purchase electric vehicles. Among the 11 economists who were uncertain or who disagreed, one said the fee is currently progressive but will become more regressive as more low-income people purchase electric vehicles and another pointed to the regressivity of flat fees.

Ten of nineteen economists agreed public expenditure on infrastructure to support electric vehicles is likely to be more cost-effective than providing equivalent amounts as tax credits/purchase rebates for buyers. Some who agreed commented that “range anxiety” is likely holding back electric vehicle purchases and that infrastructure will help grow the industry. Those who were uncertain wanted more data on the current state of electric vehicles and clarity on the specifics of policy implementation.

The Ohio Economic Experts Panel is a panel of over 40 Ohio Economists from over 30 Ohio higher educational institutions conducted by Scioto Analysis. The goal of the Ohio Economic Experts Panel is to promote better policy outcomes by providing policymakers, policy influencers, and the public with the informed opinions of Ohio’s leading economists.

Introducing Michael Hartnett, Policy Analyst

Hello! My name is Michael Hartnett, and this week I joined the Scioto Analysis team as a policy analyst. 

I attended Bates College for my undergraduate degree where I studied economics. My main area of interest was using applied microeconomic methods to study policy impacts. Specifically, I focused on the economics of the criminal justice system and rates of recidivism in Maine.

Now, my area of expertise is in statistics as I recently completed my Master’s degree from the University of Minnesota. My goal is to use that knowledge to help make Scioto’s research even more robust. Better models lead to better information, and better information can inform better policy decisions. 

I believe that as data becomes increasingly available, policymakers need to rely on accurate and unbiased analyses in order to achieve the most efficient and equitable outcomes. As a policy analyst, my goal is to use the best data possible to answer pressing questions in a way that is fair and honest.

I am super excited to have this opportunity, and I am looking forward to all of the challenges and responsibilities that come with it. If you have any questions for me, please feel free to email me at michael@sciotoanalysis.com.