War for the Web: Countering ISIS and violent extremism online

In the wake of the recent terrorist attacks in London, U.K. Prime Minister Theresa May has been at the forefront of international calls for technology companies to do more to combat online extremism. The British government announced its intent to stamp out extremism “in all its forms, both across society and on the internet.”

In the United States, the Department of Homeland Security just announced a $10 million two-year grant to organizations that work to improve cybersecurity and thwart terrorism. Countering violent extremism, specifically online, requires taking proactive steps to halt extremist groups from being able to recruit and radicalize followers worldwide. This effort, now more than ever, requires increasing cooperation between the private, public and academic sectors, among others. For their part, tech companies have been experimenting with new techniques and guidelines.

These are complex issues at the intersection of freedom of expression and national security. How will all of the proposed changes and solutions express themselves online, domestically and abroad? How do these efforts to identify and prevent early online radicalization square with the First Amendment and notions of freedom of expression?

Arthur Rizer, R Street’s director of national security and justice policy, took part in a July 21, 2017 panel discussion on these and related issues hosted by the Advisory Committee to the Congressional Internet Caucus. Other panelists included Kevin Adams of the British Embassy, Alexander Meleagrou-Hitchens of George Washington University’s Project on Extremism, Mark MacCarthy of the Software & Information Industry Association and Clara Tsao of DHS’ Countering Violent Extremism Task Force.

Video of the discussion is embedded below:

Trump wisely rejects emergency order for coal


The Trump administration this week confirmed it has rejected a coal industry request seeking an emergency order for a two-year moratorium on coal-plant closures. This avoids what would have been an unprecedented and economically damaging intervention in electricity markets, without even the benefit of greater reliability. The move marks a sharp break from the all-tools-considered approach to reinvigorating coal, as the president reportedly had previously committed to the measure in private conversations with industry executives.

The Federal Power Act grants the U.S. Energy Department emergency authority to order continued operations of power facilities. In April, Energy Secretary Rick Perry announced the possibility of federal intervention to protect coal and nuclear plants in the name of national security, which would pre-empt state policies. The announcement coincided with the launch of an Energy Department study on so-called “baseload” power-plant retirements.

A massive moratorium on power-plant closures, especially those brought about by market forces, would heavily distort electricity markets and deter, if not outright freeze, new capital investment. Fatally undermining an investment climate could paradoxically worsen energy reliability by undermining the price signals that competitive electricity markets use to meet reliability requirements. Furthermore, using a national security mechanism when there is no national security concern would be an abuse of the emergency authority. Doing so while overriding the states would also leave a deep federalist scar.

This may even beat coal-production subsidies as the worst energy policy idea. Fortunately, many productive energy-policy corrections are on the table for the administration.

A reset on coal policy should be consistent with market principles, not a form of reverse-industrial policy to counter the prior administration’s favoritism to renewables. Thoughtful deregulation is an appropriate approach. So is lifting restrictions on coal exports or international financing for coal development. But subsidies and knee-jerk responses—a protectionist emergency order being the worst among them—would be deeply damaging and harm the economy.

One hopes this is a sign the president and his senior energy advisors recognize that economic transitions are necessary and healthy when they are supported by market forces. Coal’s biggest foe is shifting market fundamentals – namely inexpensive natural gas and declining demand. Subsidies for coal’s competitors is a lesser factor and the administration should deal with those in a manner that predictably and sustainably reduces the subsidy regime, not further entrenches it.

The surge in coal-plant retirements this decade was due mostly to a combination of environmental rules and market forces, with the latter being the main driver going forward. The mid-Atlantic region experienced more than 20 gigawatts of coal retirements already (equivalent to about three-quarters of New England’s peak demand). Markets facilitated new resources to take coal’s place. As leading industry economists note, the emergence of these alternative resources has been surprisingly robust and posed no clear reliability concern. Overall, most electric reliability metric trends are stable or improving.

Clearly, the doomsday reliability claims (e.g., coal retention as a national security issue) of some uncompetitive industries have gone unfounded. Still, achieving continued reliability requires market rules and monopoly-utility-planning processes to evolve, as unconventional resources become more economical. The administration can aid this by listening to industry experts, not the desperate claims of rent-seeking industry members.

The dismissal of a blatantly anti-market idea could, one hopes, point the way toward a more refined approach for this administration’s energy policy. The forthcoming U.S. Energy Department study has much potential to assess the regulatory and market environment fairly and to suggest market-enhancing improvements. Further work to improve the alignment of wholesale electricity market rules with electric reliability requirements is one such path to let markets, not government, decide the fate of the coal industry and all other power sources.

Image by Rudmer Zwerver


Trump’s ‘energy dominance’ strategy starting to crack Eastern European markets


The U.S. Energy Department announced Aug. 21 that a cargo ship full of Pennsylvania coal would be sailing out of Baltimore, 5,600 nautical miles across the Atlantic Ocean, Mediterranean and Black Seas to Ukraine, the first such shipment of its kind.

Such shipments hold significance in a variety of ways, and offer a possible window into the Trump administration’s desire to use energy trade to offset the aggressive geopolitical behavior of Russian President Vladimir Putin. In March, Ukraine cut off deliveries of coal from the Russian-controlled region of Donbass, where much of Ukraine’s coal industry resided before the 2014 conflict between Ukraine and Russia began.

Centrenergo, the Ukrainian power utility, has been struggling since March to replace the blocked coal supplies. Now, Latrobe, Pennsylvania-based XCoal Energy will send 700,000 tons of anthracite coal to Odessa over the next several months. The agreement follows a June meeting between President Donald Trump and Ukrainian President Petro Poroshenko and U.S. President Donald Trump.

This week also marks the first shipment of liquefied natural gas from the United States to the Baltic state of Lithuania, a former Soviet satellite, which until recently was completely dependent on Russian gas supplies. Lithuania and the other Baltic states of Latvia and Estonia have at times been under withering political pressure from their former Cold War patrons, particularly since Moscow occupied the Crimea in 2014.

Trump’s visit to Poland in early July included a speech that highlighted his administration’s desire to exert a counter-force on Russia through the energy markets, “so that you can never be held hostage to a single supplier,” Trump said.

Poland received its first U.S.-based shipment of LNG in July, and the single U.S. LNG export terminal at Sabine Pass on the Texas-Louisiana border has sent out more than 160 cargoes since starting up in February. Several of the cargoes have reached Spain, Italy, the United Kingdom and the Netherlands in the past 18 months.

U.S. LNG export capacity is set to grow five-fold by 2020, potentially making Eastern European economies awash in natural gas, just as many of the long-term delivery contracts signed between Eastern Europe and Russia’s natural gas monopoly Gazprom are set to expire in the early 2020s.

Image by Anatoly Menzhiliy


Rep. Meadows introduces bill to lock in regulatory budgeting


Those who champion slowing the growth of the regulatory state earned a victory earlier this year with President Donald Trump’s “two-out-one-in” executive order, requiring federal agencies to eliminate two old regulations for every new one they enact. The order also established a type of regulatory budget that caps the amount of regulatory costs agencies can impose on the economy during a given year.

But as R Street previously has argued, such executive branch actions, particularly in the area of deregulation, are unlikely to be lasting unless they are codified. Codification ensures that deregulatory efforts are locked in and not subject to reversal by a future president.

Toward that end, the latest good news is that H.R. 2623, legislation that effectively would codify Trump’s order, has been introduced in the U.S. House by Rep. Mark Meadows, R-N.C., chairman of the House Freedom Caucus.

Unlike past regulatory budgeting legislation, Meadows’ bill would not task Congress with setting the regulatory budget, instead granting that responsibility to the White House Office of Management and Budget. While such a structure may be the best short-term option to codify a regulatory budget, Congress ideally would be the branch responsible to set how much regulatory costs agencies could impose each year. A further concern is ensuring that OMB has the resources and manpower necessary to institute the regulatory budget.

Regardless, the Meadows bill should be welcomed as a step toward a more sustainable deregulatory effort.

Image by Maythaphorn Piyaprichart

New internet tax threatens privacy of Washington customers


In their zeal to shake a few more tax dollars out of Washington residents’ pockets, state lawmakers are brushing aside legitimate privacy concerns raised recently by civil-liberties groups. Under the new internet sales-tax law signed by Gov. Jay Inslee last month, the Washington Department of Revenue could learn more than most of us want it to know about our online purchases.

State officials vow the information provided to them by online retailers to facilitate the collection of the so-called “use” tax will be held in the utmost confidence. But from police agencies to the Internal Revenue Service, government bureaucracies have far from an unblemished record when it comes to protecting private records.

If you’ve bought nothing weird, then maybe you’ve got nothing to hide. And maybe the retailers, or third-party websites like eBay will do the state’s bidding and collect the tax for the department without turning over any information. Or maybe not. The new law gives out-of-state sellers the option to “voluntarily” collect Washington sales taxes or to provide the names, addresses and purchase information to the revenue folks in Olympia. As a consumer, the decision won’t be yours to make.

The stores must provide purchase amounts rather than a list of specific items. But this can be small comfort for those who patronized, e.g., a mental-illness center, a paraphernalia shop or a company that sells sex toys. Current law requires sellers with a brick-and-mortar presence in the state to collect taxes from in-state consumers, but the tax collectors say they want to “level the playing field.” Your privacy wasn’t much of a concern when they passed the law.

Even if your raciest online purchase is a calendar with cutesy cat photos, you ought to be concerned about the costly implications. There are your personal costs. The new law is, after all, a tax increase on Washington residents’ purchases. Then there’s the likely cost to the general fund, as state officials defend it for years in the federal courts. The state balanced its budget based on revenue assumptions from the tax (an estimated $1 billion over the next four years), but those collections will be put on hold through the length of the trials.

Lawmakers are confident that they are on solid legal ground, because the federal 10th Circuit Court of Appeal, after six and a half years of litigation, upheld Colorado’s internet sales tax law. The U.S. Supreme Court recently refused to review the Colorado decision, which cleared the way for that state to begin collections from out-of-state online customers.

But there’s no guarantee the federal 9th Circuit Court of Appeals, which oversees Washington state matters, will reach the same conclusion. There are significant other differences between the Washington and Colorado laws, even though Washington legislators used Colorado as a model. Those, too, can lead to a different outcome.

Both states require sellers that don’t collect the sales tax to provide personal information about online purchasers to their respective revenue departments. But the Washington law applies to companies that gross more than $10,000 a year in sales to in-state residents, whereas Colorado’s threshold is 10 times higher. “Washington’s puts more responsibility on so-called ‘marketplace facilitators’ and other internet ‘middlemen,'” according to a Tacoma News Tribune report.

That Colorado case centered on the U.S. Constitution’s Commerce Clause, governing business among the states. The Direct Marketing Association challenged the law based on a seminal 1992 U.S. Supreme Court ruling (Quill Corp. v. North Dakota) that state officials can only collect sales taxes from a business if they have a physical presence in that state. For instance, Seattle-based Amazon has long collected taxes for sales to residents living in Washington.

But the 10th Circuit ruled that “Quill applies only to the collection of sales and use taxes, and the Colorado law does not require the collection or remittance of sales and use taxes. Instead, it imposes notice and reporting obligations.”

Nevertheless, there are many reasons to question the Washington law. The $10,000 threshold imposes a burden on small businesses, given that they will need to maintain detailed reports on buyers in the state. If this law withstands court scrutiny, similar tax schemes will spread like a bad internet rumor. Even the tiniest enterprises, located here and elsewhere, will have to collect data to meet the varied demands of 50 state revenue offices – or face $20,000-plus penalties.

All U.S. internet companies, not just the small ones, will face disadvantages given that Olympia’s tax grabbers will not be able to enforce the statutes on sellers based in Shanghai or New Delhi. And Commerce Clause arguments won’t be the only ones that might tie it up in court.

In 2010, the U.S. district court in Seattle rebuffed efforts by the North Carolina secretary of revenue to receive detailed purchase information from Amazon as part of a tax audit of North Carolinians’ purchases. The court found that “citizens are entitled to receive information and ideas through books, films, and other expressive materials anonymously.”

That’s true even when the government is seeking the information for tax purposes rather than to censors. Do we really trust any government agency with such personal information? Unfortunately, we’re now at the mercy of the courts and Congress to protect such privacy rights.

Image by koya979


The FDA’s words and actions do not match


In a new paper in the New England Journal of Medicine, newly minted Food and Drug Administration Commissioner Scott Gottlieb and Mitch Zeller, longtime director of the FDA Center for Tobacco Products, commit themselves to a science-based regulatory framework that takes into account “the continuum of risk for nicotine-containing products” to reduce tobacco-related addiction, illness and death. To do this, the paper promises an FDA commitment to reduce nicotine levels in cigarettes to “non-addictive levels” and to “foster innovation in less harmful nicotine delivery.”

The proposal to reduce the nicotine content of cigarettes is based on two recent papers by Eric C. Donny and colleagues – one that also appeared in the NEJM and another that appeared in the journal Preventive Medicine. The Preventive Medicine paper outlines the issues that need to be addressed for such regulations to be developed. The NEJM paper details a six-week trial in which smokers were given free low-nicotine e-cigarettes to use, in addition to their usual cigarettes. The trial showed that smokers of low-nicotine cigarettes did not smoke more cigarettes on the last day of the trial than they had on the first day. Neither paper provides a sound basis to replace traditional cigarettes as we know them with a product that does not deliver significant amounts of nicotine.

For the past six years that Zeller has directed the Center for Tobacco Products, he has spoken of a “continuum of risk” and said he favored “fostering innovation.” Unfortunately, what has been reflected in FDA policy is direct opposite of those notions. The FDA has imposed no regulatory burden on cigarettes that wasn’t already imposed before adoption of the Tobacco Control Act of 2009. Meanwhile, the FDA continues to impose nearly impossible-to-meet requirements to approve any new product to the marketplace, including those that claim lower risk than cigarettes.

Despite overwhelming evidence that e-cigarettes are far lower in risk than cigarettes, and that they do not recruit teens to nicotine who otherwise would not have smoked, the FDA has done nothing to confirm or deny these research findings. Indeed, it has just announced a new anti-e-cigarette campaign.

The smokeless tobacco products currently on the American market have long been known to be far lower in risk than cigarettes. The FDA has done nothing to share this information with the public, and has even proposed a new set of smokeless tobacco regulations that threaten to remove almost all current smokeless products from the market.

FDA regulations continue to mandate that even the smallest change to any tobacco-related product now on the market would require immediate removal of that product, pending a new costly application for FDA approval of the modified product. So much for encouraging innovation.

The bottom line is this: neither science nor the fine words in this latest NEJM piece have anything to do with FDA policies that continue to protect cigarette sales and profits from competition from lower-risk nicotine-delivery products.

Image by Gustavo Frazao


How big a bank is too big to fail?


The notion of “too big to fail”—an idea that would play a starring role in banking debates from then to now—was introduced by then-Comptroller of the Currency Todd Conover in testimony before Congress in 1984. Conover was defending the bailout of Continental Illinois National Bank. Actually, since the stockholders lost all their money, the top management was replaced and most of the board was forced out, it was more precisely a bailout of the bank’s creditors.

Continental was the largest crisis at an individual bank in U.S. history up to that time. It has since been surpassed, of course.

Conover told the House Banking Committee that “the federal government won’t currently allow any of the nation’s 11 largest banks to fail,” as reported by The Wall Street Journal. Continental was No. 7, with total assets of $41 billion. The reason for protecting the creditors from losses, Conover said, was that if Continental had “been treated in a way in which depositors and creditors were not made whole, we could very well have seen a national, if not an international crisis the dimensions of which were difficult to imagine.” This is the possibility that no one in authority ever wants to risk have happen on their watch; therefore, it triggers bailouts.

Rep. Stewart McKinney, R-Conn., responded during the hearing that Conover had created a new kind of bank, one “too big to fail,” and the phrase thus entered the lexicon of banking politics.

It is still not clear why Conover picked the largest 11, as opposed to some other number, although he presumably because he needed to make Continental appear somewhere toward the middle of the pack. In any case, here were the 11 banks said to be too big to fail in 1984, with their year-end 1983 total assets – which to current banking eyes, look medium-sized:

alex chart

If you are young enough, you may not remember some of the names of these once prominent banks that were pronounced too big to fail. Only two of the 11 still exist as independent companies: Chemical Bank (which changed its name to Chase in 1996 and then merged with J.P. Morgan & Co. in 2000 to become JPMorgan Chase) and Citibank (now Citigroup), which has since been bailed out, as well. All the others have disappeared into mergers, although the acquiring bank adopted the name of the acquired bank in the cases of Bank of America, Morgan and Wells Fargo.

The Dodd-Frank Act is claimed by some to have ended too big to fail, but the relevant Dodd-Frank provisions are actually about how to bail out creditors, just as was the goal with Continental. Thus in the opposing view, it has simply reinforced too big to fail. I believe this latter view is correct, and the question of who is too big to fail is very much alive, controversial, relevant and unclear.

Just how big is too big to fail?

Would Continental’s $41 billion make the cut today? That size now would make it the 46th biggest bank.

If we correct Continental’s size for more than three decades of constant inflation, and express it in 2016 dollars, it would have $97 billion in inflation-adjusted total assets, ranking it 36th as of the end of 2016. Is 36th biggest big enough to be too big to fail, assuming its failure would still, as in 1984, have imposed losses on hundreds of smaller banks and large amounts of uninsured deposits?

If a bank is a “systemically important financial institution” at $50 billion in assets, as Dodd-Frank stipulates, does that mean it is too big to fail?  Is it logically possible to be one and not the other?

Let us shift to Conover’s original cutoff, the 11th biggest bank. In 2016, that was Bank of New York Mellon, with assets of $333 billion. Conover would without question have considered that—could he have imagined it in 1984—too big to fail. But now, is the test still the top 11?  Is it some other number?

Is $100 billion in assets a reasonable round number to serve as a cutoff? That would give us 35 too big to fail banks. At $250 billion, it would be 12. That’s close to 11. At $500 billion, it would be six. We should throw in Fannie Mae and Freddie Mac, which have been demonstrated beyond doubt to be too big to fail, and call it eight.

A venerable theory of central banking is always to maintain ambiguity. A more recent theory is to have clear communication of plans. Which approach is right when it comes to too big to fail?

My guess is that regulators and central bankers would oppose anything that offers as bright a line as “the 11 biggest”; claim to reject too big to fail as a doctrine; strive to continue ambiguity; and all the while be ready to bail out whichever banks turn out to be perceived as too big to fail whenever the next crisis comes.

Image by Steve Heap


PACER might be the government’s worst website


The following is a guest post by Tom Lee, former chief technology officer for the Sunlight Foundation.

When hackers are able to steal your money, it’s usually safe to call that a website’s least appealing feature. Astonishingly, that’s not true of PACER—the Public Access to Court Electronic Records system, run by the Administrative Office of the Courts—which charges for downloads of federal essential court records. In its case, hackability comes second to the bad and perhaps even illegal deal that it offers the public.

The exploit is real, mind you. The good people of the Free Law Project uncovered it months ago as part of their work to democratize legal information. Now that PACER has patched the vulnerability, FLP has disclosed the gory details.

The problem revolves around a cross-site request forgery attack. When you connect to a website, it’s normally able to store small amounts of data called “cookies” on your computer. Any time your browser makes a request to that site, it will send those cookies, along with the request. Sites can tell if a request comes from a logged-in user by examining the request for unique cookie values that were set after a successful authentication attempt and comparing those values to copies stored in the site’s database.

Code running on a different malicious website that you visit can’t look at the cookies of other websites. But it can make requests to other websites, and those requests will carry the other sites’ cookies. If those cookies identify a logged-in user, the malicious site can make invisible requests that trigger real actions on that user’s behalf on the target site.

There are standard ways to detect and defend against this, but PACER hadn’t used them. Although there is no proof that it happened, a malicious site could have made requests on behalf of logged-in users, downloading documents and racking up fees.

That’s bad. But it’s not the worst thing about PACER—that would be the fees themselves. PACER makes some kinds of documents free, but for many others, it charges 10 cents per page. Barring some truly incredible technical mistakes, that number is vastly more than the cost of serving a page of content. And it has remained at that level for many years, despite advancing technology and falling bandwidth and storage costs.

Legal actions often involve huge page counts, which means that PACER fees add up. And they render some kinds of research and scholarly work totally impractical.

Even worse, those fees might be illegal. The Administrative Office of the Courts is barred by the E-Government Act of 2002 from charging more for PACER than it costs to maintain the system. But there is evidence that AO is not in compliance with the law. In 2014, PACER collected $145 million in fees. Five years earlier, it had been projected to cost $30 million per year to maintain. Many suspect that PACER fees are being used to subsidize other line items in the agency’s budget.

A class-action lawsuit is underway that aims to untangle all of this; if you used PACER between 2010 and 2016, you might be a part of it. But even if you’re not, you can still help to democratize the system’s information. Since the government doesn’t hold copyright over PACER records, there’s nothing stopping you from sharing them with the world after you pay your 10 cents per page. The RECAP project is run by the Free Law Project and Princeton University’s CITP program, and provides browser extensions that automate and centralize this process. It will let you download records from the RECAP archive when they’re available, or contribute newly purchased PACER records to the archive automatically when they’re not.

PACER doesn’t charge for balances less than $15 per quarter, so if you’re feeling civic-minded, why not download RECAP, make a PACER account and liberate some court records for the public good? Now that they’ve patched their vulnerability, it might even be safe to do so.

Image by fizkes


How messaging smart flood planning as ‘climate’ policy led to its demise


When a bunch of reporters called me to discuss President Donald Trump’s decision to turn back Obama-era flood protection standards, I was happy to criticize the administration, because I think the standards were one of the few unalloyed good things the Obama administration did. They’re a clear message from the federal government that federal taxpayers won’t pay to build in flood-prone areas and will build infrastructure designed to stand up to nature.

The Federal Flood Risk Management Standards, promulgated by a January 2015 executive order, drew on the principles of President Ronald Reagan’s great Coastal Barrier Resources Act which forbade development subsidies for barrier islands and barrier beaches while leaving the private sector free to do as it pleased. This is a great policy.

But as I wrote in the Weekly Standard not long after the standards came out, the Obama administration made a serious political (and, arguably, factual) error by choosing in their public statements to label the standards a climate-change-adaptation measure. Now, it’s absolutely true that greenhouse gas emissions have resulted in thermal expansion of seawater and some ice melt in polar regions. These factors (mostly the former) have resulted in sea-level rise. This results in more flooding. In fact, an increase in “sunny day” flooding is one of the very few easy-to-observe widespread phenomena that we can link to greenhouse gas emissions in a convincing fashion.

That said, the areas most at-risk now and in the near future are almost all places where climate change isn’t the dominant concern. Changes in the levels of continental plates, as well as land loss caused by hydrological projects and other human activity, can have local impacts hundreds of times larger than those caused by global warming. Purely natural processes like erosion and seasonal plant growth also can change which particular areas will flood, how badly and how often. In any given area, these factors can be far more likely to make the difference than sea-level rise, which generally proceeds at a scale noticeable only after decades have passed. The folks who wrote the Obama executive order—I talked with them a bunch—knew this well and wrote the order in a neutral fashion to deal with whatever was causing flooding.

In its press statements and publicity, however, the Obama administration insisted on positioning the EO as a response to climate change. While any number of factors—including a genuine desire to cut red-tape surrounding infrastructure projects, pressure from builders and his own career as a real-estate developer—each played a role in Trump’s decision to rescind the order, I can’t help but think that a simple distaste for anything the Obama administration labeled as “climate policy” may have been the driving motivation to repeal the standards.

In part because climate change policy has become such a political hot potato—and because so many on the left have turned it into a culture war issue—focusing on climate change was clearly the wrong move for the Obama administration. As a result, the wrong messaging may have contributed to a very unfortunate policy decision.

Image by MaryValery


Clark Packard talks NAFTA renegotiation on Fox


President Donald Trump, Mexican President Enrique Peña Nieto and Canadian Prime Minister Justin Trudeau are set to meet in Washington today, and over the next three days, for the first round of talks to renegotiate the North American Free Trade Agreement.

R Street Trade Policy Analyst Clark Packard, who back in June co-authored R Street’s comments to the Office of the U.S. Trade Representative on the subject of NAFTA renegotiation, discusses the history of the agreement, its benefits and ways it still could be improved in a new FoxNews.com video profile, embedded below.

Andrew Heaton on how to stop patent trolls

In his latest video for Reason’s Mostly Weekly series, R Street Associate Fellow Andrew Heaton takes on the subject of patent trolls and what to do about them — particularly in light of a recent decision by the U.S. Court of Appeals for the Federal Circuit that Personal Audio LLC doesn’t own the patent on the entire podcasting industry.

Kosar talks postage rates on APM’s Marketplace

R Street Vice President of Policy Kevin Kosar appeared on American Public Media’s Morning Marketplace program to discuss efforts by the U.S. Postal Service to have more flexibility to raise rates without congressional approval, and how that could cross-subsidize businesses where they compete directly with the private sector.

R Street’s voting guide for SXSW panels!



We’ve put together some great policy panels for next year’s SXSW conference in Austin, Texas. BUT WE NEED YOUR HELP to get in the final conference program!

Please vote for us and help bring free-market ideas to Austin’s annual gathering of technologists, activists and entrepreneurs.

Panels featuring R Streeters:

Global Ecosystems and the Policies that Support Them: CLICK TO VOTE!

  • Featuring: Melissa Blaustein, founder and CEO, Allied for Startups;
  • Zach Graves, tech policy program director and senior fellow, R Street Institute;
  • David McCabe, technology reporter, Axios; and
  • U.S. Rep. Blake Farenthold, R-Texas.

How Scientology and Porn Shaped the Internet: CLICK TO VOTE!

  • Featuring: Sasha Moss, technology policy manager, R Street Institute;
  • Christian Dawson, co-founder and executive director, Internet Infrastructure Coalition (i2C); and
  • Aaron Perzanowski, professor of law, Case Western Law School.
  • Katie Oyama, Google.

RoboCop: Is Artificial Intelligence the Future of Criminal Justice? CLICK TO VOTE!

  • Featuring: Arthur Rizer, national security and justice policy director, R Street Institute;
  • Ryan Calo, assistant professor, University of Washington School of Law;
  • Heather West, senior policy manager, Americas principal, Mozilla; and
  • Vikrant Reddy, senior research fellow, Charles Koch Institute.

Virtual Reality Codes of Conduct in the Virtual Wild West: CLICK TO VOTE!

  • Featuring: Anne Hobson, associate fellow in technology policy, R Street Institute;
  • James Hairston, head of public policy, Oculus;
  • Alexis Kramer, legal editor, Bloomberg BNA; and
  • Matthew Schruers, adjunct professor of law, Georgetown University Law Center.

We’ve also put together a great list of policy panels from our friends! 

If you have any panels you’d like us to add to this list, please email Sasha Moss: smoss@rstreet.org.


Congress may be more bipartisan than you think


At the Library of Congress’ Congress and History conference, political scientists James Curry and Frances Lee presented their working paper “Non-Party Government: Bipartisan lawmaking and theories of party power in congress.” In the paper, the authors examine the degree to which increases in polarization and the centralization of power in Congress have resulted in strictly partisan lawmaking.

In short, they want to know if the common characterization of Congress is accurate: in our current era of hyperpolarization and confrontational politics, do majorities in Congress skip bipartisan legislating and pass bills over the strong objections of the minority? Turns out – not so much. Curry and Lee “find that lawmaking today is not significantly more partisan than it was in the 1970s and 1980s.”

Such a conclusion is a bit counterintuitive, given seemingly constant claims that parties are unwilling and unable to work together. Both parties have accused the other of ramming legislation down the throats of the minority without even a semblance of compromise or debate. Democrats have most recently leveled that charge at the GOP’s maneuvers regarding the American Health Care Act.

The perception that majorities run roughshod over minorities is based on a couple observable characteristics of recent Congresses. First, institutionally, increased polarization has diminished overlaps in policy preferences between parties, theoretically decreasing the likelihood of reaching bipartisan agreements. Additionally, stronger, more cohesive political party organizations have developed, which have subsequently centralized power in leadership offices in order to facilitate partisan lawmaking. As articulated by the authors, “Members have provided their leaders a bevy of procedural and agenda-setting tools to structure the legislative process in ways that stand to benefit the majority party.” Among these tools is the bypassing of the traditional committee-driven legislative process in exchange for leadership-managed policy creation, and granting leadership a near monopoly in deciding what issues come up for a vote.

Both of these factors—polarization and more cohesive parties with centralized power—lead observers to hold two important expectations:

  1. Bills that are actually signed into law are likely to be passed without bipartisan support;
  2. The majority party is more effective at realizing their legislative agenda, in spite of the minority opposition.

Curry and Lee, however, show that both of these expectations are not supported by the data.

For their analysis, the authors compile all passage votes in both chambers for bills that became law in the 93rd-113th congresses (1973-2014). Additionally, Curry and Lee use a subset of bills identified as “landmark legislation” by fellow political scientist David Mayhew, to examine if these more significant bills received less bipartisan support due to their increased impact and salience.

A brief discussion of three key findings within the paper are below, all of which suggest that lawmaking in Congress still generally requires, and receives, bipartisan support.


Most laws, including landmark legislation, are passed with strong bipartisan support. The above figure shows the average percentage of minority-party support on all bills that became law during each congress from 1973 to 2014 in the House of Representatives. Contrary to expectations, the figure shows no clear trend line of decreased minority support. On all bills that became law during this period, more than 60 percent of minority lawmakers voted in favor of passage on average, and in many congresses more than 80 percent of the minority voted yes. In fact, in the most recent congresses where polarization is most intense, we find the percentage of minority support is even higher than in less-partisan congresses of previous decades.

On landmark laws we see more variation in minority support across congresses, but still find that, on average, more than 65 percent of minority lawmakers vote in favor of these laws. Only in two congresses, the 103rd and the 111th, does the percentage of minority support fall below 50 percent. Similar patterns are found in the Senate, though not discussed here. (Please see the linked paper for the data and analysis for the upper chamber.)

download 2

Only rarely does the majority pass laws over the opposition of a majority of the minority party. The above figure shows the percent of laws that were passed in the House despite a majority of the minority voting no – this is referred to as the minority getting rolled by the majority. On average, the minority roll rates were less than 15 percent for all laws passed during the period under study. In only a handful of congresses does the roll rate get above 25 percent, with the 103rd Congress showing the highest roll rate of more than 30 percent. Again, we see no upward trend in roll rates despite stronger parties and increased centralized power in leadership offices.

Roll rates are moderately higher in the House on landmark laws, particularly in more recent congresses. However, even on these major bills, the majority is rolled only about 30 percent of the time. Of notable exception are the 103rd and 111th congresses, where the minority was rolled on more than 70 percent of landmark laws.

In the Senate, there is more variation in roll rates across congresses, but on average, the minority is rolled on less than 15 percent of all laws. On landmark laws in the Senate, there is only a slight increase in roll rates, with 19 percent of major bills being passed with the majority of the minority voting no.

download 3

Despite increased majority party tools, congressional majorities do not pass a greater portion of their legislative agenda than congresses in less partisan eras. In addition to looking at levels of minority support on legislation and roll rates, Curry and Lee also assess the degree to which majorities are able to enact their legislative agendas. Because of the increased cohesion of parties and tools granted majority leaders, we would expect to find that majorities are more effective in realizing their policy goals. Instead, the authors find that “congressional majorities rarely are able to enact new laws addressing priority agenda items that achieve most of what they set out to achieve. Far more frequently, majorities achieve none of what they set out to achieve or just some of it.”

The figure above displays the percentage of majority party agenda items enacted from 1973 to 2017, and it categorizes majority success in accomplishing some, most, or none of their policy goals on prioritized issues. While there is notable variation in the majority party’s ability to implement its agenda, the majority was able to realize none of its legislative goals most frequently. Only rarely does the majority get most of what it wants on agenda items, particularly in more recent congresses. Even in congresses with unified party control, the majority struggles to get even some of what it’s after. Having congressional majorities—as Senate Majority Leader Mitch McConnell, R-Ky., and House Speaker Paul Ryan, R-Wis., could tell you—does not automatically translate to the majority dictating policy terms to the minority. Instead, it appears the majority must make concessions on policy goals to ensure passage.

In spite of stronger, more cohesive parties, as well as more powerful leaders with tools to execute partisan lawmaking, laws passed in Congress are mostly done with large percentages of the minority voting in the affirmative. Contrary to consistent claims of majority party dominance over the minority, laws, including landmark bills, are typically passed with majorities of both parties in support.

Here’s the bottom line, in the words of the authors:

After decades of partisan change and institutional evolution in Congress, lawmaking remains a process of bipartisan accommodation.

Image by Lightspring


Courts deal another blow to Obama climate legacy


Attempts by the Environmental Protection Agency to regulate greenhouse gases suffered another setback Tuesday, when a panel of the U.S. Court of Appeals for the D.C. Circuit invalidated an Obama-era EPA rule governing the use of hydrofluorocarbons (HFCs).

HFCs are a greenhouse gas. They’re less well-known than, say, carbon dioxide, but they still have a warming effect when present in the atmosphere, and the rapid rise of HFC emissions in recent years has been a growing cause of concern for policymakers.

Ironically, HFC use has been encouraged by EPA regulation, which authorizes manufacturers to use HFCs as a replacement for other substances that negatively affect the ozone layer. The regulation struck down this week was EPA’s belated attempt to walk back this legacy, telling companies to forget what it said previously, because HFCs are bad now.

The problem is that the statute EPA claimed gave it the authority to restrict HFCs is about restricting ozone-depleting substances. But as everyone (including the EPA) concedes, HFCs don’t deplete ozone. According to the court, since the EPA had already OK’d manufacturers using HFCs as replacements for actual ozone-depleting substances, it couldn’t use the law governing ozone to bootstrap regulation of HFCs.

All this is somewhat technical, but it raises a broader issue. The EPA’s HFC regulation is one example of a larger strategy adopted by the Obama administration and some in the environmental movement to circumvent Congress when it comes to climate change policies. Instead of working out a viable legislative solution that would deal with the problem, the administration looked for ways to commandeer existing statutory and regulatory provisions as a basis for limiting greenhouse gas emissions. Often, this involved stretching the meaning or purpose of particular provisions until they bore little resemblance to how they traditionally were used. The biggest example of this, of course, was the Clean Power Plan.

Now I can almost hear the shouting as I type these words. Obama had no choice! Republicans in Congress were obstructionists, and never would have passed anything. This overlooks that Democrats controlled the House of Representatives and had a filibuster-proof majority during the first years of Obama’s presidency and still couldn’t enact their climate plan, but let’s leave that aside. My point is this: whatever the rationale of trying to act on climate without Congress, recent events have shown that this is a very fragile strategy.

When the EPA stretches its authority to act without congressional sanction, it risks having its work undone by the courts. And even where an EPA action might survive judicial scrutiny, it is vulnerable to being revoked by a future EPA with a different political bent. What can be done without Congress probably can be undone without Congress. This week’s court decision is simply more evidence that any lasting action on climate is going to have to involve Congress.

Image by Evan El-Amin

Diverse voices unite to ask Congress not to gut Section 230


It’s hard to argue against a bill as unassailably titled as the Stop Enabling Sex Trafficking Act, introduced in the Senate last week as S.1693. The measure already enjoys broad bipartisan support and boasts 27 cosponsors.

However, in its effort to punish backdoor online sex traffickers, this legislation appears likely to have unintended damaging consequences wholly unrelated to the issue. Since its introduction, a large array of voices—including civil liberties groups, think tanks, startups and tech industry groups—have come out, despite obvious reputational risks, to point out ways the bill would be counterproductive and damaging to internet freedom.

The proposed legislation includes overly broad language that would modify Section 230 of the Communications Decency Act, which provides online platforms a limited liability shield for user-generated content. If revised, online platforms would be liable for the behavior of their users. Critics of the legislation agree that without these protections, America’s unique and innovative internet ecosystem will collapse.

As R Street wrote in a bipartisan coalition letter with other think tanks and civil-society organizations, this well-intentioned bill threatens to weaken the pillars of internet freedom. Human rights and civil liberties organizations have voiced concerns that the bill would lead to increased censorship across the web. Moreover, it would hinder existing voluntary incentives to stop trafficking and discourage platforms’ proactive efforts to address evidence of trafficking, for fear of being implicated and prosecuted.

Currently, online communications pass through multiple intermediaries—including web-hosting platforms, email providers, messaging services, search engines, online advertisers and more—all of whom depend on protection from misdirected legal threats. Without the protection of Section 230, each intermediary could face potential lawsuits based on the millions of videos, posts and pictures uploaded to their platforms every day. Many stakeholders have pointed out that it’s unlikely the bill will do anything to combat trafficking, but it will certainly invite trial lawyers to bring a deluge of frivolous lawsuits that target law-abiding platforms.

The Electronic Frontier Foundation cited Section 230 as “one of the most important laws protecting free expression online.” To clarify, Section 230 does not provide immunity and has never prevented intermediaries from facing federal criminal charges. The U.S. Justice Department has every right to pursue anyone who violates trafficking statutes on internet platforms without making any changes to existing law.

“If online intermediaries were held responsible for the actions of each and every user, the potential liability would be so massive that no reasonable person would start or invest in such a business,” the Consumer Technology Association stated.

A multitude of tech coalitions have also highlighted how the overly broad legislation would harm legitimate U.S. tech companies. Without the protection provided by Section 230, all internet platforms will be responsible for engaging in self-censorship and resource-intensive review to inspect all user-generated content. While some tech giants might be able to shoulder the cost, the burden undoubtedly would stifle development of smaller websites and startups. Law-abiding citizens would be left dealing with the repercussions, while bad agents could easily escape by moving abroad or changing their URL address.

Section 230 promotes positive legal behavior. The tech industry has been cooperative in the fight against trafficking, working closely with law enforcement to identify potential illegal activities. The Copia Institute and Engine Advocacy groups highlighted in their letter how the tech industry has created their own tools, combining cutting-edge technology and big data to eradicate trafficking in the online sphere. This bill could have a chilling effect on the industry’s relationship with law enforcement. Trade associations spanning the breadth of the U.S. media and technology industries have described how the measure would be counterproductive to those companies’ efforts to combat sex trafficking. Ultimately, it would create incentives not to filter proactively for evidence that might implicate companies in criminal lawsuits.

New legislation is not necessary to hold actors accountable for their participation in illegal activity. The internet is the product of user-generated content and the ramifications of a bill like this would be devastating.

Image by KreativKolors


Free to Brew: Alabama’s war on margaritas

Cameron Smith uncovered Alabama’s overzealous alcohol control board attempting to ban the sale of pitchers of margaritas to adults. He explains how his team was able to help pressure the nannies in Alabama to reverse their decision and let consenting adults voluntarily purchase pitchers of margaritas once again. He also talked about how people can replicate the success!

Why should conservatives care about urbanism and city development?

Jonathan Coppage, visiting senior fellow with the R Street Institute, where he researches urbanism and the built environment, joins host Gracy Olmstead on this episode of Federalist Radio. They discuss the ways that design can have impact on our communities and neighborhoods.

“Building a house to engage and to face the street is the first step of reviving a public space,” he said. “Having a public space that orients people towards it is not just part of good community foundation…it’s part of public safety.”


They discuss Jane Jacobs, Wendell Berry, and others who have written about the spaces in which we live.

What the budget process can tell us about the state of the Senate


Congress is running out of time to fund the federal government for the upcoming fiscal year that begins Oct. 1.

In July, the House of Representatives passed four appropriations bills bundled together in a so-called minibus. But senators chose to leave town for their August recess rather than take up that spending package.

And there won’t be much time to do so when they return in September. The Senate is currently scheduled to be in session for only 17 days next month. The House and Senate will be on the job at the same time for only 12 of those days.

That doesn’t leave a lot of time for the Senate to take up and debate the House-passed minibus, much less the other eight appropriation bills that have yet to be considered by the full House or Senate. A short-term continuing resolution to keep the government open while Congress finishes its work appears inevitable.

Often overlooked in reporting on this state-of-play is the fact that Congress has yet to pass a budget resolution for the fiscal year that begins the end of next month. This is significant, because the budget provides the framework in which the appropriations process unfolds. That is, it governs annual spending decisions in the House and Senate. As such, its consideration is meant to precede that of the appropriations bills.

But that rarely happens these days.

Instead, Congress routinely fails to pass a budget at all. For example, Congress passed only two budgets in the seven years since 2010. And only one of those (in 2016) can be thought of as a budget in any meaningful sense. Congress passed the other one (in 2017) simply to make it possible for Republicans in the House and Senate to repeal and replace Obamacare via reconciliation. Members were focused on the budget’s reconciliation instructions and not its top-line spending, revenue and debt numbers.

A recent paper from Brookings Institution Fellow Molly Reynolds and the Center for Effective Public Management tackles this phenomenon and, in the process, provides valuable insight into why the Senate has been reluctant to take up a budget in recent years.

According to Reynolds, two developments are to blame. First, the budget process has become a partisan exercise. This aligns with how we typically think about the resolution itself. That is, as a symbolic document reflecting the priorities and governing agenda of the majority party. It is also hard to imagine a policy area that generates a comparable degree of conflict on such a consistent basis, given the controversial nature of our budgetary politics today.

As a result, budget votes have become party-line affairs, where senators from one side of the aisle reflexively line up in opposition to those on the other. In this environment, members of the minority party rarely cross over to support the majority’s budget.

One consequence of this is that it is now harder for Senate majorities to pass a budget when they are divided. Achieving party unity is made even more difficult with the strict statutory limits placed on defense and nondefense discretionary spending by the Budget Control Act of 2011.

Given that Senate minorities cannot obstruct budget resolutions, this dynamic also provides insight into how we should expect the institution to operate if a majority uses the nuclear option to eliminate the legislative filibuster in the future. If recent experience with the budget is any guide, empowering a majority to pass measures in the Senate unencumbered by the minority will not necessarily guarantee a sudden burst of legislative productivity.

Reynolds also suggests that the Senate’s reluctance to consider the budget resolution may be driven by the broader breakdown in the institution’s decision-making process more generally. That is, members increasingly offer more floor amendments during the consideration of the budget because it represents one of the few instances when they know they will have the opportunity to do so.

Overall amendment activity in the Senate has declined. While the number of amendments that are filed to legislation considered on the floor has remained relatively consistent, the number of those amendments that are eventually offered (i.e., made pending) to bills has dropped considerably. The reason is that leaders from both parties have utilized a complex assortment of rules and practices to exert greater control over the Senate floor than at any point in the institution’s history. The principal means by which they establish such control is their ability to fill the amendment tree, or offer the maximum allowable number of amendments to legislation. No amendments are in order once all the extant branches on the tree are occupied. As a result, senators are blocked from offering their own amendments.

But it is harder for leaders to block amendments during the budget’s consideration because members can continue offering amendments during the so-called vote-a-rama period once all debate time on the resolution has expired. The budget thus offers members a relatively easy way to engage in credit-claiming and position-taking activities on the Senate floor.

In highlighting these problems, Reynolds underscores the various ways in which the contemporary budget process is in tension with itself. Acknowledging the trade-offs inherent in such contradictions is an important first step in designing reforms that can help reverse Congress’ current trend of not considering a budget.

Several of these reforms are reviewed in the paper, including setting an overall limit on the number of amendments a senator may offer during floor consideration and creating a cloture-like filing deadline for those amendments to give members more time to review them before having to cast their votes.

Another possibility is to revise the contents of a budget resolution to include more information to help rank-and-file members and their staff independently assess the budget. Currently, budget enforcement mechanisms are tied to committee allocations, but few members (and few staff outside of the leadership and budget committees) fully understand how those allocations relate to the functional categories in the budget resolution text. They are not publicly available until they are published in the conference report’s statement of managers at the end of the process. Requiring the budget’s major functional categories to be replaced in the text, or at least supplemented, with specific committee allocations for budget authority, outlays, contract authority (where appropriate), and revenues (where appropriate) would enhance senators’ ability to evaluate the impact of any amendments offered, as well as the underlying resolution itself, on their priorities for the upcoming year.

Reforms like these would certainly make it easier for members to weigh the merits of various amendments and the budget resolution itself. But Reynolds concludes with the astute observation that such changes may be insufficient, so long as senators are not able to offer amendments freely to other measures on the Senate floor. That is, the budget resolution and vote-a-rama are likely remain an outlet for pent-up member demand to participate in the legislative process without changes to how the Senate makes decisions more generally.

Image by nelzajama


Juvenile justice reform finally clears its U.S. Senate hurdle


The following post was co-authored by R Street Research Assistant Megha Bhattacharya.

It’s been 10 years since the expiration of the Juvenile Justice and Delinquency Prevention Act, which created America’s federal standards for the treatment of juvenile offenders. Efforts to reauthorize the legislation have failed repeatedly.

However, the Senate last week passed its version of the JJDPA reauthorization bill—S. 860, the Juvenile Justice and Delinquency Prevention Reauthorization Act—a development that gives hope to juvenile justice reform advocates across the country.

Previous reauthorization attempts faced significant hurdles. Sen. Tom Cotton, R-Ark., held the bill last year over an objection to phasing out of the valid courter order (VCO) exception. VCOs allow state and local systems to detain youth for committing so-called “status offenses” like running away from home, truancy, underage smoking and curfew violations – things that wouldn’t be crimes but for the age of the perpetrator. The VCO exception, Cotton argued, grants state courts additional options when dealing with juvenile offenders.

But Cotton’s opposition prompted a hold from Sen. Rand Paul, R-Ky., who stated he would not support the bill without the phase-out. Ultimately at an impasse, last year’s negotiations ran out of time.

For reauthorization to be successful, both the House and Senate bills must be agreed upon in conference committee and then passed by both chambers of Congress. Leaders on the issue in the House released a statement shortly after news of S. 860’s passage, expressing commitment to crafting a final reauthorization bill alongside their Senate colleagues.

Senate Judiciary Committee Chairman Chuck Grassley, R-Iowa, and Sen. Sheldon Whitehouse, D-R.I., are leading the Senate effort. It is anticipated the bill will reach the president’s desk before the end of this congressional session.

Image by Air Images

Virgin Islands follow Puerto Rico into the debt day of reckoning


What do Puerto Rico and the U.S. Virgin Islands have in common?  They are both islands in the Caribbean, they are both territories of the United States and they are both broke.

Moreover, they both benefited (or so it seemed in the past) from a credit subsidy unwisely granted by the U.S. Congress: having their municipal bonds be triple-tax exempt everywhere in the country, something U.S. states and their component municipalities never get. This tax subsidy helped induce investors and savers imprudently to overlend to both territorial governments, to finance their ongoing annual deficits and thus to create the present and future financial pain of both.

Puerto Rico, said a Forbes article from earlier this year—as could be equally said of the Virgin Islands—“could still be merrily chugging along if investors hadn’t lost confidence and finally stopped lending.” Well, of course:  as long as the lenders foolishly keep making you new loans to pay the interest and the principal of the old ones, the day of reckoning does not yet arrive.

In other words, both of these insolvent territories experienced the Financial Law of Lending. This, as an old banker explained to me in the international lending crisis of the 1980s, is that there is no crisis as long as the lenders are merrily lending. The crisis arrives when they stop lending, as they inevitably do when the insolvency becomes glaring. Then everybody says how dumb they are for not having stopped sooner.

Adjusted for population size, the Virgin Islands’ debt burden is of the same scale as that of Puerto Rico. The Virgin Islands, according to Moody’s, has public debt of $2 billion, plus unfunded government pension liabilities of $2.6 billion, for a total $4.6 billion. The corresponding numbers for Puerto Rico are $74 billion and $48 billion, respectively, for a total $122 billion.

The population of the Virgin Islands is 106,000, while Puerto Rico’s is 3.4 million, or 32 times bigger. So we multiply the Virgin Islands obligations by 32 to see how they compare. This gives us a population-adjusted comparison of $64 billion in public debt, and unfunded pensions of $83 billion, for a total $147 billion. They are in the same league of disastrous debt burden.

What comes next?  The Virgin Islands will follow along Puerto Rico’s path of insolvency, financial crisis, ultimate reorganization of debt, required government budgetary reform and hoped for economic improvements.

A final similarity: The Virgin Islands’ economy, like that of Puerto Rico, is locked into a currency union with the United States from which, in my opinion, it should be allowed to escape. This would add external to the imperative internal adjustment, as the debt day of reckoning arrives.

Image by Peter Hermes Furian


Free-marketers, environmentalists both have reasons to hate the RFS


The Renewable Fuel Standard, created more than a decade ago, remains the source of strong divisions today. But as an Aug. 1 hearing of the Environmental Protection Agency showed, it also can be the source of rare bipartisan agreement, with experts from across the political spectrum testifying to the need to update and reform the RFS.

Under terms of the Energy Policy Act of 2005, the RFS “requires a certain volume of renewable fuel to replace or reduce the quantity of the petroleum-based transportation fuel, heating fuel, or jet fuel.” Two years later, the Energy Independence and Security Act of 2007 updated the RFS and set a projection for the volume of renewable fuels, particularly ethanol, that are mandated to be mixed into the nation’s fuel supply.

Under current projections, by 2022, 15 billion gallons of corn-based ethanol and 2.1 billion gallons of non-corn biofuel will be required in the nation’s fuel supply. While these numbers simply continue existing statutory requirements, both environmental and free-market groups have noted the updated volumes will have harmful effects on the fuel market and on car engines, as well as contributing to pollution from farm runoff.

Before the RFS was passed, oil companies already had been producing gasoline with a 10 percent blend of ethanol—what’s commonly called E10—as corn-based ethanol is largely cheaper than its counterparts derived from petroleum. However, the RFS mandates do not stop at E10. In the effort to “create a market” for advanced fuels, the RFS now calls for blending more ethanol into gasoline than consumers are willing to buy.

Most vehicles on the road can use E10 because it allows for the highest amount of ethanol and does not void vehicle warranties. But many car engines are not warrantied to use a higher ethanol blend, and if they do, it can cause severe damage and corrosion.

“We were pleased to see that the Environmental Protection Agency acknowledged ‘real constraints’ in the market, in terms of demand, infrastructure and production, toward accommodating higher blends of ethanol,” the National Taxpayers Union’s Nan Smith testified before the EPA. “If admitting you have a problem is the first step toward recovery, this and the slightly lower [renewable volume obligations] recommended in the 2018 proposal are good signs for taxpayers.”

Unfortunately, the RFS itself makes no consideration for the consequences faced by consumers. Due to the strict requirements built into the law, the EPA is unable to adjust the volume requirements downward in the face of lower-than-expected demand. This endless cycle leaves companies scrambling for ways to comply, rather than dedicating their energies toward real, market-driven innovation.

These market distortions alone would be reason enough to oppose RFS, but regrettably, it turns out the mandate is also damaging to the environment, particularly by encouraging the use of nitrogen-rich fertilizers used to grow corn. The need for more and more corn-based ethanol because of the RFS creates larger demand for corn and more pollution from its production. The runoff from large farms in the Midwest and Great Plains makes its way into the Mississippi River and has created a large dead zone in the Gulf of Mexico.

While environmental groups overall are split on the effectiveness of the RFS mandates, Friends of the Earth has opposed the standards because of the pollution they cause. “As it ignores the significant environmental damage created by runoff from biofuels production, the RFS will likely exacerbate the problem,” the group notes.

The RFS safeguards do require that biofuels meet a greenhouse gas emissions-reduction standard for each biofuel type. Ethanol made from corn must reduce greenhouse gas emissions by 20 percent; advanced biofuels must reduce greenhouse gas emissions by 50 percent; and cellulosic biofuel must reduce greenhouse gas emissions by 60 percent. These are good standards to have, but they have loopholes that cause the effort to fall short of its desired effect. As it stands, 15 billion gallons of corn ethanol are exempt from the safeguards. FOE adds that the EPA uses flawed data on the true impact of biofuels:

For example, the EPA uses a questionable analysis to predict that corn ethanol will produce less pollution than regular gasoline one day in the future, and then uses that analysis to excuse the use of extremely dirty corn ethanol today.

Rather than hold RFS volumes steady, the EPA should work with Congress to correct what is a fundamentally flawed statute, with the goal of creating an environment where market innovation is encouraged, rather than creating fake markets for industries with powerful lobbyists. As R Street’s Lori Sanders testified at the EPA’s recent RFS hearing:

Rather than continue down this failed path, we at R Street encourage the EPA to work with Congress to pass reforms that work. The federal government does have a role to play in creating an environment in which new fuels and technologies can take root in the marketplace and, in the process, reduce emissions and preserve the environment for generations of citizens to come. Sadly, the RFS does not fit the bill, and the new EPA should seek better solutions.

Image by Jonathan Weiss


U.S. steel requirements for pipelines undermines American energy, trade


The United States is a free-trading nation, regardless what President Donald Trump says on any given day. Any doubters about current U.S. trade policy should look no further than an Aug. 1 op-ed in The Wall Street Journal written by U.S. Commerce Secretary Wilbur Ross entitled “Free Trade is a Two-Way Street.”

The article and associated graph clearly show how much lower U.S. tariffs are for nearly all imported products from the European Union and China than visa-versa, with China being the bigger protectionist. The Trump administration is preparing to launch a major attack on China’s trade barriers, but the trade barrier proposals the president has made at home are deeply inconsistent with free trade in ways that undermine U.S. jobs and energy security.

In particular, the Commerce Department is expected to submit a proposal that would require domestic steel be used in all domestic pipelines, a proposal that could dramatically upend the ability of pipeline operators to source materials at a time of booming demand.

The United States has been in a pipeline boom this past decade thanks to the shale gas and tight oil booms, with roughly 20,000 miles of oil pipeline added since 2010 and more than 10,000 miles of natural gas pipeline added each year since 2008, according to the U.S. Transportation Department.

But few U.S. firms make the type of steel pipe used in large-line pipelines, and 77 percent of the steel used in line-pipe comes, one way or another, from foreign sources: particularly China, Japan, Turkey and South Korea.

According to ICF International, requiring domestic steel could add dramatically to pipeline costs, both in money and time, since disrupting the current international supply chain would cause shortages and possibly curtail future pipeline investments.

Depending only on U.S.-produced pipe “could lead to long construction delays and higher costs, potentially canceling planned pipeline project or blocking new projects,” wrote a group of oil and gas associations to the Chamber of Commerce back in April. Pipe operators cannot simply substitute other materials or products when constructing and repairing pipelines, ICF wrote.

Such restrictions on trade fly in the face of everything the U.S. energy space has learned since the marriage of hydraulic fracturing and horizontal drilling caused oil and gas development to explode forward around 2008.

Since that time, $1 trillion in capital—much of it foreign investment—has been raised and spent to boost the drilling and transportation of oil and gas from shale fields around the country. As we speak, the construction of five separate pipelines—the Atlantic Sunrise Pipeline, the Nexus Pipeline, the Dakota Access Oil Pipeline, the Rover Pipeline and the Mariner East II—are either complete or within months of completion, moving tens of millions of dollars of oil and gas to market every day using steel sourced from around the world.

Trump’s attention – some say, fixation—on the United States’ structural trade deficit and his proposals to solve it no doubt are among the reasons for his election. But it makes no sense to place trade restrictions of the energy supply chain when the product being produced, oil and gas, have much higher value and can have a dramatically greater impact on the country’s long-term health than demand for domestic steel pipe use.

Image by fuyu liu


Congressional Pit Stop: How legislative dysfunction deters young talent


Young people yearn to enact change and make their mark upon the world. Many of them, however, no longer see government as a viable arena in which to do so, in no small part due to congressional dysfunction.

Nurtured in a country constantly at war for most of my life, and thrust into maturity in the worst financial crisis in decades, my generation has grown a well-developed sense of political skepticism. Large swaths of young Americans no longer possess faith in political institutions and processes, and view the government as powerless to combat injustice or solve problems.

Yet without fail, throughout the school year, the University of Chicago Institute of Politics invites myriad political speakers to campus. From members of Congress to idealist activists, their message remains unanimous: There is an unmet need for a new generation of public servants.

Each summer, D.C. is inundated with an influx of young student interns and staffers looking to make a difference. And while Congress remains a powerful attraction, more people are pursuing options beyond the Hill: turning down competitive government internships in favor of more fulfilling private-sector opportunities. As someone who’s made this exact decision, I am a part of the problem. The decision should not come at a surprise when many congressional internships have become dreary positions filled with administrative work and little connected to professional development.

And while interning itself is a temporary commitment, the disinterest in long-term governmental work among young people is indicative of a larger problem among congressional staffers. Surrounded by high disapproval ratings, political gridlock and hyperpartisanship, the frustration within government is palpable, particularly among individuals my age. The decline of faith in political institutions, combined with a growth of opportunities to enact societal change outside of government, has led to millennials choosing private-sector missions in growing numbers.

Though Congress will have little trouble filling many of the staffing positions, a serious underlying issue remains: are positions being filled by the most qualified candidate? Feelings of pessimism make it hard to attract young people to serve Congress, and even harder to retain them. As a result, it is difficult to generate institutional growth if each new wave of public servants view their time in our national legislature as a steppingstone to other opportunities with more meaning.

Congress is supposed to be the foundation upon which the rest of the government edifice rests. It is the first branch, and was designed to be the driving force of policymaking, the repository of national powers and the channel of popular energy. Article I assigned Congress diverse and immense powers to govern so as to properly reflect property, people and political communities. Congress was once the bedrock institution but has fallen victim to its vices.

Established to make policy and respond to shifting social and economic needs, our national legislature is gridlocked by ideological strife. Because of this, Congress does not offer younger candidates an environment conducive to sustainable or meaningful growth. But more than that, the inability to govern signals a lack of congressional demand for the ready supply of ideas and talent – talent that therefore flows to workplaces off Capitol Hill.

While recent attention has been focused on President Donald Trump’s inability to fill high-level government positions, the bigger story is that decades of disinvestment in Congress have left rampant staffing problems within its daily structure. Legislative branch staffing has not grown proportionally with the expanding size of the government or the U.S. population, which has weakened the most democratic branch of government.

Experienced staff is a conceptual rarity. By the time congressional staffers gain high-level expertise, they’ve typically initiated the process of cycling out of the institution to pursue other prospects. The continuous influx of bright and energetic staff is not an ideal replacement for staffers with policy experience. Disinvesting in the legislative branch talent pool has led to a dependence on external resources—mainly, interest groups—which have smarts but inevitably have an agenda. The decay of institutional knowledge is hampering effective governance.

Congressional reform should focus on battling the external pressures and strengthening the crumbling institutional structures through an increase in motivated staff with a focus on retention. While social and political issues continue growing in complexity, Congress remains unable to address them properly. The government is responsible for processing more information than ever before, and is doing so with even fewer resources. Why should Congress continue to rely on private research, elite op-eds and corporate lobbyists when it can strengthen itself from within?

Young professionals are demoralized by the behavior of Washington officials, but their disengagement is rooted in frustration, not apathy. It is misinformed to fault millennials for remaining unengaged in the Hill when the government itself has repeatedly and publicly divested from young talent. However, without a clear solution, the dysfunctionality of Congress is condemned to further spiral. Instead, Congress should invest in creating long-term paths and educational opportunities to educate staffers continuously. This is what congressional internships should be about.

A job on the Hill should be more than a pit stop. But it won’t be anything but that until Congress reforms itself.

Alex Pollock on the Peak Prosperity podcast

Appearing on the the Peak Prosperity podcast, R Street Distinguished Senior Fellow Alex Pollock details his assessment of the Federal Reserves’s major transgressions against the interests of the general public. But perhaps more interestingly, he shares his observations from a recent hearing of the House Financial Services Committee on the same topic (at which he testified) and how it struck him that many members of Congress that convened it appear to be growing increasingly concerned about the Fed’s lack of accountability, as well as its potential fallibility.

For Harry Potter’s birthday, try on the federal affairs Sorting Hat


Today is Harry Potter’s 37th birthday. In honor of The Boy Who Lived and savior of the wizarding world, we had some administration officials and members of Congress try on the Sorting Hat to determine which house of Hogwarts is their true home.

As a proud Slytherin, I’d like to remind everyone that this is all in good fun, and each of the four houses has its merits. (Even Hufflepuff; J.K. Rowling herself would have been one.)

Do you agree with our sorting? Who did we miss? Let us know in the comments or tweet to us at @RSI! And always remember:



Sen. Mike Lee, R-Utah

Energy Secretary Rick Perry

Rep. Darrell Issa, R-Calif.

Sen. John McCain, R-Ariz.



Education Secretary Betsy DeVos

Rep. Blake Farenthold, R-Texas

Housing and Urban Development Secretary Ben Carson



Sen. Al Franken, D-Minn.

Sen. Tom Cotton, R-Ark.

Rep. Justin Amash, R-Mich.

Sen. Ted Cruz, R-Texas



Rep. Jared Polis, D-Colo.

Rep. Bob Goodlatte, R-Va.

Sen. Ron Wyden, D-Ore.

Transportation Secretary Elaine Chao

Sen. Ben Sasse, R-Neb.


Whitehouse-Schatz carbon tax moves in right direction, but falls far short


Sens. Sheldon Whitehouse, D-R.I., and Brian Schatz, D-Hawaii, are serious about tackling the challenge of climate change and they’re out this year with another carbon proposal intended to be an “olive limb” to the right. As Whitehouse describes it:

Virtually every person on the Republican side who has thought the climate change problem through to a solution has come to the same place: price carbon emissions to encourage cleaner energy and return the revenue the American people.

That’s just what their new legislation intends to do. From 10,000 feet, it’s a promising start. The proposal imposes a tax on carbon emissions from fossil-fuel combustion and other major emitters; establishes a border adjustment to address concerns about competitiveness; and returns all the revenue, keeping none for the federal coffers.

The devil, however, is in the details. And that’s where the American Opportunity Climate Fee Act falls short.

First, there’s the revenues. We know from the literature that a revenue-neutral carbon price can boost economic growth, if revenues are devoted to cutting taxes to capital. Other ways of recycling the revenue—cutting payroll taxes, offering lump-sum rebates or reducing sales taxes—all pull the reins in on the economy. The Whitehouse-Schatz proposal spends the revenue several ways: reduces the top corporate income tax rate to 29 percent; offers a refundable tax credit to working Americans; offers additional payments to Social Security and veterans’ benefits recipients; and delivers $10 billion in annual block grants to the states.

The cuts to the corporate income tax rate are a good start, but insufficient. Any redesign of the corporate income tax should make the United States a more competitive place to do business; the Whitehouse-Schatz proposal would leave the United States with a tax rate that’s still 50 percent higher than the European average. That’s not exactly the ground-breaking shift we’re looking for.

Refundable tax credits to workers and additional payments to Social Security and veterans’ benefits recipients are intended to address the regressivity of a new tax on carbon. That’s a worthy goal; reducing greenhouse gas emissions shouldn’t increase the burden on those least able to pay. But the senators’ proposed structure creates a national constituency for something akin to a new entitlement. That constituency will support a tax just high enough to maintain annual payments and just low enough to not actually phase down the greenhouse gas emissions that support the new annual payment.

Lastly, the $10 billion in block grants is intended to fund individual states’ efforts to help those who can least afford to pay the new taxes on energy, or those whose industries are hardest hit, distributed on a per-capita basis. That creates a serious issue for the most rural states with the lowest populations – Alaska, the Dakotas, Montana and Wyoming. These states would also be disproportionately impacted; energy development is among the top five industries in Alaska, North Dakota and Wyoming.

Then there’s the matter of the tax itself. Whitehouse-Schatz would start at $49/ton of carbon dioxide in 2018, rising 2 percent above inflation year-over-year until an emissions target is attained. That’s a pretty high starting value: when the Congressional Budget Office modeled the Waxman-Markey cap-and-trade proposal in 2009, it estimated first year prices around $15/ton.

More troubling, however, is how the tax is applied. The good news: it’s designed to be administratively simple, capturing emissions at as few possible collection points and as accurately as possible. The bad news: in capitulating to environmentalists’ demands, it actually discourages industry best practices and safe and clean infrastructure. Whitehouse-Schatz requires that the tax be applied to, “greenhouse gases that escape throughout the fossil fuel supply chains.” It would not be applied at the points of emission, but rather an adjustment to the tax would applied equally to all producers and importers of fuel. Companies who utilize the best practices and the most advanced infrastructure with the fewest leaks will pay just as stiff a penalty as companies that wisely avoid investing in equipment from which they won’t benefit.

Finally, the Whitehouse-Schatz proposal doesn’t include any mechanism for regulatory preemption. The Environmental Protection Agency is obligated to regulate greenhouse gas emissions under the Clean Air Act, a mandate that created the faulty, expensive and ineffective Clean Power Plan. No tool within the CAA creates a proper framework for a regulatory solution. Even the Waxman-Markey cap-and-trade bill included provisions that would prevent the EPA from regulating carbon under certain provisions in the Clean Air Act. The senators, however, see this regulatory burden as a bargaining chip, not a problem to remedy.

For all its faults, the Whitehouse-Schatz proposal is promising in one respect: it demonstrates that motivated environmentalists know that market-based instruments can address the climate challenge effectively. An appropriately designed revenue-neutral carbon price can encourage economic growth, draw investment, boost innovation and achieve more emissions reductions at a lower cost than the regulatory machine. Toward that end, R Street has proposed that a carbon tax that would finance the outright elimination of the corporate income tax, a proposal we believe will unleash capital markets and boost employment while untethering economic growth from a carbon-based fuel supply.

Sen. Whitehouse is right – conservative solutions can work. The American Opportunity Climate Fee Act, however, is a far cry from conservative.

Image by visualdestination

The Hillsborough PTC is dead; long live the Hillsborough PTC


After years of tormenting ridesharing companies Uber and Lyft, as well as their customers, with burdensome regulations designed to prop up area taxi cab companies, the Hillsborough County Public Transportation Commission is set to be dissolved later this year by an act of the Legislature. Founded in 1987, the Hillsborough PTC regulates ground transportation companies such as cabs and limousines, as well as overseeing tow-truck companies in the Tampa Bay area.

Now that the Hillsborough PTC’s days are numbered, some of its remaining proponents warn that consumers will lack the kinds of protections that apparently only the PTC can provide. A recent local news report honed in on the PTC’s oversight of tow-truck companies as an example.

Indeed, tow-truck company activities can and should be regulated by local and state authorities. However, it does not take an entire government agency to do just that. In fact, the PTC was the only such local transportation board in the entire state of Florida. Other counties delegate ground transportation, towing and other such oversight and regulation to police departments, consumer-protection bureaus and other departmental offices within county government.

In Miami-Dade County, for example, tow-truck companies are regulated by the Department of Regulatory and Economic Resources, which also enforces consumer-protection measures like maximum towing rates, background checks on tow-truck operators, vehicle-safety standards, insurance requirements and other protections and remedies established by the Miami-Dade County Commission for consumers who have been towed. Orange County, which includes Orlando, has a consumer fraud unit that deals with all sorts of consumer-related issues, ranging from house repairs and construction to towing grievances.

Many municipalities also enact their own regulations that either work in harmony with the county’s or add additional layers to them. Florida state law also establishes basic guidelines. While towing is an industry inherently prone to angry customers, Florida’s is a relatively stable market.

The Hillsborough County Commission is currently exploring ways to distribute the PTC’s regulatory responsibilities across existing county agencies.  Tow-truck oversight, for example, is likely being transferred to the Sheriff’s Office. The commission is set to consider this and other staff recommendations related to the PTC’s impending dissolution at its next meeting Aug. 16.

Residents should praise the Legislature for dissolving an obsolete, unnecessary government agency that had been undermining competition and restricting transportation choice. However, county residents should remain vigilant of commission proceedings to ensure it preserves the rules and regulations the PTC enacted once upon a time—those that were reasonable and worked. This exercise should not used as an opportunity by local politicians, bureaucrats and entrenched interests to foist the kinds of unnecessary, burdensome regulations that led to the PTC’s dissolution in the first place.

Image by CrispyPork


Great ECPA expectations


When the Electronic Communications Privacy Act first was passed back in 1986, lawmakers mostly didn’t even imagine that email might play a central role in American life. Scarcely anyone in 1986—whether inside or outside of Congress—foresaw a day when we’d use the internet to help us find our misplaced phones and watches.

The digital landscape for Americans has vastly changed over the last three decades, but the central law spelling out when government needs to get a warrant to capture electronic communications has not. Because the internet is central to most of our lives, and because the potential scope of government intrusion on our lives has thus become vastly greater, it’s high time (or, really, past time) for Congress to update ECPA. That’s why we are pleased to see today’s introduction of the ECPA Modernization Act of 2017 by Sens. Mike Lee, R-Utah, and Patrick Leahy, D-Vt.

Congress is now poised to update the law in ways that reflect how pervasively we use digital communications and tools (computers, phones, watches, fitness trackers, and many other devices) in our everyday lives. The act aims to fix some serious flaws in the older law. The ECPA Modernization Act is not just about the content of digital communications; it’s also about the geolocation features (and other non-email, non-messaging features) that internet services increasingly offer us.

That’s not to say that the ECPA Modernization Act is perfect. It is a fundamental principle of liberal democracy that there should be limits on what government can grab from your digital world. These limits are essential to understanding the Fourth Amendment in the 21st century. Even as we see progress toward updating digital-privacy laws, it’s essential to point out that plenty of issues, such as the gathering and analysis of metadata, still need to be revisited and more thoroughly reviewed from a pro-privacy standpoint. (I’ve written about the underlying problems with ECPA’s inadequate protections for metadata here.)

And as Chris Calabrese of the Center for Democracy and Technology testified in 2015, the last time the Senate considered updating ECPA, the consequence of failing to update this creaky 1980s statute has been ambiguity and inconsistency. Is a Google Doc subject to the law if you’re only using Google Docs to store a document for later editing? Or, if it isn’t, does it become subject to ECPA provisions when you share the document for others to edit? Inquiring minds wanted to know.

This latest ECPA-revision language takes steps toward addressing both my concerns about metadata and Calabrese’s concerns about ambiguity. It adds warrant requirements for information stored in the cloud and for location information, as well as adding new limits on metadata collection. The ECPA Modernization Act may not be perfect (and what legislation is, really?), but it’s a good start, and it ought to serve as a good reminder that we shouldn’t wait another three decades—or even another three years—before we take another comprehensive look at how our individual privacy, and Fourth-Amendment-based limits on government snooping on citizens, should be updated for our fast-evolving digital landscape.

Image by Maksim Kabakou


Why city officials should welcome the autonomous revolution


The following post was co-authored by R Street Tech Policy Associate Caleb Watney. 

With tech and car companies racing to advance the state of self-driving car technology, the House Energy and Commerce Committee just gave the burgeoning industry a measure of regulatory certainty. Earlier today, the committee marked up and unanimously passed H.R. 3388, the Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution (SELF DRIVE) Act, a draft version of which previously moved through the panel’s Digital Commerce and Consumer Protection Subcommittee.

The bill would reserve for states and localities the power to regulate their streets and the rules of the road, as is appropriate. But when it comes to regulating vehicle design, performance and safety standards, the federal government would continue to take the lead through the National Highway Traffic Safety Administration.

Even though that basic division of regulatory labor has been a successful model for 60 years, groups representing city and transportation departments, along with allied activists, are sounding an alarmist warning that the House bill would “preempt state and local governments from regulating their own streets.” A joint letter from the National Association of City Transportation Officials, National League of Cities, Transportation for America and the Natural Resources Defense Council proclaims:

The bill would allow autonomous vehicle companies to self-certify the safety of their vehicles without an independent reviewer, and would severely limit any government from protecting the well-being of its citizens. This is akin to trusting the fox to protect the hen house, and would clear the way for automakers and tech companies to deploy hundreds of thousands of automated vehicles without adhering to stringent safety standards.

In fact, traditionally operated vehicles aren’t subject to pre-market approval either, because that would be a slow and costly system without any concrete benefit. What’s more, the safety standards already in place for traditionally operated vehicles also would apply to autonomous vehicles under the committee’s bill, just as they do now. Manufacturers of autonomous vehicles must go through a lengthy regulatory process to receive exemption from any NHTSA safety standard and must justify each deviation by demonstrating that an exempted development provides an equivalent level of safety. Ultimately, if manufacturers fail to live up to the agreement they make during the exemption process, or if vehicles prove to be problematic in practice, NHTSA still would have full authority to take them off the road using their expansive recall authority.

The legislation thus leaves the federal government well-positioned to continue protecting the well-being of all Americans with regard to vehicle safety—autonomous or otherwise—just as it has been doing with human-piloted vehicles for decades. By raising the cap on exemptions, companies will be able to conduct much more rigorous testing and deploy autonomous technologies more quickly. By avoiding a patchwork of design, performance and safety standards promulgated by local governments, they will not be driven to “shop” for friendlier regulatory environments across state lines or be forced into the compliance nightmare presented by the development of 50 or more conflicting standards.

NACTO and allies rightly point out that local governments “have made great strides to manage traffic congestion, reduce emissions and air pollution, and improve safety and mobility for people accessing jobs and opportunities.” After decades when American street design and transportation planning lagged behind international standards, many localities are catching up by implementing effective road diets, narrowing lanes and making multi-modal accommodations. But this legislation does nothing to interfere with that fine work. In fact, it relieves city and transportation planners of responsibilities that are beyond both their budgets and their core competencies.

Nothing in the House legislation prevents state and local governments from continuing to enhance the safety of their streets through improved design and regulation. Autonomous vehicles, just like human-piloted vehicles, will be responsible for following “rules of the road”, including speed limits and rights of way. And in fact, testing thus far shows that autonomous vehicles promise to be far more compliant with road regulations than citizen drivers and provide dramatically better safety outcomes.

With more than 40,000 auto fatalities in 2016, 94 percent of which were due to human error, every day that autonomous vehicles aren’t on the road means lives are lost. No one knows the safety dangers posed by human-operated automobiles better than the transportation officials that NACTO represents. Those officials should welcome the addition of highly autonomous vehicles to the toolkit of advocates for street safety.

Image by Scharfsinn


How Congress can use evidence-based policymaking

The Legislative Branch Capacity Working Group examined the use of data and analyses in policymaking at the group’s July 17 meeting, including exploring the challenges Congress faces in attempting to implement evidence-based policymaking and how increasing congressional capacity could lead to more and better evidenced-based lawmaking.

Collectively, panelists Lucas Hitt of the Commission on Evidence-Based Policymaking, Andrew Reamer of George Washington University, Timothy Shaw of the Bipartisan Policy Center and R Street Vice President of Policy Kevin R. Kosar noted that Congress always has sought data and evidence to help it make policy, but legislators will disregard that evidence for at least a few reasons: values, distrust, and parochial and other pluralistic interests.

The Commission on Evidence-Based Policy to release its report this fall, which will advise Congress on how to increase the use of data and research in legislating and oversight.

Video of the panel is embedded below:


Moss on whether copyright is a property right

With Congress possibly set to consider new ideas on copyright, R Street Tech Policy Manager Sasha Moss participated in a recent panel convened by America’s Future Foundation to debate the constitutional and philosophical underpinnings of intellectual property and explore whether today’s copyright laws are excessive or not sufficiently protective enough. Alongside co-panelist Kristian Stout of the International Center for Law and Economics and moderator Jim Harper of the Competitive Enterprise Institute, Sasha observed that current U.S. copyright law is not in line with what the founders intended. Full video is embedded below:

Is the real estate double bubble back?


Average U.S. commercial real estate prices are now far over their 2007 bubble peak, about 22 percent higher than they were in the excesses of a decade ago, just before their last big crash. In inflation-adjusted terms, they are also well over their bubble peak, by about 6 percent.

In the wake of the bubble, the Federal Reserve set out to create renewed asset-price inflation. It certainly succeeded with commercial real estate – a sector often at the center of financial booms and busts.

Commercial real estate prices dropped like a rock after 2007, far more than did house prices, falling on average 40 percent to their trough in 2010. Since then, the asset price inflation has been dramatic: up more than 100 percent from the bottom. In inflation-adjusted terms, they are up 83 percent.

This remarkable price history is shown in Graph 1.

graf 1

Bank credit to commercial real estate has been notably expanding. It is up $238 billion, or 21 percent, since the end of 2013 to $1.35 trillion. It has grown in the last two years at more than 7 percent a year, which is twice the growth rate of nominal gross domestic product, although not up to the annual loan growth rate of more than 9 percent in the bubble years of 2000-2007.

The Federal Reserve also succeeded in promoting asset-price inflation in houses. U.S. average house prices are also back over their bubble peak—by about 2 percent, in this case. They have rebounded 41 percent from their 2012 trough. In inflation-adjusted terms, house prices a have climbed back to the level of 2004, when we were about two-thirds of the way into the bubble. See Graph 2.

graf 2

The rapid house price increases since 2012 have not been matched by growth in bank residential mortgage loans or aggregate mortgage credit. Banks’ total residential mortgage loans were $2.45 trillion in 2012 and $2.41 trillion in the first quarter of 2017. Total U.S. 1-4 family mortgages outstanding went from $10.04 trillion to $10.33 trillion in the same period. Thus, there is a marked difference between the two real estate markets, with commercial real estate having even more price inflation and more bank credit expansion than houses. The interest rate environment is, of course, the same for both.

House prices and commercial real estate prices are closely related. As shown in Graph 3, they made an obvious double bubble, a double collapse and a double big rebound. The statistical correlation between the two since 2001 is 86 percent.

graf 3

Is what we have now a new double bubble, or something else?  Considering where these charts may go from here, we may ponder three key questions:

  1. If interest rates go up 1 percent or 2 percent, what will happen to commercial real estate and house prices?
  2. If the Fed stopped being a big buyer of mortgage-backed securities and bonds, what would happen to interest rates?
  3. Having driven asset prices up, by buying and maintaining huge long positions, can the Fed get out of these positions without driving prices down?

We will know the answers when, sometime in the future, somebody explains it all to us ex post. For now, we know that real estate prices are back to the levels of the last bubble, reflecting the Federal Reserve’s production of asset-price inflation through its interest rate and bond market manipulations.

Image by Noah Wegryn


New DOJ asset-forfeiture rules trample basic rights


In a speech Monday to the National District Attorneys Association annual conference, Attorney General Jeff Sessions announced the U.S. Justice Department plans to ramp up the use of civil asset forfeiture to “combat crime.”

If this sounds like a cliché ripped from a 1980s political speech, that’s not far off. The truth is, the DOJ new effort has less to do with fighting crime than it does with funding for law enforcement.

Sadly, what Sessions actually is doing is green-lighting escalation of DOJ and local law-enforcement efforts to seize property from people who have never been convicted of a crime, thus allowing government agencies to reap major monetary rewards. To put it another way, if the government can’t convict you of a crime, they will just take your stuff instead.

One could argue the road to asset forfeiture was paved with good intentions. The practice re-emerged at the height of the 1980s drug war, when law-enforcement agencies across the country were trying to bring down the drug trade. Civil asset forfeiture programs gave government agencies the power to seize cash, cars, guns or anything else of value that was potentially bought with drug money. Suspected drug dealers would then be forced to prove in civil court that they obtained everything legally. Once seized, the cash and other items would be used to fund both federal and local agencies’ drug war efforts, creating something of a vicious circle.

Like any power the government is granted, the practice has been expanded massively, with the end result being blatant violations of Americans’ civil rights. This country was founded on the principles of property rights and protections from unreasonable government search and seizure. Indeed, we have drifted a long way from the inalienable rights outlined in our founding documents that all men are protected under the due process of law.

Unsurprisingly, asset forfeiture has become a cash cow for the federal government and a slush fund for local law-enforcement agencies across the country. Local agencies construct their budgets based on expected seizures, which has created incentives to seize assets just to keep the lights on. All in all, civil asset forfeiture is a $5 billion “industry.” The government has so perfected the art of seizure that they now outperform actual criminals. In 2014 alone, the government seized more assets than actual burglars did.

For a while, things had been looking up. During the Obama administration, the Justice Department took some real steps toward curbing civil asset forfeiture. More importantly, many states across the country started to take a stand by passing laws to make it tougher for the government to seize assets. As of today, according to the Institute for Justice, 13 states require a criminal conviction before the government can take someone’s property. However, these state-level reforms are about to become moot thanks to the Justice Department.

Along with increased interest in asset forfeiture, Sessions and the DOJ announced Wednesday that the DOJ will also reinstate “adoptive” forfeiture, which allows state and local agencies a workaround to any potential state laws by allowing them to use a federal statue to seize property. Not only is this a direct challenge to states’ rights, it also provides incentives for local agencies to continue to pursue these actions with little regard for civil liberties.

Few think criminals should profit from their crimes. There’s also no doubt that it is challenging for state and federal law enforcement agencies to investigate and prosecute complex criminal enterprises like drug cartels and human traffickers. But the current system violates some of the basic principles this nation was built upon—due process of law, innocent until proven guilty and freedom—all in the pursuit of innocent people’s property.

Image by hafakot


Using the CPP to boost coal is just as bad


President Donald Trump has spoken repeatedly of his support for coal mining, pledging publicly that “we will put our miners back to work.”

It probably should not be surprising, then, that the White House would give serious consideration to a pitch made by several coal-mining union representatives to the Office of Management and Budget that would see the Environmental Protection Agency rewrite the Obama administration’s Clean Power Plan in ways that help the coal industry.

Alas, the ends the industry wants to achieve using the CPP are at least as wrongheaded as the command-and-control model that was used to craft the emissions plan in the first place.

What the proposal by the AFL-CIO, the International Brotherhood of Electrical Workers and the Utility Workers Union of America recommends is for EPA Administrator Scott Pruitt to initiate only the first of the CPP’s four “building blocks.” Such a plan would reward coal-fired power plant if they improved their boiler heat-rate efficiency, even though the improvements could only cut greenhouse gas emissions by 2-3 percent, as opposed to the additional 10-12 percent the previous administration wanted to see.

The CPP’s other three building blocks—natural gas switching, renewable energy and energy-efficiency programs—would be eliminated, leaving a rump emissions plan that could pass muster in the courts.

Unlike the recent decision to exit the Paris Climate Accords, in which the United States simply said it wouldn’t follow through on a prior commitment, the Clean Power Plan’s regulation of existing power plants was finalized in June 2015. That makes it legally hazardous to jettison the plan, which remains before the U.S. Supreme Court, without a replacement. Only an unprecedented legal stay issued by the court in February 2016 – shortly before the death of Justice Antonin Scalia – kept the regulations from coming into force.

It’s worth remembering that the Clean Power Plan was the Obama administration grand attempt to regulate emissions from coal-fired power plants. The White House sought to expand the scope of the Clean Air Act beyond “the fence line” of power plants to cut state-level emissions coercively, whether states agreed to the federal actions or not.

But just because the revised rule wouldn’t be as powerful doesn’t mean it wouldn’t be just as damaging to the economy over the long run. Dictating winners and losers in energy markets is always a bad idea. This is as true of the bias against coal and nuclear energy shown by regulators during the second Obama term as it would be of this new proposal to upgrade coal-powered electricity plants to a point where they still won’t be as clean as a new natural gas-fired plant.

The natural gas fracking revolution– driven entirely by market forces and private property rights – has contributed to the 14 percent reduction in energy-related U.S. carbon emissions since 2005, leaving us roughly in the same position we were in the early 1990s. Leaving an ineffective regulatory structure in place of the original CPP may save the Trump administration a lot of time and effort, but it isn’t the principled approach to energy development this country needs in the 21st Century.

Image by 1968

Microsoft’s alternative power deal could be breakthrough for consumer choice


Washington state regulators approved a settlement last week between Microsoft Corp. and their monopoly utility, Puget Sound Energy Inc. (PSE), to enable Microsoft to buy its own wholesale energy or develop its own supply. The agreement represents a more cordial approach amid a widespread trend of large customers seeking alternative power suppliers, but underscores the inherent choice-constraining limitations of the monopoly model, even with favorable amendments.

The monopoly model, premised on a single power provider with captive customers, does not easily accommodate customer preferences. However, a glimmer of choice has emerged recently. Microsoft is just one of many corporate customers to pursue third-party purchases or direct-access policies that enable one-off customer choice within a monopoly footprint.

Spurred by less expensive alternative suppliers and corporate commitments to clean energy, corporations have procured more than 6 gigawatts of wind and solar in the last two years alone. In 2016, Microsoft and Amazon led the pack in corporate clean-energy procurement. Based on public commitments, this trend looks likely to continue, with the likes of Google, Apple, Johnson & Johnson and more committing to source all of their consumption from renewables.

At a time when climate and clean-energy policy too often reverts to a culture war, voluntary clean-energy procurement by corporate leaders marks a refreshing intersection of the conservative and green agendas. Bill Hogan, a Harvard professor and electricity markets expert, emphasizes that customers spending their own money to contract for green power is consistent with market principles. He clarifies that the “problem comes when governments spend other people’s money, using their power to mandate, that is a public policy concern.”

This may blossom into the new chapter of voluntary environmentalism, which has roots in the kinds of conventional pollution reduction (beyond legal requirements) that preceded today’s amplified climate discussion. For some companies, the reputational or branding benefits of contributing to a cleaner environment can provide substantial incentives. It appears those benefits are magnifying at the same time that the cost of renewables has fallen, spearheaded by merchant wind developers providing very competitive power purchase agreements.

Some have voiced concerns that an exodus of big customers from monopoly service may leave other customers with higher bills. A large customer’s departure could create stranded costs for the utility, which it will shift to other customers if permitted by regulators. To cover these costs, regulators may require customers seeking to leave the monopoly to pay exit fees. Companies like Microsoft might even go beyond the exit fee by pledging support for local community programs.

Proper exit fees can prove technically challenging to calculate. In addition, monopoly utilities often leverage those fees to impose a regulatory barrier to exit. In particular, they frequently will underplay the benefits to their remaining customers of the reduced costs and expanded opportunities to sell excess power.

Litigated exit fee cases have proven contentious and inefficient. In Nevada, numerous cases have led to prolonged regulatory battles and deterred some companies (e.g., Las Vegas Sands Corp.) from seeking to buy power on the open wholesale market. In a recent filing before the Nevada Public Utilities Commission (PUC), Wynn Las Vegas argued the exit fee imposed by the PUC—whose staff changed their methodology from the one applied to the previous exit request of the data storage company Switch—was unfair and discriminatory.

In fact, Switch incurred regulatory headaches of its own. The PUC rejected its initial proposal to switch to an alternative provider in 2015. Other Nevada resorts and casinos, including Caesars Entertainment Corp., are either considering or already have applied to leave monopoly service, with the MGM Grand agreeing to pay an $87 million exit fee.

Even with direct access, regulatory delays and inflated exit fees can serve as chronic limits to customer choice, not to mention that clinging to the monopoly model results in an underdeveloped market for alternative suppliers. Even the Microsoft settlement revealed differences between the customer and the utility over how to calculate the exit fees. In its initial testimony, Microsoft argued that its departure would provide a net benefit and estimated that, using generally accepted rate-setting standards, the utility would compensate Microsoft between $15 million and $35 million to leave (the two sides differed over the timeframe used to calculate the useful life of the utility’s assets and market value of excess generation).

However, in the end, Microsoft agreed to pay an inflated $24 million exit fee. The settlement represents a deal between numerous parties that is likely more efficient than prolonged litigation. Such a collaborative approach may serve as the preferred interim model in monopoly states (i.e., negotiated special contracts), short of a new customer tariff that would streamline the process.

Despite the niceties of settlements, such agreements retain undertones of the fundamental rift between increasingly heterogeneous customers and the choice-constraining monopoly model. In restructured or “retail choice” states, customers choose their power provider freely, and large customers often negotiate contract terms tailored to their unique profile.

Restructured states present a big advantage for corporate consumers, and policymakers increasingly have noted this advantage for retaining and attracting businesses. Enabling third-party service or direct access is certainly not the “end game” regulatory structure, but it offers a great incremental step to introduce customer choice, with benefits both for customers and for the environment.

Image by Katherine Welles

Jonathan Coppage all over your TV screen


Visiting Senior Fellow Jonathan Coppage’s recent Washington Post op-ed taking apart the alarmist coverage of a purported trend of millennials living at home as adults (tl;dr, it’s a normal thing, historically, and there’s a lot to recommend it in practice) drew quite a bit of attention, earning Jon invitations to sit down on a pair of national cable news shows. First, there was a two-part spot on CNBC’s Squawk Box:

Next, he was on CNN, discussing the piece with Smerconish host Michael Smerconish:

Why quality will trump quantity in the net-neutrality debate

Also appeared in: TechDirt


If you count just by numbers alone, net-neutrality activists have succeeded in their big July 12 push to get citizens to file comments with the Federal Communications Commission. As I write this, it looks as if 8 million or more comments have now been filed on FCC Chairman Ajit Pai’s proposal to roll back the expansive network-neutrality authority the commission asserted under its previous chairman in 2015.

There’s some debate, though, about whether the sheer number of comments—which are unprecedented not only for the FCC, but also for any other federal agency—is a thing that matters. I think they do, but not in any simple way. If you look at the legal framework under which the FCC is authorized to regulate, you see that the commission has an obligation to open its proposed rulemakings (or revisions or repeals of standing rules) for public comments. In the internet era, of course, this has meant enabling the public (and companies, public officials and other stakeholders) to file online. So naturally enough, given the comparative ease of filing comments online, controversial public issues are going to generate more and more public comments over time. Not impossibly, this FCC proceeding—centering as it does on our beloved public internet—marks a watershed moment, after which we’ll see increasing flurries of public participation on agency rulemakings.

Columbia University law professor Tim Wu—who may fairly be considered the architect of net neutrality, thanks to his having spent a decade and a half building his case for it—tweeted July 12 that it would be “undemocratic” if the commission ends up “ignoring” the (as of then) 6.8 million comments filed in the proceeding.

But a number of critics immediately pointed out, correctly, that the high volume of comments (presumed mostly to oppose Pai’s proposal) doesn’t entail that the commission bow to the will of any majority or plurality of the commenters.

I view the public comments as relevant, but not dispositive. I think Wu overreaches to suggest that ignoring the volume of comments is “undemocratic.” We should keep in mind that there is nothing inherently or deeply democratic about the regulatory process – at least at the FCC. (In fairness to Wu, he could also mean that the comments need to be read and weighed substantively, not merely be tallied and dismissed.)

But I happen to agree with Wu that the volume of comments is relevant to regulators, and that it ought to be. Chairman Pai (whose views on the FCC’s framing net neutrality as a Title II function predate the Trump administration) has made it clear, I think, that quantity is not quality with regard to comments. The purpose of saying this upfront (as the chairman did when announcing the proposal) is reasonably interpreted by Wu (and by me and others) as indicating he believes the commission is at liberty to regulate in a different way from what a majority (or plurality) of commenters might want. Pai is right to think this, I strongly believe.

But the chairman also has said he wants (and will consider more deeply) substantive comments, ideally based on economic analysis. This seems to me to identify an opportunity for net-neutrality advocates to muster their own economists to argue for keeping the current Open Internet Order or modifying it more to their liking. And, of course, it’s also an opportunity for opponents of the order to do the same.

But it’s important for commenters not to miss the forest for the trees. The volume of comments both in 2014 and this year (we can call this “the John Oliver Effect”) has in some sense put net-neutrality advocates in a bind. Certainly, if there were far fewer comments (in number alone) this year, it might be interpreted as showing declining public concern over net neutrality. Obviously, that’s not how things turned out. So the net-neutrality activists had to get similar or better numbers this year.

At the same time, advocates on all sides shouldn’t be blinded by the numbers game. Given that the chairman has said the sheer volume of comments won’t be enough to make the case for Title II authority (or other strong interventions) from the commission, it seems clear to me that while racking up a volume of comments is a necessary condition to be heard, it is not a sufficient condition to ensure the policy outcome you want.

Ultimately, what will matter most, if you want to persuade the commissioners one way or another on the net-neutrality proposal, is how substantive, relevant, thoughtful and persuasive your individual comments prove to be. My former boss at Public Knowledge, Gigi Sohn, a net-neutrality advocate who played a major role in crafting the FCC’s current Open Internet Order, has published helpful advice for anyone who wants to contribute to the debate. I think it ought to be required reading for anyone with a perspective to share on this or any other proposed federal regulation.

If you want to weigh in on net neutrality and the FCC’s role in implementing it—whether you’re for such regulation or against it, or if you think it can be improved—you should follow Sohn’s advice and file your original comments no later than Monday, July 17, or reply comments no later than Aug. 16. If you miss the first deadline, don’t panic—there’s plenty of scope to raise your issues in the reply period.

My own feeling is, if you truly care about the net-neutrality issue, the most “undemocratic” reaction would be to miss this opportunity to be heard.

Image by Inspiring


Alabama backs down on targeting margarita pitchers


In these hot summer months, nothing refreshes like a margarita. But in Alabama, the state Alcoholic Beverage Control Board had banned pitchers of this limey and refreshing libation. Seriously.

R Street’s Cameron Smith exposed the ban and advocated for its repeal in AL.com after a series of email exchanges with ABC representatives:

The Alabama Alcoholic Beverage Control Board (ABC) doesn’t want you wasting away in Margaritaville, so they’ve banned pitchers of the frozen concoction outright.

No, I’m not joking.

But we shouldn’t be surprised. This is the ABC that cracked down on people drinking while dining on the sidewalks in Mobile. It’s the same ABC that cut a deal to impose a 5 percent liquor mark-up to help the legislature and the governor enact a back-door tax hike.

Now the agency has taken to reminding licensees of its legal ‘interpretation’ that beer is the only alcoholic beverage that may be served in a pitcher…

ABC claimed it was concerned with the tequila in margarita pitchers “settling” over time, which could lead to situations where the first few drinks poured from the pitcher had less alcohol than the ones from the bottom of the pitcher. As Smith pointed out, this amounted to an argument that a group of legal adults “can’t figure out how to handle a pitcher of margaritas shared among them.”

Smith’s column generated enough outcry among Alabama residents that Dean Argo, ABC’s government relations manager, took to AL.com to announce that the board would no longer target margarita pitchers. In short, ABC has backed off, at least for now. (The Associated Press also covered the reversal).

While this was a clear win for margarita lovers across the state, Argo ominously suggested that the state may still draw a line between which types of drinks can be served in pitchers and which cannot. The dividing line would appear to be if the drink in question is “customarily” served in pitchers. So, margaritas and beer would seem to be safe, but what about less clear cases like mojitos? Mojitos are certainly served in pitchers sometimes, but is it “customary” to serve them that way? And how about bottled cocktails, which have become all the rage in the cocktail world? Are they a “pitcher,” and if so, are they “customary”?

The ABC’s decision to draw the line at what types of drinks are “customarily” put into pitchers is the type of ambiguous legal phrase that only a government lawyer could love. Call it “pitcher ambiguity,” and suffice it to say R Street’s team will be the first to blow the whistle if more pitcher shenanigans go down in Alabama.

Note: Cameron Smith has also been tracking and writing about the Alabama ABC’s attempt to enact a stealth tax increase by increasing the state liquor mark-up. Read more about that here.

Image by Danny E Hooks


Welcome to Climate Junior High


The new kid in the class is glib and loud, while the gal in charge of the “cool kids” pretended he hadn’t even entered the classroom. At least, that’s the way it seems from watching President Donald Trump and German Chancellor Angela Merkel in Hamburg last weekend at the Group of 20 (G-20) summit involving a majority of the world’s most industrialized countries.

In the weeks before the meeting, analysts and partisans were praying for some kind of moral reckoning for Trump on his arrival in Hamburg, the heart of Germany’s political left. Trump’s withdrawal from the Paris Climate Accords in early June had sent many European leaders into a state of shock, given that the European Union’s plan to cut its climate emissions dramatically is its pre-eminent geopolitical strategy.

Speaking before the German Parliament in late June, Merkel said of the U.S. withdrawal that “the climate treaty is irreversible and is not negotiable” – a direct rebuke of Trump’s decision to go it alone concerning climate change.

In other words, a beat-down in the lunchroom was expected.

Nevertheless, Trump and Merkel played nice in front of the dignitaries during the July 8-9 summit and the United States dissented from the 19 other countries’ consensus language on climate change in the final joint declaration with relative ease. The White House even was allowed to insert language saying the United States “will endeavor to work closely with other countries to help them access and use fossil fuels more cleanly and efficiently.”

The addition of the clean fossil fuel language was a “poker tell” to the radically divergent strategies at the heart of the chasm between United States and European Union on energy and climate policy. The problems undermining the Paris Accord—its voluntary and top-down nature, in particular—have been highlighted repeatedly by R Street and others. The facts of the case remain unchanged.

The United States, through the development of hydraulic fracturing and subsequent very low natural gas prices, has cut its energy-related carbon emissions more than any other member of the G-20 since 2005. The reason has nothing to do with international agreements or top-down approaches.

Instead, market forces drove natural gas drillers in the late 2000s to develop the hydraulic fracturing of shale basins in Pennsylvania and Texas. The explosion of natural gas supplies soon made it the fossil fuel of choice, over coal, for electricity plants around the country. The rest is history.

Since peaking in 2007, U.S. energy-related carbon emissions are down roughly 14 percent, while Germany, which sees itself as the world leader in climate change, had its carbon emissions fall 7 percent during the same period.

Given the size of the U.S. economy, the scale of the emissions savings has been enormous, with U.S. emissions falling 600 million metric tons compared to Germany 70 million tons over the same time period. All this while the European Union spent $1.2 trillion on wind, solar and bio-energy subsidies and an emissions trading scheme (ETS) that priced carbon too low to be effective.

Merkel waited until the very end of the summit to express her disdain: “Unfortunately – and I deplore this – the United States of America left the climate agreement,” she said in her closing statement.

As it stands, the differences in energy and climate outlook between the United States and Europe could not be wider. The United States looks to export both oil and natural gas into Europe. Meanwhile, both Germany and France are constraining both nuclear power and all fossil fuel use, as they aim for a dramatic cut in emissions by midcentury.

Perhaps French President Emmanuel Macron, who is also a new kid in the class, has a different plan to bring Trump into the climate club when he hosts Trump for Bastille Day celebrations in Paris July 14.

Image by Rawpixel.com


Kosar talks CRS reports on FedSoc podcast

In episode 2 of the Federal Society’s Necessary & Proper podcast, the R Street Institute’s Kevin Kosar discusses the Congressional Research Service, a nonpartisan government think tank in the Library of Congress. CRS assists Congress in lawmaking and oversight, and lamentably Congress has downsized the agency. CRS also has struggled to adapt to the hyper-partisan, Internet-connected Hill environment.

The full episode is embedded below:

South Miami solar mandate would trample property rights


Expanding solar energy to rely less on oil, gas and other nonrenewable resources is an almost universal goal, regardless of one’s political persuasion.  Indeed, with growing concerns about climate and the economic and even national security implications of relying on nonrenewable and oftentimes foreign energy sources, it makes sense to look at solar as a viable means to power more of society’s needs.

But as noble as the expansion of solar energy might be, its pursuit should never infringe on individual rights, as some local governments appear to be doing. For example, the City of South Miami is considering an ordinance that would require installation of solar panels on all newly constructed homes, as well as older homes whose owners elect to renovate 50 percent or more of the square footage.

Indeed, although the cost of solar-energy-generating devices has dropped in recent years, they still remain cost-prohibitive to most. This ordinance would not only increase the price of homes in a city where cost-of-living is already way above the national average, but may actually serve as a disincentive to existing homeowners who wish to make their older homes more energy efficient. Residents who might otherwise consider remodeling their homes with energy-efficient doors, windows, roof shingles, insulation and appliances may think twice if they were also forced to purchase expensive solar panels.

But even that is not the point.

This is a clear and egregious example of government trampling on individual property rights. Local and state authorities can and should develop building codes to ensure safety; Miami-Dade County already has a strict building code due to its vulnerability to hurricanes. However, residents should not be forced to purchase an expensive product that serves no health or safety purpose as a condition to develop or improve their own properties, just so politicians can feel good about themselves.

It is fair to debate how to expand solar-energy production and who should pay for it. Should government subsidize research? Should government grants or tax credits be offered to entice individuals to install solar panels? Should utility companies purchase excess power generated by privately owned solar devices?

These are all relevant public-policy issues that well-intended people with differing opinions can debate, and they all revolve around the notion that solar-device installation is a choice, not a mandate. Government should not pick one industry over another through subsidies or unfair incentives or penalties. Allowing energy producers to compete on a level playing field will encourage them to innovate and make their products more efficient and thus more economically viable over time.

Image by ND700


What’s in the FY2018 House legislative branch appropriation?


The House Appropriations Committee approved Fiscal Year 2018 appropriations via a June 29 voice vote. The bill calls for $3.58 billion of funding for House and joint-chamber operations (Senate-specific items are not included), a full $100 million more than the enacted FY2017 funding levels. It should, however, be noted that the FY2018 appropriation is much lower than the appropriation of FY2010.

On the same day, the committee released a full report explaining the appropriating rationale.

What is actually included in the bill? Who won and who lost the funding battles?

Big Winners

Security: In light of the recent shooting of Rep. Steve Scalise, R-La., staffer Zachary Barth, and Capitol Police officers Crystal Griner and David Bailey, the committee clearly saw a need to boost various forms of security for members and the government. The Capitol Police received an increase of $29 million, the House sergeant-at-arms budget was upped $5 million to $20.5 million, and $10 million of that was itemized to enhance the cybersecurity program of the chief administrative officer (CAO).

Architect of the Capitol (AOC): The stewards of the capitol complex, from building maintenance to landscaping, received a $48.4 million increase in funds over FY2017 enacted levels. The committee instructed the AOC to spend the appropriated $577.8 million on efforts that “promote the safety and health of workers and occupants, decrease the deferred maintenance backlog, and invest to achieve future energy savings.”

Library of Congress (LOC): For FY2018, the Library of Congress’ appropriations were increased $16.9 million to $648 million in an effort to modernize information technology and copyright efforts, as well as provide more funds ($3.5 million) to Congress’ nonpartisan think tank, the Congressional Research Service (CRS). Additionally, $29 million of the AOC’s appropriation was itemized for improvement and maintenance of LOC buildings and grounds.

Transparency: After years of debating the issue, the appropriators directed CRS to make all of its nonconfidential reports available to the public. The agency was given 90 days to submit an implementation plan, including cost estimates, to its oversight committees.

Big Losers

House Office Buildings: Despite the AOC receiving a sizable bump in appropriations, the amount allocated for the maintenance and care of the four House office buildings initially was chopped by $23.4 million from FY2017 levels. $4 million was later amended by voice vote, leaving the cut at $19.4 million.

Members Representational Allowance (MRA): The funding stream allowing members of the House to hire more and better compensate current congressional staffers remained at FY2017 levels ($562.6 million). “This level of funding will allow the MRAs to operate at authorized levels as approved by the Committee on House Administration,” declared the committee. What it will not do is reverse the long decline in congressional staff levels and salaries.

Government Accountability Office (GAO): Though not a decrease in funding levels, GAO was granted only a $450,000 bump in funds, despite a requested $46 million increase over FY2017 enacted levels. The agency requested the substantial increase for FY2018 chiefly for increased staffing in order to reduce the amount of improper governmental payments, identify ways to close the gap between taxes owed and taxes paid, and assist Congress in determining “policy implications of increasingly complex and rapidly evolving development of science and technology.” Instead of comparably larger increases enjoyed by sister agencies CRS ($3.5 million) and the Congressional Budget Office ($2 million), GAO’s appropriation remained relatively flat at $568 million.

Legislative Branch Appropriation Bill Specifics

Capitol Police: FY2018 funding levels increased $29 million to $422.5 million, including an increase of $7.5 million to “enhance Member protection, increased training, equipment and technology-related support items”; an increase of $13.2 million for Capitol Police buildings and grounds; and half-year funds to hire 48 additional sworn officers.

House Sergeant-at-Arms: An increase of $5 million with the “intent of enhancing security for Members when they are away from the Capitol complex. The Committee is aware that a specific plan is still evolving and once fully developed a plan will be presented to the Committee.”

Member’s Representational Allowance (MRA): though the MRA remains at FY2017 levels ($562.6 million), “the Committee has provided resources necessary to support the Committee on House Administration’s plan to increase Member’s Representational Allowance (MRA) by $25,000 per account this year for the purpose of providing Member security when away from the Capitol complex.”

Chief Administrative Officer (CAO): The CAO received an additional $10 million for strengthened cybersecurity measures. Additionally, the committee suggested that “with effective management of the program and continued support in appropriations, sufficient funding exists” to increase the number of two-year fellows partaking in the CAO’s Wounded Warrior Program from 54 to 85.

House Leadership Offices: FY2018 funding levels remained constant at $22.3 million.

House Committees: Appropriations for the salaries and expenses of House committees decreased by $45,004, from $150,324,377 in FY2017 to $150,279,373 for FY2018.

Joint Committees: The Joint Committee on Taxation received an increase of $360,000 to $10.46 million, while the Joint Economic Committee’s funding remained at $4.2 million.

Congressional Budget Office (CBO): Funding levels increased $2 million, from $46.5 million in FY2017 to $48.5 million for FY2018.

Architect of the Capitol (AOC): FY2018 funding levels increased $48 million to $578 million, including a $12.7 million increase for care and maintenance of the U.S. Capitol; $20 million increase in funding for the Capitol Power Plant; a $29 million increase for Library of Congress buildings and grounds; and a decrease of $27.4 million for House office buildings maintenance.

Congressional Research Service (CRS): Funding levels increased $3.5 million from $108 million in FY2017 to $111.5 million for FY2018.

Government Publishing Office (GPO): FY2018 funding levels remained constant at $117 million.

Office of Compliance: FY2018 funding levels remained flat at $3.6 million.


Two amendments to the FY2018 legislative branch appropriations bill were adopted by the Appropriations Committee.

  1. The manager’s amendment from Rep. Kevin Yoder, R-Kan., added $4 million to House office building maintenance. Instead of a decrease of $27.4 million, the amendment makes the decrease $23.4 million.
  2. Barbara Lee, D-Calif., sponsored an amendment that directed CAO to submit a report to committee within 90 days “addressing the ways in which Members and staff who have hiring and management responsibilities can be given the tools to combat unconscious bias in hiring and promotion, and with education on the negative impact of bias.”

Image by Golden Brown


States still stuck when it comes to pension plan fixes


I spoke recently with Bill Howell, the longtime speaker of Virginia’s House of Delegates. While he is not standing again for election, he is the kind of person who wants to use the last portion of his authority with the state government to work on the most important issues facing his state.

Number one on his list is pension reform. Nobody will be able to pin on him the consequences of inaction today or the failure of an unsustainable system over time. Making the choice to spend the last months of his time in office with a virtual shovel on his shoulder is leadership one doesn’t see much across the Potomac these days.

Other places will certainly provide awareness through “canary in the coal mine” warnings about the fiscal challenges of our retirement security system, but our political system and culture are generally less responsive to these kinds of virtually certain problems than they are to perceived future environmental hazards. As one example, due in large part to the one-child policy instituted in 1979, China is now contemplating the “4:2:1” situation of one grandchild in the workforce struggling to support two parents and four grandparents. For perspective, China is physically roughly the same size as the United States, with five times its population. That country alone is projecting a population over age 60 of more than 300 million people by 2024. The pressure on offspring to care for this number of elderly is mirrored in public programs.

Somewhere in my files is a page of dates in the not-so-distant future that represent each state’s technical bankruptcy, if something isn’t done in the meantime to alter the math. There is also Medicaid, of course, the budget issue du jour, but these dates are only based on pensions and state employee health care. In those jurisdictions where local governments participate in the state systems, their figures are included.

Pennsylvania is a good example of the political and financial pressures on governments to keep promises to their employees. Having barely celebrated passage of needed reforms a few days ago, there is already serious discussion of allowing the state to borrow the money it just required itself to put aside to fund the reforms.

Not even a month ago, Pennsylvania lawmakers enacted bipartisan legislation that required them to fully fund the employer (state) share of their defined contribution plan. When Gov. Tom Wolf signed the bill his public comment was: “Here in Harrisburg we can get important things done in a way that I think a lot of other places cannot.”

The new law provides that only hazardous-duty state employees, such as law enforcement, will stay eligible for the once-ubiquitous defined benefit plans that defined public pensions for decades but have been mostly phased out in the private sector. Both state and school employees who start jobs in 2019 will have three retirement options, and current employees will have to choose one, as well. Two of the new plans combine features of a guaranteed pension amount with an investment vehicle similar to private-sector plans. The third is a full defined contribution plan like a 401(k) plan, where the state pays 2 percent of salaries into the plan for school employees and 3.25 percent for other state workers to match their 7.5 percent minimum contributions.

Now there are rumblings that the state will authorize—as Illinois and other states with shaky financials have—sales of pension obligation bonds to lay against a portion of its share. It is theoretically possible to earn a rate of return on the bonds more than the pension contribution owed, but successes are few, and the risk to future workers and taxpayers accordingly great. Both Illinois and New Jersey have sold billions of dollars of pension obligation bonds. This year, 80 percent of the money paid out by Illinois for state teacher pension payments is going toward the unfunded liability. The state has never paid its full share, according to the Teacher’s Retirement System. Racking up long-term losses on these instruments, Illinois jacked up its income tax by 66 percent in 2011, and another 32 percent increase was over Gov. Bruce Rauner’s veto this past week. These are not unrelated stories.

New Jersey has suffered the indignity of being sued by federal regulators for securities fraud in its pension bond sales. The Garden State’s pension system was rated dead last among the 50 states in the most recent Pew Charitable Trust national study. State workers have been paying in higher amounts since 2011 reforms, but the state has not kept up its commitment. Ironically, the latest reform proposal for the worst-funded pension system among the states is to give it the billion-dollar lottery. This would increase the funded rate immediately to 65 percent – a dramatic improvement. If there is a better metaphor for a New Jersey solution, I don’t know what it would be. People in the Garden State will be encouraged to keep on gambling.

Pennsylvania should stay the course, and allow the reforms to nudge the retirement plans for state workers and teachers back toward stability.

Image by Aaban


Carbon tax versus clean tax cuts policy wonk rumble

Back in April, R Street Energy Policy Director Josiah Neeley moderated a panel at Earth Day Texas in Dallas. Billed as a “Policy Wonk Rumble,” the panel compared the merits of different ways to use the tax code to encourage clean energy and reduce greenhouse-gas emissions. Also featured on the panel were Peter Bryn of Citizens Climate Lobby, Travis Bradford of Columbia University, Rob Sisson of ConservAmerica, and Rod Richardson of the Grace Richardson Fund.

The future of aviation demands privatized air-traffic control


American air-traffic control is safe, but as currently constituted, the system won’t be able to keep up with the increasing demand for domestic and international air travel. To ensure Federal Aviation Administration can continue to modernize and operate efficiently, free of budget uncertainty and political interference, air-traffic control should be turned over to an independent nonprofit corporation, as proposed by H.R. 2997, the 21st Century Aviation Innovation, Reform, and Reauthorization Act.

From 1996 to 2012, the FAA’s budget doubled, even though staff levels stayed roughly constant and the agency’s productivity actually fell. A 2016 inspector-general’s report found that, of the system’s 15 most recent major system acquisitions, eight had gone over-budget by a total of $3.8 billion and eight were behind schedule by an average of more than four years. These sorts of problems illustrate the difficulties the FAA faces in adapting to new market conditions due to higher and more complex demand.

The 21st Century AIRR Act—sponsored by Rep. Bill Shuster, R-Pa., chairman of the House Transportation and Infrastructure Committee, which cleared the bill June 27 in a 32-25 vote—would assign oversight of America’s air-traffic control system to a new nonprofit corporation, with a CEO who is answerable to a board of directors made up of “a diverse cross-section of the aviation system’s stakeholders and users.” The act would refocus the FAA on federal safety oversight and streamline the FAA certification process, making it easier for companies to get their products out on time. This would encourage innovation in aviation technology by lowering the cost of implementation.

The proposal has support from President Donald Trump, who included a version of it in his proposed FY 2018 budget. As the National Taxpayers Union Foundation detailed in a recent piece, “the budget forecasts that taxes would be reduced by $115 billion from FY 2021 to FY 2027. The FAA’s budget for ATC would be reduced by $70 billion, leaving the agency to focus on regulating aviation safety.”


But the measure also faces pushback from a variety of aviation interests. They prefer the Senate’s FAA reauthorization bill from Sen. Jon Thune, R-S.D., which does not include air-traffic control privatization. The Schuster proposal should be considered commonsense legislation, not only cutting government waste but making the world a little bit safer. Let’s hope it moves on the House floor soon.

Image by Stoyan Yotov


Private flood insurance should be allowed to compete on a level playing field


Since 1968, the National Flood Insurance Program (NFIP)—in a well-intentioned but ill-designed effort to help home and business owners in flood-prone regions—has provided flood insurance at below-market rates. Predictably, the program has racked up a significant amount of debt, discouraged private competition and innovation and distorted consumers’ ability to calculate the risk of living and building in flood-prone areas.

As Congress considers NFIP reauthorization this summer and fall, lawmakers ought to implement structural reforms that will benefit both insurance consumers and the American taxpayers.

It is a well-known economic adage that “if you subsidize something, you get more of it.” In this case, the NFIP’s practice of subsidizing insurance premiums for high-risk areas has created a moral hazard problem where the government insurance program actually encourages higher levels of risk-taking. This has turned out to be quite costly for the American taxpayer, as the NFIP is now over $25 billion in debt to the U.S. Treasury. The Government Accountability Office has found the program is unlikely ever to generate enough revenue to cover its costs, exposing the federal government to further financial risk.

Yet, the subsidies keep flowing to areas where floods are common, and where it may not otherwise be cost effective rebuild. There is no better evidence that the NFIP is encouraging risk than the fact that 25-30 percent of flood insurance claims in the NFIP system are generated by a mere 1 percent of properties that have government-backed insurance. This distortion of risk will continue to make the program fiscally unsustainable until the government ceases to offer insurance premiums at significantly below-market rates.

Unsurprisingly, regulations on what kinds of private market insurance lenders can accept, along with the subsidized rates, historically have made it difficult for insurance companies to offer competitive flood insurance plans. Private companies do not have the luxury losing $25 billion. Though previous reforms sought to level the playing field and move the NFIP toward risk-based rates, unclear language has continued to stymie private market development, limiting choice for consumers and putting taxpayers at continued risk. Among the issues that put private entities at a disadvantage is that NFIP policyholders who make the switch to private insurance are not considered to have continuous coverage, and therefore may have to pay significantly more should they ever decide to switch back.

Congress should look to Florida as an example of how to salvage a failing insurance system. Before state government enact reforms in 2010, Florida’s public insurance program, Citizens Property Insurance Corp., was fiscally unsound and the Florida taxpayers were exposed to high levels of risk in the event of another hurricane. State lawmakers incrementally raised premiums to be in line with the market rates and allowed private companies to assume many of the policies previously written by Citizens. As they did, the fiscal burden shifted from taxpayers to private entities.

The Flood Insurance Market Parity and Modernization Act, submitted in both the House and the Senate, would be an important first step to enable private market insurance to compete on a level playing field with government insurance. It would end clarify federal lending rules, allow insurers who participate in the NFIP’s Write Your Own program to also underwrite private flood insurance and end the practice of penalizing those who choose to purchase private coverage. It would also further the move toward a less distorted system and thus shift some of the burden off the taxpayers.

Despite passing the House unanimously in 2016, and passing the House Financial Services Committee unanimously last month as part of its package to reauthorize NFIP, the bill has not yet moved in the Senate. The Senate Banking Committee should take a lesson from their House colleagues and include this important clarification in their own legislation to reauthorize NFIP. Failing to do so would only ensure that, for many years to come, American homeowners will continue to be at the mercy of a failing government program, all on the taxpayer dime.

Image by humphery

Pollock before the Subcommittee on Monetary Policy and Trade

R Street Distinguished Senior Fellow Alex Pollock testifies before the House Financial Services Committee’s Subcommittee on Monetary Policy and Trade in a June 28 hearing on “The Federal Reserve’s Impact on Main Street, Retirees and Savings.”

Coppage at R Street-CNU event in Salt Lake City

The R Street Institute recently co-hosted an event in Salt Lake City with Utah chapter of Congress for the New Urbanism on how to make both housing affordability and strong communities possible in a red-state boom town like Utah’s capital. Alongside Sutherland Institute Director of Public Policy Derek Monson and Health Hansen, staffer to Sen. Mike Lee, R-Utah, R Street Visiting Fellow Jonathan Coppage reviewed the need to allow for small solutions to big problems, relegalizing accessory dwelling units and missing middle forms.

The downsides of using executive agency detailees

In a previous post, I recounted the advantages of using executive detailees as a means to combat staffing shortages on Capitol Hill. In short, agency detailees can serve as a free source of policy expertise to Congress, providing committees with experience and insight into agency decisionmaking and likely responses to congressional actions.

But, as with all governing arrangements, executive-branch detailees are not always an unalloyed good. Detailees, as some Hill veterans will explain, can come with costs.

  1. Detailees can have divided loyalties

Detailees can have a hard time shedding their agency allegiances, ultimately resulting in divided loyalties between their parent agency and their new congressional committee. These allegiances may be unconscious byproducts of spending a career in the executive branch.

Other agency employees, however, may have more deliberate congressional prejudices. Such detailees view Congress and its committees as institutions unfamiliar with the intricate inner-workings of their agency, and ones attempting to encroach on their expertise and operations with new laws and a constant barrage of oversight information requests. In these instances, detailees may struggle to work in support of the institutional interests of Congress.

  1. Detailees can have fixed policy preferences

Relatedly, borrowed agency employees may bring with them explicit policy preferences, often within specific issue areas they handled within their parent agency. Serving as a policy expert on a relevant committee may provide an opportunity to grind such a policy ax and, in turn, warp the policymaking processes within their new committee.

  1. Detailees often need training

Detailees are often unfamiliar with the legislative process and require basic training in congressional procedures once they get to the Hill. Given that committee resources are already severely strapped, providing such training further saps the time of permanent committee staff.

The time and resources spent bringing detailees up to speed on the ways of the Hill can result in a small return on the investment for Congress. What’s more, because detailees are loaned out for a limited time—often a year or less before returning to the executive branch—a constant cycle of orientation, training, working and departing can develop where very little time is spent on intricate policymaking.

  1. Detailees can mute the call to increase staffing capacity

A growing dependence on detailees as a means to compensate for decreasing congressional capacity may prompt some to argue that increasing the number of permanent congressional staff isn’t necessary. Detailees are seen by some as capacity Band-Aids covering up the more threatening conditions of limited expertise and too few staff in Congress. Increasing committee reliance on their use may perpetuate a situation of inadequate congressional staffing levels.

Agency detailees can be a source of policy expertise for congressional committees, but their contributions can’t be assumed. Detailees, themselves, can be a drain on the already limited capacity of Congress, and ultimately make Congress less effective, less productive and more susceptible to outside influence.

FDA misinterprets massive victory on teen smoking


As detailed this morning by the Food and Drug Administration, cigarette smoking by U.S. high school students has been cut in half since 2011—from 15.8 percent to 8.0 percent—a remarkable and previously unanticipated public health victory.

Unfortunately, it appears federal authorities may be misattributing the cause. In his announcement earlier today, FDA Commissioner Scott Gottlieb attributes most, if not all, of this reduction in smoking to a federally sponsored program that has only been in place since 2014. Despite substantial evidence in federally sponsored surveys in the United States and abroad showing that remarkable reductions in teen and adult smoking have been concurrent with the increasing popularity of e-cigarettes, the FDA announcement makes no reference to the possibility that much, if not most, of the recent reductions in teen smoking may be attributable to e-cigarettes.

In fact, Gottlieb urges continuing efforts to reduce teen use of all nonpharmaceutical nicotine delivery products, while endorsing expanded efforts at smoking cessation that rely on the pharmaceutical nicotine gums, patches and other products that have proved to be of only marginal effectiveness over the past four decades.

This public health victory is too important to leave to chance and guesswork. If Commissioner Gottlieb has evidence to support the claim that The Real Cost campaign “has already helped prevent nearly 350,000 kids from smoking cigarettes since it launched in 2014,” he should present it to the public. Regulators and public health authorities also should present and discuss the evidence for and against the possibility that the availability of e-cigarettes and related vapor products may, in fact, have played a major role in securing these reductions in smoking.

This is not an academic question.  Recently promulgated regulations from the Gottlieb’s own FDA threaten to eliminate more than 99 percent of e-cig products from the marketplace before the end of 2018, including all or almost all of the vape-shop component of this industry. The limited data available strongly suggests that the vape-shop products—with their ability to customize devices, flavors and strengths of nicotine to satisfy the preferences of each smoker, and modify the flavors and strength of nicotine over time to prevent relapse to cigarettes—may be more effective than the mass-market products in achieving and maintain reductions in smoking in both youth and adults.

Image by Sabphoto


Harm reduction is about making better choices, not perfect ones


Dr. Mark Boom, president and CEO of the Houston Methodist hospital system in Texas, suggests in a recent piece in The Hill that proponents of vaping are simply ignoring evidence that vapor products are not 100 percent safe.

Of course, people in the vaping community do not think that e-cigarettes are 100 percent safe. And if these products were found to increase the incidence of teen smoking of combustible cigarettes, we don’t want that either.

However, Boom appears to misunderstand the philosophy of harm reduction. Boom no doubt would encourage his patients who use intravenous drugs to, at the very least, use clean needles, rather than sharing. If he did not, he would be grossly abusing his privileged position as a healthcare authority. Similarly, applying a harm reduction philosophy by encouraging smokers to switch to e-cigarettes could save the vast majority of the 480,000 lives taken by combustible cigarettes every year.

As Boom rightly points out, e-cigarettes do, in fact, contain toxins. These are, however, at a very low concentration in the excipients – the products that make up the aerosol suspension that delivers the active ingredient of nicotine. What he neglects to add is that the excipients in nicotine liquid are strikingly similar to those in asthma inhalers. We certainly wouldn’t suggest to an asthma patient to forgo their medication because they are also inhaling toxins.

As a pharmacologist, I would encourage every person who ingests toxins to stop doing so. Of course I would. But my years in addiction research have made clear that you cannot simply tell someone to not pick up that cigarette, syringe or beer. Until that is possible, we have to encourage people to make better choices – which, unsurprisingly, is very easy to do.

When people do things we don’t approve of, we often write them off as not caring about their own health or personhood. Having worked at community organizations that distribute clean needles to curb transmission of infectious disease, naloxone to reverse overdoses and HIV drugs to prevent new infections, it is clear that people do recognize the risks they take everyday and embrace opportunities to reduce consequences associated with risky behaviors.

Image by Grey Carnation


Setting the record straight on copyright modernization


There’s a lot to be said for the adage that “we shouldn’t let the perfect be the enemy of the good.” While true in many situations, it also requires there be enough “good” to be worth the effort you’re engaged in, and not wasting energy better deployed doing something else.

In a recent blog post on Truth on the Market, Kristian Stout of the International Center for Law and Economics takes issue with my framing of a bill that would require the register of copyrights—the person who heads the Copyright Office within the Library of Congress—to be a presidential appointment. I should add the proposal comes during a time when President Donald Trump is considerably behind in selecting and confirming his appointees to a broad range of executive branch positions.

Unfortunately, Stout mischaracterizes and misreads my position. In my TechDirt piece, I described both points of view about the bill, writing that “opponents argue the bill will make the register and the Copyright Office more politicized and vulnerable to capture by special interests.” Stout takes this out of context and represents it as my position, rather than a description of what others have said.

There are a number of other issues with Stout’s piece, not all of which are worth addressing. But I will tackle the main ones.

It’s true, as Stout claims, that the idea for making the register a nominated and confirmed position has been under discussion for several years as part of the House Judiciary Committee’s copyright review, but so were a lot of other things that didn’t come to fruition. My point is not that this idea is totally new, but that the impetus for the bill to be rushed through now is motivated by the political dynamic between Congress and Librarian of Congress Carla Hayden, as well as her removal last year of then-Register of Copyrights Maria Pallante. Stout attests Hayden’s nomination was not politicized, when in fact, it was. The Heritage Foundation, among other conservative groups, argued against her confirmation. Heritage Action even urged senators to vote “no” on her nomination, a position with which we disagreed.

To set the record straight — I don’t think it’s a terrible bill. As I’ve argued in TechDirt and The Hill, there are some reasonable arguments in its favor. There are also some plausible arguments against it. I simply don’t think it does much to move the ball either way.

The main point of the bill, according to many of its proponents, would be to make the Copyright Office position more politically accountable. In theory, with congressional input, stakeholders on all sides would have an opportunity to weigh in on who gets confirmed for the position. This could limit edge cases where there is a truly awful candidate. But the Senate rarely, if ever, rejects presidential appointments who are otherwise broadly qualified — particularly for what is not a Cabinet-level position. And there wouldn’t be many groups capable of mounting a successful opposition fight over this position, as they might over a Supreme Court seat (even then, it’s rarely the primary factor). Even for Heritage, likely the most powerful conservative group in Washington, key-vote scoring against Hayden in a Republican-controlled Senate only got them 18 votes.

This, in itself, is not much of a justification for a bill.

One of the key points of Stout’s argument for the legislation is that: “Separating the Copyright Office from the Library is a straightforward and seemingly apolitical step toward modernization.” But changing who appoints the register shouldn’t be conflated with separation or modernization. Indeed, the librarian of Congress still has final authority over all of the office’s substantive regulatory powers. Changing who picks the register also has nothing to do with meeting the challenges of modernizing the office’s information technology infrastructure. If an independent office is what you want, this bill isn’t that.

For the record, we at R Street are not necessarily opposed to an independent (or relocated) Copyright Office. Some scholars, including former Register Pallante, make a plausible case that the systemic bureaucracies of the Library are part of what’s holding the Copyright Office back. But it’s also hard to separate the Library’s well-documented IT problems from the decadeslong tenure of the previous librarian, James Billington. Additionally, there are IT modernization challenges at every level of the federal government, including independent agencies, and it may be worth giving the new librarian a chance to fix them.

At heart, the location of the Copyright Office is a complex question of public administration that is worthy of deep consideration and review. An immediate step I have suggested in conversations with colleagues is to have Congress ask the National Academy of Public Administration to conduct a review of the internal structural challenges of the Library and its component agencies (as it did for the PTO in 2005). This would inject a much-needed dose of objectivity into a discussion that has unfortunately served as another proxy battle between the entrenched sides of the intellectual property debate.

In his conclusion, Stout makes an excellent point: “Sensible process reforms should be implementable without the rancor that plagues most substantive copyright debates.” I agree. Regardless of how strong you think our nation’s copyright laws ought to be, you should be in favor of making the system’s core functions work better. This bill will do little, if anything, to advance that goal. I look forward to working with stakeholders on all sides, including Stout, to find solutions that do.

Image by Jirsak


PACE Act would prosecute teen sexting as kiddie porn


Crimes against children, particularly those that involve sexual exploitation, are beyond the pale. But while society needs to make sure it protects children from sexual abuse, recent legislation passed by the U.S. House could cause more problems than it solves – hurting minors, expanding minatory minimums and creating redundant federal authority where there already are similar laws at the state level.

By a 368-51 margin, the House voted May 25 to approve H.R. 1761, the Protecting Against Child Exploitation (PACE) Act of 2017. The bill is intended to strengthen federal laws dealing with the production and distribution of child pornography by making the transmission of sexual images of minors a federal crime. The measure has moved on to the upper chamber, where it will be considered by the Senate Judiciary Committee.

While the bill’s purpose is to punish child predators, its unintended consequence will be to create more criminals out of teenagers whose main crime is simply lacking common sense.

As written, the law could apply to minors who send sexual images to other minors, or what is commonly referred to as “sexting.” The House-passed bill provides no exemption or provision to deal with minors who engage in sexting, meaning they could be subject to a mandatory minimum sentence of 15 years in prison and lifetime registration as a sex offender. Because of how broadly the text is written, even a teenager who merely views a sexual image or requests that one be sent could be subject to the mandatory minimum.

Sexting among teenagers increasingly has become the norm. While the phenomenon is worth a larger discussion, most would agree that locking teenagers up for 15 years is not the best way to handle the situation. Few believe these minors are committing crimes on a par with actual child predators. They should not be treated the same way under the law.

Teenagers are still minors in the eyes of the court. By creating an inflexible law that cannot take into account the ages of those involved, the law will force the courts to punish minors for having poor judgment. For numerous other crimes, the court system is purposely designed differently when it comes to how and whether to prosecute and sentence minors. Judges are given more tools to keep them out of jail and without criminal records. By retaining local jurisdiction, communities could respond more effectively to offenders and victims, as well as to the community at large. Child pornography laws should protect children from terrible acts, not punish teenagers for lapses in judgement.

Such concerns could have been addressed in the PACE Act, were it not for pure laziness on the part of the House of Representatives. The bill was passed without any hearings or input from experts, and approved as members fled Washington for their Memorial Day recess. The American people deserve better that.

There is still hope that the Senate will take notice of these issues. Law enforcement at both the state and federal level already have multiple tools at their disposal to prosecute child predators. This expansion of federal power is nothing but Congress creating a solution to a problem that did not exist.

Image by nito


Dual-class shares and the shareholder empowerment movement


The shareholder empowerment movement has renewed its effort to eliminate, restrict or, at the very least, discourage use of dual-class share structures—that is, classes of common stock with unequal voting rights—in initial public offerings. Of particular interest to the movement, which is made up primarily of public pension funds and union-related funds that hold more than $3 billion in assets, was the recent Snap Inc. IPO that sold nonvoting stock to the public, a first for IPOs with dual-class shares.

Typically, a company will issue a class of common stock “ordinary shares” to the public that carry one vote per share, as Facebook Inc. did in its IPO, while reserving a separate “super-voting” class that provides founders like Marc Zuckerberg with at least 10 votes per share. This structure allows the founders to maintain control of the company without having to own the majority of outstanding common stock.

Even though it offered no voting rights in the shares sold to the public, the Snap IPO was a huge success. Snap priced its IPO at $17 per share, giving it a market valuation of roughly $24 billion. The book was more than 10 times oversubscribed and Snap could have priced the IPO at a price of up to $19 per share.

The Council of Institutional Investors, the trade organization that represents the shareholder empowerment movement, has asked the S&P Dow Jones Indices, MSCI Inc. and FTSE Russell to exclude Snap Inc. and other companies with nonvoting stock from their indices unless they include extremely restrictive provisions, such as maximum sunset provisions—triggers that would terminate the super-voting characteristics of the founders’ shares—of three to five years. Moreover, consistent with the CII’s general policy, the letters the council sent also advocate for a forced conversion of all dual-class share structures to one-share, one-vote, unless the majority of ordinary shares vote to extend the dual-class structures for a maximum of five years.

The movement’s advocacy is not confined to those IPOs with dual-class shares listed on the U.S. stock exchanges. It also is attempting to persuade the Singapore stock exchange not to allow dual-class share structures of any kind.

If the movement is successful, this shift would not be trivial, as many of our most valuable and dynamic companies have gone public by offering shares with unequal voting rights. Besides Snap and Facebook, other companies that have gone public with dual-class shares include Alphabet Inc. (Google); LinkedIn (acquired by Microsoft for $26 billion in 2016); Comcast; Zoetis Inc.; Nike, Inc.; and Alibaba Group Holding Ltd. Two of these companies, Alphabet and Facebook, rank in the top 10 in the world based on market valuation. Berkshire Hathaway Inc., a company that also uses a dual-class share structure, also ranks in the top 10, although it only started using the structure after Warren Buffet bought control of the company.

Public companies with dual-class share structures have an aggregate market value of close to $4 trillion. As reflected in their market valuations, they are some of our most important companies, helping to fuel the growth of the economy.

The movement’s vigorous response to Snap’s hugely successful IPO was unsurprising. The CII, since its founding in 1985, has promoted a “one-share, one-vote” policy as one of its bedrock principles. But this policy of “shareholder democracy” should not be confused with political democracy, where each person gets one vote. In shareholder democracy, voting power is assigned according to property ownership – i.e., how many shares the person or entity owns. Dual-class share structures clearly violate the CII’s policy of shareholder democracy and are an obvious threat to the movement’s power. That is, the more public companies that utilize a dual-class share structure, the more controlled companies exist and the less power the movement has.

Most importantly, the movement’s advocacy comes into strong conflict with what many believe to be the great strength of our system of corporate governance: the private ordering of corporate governance arrangements, with dual-class share structures being an optimal result of that ordering. Consistent with this understanding, NASDAQ Inc. recently declared:

One of America’s greatest strengths is that we are a magnet for entrepreneurship and innovation. Central to cultivating this strength is establishing multiple paths entrepreneurs can take to public markets. Each publicly-traded company should have flexibility to determine a class structure that is most appropriate and beneficial for them, so long as this structure is transparent and disclosed up front so that investors have complete visibility into the company. Dual class structures allow investors to invest side-by-side with innovators and high growth companies, enjoying the financial benefits of these companies’ success.

At its core, the shareholder empowerment movement advocates shifting corporate decision-making authority to shareholders, and thus away from boards of directors and executive management, the most informed loci of corporate authority. Shareholder empowerment, not maximizing shareholder wealth, is the movement’s objective. This movement must be stopped from opportunistically interfering with the use of dual-class share structures in IPOs.

Image by create jobs 51


Lehmann before the House Financial Services Committee

R Street Senior Fellow R.J. Lehmann testifies before a June 7 hearing of the House Financial Services Committee on “Flood Insurance Reform: A Taxpayer’s Perspective.”

How Congress became colonized by the imperial presidency


Ever since Arthur Schlesinger’s 1973 book coined the phrase, the so-called “imperial presidency” has been a perennial topic of our national political discourse. At a time when the American branches of government are separate but unequal, the seven essays collected in The Imperial Presidency and the Constitution trace when fears of an imperial presidency first arose, the extent to which such fears are justified and what can be done about it.

Adam J. White’s contribution, “The Administrative State and the Imperial Presidency,” cautions not to conflate the “imperial presidency” with the administrative state itself. As White points out, the administrative state is “first and foremost a creation of Congress,” and “to at least some extent, a necessary creation.”

By contrast, the imperial presidency refers to the power the president wields through his office. While this power can be channeled and enhanced through the apparatus of the administrative state, an imperial presidency also “can restrain the administrative state, as in the Reagan administration … and, less obviously, the administrative state can restrain an imperial president.”

In modern times, of course, the power of the presidency and the administrative state have grown in tandem. “The president wields executive power broadly to expand the administrative state, and the administrative state acts in service of the current president’s agenda,” White writes.

After various failed attempts by Congress itself to act as an administrative body during the Articles of Confederation era, the U.S. Constitution provided for an energetic executive, which Alexander Hamilton described as “essential to the steady administration of the laws.” Despite this, the Constitution offered little in the way of an affirmative vision of the administrative bureaucracy, an omission some scholars have referred to as “the hole in the Constitution.”

Although there were earlier antecedents, Congress’ creation of the Interstate Commerce Commission in 1887 marked the modern administrative state’s arrival. Ove time, the ICC’s powers were enhanced by Congress to encompass both judicial and legislative powers, given its ability to both set rates and adjudicate disputes. During the Progressive Era and through the New Deal, more administrative agencies were built on the ICC model, including the Federal Trade Commission and Federal Communications Commission.

Importantly, these agencies were distinct from the traditional executive branch departments and thus operated “outside of the direct oversight of the president,” White notes. Progressive policymakers—starting with some in the Franklin Roosevelt administration—quickly grew frustrated with the agencies’ ability to “impede an energetic liberal president’s regulatory agenda.”

Years later, conservatives also began to bemoan the independent nature of certain agencies. As the Reagan administration sought to cut back on the regulatory state, it attempted to increase the president’s power over the administrative state through mechanisms such as centralized regulatory review under the Office of Information and Regulatory Affairs. Since Reagan, presidents of both parties increasingly have embraced greater presidential control over federal agencies. Some used that control to expand the administrative state’s power, while others have sought to curtail it.

The “most straightforward” way to shrink the administrative state, White argues, “would be for Congress to do the work of taking delegated powers away from the agencies, by amending statutes.” Since many legislators prefer to delegate their power in an effort to avoid responsibility, White views this option as unrealistic.

This leads White to the “second best option,” which is to pass some form of broad regulatory reform legislation that revamps the processes through which agencies enact rules. He mentions the REINS Act and the Regulatory Accountability Act as two possible options. R Street actually has identified a whole menu of options from which Congress feasibly could choose.

More broadly, White points out that using the imperial presidency as a means to control and direct the administrative state is no longer an effective mechanism to rein it in. Rather, it’s far past time that the other branches assert themselves and join the fray. One possibility is for the judicial branch to revisit its doctrines that grant significant deference to federal agencies.

In many ways, Andrew Rudalevige’s contribution, “Constitutional Structure, Political History and the Invisible Congress,” picks up where White’s essay leaves off. When the system of separated powers works as intended, the legislative and executive branches operate as “rivals for power,” making their relationship contentious, rather than cooperative. Although the Founding Fathers were more concerned about the legislature accreting power than the executive, Rudalevige’s chapter retraces how both structural and political factors have created the exact opposite dynamic.

Rudalevige lays out an obvious—but often underappreciated—truth: the president has a built-in advantage in that he is just a single person. By contrast, Congress must function as a 535-member conglomeration of legislators spread across two different chambers and hailing from different political parties and geographical regions. Given that each member carries “their own localized electoral incentives,” they will “rarely find it in their interests to work together, much less to confront the executive branch.”

Another factor Rudalevige pinpoints for Congress’ decline is the rise of political polarization. Politics has increasingly become a team sport: “A vote against presidential overreach is now seen by the president’s party colleagues as damaging to the party brand, and thus to their own careers.” The result is that legislators are more likely to toe the party line in pursuit of short-term policy victories, rather than vote to strengthen Congress as an institution.

Rudalevige also highlights how modern travel has allowed congressmen to transit back-and-forth from their home districts to Washington with relative ease. This has led to the rise of the “Tuesday-Thursday club of drop-in legislators,” who spend more time pressing the flesh with donors and constituents back home than doing the hard work of hammering out legislative compromises. One option is for Congress to extend its work weeks, which could increase the amount of floor time available to conduct legislative business.

Exercising more effective oversight doesn’t just mean finding more time; it also requires more capacity. Rudalevige cites R Street’s Kevin Kosar, who has chronicled the decline in congressional staff and pay levels over the past 40 years. Beefing up congressional staff, as well as support systems like the Congressional Research Service, would help address this deficiency.

Other possibilities include forming new institutions such as a Congressional Regulation Office—as proposed by Kosar and the Brookings Institution’s Phillip Wallach—to provide independent cost-benefit analyses and retrospective reviews of regulations. A final idea—and one long advocated by policy wonks—is a return to “regular order” budgeting, in which Congress breaks the federal budget into bite-sized pieces rather than relying on last-second, thousand-page omnibus spending bills to keep the government’s lights on.

While all of these ideas are available and ready for the picking, Rudalevige admits that “current returns are unpromising” that Congress will actually implement any of them. Nonetheless, he’s correct in warning that “the matter demands our attention even so.” Let’s hope Congress—and the American citizenry—heeds his call.

Image by Ed-Ni Photo


How executive ‘detailees’ could help ease Congress’ staffing problems

Capitol Building

It is becoming more widely acknowledged that Congress has a staffing problem. While the executive branch employs more than 4 million people, the legislative branch has only about 30,000. This number includes personnel toiling for agencies that do not readily come to mind as legislative, like the Government Publishing Office, the Architect of the Capitol and the U.S. Capitol Police.

While congressional capacity advocates shout for more funding and personnel to be allocated to the legislative branch, political scientists Russell Mills and Jennifer Selin examine the use of an often-overlooked stream of expertise available to congressional committees: federal agency detailees. Detailees are executive agency personnel with a particular policy mastery who are temporarily loaned out to congressional committees. The typical detailee assignment runs one year.

Hill operators and observers have long known policy expertise resides primarily in congressional committee staff. Compared to House and Senate personal office aides, committee staffers typically have more experience and narrower portfolios, both of which enhance the abilities of committees and their members to conduct oversight, draft legislation and develop fruitful lines of communication with relevant agency stakeholders.

However, as Mills and Selin point out in a recent piece in Legislative Studies Quarterly, there are only about half as many committee staff as there were in 1980, while inflation-adjusted pay levels have fallen 20 percent for many committee aides. This reduction in resources has hampered committees’ oversight capabilities, in addition to abetting the centralization of policymaking in leadership offices or its complete delegation to the executive branch.

House versus Senate committee staff, 1977-2014

house v senate staff

SOURCE: Russell Mills and Jennifer Selin, 2017

Mills and Selin argue detailees offer at least three specific benefits to supplement Congress’ legislative and oversight responsibilities:

  1. Detailees provide additional legislative support. Though committee staffers are usually issue specialists, “detailees often have specialized, expert knowledge of a policy, [and] they are able to provide awareness more traditional congressional staff may not have.” Moreover, given their personal experience within the agencies, detailees offer committees important insight into the decision-making processes and likely agency responses to potential congressional action.
  1. Detailees assist with executive branch oversight. “The process for securing information through requests directly to a federal agency is slower and involves agency coordination with the presidential administration. Detailees provide a way around these problems.” Simply having agency contacts and being able to connect committee staffers directly to those agency personnel most likely to respond quickly with accurate information can expedite the frustratingly slow information-gathering process vital to conducting effective congressional oversight.
  1. Detailees supplement interest-group engagement. In developing policy, committee staffers spend much of their time meeting with relevant policy stakeholders. “Committee staff routinely assists members of Congress by meeting with interest groups to gather their input for legislative initiatives as well as to hear their objections or support for actions taken by executive agencies.” Detailees provide the committee more, and different, stakeholder contacts established from the agency perspective, which allows for better information filtering and a more informed assessment of legislative potential.

Finally, and importantly, Mills and Selin point out that use of detailees is a rare win-win for both the legislative and the executive branches. The benefits to Congress are clear: committees gain expert-level staffers with experience and connections to the agencies under the committee’s purview, all on the agencies’ dime. Sen. Susan Collins, R-Maine, has noted:

These detailees apply their expertise in researching issues, staffing hearings, and working on legislation. In return, they gain valuable experience, which develops their careers and benefits their agencies.

The gains for the executive branch are less intuitive. After all, the agency loses a competent staffer who then offers Congress firsthand insight into agency operations, even potentially providing increased oversight to the very agency from which the staffer originated.

But Mills and Selin note that, from qualitative interviews they conducted with current and former detailees, they discovered that “detailees gain experience in the legislative process, can represent the interests and perspectives of the agency, and give the agency a conduit to committee decision making.”

In other words, just as detailees provide insider information to committees on agency operations, agencies profit from their detailees returning to the agency with intelligence on committee decision-making, policymaking and oversight capabilities. All of which our personnel-strapped national legislature badly needs.

Five years of R Street


Five years ago today, Deborah Bailin, Christian Cámara, Julie Drenner, R.J. Lehmann, Alan Smith and I resigned our jobs at the Heartland Institute over a horrifically ill-advised billboard advertisement and began a new think tank called R Street. Tonight, we’ll celebrate our fifth anniversary.

We’re now almost 40 strong and have a budget about 10 times that of our first year. In honor of our anniversary, here are five bits of trivia about R Street that I like to share:

  1. R Street’s first hire was Erica Schoder, now our senior vice president for operations. Our first office, previously the Heartland Institute’s Washington office, was a converted art gallery above a vintage clothing store.
  2. Some other names we considered were the Metis Institute (after the Greek goddess of common sense) and JuneFirst (after the day we officially opened). Our offices were near R Street and R is the first mostly residential street off Connecticut Avenue, which is arguably the main street in Washington. So it’s the place where real life begins in the nation’s capital.
  3. One huge advantage of the name R Street was that we could get the short URL org. That’s actually a big deal. It makes our email addresses much easier to type. Many other think tanks that have started recently have long and unwieldly URLs. We don’t.
  4. To my knowledge, we remain the only right-of-center think tank that both reimburses bike sharing and maintains a gender-identify nondiscrimination policy. I’m a cyclist and support civil rights protections for the gender nonconforming. But I’d argue that both policies are simply grounded in common sense.
  5. We believe that pirates are much cooler than ninjas. By a lot.

Image by Africa Studio


Reports of the taxi industry’s death have been greatly exaggerated


Co-written with Jonathan Haggerty 

It seems like nearly every time ridesharing is brought up in New York City, someone will inevitably bring up the dramatic decline in taxi medallion prices. Dubbed the “Uber effect” by American Enterprise Institute scholar Mark Perry, the theory is that increased competition from companies like Uber and Lyft has eroded the legal monopoly that taxi medallion holders previously exerted in the on-demand automobile transport market.

By competing against this once isolated market, transportation network companies like Uber and Lyft have made these medallions significantly less valuable. One proxy for this decline can be found in share prices of Medallion Financial Corp., a publicly traded consumer and commercial lending firm that is a major creditor in the taxi medallion lending business. When looking at the period from 2013 to 2016, the decline certainly looks precipitous:


This may not be the complete story, however. After all, the stock price may vary depending both on the specific quality of loans the company issues, its underlying cost of capital and on general market confidence. Furthermore, the stock price doesn’t make any distinction across the numerous categories of medallion ownership.

To the extent that news reports cite changes in the actual market value of a medallion, they usually do so anecdotally, comparing the peak value in 2014 of more than $1 million to the current trough of under $300,000.

Given the clamor and potential policy implications, a more detailed analysis seemed appropriate. We examined medallion price trends over time and differentiated across the different medallion categories. NYC’s Taxi and Limousine Commission compiles monthly records of medallion transactions for each of six categories: Individual unrestricted, handicap accessible and fuel alternative, as well as corporate (minifleet) unrestricted, accessible and fuel alternative. Unrestricted cabs are the general purpose yellow taxis that everyone thinks of, handicap accessible are cabs specially retrofitted to allow persons with disabilities easier access, and fuel alternative cabs have specific fuel requirements titled towards being more environmentally friendly.

The primary breakdown is between individual and minifleet. Where an individual medallion owner has to spend a minimum number of hours per year (usually the equivalent of 200 separate nine-hour shifts) driving the cab, a minifleet owner can lease out taxis to other drivers.

By far, the largest categories are the individual and minifleet unlimited licenses, and the general decline here tracks fairly well with Medallion Financial’s stock price:


Immediately we can see that there is a clear and substantial price premium for minifleet licenses over individual licenses. This makes sense intuitively. A license with strict personal driving requirements is going to be more restrictive on your time, and less valuable, than one without. Another factor that stands out is how messy the data is, with transfers at price points both significantly cheaper and significantly more expensive than the average in any given month. Unfortunately, it’s difficult to tell whether this was an issue with the NYC taxi commission’s data recording or whether these were due to external factors, like family transfer discounts or business liquidations.

However, it is important to recognize that the towering price high in 2014 was spurred partially by fleet owners borrowing against the rising value of the medallions they already owned to finance further purchases. So while medallion prices are undoubtedly dropping, it may look worse because prices were experiencing a bit of a bubble in the first place. Indeed, a former head of the TLC stated in April “the (taxi) industry’s performance has not been as bad as the decline in medallion prices would suggest.” In other words, don’t mistake the price of medallions with the health of the industry overall.

Another obvious factor here is the decrease in liquidity since 2014. One sale in March and two in February of 2017 means one of two things are happening: either medallion owners can’t find buyers, or owners are holding on because they view a price rise or stabilization on the horizon. The prospect of a bailout could keep buoying prices, while easing restrictions on medallion transfers has increased the potential pool of buyers.

Unfortunately, there were so few alternative fuel licenses released or transferred that there was not much data to analyze. Handicap accessible licenses, however, had a more interesting story to tell:


Here you can see that the handicap accessible licenses have actually appreciated in value over the same timeframe. (If the graph looks funky with the straight lines, that’s due to the initial auctions where these licenses were sold.) This is not an apples-to-apples comparison, because we have so little data post-2014, but the total lack of sales (for minifleet accessible) may be an indication that it’s not an asset worth liquidating.

One reason for this may be that Uber partners with cab drivers who own these handicap accessible licenses to help provide rides on their platform to users with disabilities. It seems intuitive then, that these specific medallions would continue to hold value.

But perhaps the most important factor in all this is the total size of the market. The market share of taxis has shrunk with the emergence of Uber and Lyft, but the overall size of the market is larger today:


Note that taxi trip volume has begun to level out in late 2016 and 2017. Taxis can coexist with TNCs in some markets, especially in densely populated cities where the value of a street hail is higher.

Put all of this together, and it appears the reports of taxi death have been greatly exaggerated. While some form of the Uber effect certainly exists, insofar as general medallion prices are concerned, the decline is not quite as precipitous as some have reported and taxi ride volume is not disappearing overnight. Furthermore, the future price of all these medallions likely will be more dependent on the success or failure of autonomous vehicles than on competition from ridesharing services from here on out.



The data we compiled for the piece can be found here.

Image by Cameris

Even without Durbin Amendment repeal, Congress should pass the CHOICE Act


The following post was co-authored by R Street Outreach Manager Clark Packard.

House Financial Services Committee Chairman Jeb Hensarling, R-Texas, has done the yeoman’s work of putting together a host of fundamantal conservative reforms in the CHOICE Act. Although repeal of the Durbin amendment would have been a positive, pro-market reform, Congress should pass the bill even if this repeal is not included.

The most important provision of the bill allows banks the very sensible choice of maintaining substantial equity capital in exchange for a reduction in onerous and intrusive regulation. This provision puts before banks a reasonable and fundamental trade-off: more capital, less intrusive regulation. This is reason enough to support the CHOICE Act. Its numerous other reforms also include improved constitutional governance of administrative agencies, which are also a key reason to support the bill.

Accountability of banks

The 10 percent tangible leverage capital ratio, conservatively calculated, as proposed in the CHOICE Act, is a fair and workable level.

A key lesson of the housing bubble was that mortgage loans made with 0 percent skin in the game are much more likely to cause trouble. To be fully accountable for the credit risk of its loans, a bank can keep them on its own balance sheet. This is 100 percent skin in the game. The CHOICE Act rightly gives relief to banks holding mortgage loans in portfolio from regulations that try to address problems of a zero skin in the game model – problems irrelevant to the incentives of the portfolio lender.

Accountability of regulatory agencies

The CHOICE Act is Congress asserting itself to clarify that regulatory agencies are derivative bodies accountable to the legislative branch. They cannot be sovereign fiefdoms, not even the dictatorship of the Consumer Financial Protection Bureau. The most classic and still most important power of the legislature is the power of the purse.  The CHOICE Act accordingly puts all the financial regulatory agencies under the democratic discipline of congressional appropriations. This notably would end the anti-constitutional direct grab from public funds that was granted to the CFPB precisely to evade the democratic power of the purse.

The CHOICE Act also requires of all financial regulatory agencies the core discipline of cost-benefit analysis. Overall, this represents very significant progress in the governance of the administrative state and brings it under better constitutional control.

Accountability of the Federal Reserve

The CHOICE Act includes the text of The Fed Oversight Reform and Modernization Act, which improves governance of the Federal Reserve by Congress. As a former president of the New York Federal Reserve Bank once testified to the House Committee on Banking and Currency: “Obviously, the Congress which set us up has the authority and should review our actions at any time they want to, and in any way they want to.” That is entirely correct. Under the CHOICE Act, such reviews would happen at least quarterly. These reviews should include having the Fed quantify and discuss the effects of its monetary policies on savings and savers.

Reform for community banks

A good summary of the results of the Dodd-Frank Act is supplied by the Independent Community Bankers of America’s “Community Bank Agenda for Economic Growth.” “Community banks,” it states, “need relief from suffocating regulatory mandates. The exponential growth of these mandates affects nearly every aspect of community banking. The very nature of the industry is shifting away from community investment and community building to paperwork, compliance and examination,” and “the new Congress has a unique opportunity to simplify, streamline and restructure.”

So it does. The House of Representatives should pass the CHOICE Act.

Image by Erce


How congressional power became separate, but unequal


Recent polling shows that Americans are increasingly turned off by the rancor and high-stakes nature of our recent presidential elections. But don’t expect contests for the presidency to calm down anytime soon. Today, the modern American presidency is more powerful than ever, making the importance of the office paramount to partisans on both sides of the political aisle.

It’s important to remember, however, that the presidency wasn’t always viewed this way. The system established by our Founding Fathers went to great lengths to separate the powers of government both vertically and horizontally. If anything, the founders actually were more concerned about power accreting in the legislature than in the executive.

As James Madison warned: “[i]n republican government, the legislative authority necessarily predominates,” rendering it necessary to take certain “precautions” to “guard against dangerous encroachments.” In contrast, he noted that the “weakness of the executive … might require it to be fortified” in order to resist legislative power grabs. The text of the Constitution reflected the primacy of Congress, too: Article I of the document, which lays out of the legislative powers, is more than twice as long as Article II, which describes the executive’s role.

Over the past several decades, though, Congress has gradually lost its influential role, while the presidency has been ascendant. Today, the executive branch is a sprawling behemoth with more than 4 million employees, and presidents routinely advance policy goals by executive fiat rather than by working with Congress. Given Congress’ diminished state, it is important to consider how and why Congress has failed to maintain its role as the country’s “first branch.” A recent paper by Matthew Glassman of the Congressional Research Service lays out a primer on the history of the separation of powers, as well as providing clues about Congress’ dwindling status within that system.

As Glassman recounts, the notion of governmental power being comprised of distinct functions—lawmaking, administration and adjudication—can be traced back to the ancients, including greats like Aristotle, Polybius and Cicero. The theory was more fully developed in the 17th and 18th century by Locke and Montesquieu, who acted as intellectual guideposts to the American founders.

The key feature of the American tripartite system is that it placed the legislative, executive and judicial powers of government into distinct spheres, but also ensured that their powers overlapped in certain areas. For example, the president has veto power over congressionally passed legislation, while Congress has a say in executive branch appointments. In Glassman’s words, this setup produces conflict “by design,” allowing each branch to guard its power against encroachment from the other branches.

Glassman also identifies several institutional features that have allowed our system of separated powers to remain effective throughout most of our country’s history, such as distinct personnel, independent electoral bases and separate resources for each branch. But using a system of separated power to guard against the accumulation of power is only effective if the numerous branches are operating in relative equipoise.

Glassman’s paper is particularly insightful in analyzing why the power of different branches can ebb and flow over time. He highlights the perverse incentives individual actors within each branch face—incentives that can cause them to undermine their own branch’s long-term institutional power. These forces at least partly explain why Congress’ power has declined in recent times.

For one, Glassman notes that an individual actor within a branch may have personal policy positions that conflict with the long-term institutional interests of his or her branch. An example might be a member of Congress agreeing on policy grounds with a president’s decision to engage in a unilateral military strike, despite the fact that the president acted without consulting Congress.

Partisan affiliations also might cause individuals to take actions that undermine their branch’s institutional power. This phenomenon is commonly seen when members of Congress refuse to criticize a president of their own party publicly, even if they believe the president is acting beyond his power. The electoral goals and strategies of individual members of Congress can conflict with their own branch’s long-term interests.

Glassman recognizes that the problem of a branch’s institutional power conflicting with the personal goals of individual branch members is “particularly acute for Congress”:

As individual members of a large body, Representatives and Senators may not believe they have the responsibility or the capacity to defend the institution… Even when Congress does choose to institutionally defend itself, it often finds itself speaking with less than a unified voice, as only the most vital institutional powers have the ability to unanimously unify Congress.

These problems of collective action—the responsibility/capacity to defend the institution, the ability to speak with a unified voice, and the conflict with party or policy goals—rarely if ever occur in the executive branch. The unitary nature of the presidency ensures that the executive branch will ultimately always speak with one voice, and past presidents have often expressed— both in office and after retirement—a deep feeling of responsibility for the maintenance of the powers of the presidency.

These trends, of course, are not irreversible. Congress can fight back against executive branch encroachment, if it so chooses.

R Street’s Legislative Branch Capacity Working Group has identified numerous “Madisonian solutions” that would allow Congress to rebalance the separation of powers. Options include strategies to strengthen Congress itself—for example, by beefing up committee staffs and providing more funding for entities like the Congressional Research Service and the Government Accountability Office. Alternatively, Congress could seek to reduce the power of the presidency by clawing back power from federal agencies through comprehensive regulatory reform legislation.

In other words, Congress has the tools at its disposal to return our branches of government to a more equal footing. Members of Congress simply need to start prioritizing their branch’s long-term institutional interests over their personal preferences and predilections. Until that happens, we can expect the preeminence of the presidency—and the vitriol of presidential elections—to continue unabated.

Image by Pozdeyev Vitaly

Does Congress have the capacity it needs to conduct oversight?


Envisioned by the founders as the “first” branch of government, Congress has the responsibility of overseeing and managing the other two arms of our constitutional system. And yet, as the executive branch has grown in power and prestige, Congress has increasingly lost its authority.

What resources does Congress currently employ when overseeing federal agencies? Which current resources are well-used; which are under-utilized? What additional tools and resources does Congress need to engage in truly effective oversight? The Legislative Branch Capacity Working Group recently hosted a panel on these questions, moderated by R Street’s Kevin Kosar and featuring Morton Rosenberg of The Constitution Project and Justin Rood of the Project on Government Oversight. Video of the panel is embedded below:

To keep jobs in Missouri, special session should allow more options for renewable energy


As the legislature continues work during the special session, it needs to keep sight of the big picture. The case that motivated Gov. Eric Greitens to call the session—the loss of two plants in southeast Missouri due to high electricity costs—highlights the importance of cheap, reliable electricity to the economic health of the state. But if Missouri politicians are interested in sustainable growth in its energy sector, they need to go beyond legislating single cases and take a broader look at how the electrical system can become more attractive to employers and consumers alike.

Of course, there is an easy way to reform Missouri’s electrical regulations that will increase the state’s attractiveness to business while advancing the free market principles that the legislature—and voters—support.

As things stand, consumers are restricted to buying power from their utility company, local municipality or electric cooperative. This lack of choice can be burdensome, but it is a particular problem for businesses that have internal sustainability goals regarding energy use. Indeed, many large companies have set goals to receive a set percentage of their energy from renewable sources. Businesses adopt these goals to save on costs, satisfy consumer preferences and to underscore good corporate stewardship. In Missouri, however, many companies may not be able to meet their energy goals, because local utilities simply do not offer sufficient renewable electricity. For a business deciding whether to locate or expand facilities in the state, lack of options makes the choice clear.

During the recently ended regular session, the legislature considered the Missouri Energy Freedom Act, by Rep. Bill Kidd, which would have solved the scarcity problem by allowing companies to purchase renewable electricity from someone other than their official local provider. This legal structure has worked in other parts of the country, and has the potential to attract thousands of jobs to the state both from energy conscious employers and from potential renewable generators. Companies save money on their energy bills, but also shoulder the risk of new clean energy projects. That means one simple rule change can bring Missouri huge new investments, more profitable businesses, jobs in the community and clean energy to fuel the economy – all risk free.

Most important, however, this approach would bring more jobs to the state without increasing the role of government. Allowing PPAs involves no mandates, subsidies or government heavy handedness. It simply provides companies with another option. The proposal also requires utilities to be reimbursed for any costs associated with allowing other power generators access to the grid, essentially leveling playing field.

Seven of Missouri’s largest companies – General Mills, General Motors, Nestle, Procter & Gamble, Target, Unilever and Wal-Mart – are on record supporting this approach. Even the Department of Defense is supportive. But, it’s not just the big guys who stand to benefit. As long as you use enough power, you’d be able to lock in long-term, low prices for electricity through this new structure—a benefit small mom and pop firms will appreciate.

By allowing PPAs for renewable energy, Missouri can help keep tens of thousands of jobs in the state by opening up greater access to clean energy and increasing competition and free markets. Adding this element of competition should be part of the final legislative package for the special session.

Image by Gino Santa Maria

Nebraska should be on the cutting edge of spacesharing


Nebraska’s fame as a place for innovation and leadership is legendary. Even before Warren Buffet became the “Oracle of Omaha,” Nebraskans had invented CliffsNotes, the Reuben sandwich, Vise-Grip locking pliers, T.V. dinners, Kool-Aid and Arbor Day, just to name a few.

But that history makes it even more perplexing why this state, which so often has been on the entrepreneurial leading-edge, suddenly would turn against that heritage and ban useful modern innovations like Airbnb and other short-term-rental platforms that help travelers visit Nebraska’s great and historic cities. These services have helped fill the market gap for people coming to the College World Series and other exceptional crowd-drawing events, who sometimes have trouble finding a place to stay overnight within commuting distance.

The Unicameral, itself a unique feature of Nebraska innovation, has been considering legislation that would prevent local governments from outlawing short-term rentals by those who wish to make a little extra cash and perhaps meet some nice folks from out-of-state. Alas, it is having trouble finding a spot on the agenda.

Like similar bills around the country, the Nebraska bill would continue to allow local governments to prohibit sexually oriented businesses, unlicensed liquor sales, sales of street drugs and anything else in a short-term rental that would constitute a genuine public-safety hazard. Municipalities also could still regulate for noise, animal control, fire and building codes, nuisances and the like.

The crux of the resistance to statewide regulation seems to be that hotels, motels, resorts, inns, licensed bed and breakfasts and other clearly commercial operations are just flat-out opposed to what they view as additional competition unburdened by many of the fees and requirements of commercial hospitality. One of the compromises suggested is that the legislation be amended to require short-term-rental customers to pay the applicable hotel tax when they book, which companies like Airbnb already collect in many other communities.

Indeed, there are lessons for Nebraska in how these conflicts already have been resolved elsewhere around the country. New York City recently settled a lawsuit that challenged its statute setting fines of up to $7,500 for hosts who illegally list a property on one of a short-term-rental platforms, with the platforms concerned that the vague language could leave them on the hook.

A more recent settlement with the City of San Francisco could set a pattern for future legislation, in that two of the major short-term-rental platforms agreed to a registration process with the city, allowing hosts to know the requirements and giving them confidence that they are operating legally. Processes in Denver and New Orleans similarly work to pass host registrations through to the local governments.

There can be reasonable regulations to protect neighborhoods and public safety that stop well short of prohibition. Lawmakers and regulators should craft targeted rules that allow opportunities for people with a room to spare to match with tourists who can take advantage of an overnight stay. Nebraska—a reservoir of both good sense and an innovation ethic—has the chance to be a great model for other states with a well-crafted new law.

Image by paulrommer


WannaCry underscores a need for cyber hygiene and insurance

“Oops, your important files are encrypted” read the pop-up message on hundreds of thousands of Windows operating systems across the world. The ransomware cyberattack, infamously labeled “WannaCry,” paralyzed computers by encrypting their data and holding it ransom pending payments from the afflicted.

In the days following, headlines bemoaned the arrival of the long-feared “ransomware meltdown,” while critics jumped to blame Microsoft for product insecurities and condemned the National Security Agency for stockpiling vulnerabilities. While it’s easy to assign blame and stoke fear, policymakers should, instead, use the attack as an opportunity to encourage better cybersecurity behavior and sensible risk management practices – including cyber insurance.

Cyber insurance was first touted during the dot-com boom of the early 2000s, but has only recently grown in popularity. Like other types of insurance, cyber insurance offers financial protection from sudden and unexpected losses.

For instance, in addition to coverage for WannaCry-like ransom attacks, many policies now encompass a wide range of possible costs businesses may face associated with a breach, including regulatory fines, legal costs, public relations services and costs associated with internet downtime. Because cyberattacks can result in all sorts of unexpectedly large expenses, coverage designed to insulate a business from the financial shock of a cyberattack is vital.

In the case of WannaCry, the total illicit haul of the ransom is projected to be less than one hundred thousand dollars. Yet, downstream damages are expected to tally in the billions. In fact, one firm is projecting that up to $8 billion in global computer downtime costs may accrue to services ranging from hospitals and government agencies to car companies.

The consequences of that damage may, for some, be ruinous. According to Symantec, ransomware attacks have increased 36 percent from 2015 to 2016, while the average ransom has increased 266 percent in that time to $1077.

With the number of attacks on the rise, it is important to note that cyber insurance can both facilitate resilience and can also assist in the maintenance of system security. That’s because the underwriting process, during which the insurer assesses the risk it considers taking on, often requires a cyber risk assessment. Once a policy is written, specific policy terms often require adherence to basic security practices such as patching or regular network assessments. Companies that do not meet a threshold of cyber preparedness may not be eligible for coverage, may face higher premiums and could risk losing their coverage entirely. Put another way, cyber insurance coverage contributes to a culture of preparedness.

Cyber insurance take-up rates are growing, but the market is still evolving and penetration is uneven. According to a recent survey by Aon, only 33 percent of companies worldwide had cyber insurance coverage. Foreign countries are at a particular disadvantage when it comes to recovery because they hold less than 10 percent of all cyber insurance policies.

This is particularly worrying because WannaCry revealed a geographic gap in cyber preparedness. Russia and China saw the largest incidence of infected computers, suggesting that lax patching practices and overreliance on pirated or outdated systems is more common abroad. Those companies without coverage today face the full brunt of the costs associated with the WannaCry attack.

Though the domestic cyber insurance picture is better, more should be done to encourage coverage. For instance, while the White House’s recent cybersecurity executive order reiterated that cybersecurity is a priority area for the Trump administration, it was silent on the role cyber insurance can play in incentivizing agencies and their contractors to internalize cyber preparedness. This is a missed opportunity. The government can use the power of the purse to promote cyber insurance adoption in the market as a whole by requiring federal contractors to acquire certain types of cyber risk coverage.

High-profile cyberattacks like WannaCry highlight the need for cyber preparedness and cyber insurance. A policy approach that emphasizes both—and cyber insurance in particular as a market solution to the global ransomware problem—will be a boon for companies and consumers alike.

UPDATE (May 30, 2017): This piece originally cited a statistic attributed to the National Cyber Security Alliance that the alliance says is outdated.  

Another bright idea from Mitch Daniels


Purdue University President Mitch Daniels was in Washington last week to receive the Order of the Rising Sun, Gold and Silver Star from the government of Japan at an embassy ceremony. The award is one of Japan’s highest, and was given for “significant contributions to the strengthening of economic relations and mutual understanding” between the two countries.

During his time as governor of Indiana, Daniels saw 70 new direct investments in the state from Japan, including a Honda assembly plant that was the biggest “greenfields” investment in the United States in 2006. Over the following six years, Japan brought more than $2.6 billion of new investment and 8,400 jobs to the Hoosier State, as the governor led five economic missions to the country.

Since Daniels came to Purdue in January 2013, nearly $5 million of Japanese corporate research has come to campus. Largely because of the groundwork he laid, Indiana ranks second this year among the 50 states for best economic outlook, as measured through 15 important state fiscal policy variables laid out the 10th annual edition of the American Legislative Exchange Council’s “Rich State, Poor State” study.

He’s also accomplished a number of significant milestones at Purdue, including a six-year tuition freeze. There may not be another university in the country that plans to charge students less tuition in 2019 than was paid in 2012. The student loan default rate for Purdue graduates hovers around 1 percent. The Milken Institute ranked Purdue No. 1 for technology transfer among public universities without a medical school.

Now, the university is going to expand its offerings to millions of people online. Instead of committing to a multiyear project to build a significant online learning university, Purdue announced April 27 that it is creating a new public university (temporarily named “New U”) by acquiring most of the assets of Kaplan University, a competency-based online learning business of 15 campuses in the United States, 32,000 nontraditional students and nearly 80 years of remote-learning experience.

Kaplan offered the nation’s first totally online law school and has created study courses to review vast amounts of material for various accreditation and professional certification exams. It is a global provider of education programs in more than 30 countries and has forged partnerships with many colleges, universities, school districts and more than 2,600 corporations. The educational networking possibilities are nearly limitless. A whole new chapter of efforts to produce more affordable post-secondary opportunities, particularly for working adults, is likely to be launched by this marriage of a top public research institution and an online juggernaut in competency-based education.

According to reports, the Purdue faculty is not yet prepared to give its blessing to New U, which is an endemic feature of both disruptive initiatives and university faculties generally. Quick to embrace every progressive policy fad, it is less likely that the Purdue management will get an immediate pass from those participants in higher education with sinecures anchored in the traditional business model. But it is a model that deserves more consideration as workplace needs drive absorption of sophisticated technical knowledge and skills and leans toward affordable learning for the benefit of its students and the good of the country.


Sessions’ charging memo underscores need for Congress to pass reform


Attorney General Jeff Sessions’ memorandum instructing federal prosecutors to “charge and pursue the most serious, readily provable offense” against defendants signals a desire to return to a tough-on-crime stance. From the perspective of criminal justice reform, the most daunting aspect of these developments is a likely resurgent dependence on mandatory minimums.

As has been noted by John Malcolm, a criminal justice expert and director of the Heritage Foundation’s Edwin Meese III Center for Legal and Judicial Studies, reinstatement of stricter charging and sentencing policies is fully within the attorney general’s authority. We’ve seen it before from Attorney General Richard Thornburgh, who issued his own guidelines in 1989 requiring strict enforcement of all provable offenses. In the years since, there’s been back-and-forth directives from Thornburgh’s successors Janet Reno, John Ashcroft, Eric Holder and, now, Sessions.

But over that interim, experts have gathered evidence against mandatory minimums, finding that heavy use of these sentencing laws failed to reduce drug use or recidivism. Mandatory minimum sentences are fixed prison terms applied to specific crimes, which can range from five years to life imprisonment. They strip judges of the ability to use their own professional discretion to determine sentencing based on the facts at hand.

The aim of mandatory minimums during the height of the 1980s crack epidemic was, of course, to target drug kingpins and cartel leaders, in order to improve public safety. Prison populations surged, but it was primarily due to an increase of low-level offenders. With prisons now bursting at the seams and calls for the construction of newer prisons to house an ever-growing population of prisoners, taxpayers have had to shoulder the costs.

The most notable portions of Sessions’ memo are where he instructs that “the most serious offenses are those that carry the most substantial guidelines sentence, including mandatory minimum sentences,” which marks a deviation from the “smart-on-crime” approach under Holder. Sessions’ memo would ensure the U.S. Justice Department “fully utilizes the tools Congress has given” the agency.

But to the extent that it is Congress that provides the DOJ tools to enforce federal laws, Congress itself needs to reassess those tools. Sen. Rand Paul, R-Ky., has taken that exact strategy. Alongside Sen. Patrick Leahy, D-Vt., Paul has introduced the Justice Safety Valve Act in the Senate, while Reps. Thomas Massie, R-Ky., and Bobby Scott, D-Va., have done the same in the House. The legislation authorizes federal judges to provide more fitting sentences outside of a mandatory-minimum requirement.

During a press call Wednesday, Paul noted that momentum for change is likely to build if more members introduce more criminal justice reform bills. While acknowledging that reform advocates face an “uphill battle,” he also indicated that he is “having conversations with people” within the Trump administration willing to listen.

The call to enforce harsher charging and sentencing methods is a serious concern, especially the goal to revive the one-size-fits-all use of mandatory minimum sentences. However, seeking ways for Congress to set the tone and dictate what tools are available for judges and parties at the DOJ is currently the most effective way to remedy this recent course of events. It’s checks and balances at its finest.

Image by Brad McPherson

OPEN Government Data Act moves to Senate floor after markup


Legislation requiring federal agencies to publish their data online in a searchable, nonproprietary, machine-readable format has been cleared for the Senate following a May 17 markup by the Senate Homeland Security and Governmental Affairs Committee.

Sponsored by Sen. Brian Schatz, D-Hawaii, S. 760, the Open Public Electronic and Necessary Government Data Act is identical to an earlier Schatz bill that passed the Senate unanimously last year after analysis by the Congressional Budget Office determined it wouldn’t cost taxpayers any money.

What it would do is modernize government agencies and increase their effectiveness, while also allowing taxpayers to see how their money is spent. For these reasons, R Street joined more than 80 organizations—including trade groups, businesses and other civil-society organizations—in urging the Senate committee to pass these badly needed reforms.

The status quo makes it difficult for engaged citizens to view the spending data of the agencies they fund. A taxpayer interested in viewing the companies and organizations that receive federal grants and contract awards would need to have a license for the proprietary Data Universal Numbering System (DUNS). Dun & Bradstreet Inc., the company that owns DUNS, functions as a monopoly with respect to government contractor data.

In a 2012 report, the GAO claimed the costs of moving away from DUNS to a different system would be too great, but that was in a time of fewer alternatives. More recently, a Government Accountability Services 18F technology team study showed that government agencies across the world are beginning to use a 20-digit code called the Legal Entity Identifier (LEI). LEI is free for organizations and companies to use, as it is managed by the Global LEI Foundation, a nonprofit organization based in Switzerland. It would require no expensive upgrades.

Both the current and previous administrations have publicly supported transparency reforms for federal agencies. President Barack Obama introduced an Open Data Policy in 2013, and Matt Lira, a special assistant to President Donald Trump for innovation policy and initiatives, told an audience in April that financial transparency is still a priority for the White House.

Vested interested likely will still oppose the bill, which also has companion legislation, H.R. 1770, in the U.S. House. But given that it has support from both parties—an incredibly rare thing these days—as well as from the present and prior administrations, transparency advocates have room for optimism. The case for nonproprietary data standards and government transparency will now be in the hands of Congress.

Image by zimmytws

Big names weigh in on FCC’s net-neutrality rules


We seldom see a cadre of deceased Founding Fathers petition the Federal Communications Commission, but this past week was an exception. All the big hitters—from George Washington to Benjamin Franklin—filed comments in favor of a free internet. Abraham Lincoln also weighed in from beyond the grave, reprising his threat “to attack with the North” if the commission doesn’t free the internet.

These dead Sons of Liberty likely are pleased that the FCC’s proposed rules take steps to protect innovation and free the internet from excessive regulation. But it shouldn’t surprise us that politicians have strong opinions. What about some figures with a broader perspective?

Jesus weighed in with forceful, if sometimes incomprehensible, views that take both sides on the commission’s Notice of Proposed Rulemaking, which seeks comment on scaling back the FCC’s 2015 decision to subject internet service to the heavy hand of Title II of the Communications Act of 1934. Satan, on the other hand, was characteristically harsher, entreating the commissioners to “rot in Florida.”

Our magical friends across the pond also chimed with some thoughts. Harry Potter, no doubt frustrated with the slow Wi-Fi at Hogwarts, seems strongly in favor of keeping Title II. His compatriot Hermione Granger, however, is more supportive of the current FCC’s efforts to move away from laws designed to regulate a now defunct telephone monopoly, perhaps because she realizes the 2015 rules won’t do much to improve internet service. Dumbledore used his comments to give a favorable evaluation of both Title II and the casting of Jude Law to portray his younger self in an upcoming film.

A few superheroes also deigned to join the discourse. Wonder Woman, Batman and Superman joined a coalition letter which made up with brevity what it lacked in substance. The same can’t be said for the FCC’s notice itself, which contains dozens of pages of analysis and seeks comments on many substantive suggestions designed to reduce regulatory burdens on infrastructure investment and the next generation of real time, internet-based services. Another, more diverse, coalition letter was joined by Morgan Freeman, Pepe the Frog, a “Mr. Dank Memes” and the Marvel villain (and Norse trickster god) Loki. It contained a transcript of Jerry Seinfeld’s Bee Movie.

Speaking of villains, Josef Stalin made known his preference that no rules be changed. But Adolf Hitler attacked Stalin’s position like it was 1941.

Then there are those with advanced degrees. Doctor Bigfoot and Doctor Who filed separate comments in support of net neutrality.

In a debate too often characterized by shrill and misleading rhetoric, it’s heartening to see the FCC’s comment process is engaging such lofty figures to substantively inform the policymaking process. I mean, it sure would be a shame if taxpayer money supporting the mandatory review of the 1,500,000+ comments in this proceeding was wasted on fake responses.

Image by Bonezboyz

Coppage talks urbanism on the Matt Lewis Show

R Street Visiting Senior Fellow Jonathan Coppage was a recent guest on the Matt Lewis Show, where he made the case for the Federal Housing Administration to re-legalize Main Street. Full audio is embedded below.

How Scott Gottlieb’s ‘boring’ approach could transform the FDA


Dr. Scott Gottlieb, confirmed earlier this week by the U.S. Senate to become the new commissioner of the Food and Drug Administration, has a pragmatic—some might even say boring—approach to public health that could revolutionize how FDA regulations can fight the consequences of addiction.

With his vision of the future of tobacco, Gottlieb takes all the fun out of the heated arguments that anti-tobacco and pro-vaping individuals engage in on a regular basis – offering a reasonable solution to the disease burden of cigarettes. In a 2013 Forbes essay, he stated:

Whatever one thinks of cigarette makers, if the industry was earnest about transitioning away from the manufacture of smoked cigarettes, and getting into the development of new products that would still satisfy peoples’ taste for nicotine (with hopefully much lower risks) there could be public health virtue. The overall incidence of smoking related disease could be sharply diminished.

He acknowledges the enormous power the FDA has in the future of public health, particularly as it relates to tobacco consumption. He even has the guts to imply that “big tobacco” could actually be an ally in solving a problem many think they created, by encouraging cigarette manufacturers to focus on safer products and the e-cigarette market.

He recognizes the emergence of e-cigarettes present a viable alternative to other smoking-cessation products and that they have the potential to contribute to a future without combustible cigarettes. During his confirmation hearings, Gottlieb stated that reduced-harm products should be available to consumers to transition off combustible cigarettes, and he has taken note of the burdensome regulations that will be put on small businesses who want to enter the e-cigarette market, under currently scheduled FDA vaping rules.

These comments suggest that he would be open to regulations that make it easier for safer products to enter the market, rather than the currently planned deeming regulations, which would require nearly all existing e-cigarette products to go through a pre-market tobacco application (PMTA) process that would cost approximately $300,000 for each combination of flavor, strength, mixture and device. In a harm-reduction model, this is important, because increased competition from small businesses in the e-cigarette market will increase innovation and production of even safer products, while decreasing the price point of products that are at least 95 percent safer than combustible cigarettes.

Furthermore, this harm-reduction approach also could be applied to the opioid epidemic, which Gottlieb has stated is the FDA’s top priority. Medication-assisted treatments—such as methadone and Suboxone—help nearly 40 percent of people with opioid-use disorders to abstain from heroin and other commonly abused opioids. Opioid antagonists—such as Narcan and Vivitrol—can be used to reverse overdoses and cut cravings. Pharmaceutical companies, both big and small, have an opportunity to improve upon medications that can be used to treat opioid addiction and its consequences. Gottlieb’s willingness to embrace a harm-reduction philosophy and his recognition that it is important to have a practical approach to expensive and time-consuming FDA regulations could further encourage small pharmaceutical companies to enter the pipeline of life-saving opioid addiction treatments.

During the confirmation process, Gottlieb received criticism for his ties to the pharmaceutical industry. But frankly, his recognition that the tobacco and pharmaceutical industries can help solve an addiction crisis that kills nearly half a million people a year is to be applauded. That level-headed vision is exactly what the FDA needs to reduce the economic and health burden of addiction in the United States.

In reshaping U.S. energy policy, Perry’s best model is his home state


The Federal Energy Regulatory Commission (FERC) held a high-stakes conference May 1 and 2 to address the contentious interplay of state policies and interstate wholesale-electricity markets. The week prior, Energy Secretary Rick Perry announced his department might intervene in state energy planning to protect baseload coal and nuclear generation.

Ironically, market experts speaking before FERC identified state interventions to bail out coal and nuclear as the most damaging form of intervention. Market experts noted that state subsidies and mandates for renewables also continue to stress market performance, but do not displace system “capacity” needs the same way that baseload subsidies do.

This marks a fundamental debate over the role of government in competitive electricity marks, which happens to intersect with federalist themes in various ways. In the case of the Northeast and some Mid-Atlantic states, FERC seeks to uphold the competitive functionality of electricity markets, while some states have undertaken anti-competitive interventions to dictate outcomes that are better determined by markets. On the flip side, the prospect of federal intervention in state energy planning runs completely counter to conservative arguments against the Clean Power Plan, even if they are (incorrectly) made in the name of national security.

This paints a convoluted picture for federalists, but the pro-market case is clear – interventions at both the state and federal level are unwarranted and destroy wealth.

At the FERC conference, state representatives reiterated their support for relying on markets because of the clear economic benefits (i.e., little to no sign of “re-regulation” interest). Yet they also wanted to preserve the option to pick government-preferred investments, which runs counter to the very premise of liberalized electricity markets. Constructively, states expressed a willingness to engage in dialogue, which was marked by identifying policy principles, but they struggled to articulate what those objectives were.

Much of the challenge that state representatives—often, commissioners of public utility commissions—had in articulating an energy vision consistent with market principles is that politicians back in their states often support industrial policy (i.e., government explicitly picking winners), creating a difficult agenda to reconcile with FERC’s obligation to uphold competition. The most common policy theme was to reduce emissions, yet the states largely rebuffed market-compatible approaches to reducing emissions – namely, emissions pricing. A couple states brought up the need for state actions to improve reliability, claiming (contrary to the evidence) that markets aren’t able to provide reliable service. They rehashed generic slogans, like the need for fuel diversity, which has no direct bearing on whether an electricity system is reliable.

Competitive electricity markets are complex and poorly understood by state and federal policymakers. Given the rapid transition of electricity technologies and fuels, coupled with the persistent political obsession to dictate what this mix “ought” to be, the scene is set for half-truths and false narratives to prevail. Whether it’s progressives pushing for more renewables, confused conservatives supporting interventions to preserve baseload or any other such combination, all these narratives fundamentally miss the point that the goal of smart policy is to encourage well-functioning markets.

Fortunately, the narrative that we should level the playing field and let technologies compete on their merits still holds some political weight. Some FERC reforms could move in that direction, such as enabling participation of energy storage and pricing of fast-start resources. However, FERC reforms to appease the industrial-policy ambitions of some states (or the U.S. Department of Energy) would fundamentally deviate from the core objectives of competitive electricity markets. This could easily result in extensive unintended consequences. It’s not FERC’s job to validate state policy, but it must pass judgement on anti-competitive conduct. Conference participants offered ideas on this definition, and FERC would be wise to continue that dialogue.

There continues to be an immense need to educate state and federal policymakers on how electricity markets function and of the consequences of industrial policy, especially ad hoc subsidies. While the uptick in state interventions stirred controversy, it has also spurred productive dialogue in the Northeast and between the states and FERC. The conference demonstrated a clear need and willingness among states, stakeholders and current FERC commissioners to continue and deepen such a dialogue.

The concern of the Northeastern and some Mid-Atlantic states to reduce emissions is laudable. Pollution is a valid market failure that can be corrected, efficiently, by market-based policies. Such policies have excelled in competitive markets, where strong cost-reduction incentives have driven emissions reductions and innovations that lower the cost of emissions abatement. That’s an example of where state and federal interests align, as well-functioning markets that internalize all costs create the most benefit for society. But many policymakers do not understand these benefits clearly, and FERC and states should engage market experts in forums that help foster and disseminate this research. One example to highlight is Texas, which has seen reductions in costs and emissions without the controversy and distortions of industrial policy.

Encouragingly, Perry noted that the president asked him to reshape U.S. energy policy in the mold of Texas, where he spent 14 years as governor. Texas relies on competitive markets to signal power-plant investments and price-responsive demand. These markets do not explicitly value baseload, nor should they. Rather, they value reliable operations by providing revenues to resources that perform, especially when supply scarcity drives price spikes.

The Texas markets handsomely rewarded baseload coal and nuclear in the past, when they were highly competitive during periods of higher-priced natural gas. Now, inexpensive gas and cost declines for gas generation and renewables (the latter partially resulting from subsidies) have heavily cut into baseload margins, even driving some into retirement. Yet, according to the independent monitor of the Texas market, the reliability outlook remains strong, as more gas and wind-generation come online (and, in the long term, solar). Meanwhile, consumer costs have tanked. The monitor emphasizes that the new resource mix underscores the need for efficient price formation. This is the product of quality market design (e.g., “scarcity pricing” to account for the market failure of having adequate resources) and market discipline, as interventions can dramatically distort investment decisions and freeze capital markets.

Perry would serve America well by encouraging the Texas model. Over the past few years, Texas has bolstered its scarcity pricing, while Texas legislators and regulators have let the market work. The Northeast and Mid-Atlantic have not done so, causing the need for FERC’s conference to address the uptick in disruptive state interventions. Once FERC re-establishes a quorum, it will face the tasks of improving price formation and moderating the effects of state interventions. Competitive markets will drive costs reductions, innovation and emissions reductions, but only if state and federal policymakers keep interventions at bay. As Marlo Lewis Jr. of the Competitive Enterprise Institute recently remarked, “subsidizing uneconomic energy to the detriment of consumers and taxpayers is no way to drain the swamp.”

Image by Crush Rush

Ridesharing a victim of Alaska’s budget battle


As Alaska’s legislative session ends, some perplexing in-house political gamesmanship has kept a popular bipartisan measure from making it to the House floor. It’s a shame, because freedom-loving Alaska now remains one of just a handful of states that still doesn’t allow ridesharing services such as Uber and Lyft to operate within its boundaries.

Senate Bill 14 has passed the full House and needs only a vote of the Senate to move it along to the governor’s office. But according to news reports, the House has bottled up the legislation in the Rules Committee, which rarely meets. The measure reportedly is being used as leverage by legislators fighting a contentious budget battle.

The Alaska Journal of Commerce reports that, instead of letting S.B. 14 get to the House floor and then presumably to the governor for his signature, “the House is effectively starting the legislative process anew by advancing its own version of Uber legislation.”

That’s bad for Alaskans, given the obvious benefits of allowing these services to operate. The bill wisely clarifies that these drivers are independent contractors, thus restricting various efforts to mandate the payment of myriad employee benefits and thereby keeping this a cost-effective option. It also would prevent Alaska cities from imposing their own onerous local restrictions on these services. The bill requires background checks for these TNC drivers, but it’s still overall a good step forward that’s backed by ridesharing companies.

Although Alaska political observers expect that a ridesharing bill will eventually get to the floor at some point this year or next, it’s mistake to delay the ability of these companies to offer not only jobs, but rides to people who need a convenient – or safe – way to get home.

The Economist recently reported on a new study suggesting that “the arrival of Uber to New York City may have helped reduce alcohol-related traffic accidents by 25-35 percent.” According to the U.S. Centers for Disease Control, Alaska has one of the nation’s highest rates of excessive drinking. Uber and Lyft aren’t a panacea for such a significant social problem, but they could make the streets of Anchorage, Juneau and Fairbanks a bit safer.

With a week to go before the session’s end and a budget crisis looming, legislators might have other things on their mind. But the budget problem will eventually get fixed. Alaska residents who want to use ridesharing services shouldn’t be held hostage to that process.

Image by Joseph Sohm

California looks to finally end the Cold War


The Cold War ended decades ago, but vestiges of the conflict still surround us. In the California Legislature, Assemblyman Rob Bonta, D-Oakland, has introduced Assembly Bill 22, which seeks to bring one chapter of that history to a close.

A.B. 22 would replace a nearly 80-year-old prohibition that barred members of the Communist Party or individuals who otherwise advocated communist ideals from employment by the State of California. In its place, the bill would impose an ideologically neutral prohibition on employing anyone who actively seeks the forceful or violent overthrow of the government of the United States.

There can be no doubt that communism was a blight on the 20th century. In its name and under its red banners, hundreds of millions of people were killed. And it is well-known that the chief geopolitical rival of the United States through the second half of that century—the Soviet Union—was a power animated by communist ideology.

It is therefore no wonder that, in an effort to ensure the state’s government institutions were not subverted by those who would like to see the Soviet Union best the United States, legislators placed in statute a prohibition on employment for anyone with ongoing ties to, or outspoken sympathy for, the Communist Party. The party, like the Soviet Union, was understandably viewed with extreme prejudice by lawmakers who felt threatened by those who sought to topple market-oriented liberal democratic institutions. In fact, the text of the existing California law goes into great detail about the consequences of communism and spells out unambiguously the threat post by a Communist fifth column.

But while it remains helpful to examine the history of communism to better understand dictatorial barbarism and anti-democratic preferences, the time has come to correct the mistakes that legislators of decades past made when they needlessly trampled their own values by targeting people’s beliefs, rather than their actions. AB 22 does that.

To be clear, there aren’t legions of communists waiting to enmesh themselves in California’s bureaucracy, so it’s a bit strange that a lawmaker would feel so strongly as to want to carry the legislation. But Republican opposition to Bonta’s bill is no more explicable.

Bonta’s bill doesn’t diminish our recognition of the repugnant nature of communist ideology. That ideology was, and remains, an affront to individual liberty and dignity. But it is the liberal aspirations of the United States—which preclude discriminating on the basis of one’s political beliefs—that set the country apart from the Soviet Union in the first place.

Image by StrelaStudio

R Street hosts Justice for Work coalition panel

The R Street Institute hosted an April 17 launch party for the Justice for Work Coalition. Justice for Work is a coalition of organizations spanning the ideological spectrum that seeks to raise awareness and advocate for lowering the barriers created by laws and regulations that unnecessarily restrict economic participation.

The event included a panel discussion featuring former law enforcement officers, an ex-offender, and policy and legal experts.

The panelists were:
Arthur Rizer, R Street Institute
Ed Chung, Center for American Progress (moderator)
Teresa Hodge, Mission: Launch
Marcus Bullock, Flikshop
Alvaro Bedoya, Georgetown University Law Center

Full audio of the panel is embedded below.

CRS should stop fighting access to its own reports


The Congressional Research Service plays an essential role in policymaking and oversight. It makes Congress smarter about issues and teaches new legislators how to legislate. I would not have spent 11 years working at CRS if I did not think very highly of the institution.

But there is one topic on which the widely esteemed and nonpartisan agency has been embarrassingly biased: the proposals to make its reports more equitably available to the public.

As a practical matter, CRS reports are available – 27,000 copies can be found on government and private-sector websites. EveryCRSReport.com, for example, has more than 8,000 reports. But official congressional policy does not provide for consistent public release of the reports, which explain the workings of Congress, agencies and myriad public policies.

Legislation has been introduced in this Congress and last Congress to fix this situation, and a number of times previously. Reps. Mike Quigley, D-Ill., and Leonard Lance, R-N.J., would have the Government Publishing Office post the reports on GovInfo.gov. This solution would give citizens a central repository to go to read authenticated copies of the reports, and would relieve CRS and congressional staff of the hassles of responding to reporters, lobbyists and constituents who ask for copies.

Inevitably, CRS proclaims aloud that it takes no position on the issue and will do whatever Congress directs. But how are we to square that claim with this 2015 memorandum that CRS’ leadership shopped to legislators? The memorandum is modestly titled: “Considerations arising from the dissemination of CRS products.” The content, however, is nothing but scare-mongering speculation about bad things that might happen if more Americans had access to CRS reports. Proponents of expanded access to CRS reports quickly demolished the claims made in CRS’ “considerations” memo.

As someone who once reviewed CRS reports before they were published, I can tell you that, had a CRS analyst written this memo, it never would have seen the light of day. And said analyst would have been rebuked by his or her supervisor. The memorandum not only misconstrues what is being proposed — nobody is advocating that CRS itself distribute the reports—but it also makes no mention of the many possible benefits of a change in policy (like increased public understanding of how Congress and government operates).

That means the memo violates CRS’ own very clear policies that its work for Congress must be accurate and unbiased, and must consider the possible benefits and costs of any proposed policy. (This internal CRS rule not only is intellectually honest, it also, ahem, protects the agency from having its work give the appearance of bias.)

One hopes that someone in Congress would call CRS leadership to the carpet on this tartuffery, and demand the agency to disavow the memorandum. In a time when federal budget cuts are being seriously discussed, the agency does itself, its employees and Congress no favors by being the lone voice advocating against common-sense reform.

Image by Micolas

Fierce debates dominate D.C.’s first E-Cigarette Summit


If you imagined an e-cigarette conference full of policymakers at a Marriott in Washington would be a tame event, you would be wrong. I suppose I shouldn’t be surprised that e-cigarettes could a polarizing topic, but I will not soon forget the cheers and boos in the crowd as people stood up to state their opinions and present their research at the first E-Cigarette Summit here in Washington.

A running theme of this conference came down to the existential question: are you a skeptic or are you an enthusiast? Are e-cigarettes addictive products designed to hook teenagers or should they be marketed to current smokers as a quitting tool?

It’s important to understand that e-cigarettes are much safer than combustible cigarettes. Every panelist—including professors, physicians, economists and industry folks—agreed with reports that e-cigarettes are at least 95 percent safer than traditional cigarettes. What is not so easily understood is how best to use e-cigarettes to promote a healthier society.

We’ve seen debates like this before. Will needle-exchange sites keep injection users free from infectious disease or will they tacitly encourage people to try heroin? Does condom distribution in high schools prevent teen pregnancy or lead to a breakdown of morals? There are valid points in support of either argument, but whichever way we as a society land will have long-lasting effects.

The truth is there are a lot of specific questions that need to be answered before people will feel comfortable with novel devices. When it comes to e-cigarettes, there needs to be a balance between consumer protection, trust and the application of science, so that sound policy can best direct public health goals. Some of the discussions at this forum centered on questions for which we don’t yet have definitive answers:

  • Does a standalone nicotine product at the concentrations found in e-cigarettes (with the absence of other chemicals that are present in tobacco) produce changes in the brain consistent with addiction?
  • What environmental or product factors are predictors of successful transition from combustibles to e-cigarettes?

As we move forward in our research and advocacy endeavors, the answers to these questions will help shape both tobacco and e-cigarette policy and will form a foundation for U.S. harm-reduction policy.

Some of the more contentious issues created even more forceful debates. While e-cigarettes are effective smoking-cessation tools, physicians are reluctant to recommend them over medications, gum or the patch. Although teen smoking rates are at historic lows, the rise in experimentation of e-cigarettes is concerning (it is noteworthy that daily use of e-cigarettes among teens is 2 percent). While it is unethical to perpetuate the myth that e-cigarettes are nearly as harmful as traditional cigarettes, some suggest there might be an ethical dilemma in marketing e-cigarettes to recreational users.

It is fair to say that more information is better to avoid hooking a new generation on cigarettes, but it is more important to use the tools we have now to encourage smokers to switch to safer products. We cannot forget that, just today, more than 1,300 people will die from smoking in the United States alone. Getting people to stop smoking combustible cigarettes should be our No. 1 priority and there is now a product to make that happen.

Image by LezinAV

Fixing California’s bloated sex-offender registry


R Street just signed a letter calling for commonsense reform of the California sex-offender registry, based on a bill proposed by our friend and Legislative Advisory Board member Sen. Joel Anderson, R-Alpine.

The bill we’re supporting in California, backed strongly by our own research, creates a tiered system for adult sex offenders. This is a step in the right direction to reform California’s overgrown and overly large sex-offender registry. A registry that includes too many people is likely even worse than one that includes too few: it diverts resources toward monitoring low-risk people that should be devoted to monitoring the relative handful of truly dangerous offenders. The best available research on sex-offender registries, which I summarized in this article for National Affairs, indicates that risk-based approaches like the one contemplated in the bill are good public policy.

While taking this first step is important, it doesn’t solve what is likely the single biggest problem with sex-offender registries: their inclusion of offenders who were adjudicated as juveniles. As I’ve written about here with my friends Nicole Pittman and Stacie Rumenap, it’s unjust, cruel and undermines the purpose of the juvenile justice system—which, at least in theory, is supposed to act in offenders’ own best interests. Youth registration, as R Street research has shown, costs millions of dollars more than it could possibly save. It’s the single greatest inefficiency in our sex-offender registration system.

The California bill is a good start, but it’s only a start. If the Golden State really wants to fix its registry, it’s going to have to end the registration of children.

Image by Jeffrey B. Banke

Congressmen reintroduce bill to make CRS reports public


The Government Publishing Office would be required to make Congressional Research Service reports publicly accessible over the internet, under legislation reintroduced last week by Reps. Leonard Lance, R-N.J., and Mike Quigley, D-Ill.

The CRS, a division of the Library of Congress, is known as Congress’ in-house “think tank.” House offices and committees historically have been free to publish CRS reports on their own websites for constituents to view and some third parties aggregate CRS data on websites like everyCRSreport.com.

But while taxpayers spend more than $100 million annually to fund CRS, timely access to these important documents is usually reserved to Washington insiders. There exists no official, aggregated source for taxpayers to access the CRS’ valuable and informative work.

R Street Vice President for Policy Kevin Kosar, himself a veteran CRS analyst, testified recently before the House Legislative Branch Appropriations Subcommittee, where he presented the panel with a letter signed by 25 former CRS employees with more than 570 combined years of service who all support an open public database of nonconfidential CRS reports.

There is strong precedent for public access to legislative support agency documents. In his subcommittee testimony, Kevin noted the Government Accountability Office, Congressional Budget Office and the Law Library of Congress all make their reports public, as do the 85 percent of G-20 countries whose parliaments have subject-matter experts.

Proposals like the Lance-Quigley bill would place publishing responsibilities with another entity, to ameliorate CRS concerns about the service having to publish the reports itself. Briefings and confidential memoranda would not be disclosed and data issued to the public through a searchable, aggregated database would only include nonconfidential information.

As Kevin noted in his testimony, the public deserves to be on equal footing with lobbyists and the Hill.


Applying a BAT to reinsurance would be a big swing and a miss by Congress


As we all saw in recent media coverage of President Donald Trump’s 100th day in office, many observers treat the first 100 days of a new presidential administration as if were the only time that matters, a legacy that has been with us since President Franklin Roosevelt passed most of his New Deal agenda in the first three months of his administration in 1933.

But in some ways, the first 100 days of any new Congress or presidential administration actually is more like baseball’s spring training. It offers lawmakers the chance to warm up, get their teams set and plot out a game plan for the coming year. For baseball, the end of spring training is marked by the start of competitive play. As of last week, Washington’s spring training is closed and it is time to play ball.

The president, congressional leaders and Washington’s many think tanks all have their versions of what comprehensive tax reform should look like, and frankly, everyone is all over the field. One of the biggest issues under debate is a plank from the House Republicans’ plan called the border-adjustment tax, or “BAT.” If Washington isn’t careful, this plan could turn into one giant swing and a miss, particularly when it comes to the reinsurance market.

For a quick trip around the bases, essentially, under the BAT, companies will no longer be able to deduct the costs of imported goods and services. Meanwhile, any company that exports or profits from foreign sales will now enjoy that income tax-free. The debates over whether or not this will be a good thing for the U.S. economy tend to focus on a very few select points. However, if the subjects of insurance and reinsurance are left on the bench, we are going to find ourselves wishing for a rainout.

Right now, it’s unknown whether House Republicans still intend to go forward with their plans for a BAT, much less whether it would apply to financial services like reinsurance – something that only one country (China) of the 160 that employ the conceptually similar value-added tax does. If Congress chooses to follow in China’s footsteps, we have a problem.

In order to take on the risks of events like Texas hailstorms, Missouri tornadoes, Florida hurricanes and California earthquakes, property insurance companies cede portions of those risks to the global reinsurance market, where they are pooled with risks like earthquakes in Japan, floods in the United Kingdom or terrorist events in France. By pooling portions these uncorrelated risks from around the globe, the reinsurance market makes it possible for Americans to buy affordable insurance for their homes, vehicles and businesses.

If Congress decides to pass a BAT system that would apply to reinsurance, the cost to American consumers would be painful. A recently released study by the Brattle Group looked at the effects of a BAT on the reinsurance market and found U.S. consumers would have to pay between $8.4 billion and $37.4 billion more each year just to get the same coverage. Several of my colleagues recently have conducted more targeted research that, over the next decade, the tax would add $3.39 billion to the cost of property insurance in Texas and $1.11 billion in Louisiana.

Applying a border-adjustment tax to reinsurance would be a pitch in the dirt for American consumers and Congress shouldn’t swing. Insurance companies will be put in the unwinnable position of having to raise their prices and offer less coverage. The end result is higher costs, with more risk concentrated on American shores. That’s a bad call for everyone.

Image by smspsy

Three years in, what does the DATA Act tell us about agency spending?


Trying to figure out exactly how much money the federal government spends long has been an exercise in futility for those few brave souls who endeavor to try it. Though the U.S. Treasury has published financial data since the beginning of the republic, the government has an uneven history, to say the least, when it comes to reporting agency expenditures.

Agencies traditionally have employed a hodgepodge of data and spending models that fail to adhere to a common metric. This makes it difficult for lawmakers and policy experts to wrap their arms fully around federal agency spending. Since at least the 1970s, efforts have been afoot to standardize government data, culminating in 2014’s Digital Accountability and Transparency Act, also known as the DATA Act.

The bill’s purpose was to make expenditures both more transparent and more accessible. It requires Treasury to establish common reporting standards across all federal agencies, with the data posted online in a publicly accessible format.

The DATA Act has been in the news again recently because the first agency reporting deadline is May 9, the third anniversary of the law’s passage. Right on cue, the DATA Coalition hosted a panel discussion and “hackathon” last week to let teams of data wonks work with some of the early datasets the agencies have provided.

Keynote speaker Rep. Jim Jordan, R-Ohio, emphasized the potential for uniform spending data to shape policy by helping lawmakers better understand the scope and size of government. That, in turn, could allow them to enact more meaningful reforms. As he put it: “If you don’t know where you are, it’s impossible to know where you’re going.”

The coalition also hosted a panel featuring three individuals who have been key to creating the uniform financial data standards the agencies now must use: Chistina Ho, deputy assistant Treasury secretary for accounting policy and financial transparency; Dave Zvenyach, executive director of General Services Administration’s 18F project; and Kristen Honey, senior policy adviser for the Office of Management and Budget’s chief information officer.

The panelists generally were optimistic about the implementation process, though each noted the difficulty involved in pursuing new endeavors within a convoluted bureaucracy like the federal government. Honey was sanguine about the potential for agencies to follow the lead of private industries that use open datasets for productive ends, noting that American taxpayers have “already paid for this data, so they should have access to it.”

She pointed to the example of the Department of Veterans Affairs’ synthetic dataset published last fall that will help them study mental health issues among military veterans. Honey also predicted that state and local governments were likely to follow suit on open data initiatives, which she hoped would help expose and weed out inefficiencies in government spending and operations across all levels of government.

The panelists also cautioned that many agencies likely will encounter difficulties aggregating and successfully publishing their spending data by the May 9 deadline. The concern was that if reports from the Government Accountability Office and agency inspectors general catalog widespread deficiencies around the first reporting deadline, it could lead the public and lawmakers to doubt the DATA Act’s efficacy.

James Madison famously claimed that the power of the purse was “the most complete and effectual weapon” that could be wielded by government. Increasing the standardization and transparency of government spending data will only help strengthen that power.

Image by zimmytws

Eli Lehrer at the New American Jobs Summit

R Street President Eli Lehrer was featured on a recent panel at the New American Jobs Summit, joined by Micaela Fernandez Allen of Wal-Mart, Tom Kamber from Older Adults Technology Services and Bill Kamela of Microsoft Corp., to discuss how technology and shifting economic needs are changing how workers prepare to join or rejoin the workforce. Video of the full panel is embedded below.

What’s wrong with e-cigarettes?

R Street Policy Analyst Caroline Kitchens recorded this recent video for PragerU on e-cigarettes, a safer alternative to traditional tobacco cigarettes that could help millions of smokers to quit.

Let’s get rid of Puerto Rico’s triple-tax exemption


Let’s ask a simple and necessary question: Why in the world is the interest on Puerto Rican bonds triple-tax exempt all over the United States, when no U.S. state or municipality gets such favored treatment?

The municipal bond market got used to that disparity, but in fact, it makes no sense. It is an obvious market distortion, on top of being unfair to all the other municipal borrowers. It helped lure investors and savers, and mutual funds as intermediaries, into supporting years of overexpansion of Puerto Rican government debt, ultimately with disastrous results. It is yet another example of a failed government notion to push credit in some politically favored direction. Investors profited from their special exemption from state and local income taxes on interest paid by Puerto Rico; now, in exchange, they will have massive losses on their principal. Just how big the losses will be is still uncertain, but they are certainly big.

Where did that triple-tax exemption come from?  In fact, from the Congress in 1917. The triple-tax exemption is celebrating its 100th anniversary this year by the entry of the government of Puerto Rico into effective bankruptcy. Said the 1917 Jones-Shafroth Act:

All bonds issued by the government of Porto Rico or of by its authority, shall be exempt from taxation by the Government of the United States, or by the government of Porto Rico or of any political or municipal subdivision thereof, or by any State, or by any county, municipality, or other municipal subdivision of any State or Territory of the United States, or by the District of Columbia.

That’s clear enough. But why?  Said U.S. Sen. James K. Vardaman, D-Miss., at the time: “Those people are underdeveloped, and it is for the purpose of enabling them to develop their country to make the securities attractive by extending that exemption.” All right, but 100 years of a special favor to encourage development is enough, especially when the result was instead to encourage massive overborrowing and insolvency.

It’s time to end Puerto Rico’s triple-tax exemption for any newly issued bonds (as there will be again someday). As we observe the unhappy 100th birthday of this financial distortion, it’s time to give it a definitive farewell.

Image by Filipe Frazao

Lehmann talks NFIP reform on NPR’s Marketplace

In the wake of devastating floods in Missouri, R Street Editor-in-Chief and Senior Fellow R.J. Lehmann was a guest on National Public Radio’s “Marketplace” to discuss why reforms to the National Flood Insurance Program that encourage more private market participation and risk-based rates are essential. The audio is embedded below.

Kevin Kosar on Fox 5 DC ‘On The Hill’

Vice president of policy at the R Street Institute Kevin Kosar appeared on Fox 5 DC’s “On The Hill” to discuss President Donald Trump’s first 100 days in office.

Cameron Smith talks Alabama’s backdoor booze tax.

R Street’s Cameron Smith joined the Matt & Aunie Show on Birmingham’s Talk 99.5 to discuss backdoor booze taxes in Alabama. Audio of the show is embedded below.

Kosar testifies to House Legislative Branch Appropriations Subcommittee on CRS reports

On May 3, 2017, R Street’s vice president of policy Kevin Kosar testifies before the Legislative Branch Appropriations Subcommittee in support of making Congressional Research Service reports available to the public.

More from Kevin Kosar on why CRS reports should be publicly available can be found here.

Greenhut on ‘damning’ UC audit

R Street Western Region Director Steven Greenhut was a recent guest on the John and Ken Show on KFI AM 640 in Los Angeles to discuss his piece of the Orange County Register discussing the recent unfavorable audit of the University of California system. Audio of the show is embedded below.

Puerto Rico’s inevitable debt restructuring arrives


“Debt that cannot be repaid will not be repaid” is Pollock’s Law of Finance. It applies in spades to the debt of the government of Puerto Rico, which is dead broke.

Puerto Rico is the biggest municipal market insolvency and, now, court-supervised debt restructuring in history. Its bond debt, in a complex mix of multiple governmental issuers, totals $74 billion. On top of this, there are $48 billion in unfunded public-pension liabilities, for a grand total of $122 billion. This is more than six times the $18.5 billion with which the City of Detroit, the former municipal insolvency record holder, entered bankruptcy.

The Commonwealth of Puerto Rico will not enter technical bankruptcy under the general bankruptcy code, which does not apply to Puerto Rico. But today, sponsored by the congressionally created Financial Oversight and Management Board of Puerto Rico, it petitioned the federal court to enter a similar debtor protection and debt-settlement proceeding. This framework was especially designed by Congress for Puerto Rico under Title III of the Puerto Rico Oversight, Management, and Economic Stability Act (PROMESA) of 2016. It was modeled largely on Chapter 9 municipal bankruptcy and will operate in similar fashion.

This moment was inevitable, and Congress was right to provide for it. It is a necessary part of the recovery of Puerto Rico from its hopeless financial situation, fiscal crisis and economic malaise. But it will make neither the creditors, nor the debtor government, nor the citizens of Puerto Rico happy, for all have now reached the hard part of an insolvency: sharing out the losses. Who gets which losses and how much the various interested parties lose is what the forthcoming proceeding is all about.

The proceedings will be contentious, as is natural when people are losing money or payments or public services, and the Oversight Board will get criticized from all sides. But it is responsibly carrying out its duty in a situation that is difficult, to say the least.

There are three major problems to resolve to end the Puerto Rican financial and economic crisis:

  • First, reorganization of the government of Puerto Rico’s massive debt: this began today and will take some time. In Detroit, the bankruptcy lasted about a year and a half.
  • Second, major reforms of the Puerto Rican government’s fiscal and financial management, systems and controls. Overseeing the development and implementation of these is a key responsibility of the Oversight Board.
  • Third—and by far the most difficult step and the most subject to uncertainty—is that Puerto Rico needs to move from a failed dependency economy to a successful market economy. Economic progress from internally generated enterprise, employment and growth is the necessary long-term requirement. Here there are a lot of historical and political obstacles to be overcome. Not least, as some of us think, is that Puerto Rico is trapped in the dollar zone so it cannot have external adjustment by devaluing its currency.

The first and second problems can be settled in a relatively short time; the big long-term challenge, needing the most thought, is the third problem.

The story of the Puerto Rican financial and economic crisis just entered a new chapter, but it is a long way from over.

Image by bobby20

Rep. Ken Buck on the Federal Budget Accountability Act


The Federal Budget Accountability Act—introduced last month by U.S. Rep. Ken Buck, R-Colo., as H.R. 1999—is a short bill, barely two pages long. But it aims to help Congress answer a basic oversight question: how much revenue does the federal government actually receive each year from offsets?

As part of the congressional budget process, Congress gathers estimates of revenues to be received by the federal government, which can be used to “offset” authorizations for spending. For example, as a Buck press release points out, Congress authorizes the Strategic Petroleum Reserve to sell oil. “However, the price of crude oil continuously fluctuates … [which] creates uncertainty regarding the accuracy of Congressional Budget Office projections versus actual revenue received through offsets.”

I had the chance to speak about the bill with Buck, who came upon the issue soon after he arrived in the House in January 2015. “There was not a moment when a lightbulb went off. It was a series of statements about how new spending was ‘paid for,'” he said.

On its face, Buck’s bill may seem utterly unobjectionable. It requires nothing more than that the Office of Management and Budget annually report to Congress on the actual revenues received from offsets. Obviously, it is a basic fiduciary duty to discern whether the revenues received actually cover the costs as intended. A few members of the House Budget Committee are cosponsoring the legislation.

But will H.R. 1999 advance? It’s not clear. Buck suspects that additional spending is being passed off as budget neutral by the misuse of overly optimistic offsets. (On offsets and spending amendments in the House, see this CRS report.) “If they pass the bill, the misrepresentations will be known,” he told me. Enacting the legislation could collectively call out Congress and make the already tough debates over mandatory spending more difficult. “Nobody wants to know what the answer is,” Buck reports, “but we all know. … We just don’t know how bad it is.”

Image by lkeskinen

Dodd-Frank reform must include repealing the Durbin amendment


Many of us know what a “seven-year itch” is. Between the famous Marilyn Monroe movie of the 1950s and the legendary Roseanne Cash song of the 1980s, it is a fairly well-understood turn of phrase.

Congress finally got around this past week to scratching one the most economically painful and fairly literal “seven-year itches” by starting the process to roll back the Dodd-Frank Act, which will turn seven this July.

The Financial CHOICE (Creating Hope and Opportunity for Investors, Consumers and Entrepreneurs) Act—currently before the House Financial Services Committee—has many bright ideas and could serve as a great replacement for the burdensome Dodd-Frank bill of the Obama years. However, in the midst of this happy occasion, the American consumer needs to pay close attention, because Congress may in the end do something stupid.

A behind-the-scenes effort is underway let a Dodd-Frank provision commonly referred to as the “Durbin amendment” remain in the law. If you have a checking account, you should not let Congress keep this law on the books. Chairman Jeb Hensarling, R-Texas, took a strong stand in calling for repeal of the Durbin amendment as part of the CHOICE Act, and the committee should follow his lead by keeping that repeal in the final mark-up.

The Durbin amendment affects literally anyone with a checking account and a debit card. It requires the Federal Reserve to impose artificial government price controls to cap what banks charge to retailers for what are referred to as “interchange fees,” which banks use to pay for the security they provide for customers’ accounts. The cap is set far lower than it would be in a free market, creating a host of unintended consequences.

Before the government interference, banks and credit unions would use these fees to cover more than just security. They would use the revenues to offer perks to their customers, like free checking or point rewards system similar to what we see with traditional credit cards. Studies have shown these perks are worth millions in value to customers. But thanks to the Durbin amendment, banks have been forced to scale back their perks dramatically. The end result has hurt consumers, particularly those—like lower-income families or younger customers—who rely heavily on their checking accounts to conduct financial transactions.

While checking-account customers lost out, retailers (especially big-box retailers) made out like bandits. In 2010, the major retailers’ lobby sold Congress on limiting these transaction fees, promising they would pass along the savings to their customers. As of today, there is no evidence that has ever happened. In fact, an analysis of Federal Reserve data shows retailers have made off with more than $42 billion in foregone interchange fees over the last seven years. Shoppers have seen virtually no decrease in prices, even as they watched as many of their banking benefits disappear.

As the Financial Services Committee wraps up its hearings on the CHOICE Act, it’s important for the American people not to sit by idly. The Durbin amendment was sold in 2010 as protection for the American people, but the data prove the only protection it offers is to the major retailers’ profit margins. The House Financial Services Committee should strive to repeal the Durbin amendment, as should the full House when it hits the floor.

Image by alice-photo

Congress’ ‘cotton fix’ just another corporate welfare handout


Spring is a special time in Washington, filled with many wonderful traditions. Between the blooming of the cherry blossoms, the White House Easter Egg Roll and the Washington Nats’ Opening Day, the nation’s capital is full of action.

However, none of these events compare to Congress’ favorite perennial tradition: trying not to shut down the government. After a two-week spring break, Congress is back, ready to work and horse-trading for votes to prevent a government shutdown. One of the items for “trade” currently being kicked around is a massive expansion of two corporate welfare programs. Referred to as the “cotton fix,” Congress is poised to expand the U.S. Department of Agriculture’s Agriculture Risk Coverage and Price Loss Coverage programs to include cotton as a covered crop.

The ARC and PLC programs already are hardly the gold standard of fiscal responsibly. When Congress created the programs in the 2014 farm bill, the projected costs were $18 billion over five years. They now are projected actually to cost $32 billion over that same time frame. If Congress is successful in adding cotton into the mix, the projected costs easily could be topped up by an additional $1 billion a year.

This might be understandable if there were some crisis in the domestic cotton industry that needed to be averted, but Big Cotton already a pretty cozy deal with Washington. Between subsidized marketing loans, trade promotion programs and economic assistance to cotton mills, the industry is well taken care of by American taxpayers.

And that’s not all the federal government does for them. Unlike many other crops, cotton growers can participate in the Federal Crop Insurance Program and get to ask taxpayers to cover 62 percent of their premiums. Furthermore, during negotiations that produced the last farm bill in 2014, the cotton lobby was able convince Congress to create a special program just for them called the Stacked Income Protection Plan (STAX). This cotton-only program has taxpayers covering 80 percent of the cost for policies that protect against “shallow losses” too minor to be covered under traditional crop insurance.

The cotton industry’s costs to American taxpayers don’t end there. The federal government is in the process of paying out $300 million to the Brazilian cotton industry as part of a 2014 settlement agreement with the World Trade Organization. The settlement was a way to resolve a longstanding trade dispute with Brazil over U.S. domestic cotton subsidies that violated WTO rules. The $300 million payment comes on top of about $500 million the United States paid Brazil from 2010 to 2013 over the same set of issues.

The STAX program was created in hopes that it would stave off future disputes with Brazil, but whether STAX meets WTO rules is itself still an open question among experts. What is certain is that adding cotton to the ARC and PLC programs would only raise the odds of more trade disputes that ultimately cost Americans more money.

Let’s be clear, cotton is still king in Texas and some other parts of the country and Congress knows it. Adding cotton to ARC and PLC isn’t a noble gesture to a struggling industry. It’s about more about making sure multimillion-dollar companies maintain their profit levels at U.S. taxpayers’ expense.

Congress made a deliberate decision to exclude cotton from these two program when they were created in 2014. For Congress to sneak more cotton in the back door of a must-pass bill would amount to yet another corporate welfare payoff, with taxpayers once again left holding the bag.

Image by Kent Weakley

Statewide ridesharing rules on the table in Louisiana


Louisiana may soon join the more than 40 states that have adopted some kind of statewide ridesharing rules, under legislation that would pre-empt parish and local governments from setting regulations and taxes on transportation network companies.

Sponsored by state House Transportation Committee Chairman Kenny Havard, R-St. Francisville, H.B. 527 would require TNCs to register with the Louisiana Department of Transportation and Development and to charge a “local assessment fee” equal to 1 percent of each “gross trip fare.” The 1 percent fee would be sent to local governments from whence rides originated, and part also would be collected by the state to administer the permitting process.

TNCs would be required, through their apps, to display the driver’s picture and license plate before the passenger enters the vehicle. The TNC would also be required to transmit an electronic receipt of the trip.

The legislation also imposes minimum requirements for drivers. The state would ban from working as TNC drivers all sex offenders, felons for up to seven years after their conviction and those convicted of an offense involving drugs or alcohol. The legislation also requires TNCs to adhere to all state anti-discrimination laws and laws providing for the transport of service animals. The law bans drivers from using drugs or alcohol while on duty and requires TNCs to post the policy on their website and provide a means for reporting violations.

In exchange for these requirements, the state would bar local governments and other authorities (including airports) from imposing their own requirements or imposing additional fees. Airports would be permitted to impose only those fees that taxi drivers already pay. Finally, the statute would clarify that TNCs are not taxi operators and are not bound by the taxis code of regulations.

Understandably, the proposal isn’t being received kindly by some in local government:

New Orleans Councilwoman Susan Guidry, who authored the city’s ordinance regulating ride-hailing services, said just a quick overview of the proposed law showed it fell short of the city’s ordinance in a number of ways. It has fewer insurance requirements, less stringent background checks, does not require random drug tests or drug tests after crashes and does not prohibit surge pricing in emergencies.

The proposed state law also does not include prohibitions on discrimination in pick-ups and drop-offs and would not require the ride-hailing services to provide data that could be used to verify whether such discrimination is occurring, something that is including in the city ordinance.

“Why would you create a law that was less protective when they have already agreed to operate under our city’s law which is more protective?” Guidry asked.

Of course, ridesharing companies already operate under a patchwork of rules and regulations. For example, three of the largest parishes in the metro New Orleans area—Jefferson, Orleans and St. Tammany—each has its own ridesharing ordinance, which differ from one another in details. Theoretically, it is possible to drive through all three parishes within an hour, depending on traffic. It doesn’t make sense literally to have to navigate that maze of regulatory regimes over such a short distance.

The Legislature should unleash the potential of the sharing economy statewide. It’s good for consumers and provides new opportunities for drivers to make ends meet.

Image by Ionut Catalin Parvu

Permissionless innovation vs. the precautionary principle

Jonathan Taplin worries the “unfettered monoliths” of Google, Apple, Facebook and Amazon undermine democracy and should be broken up. In Europe and elsewhere, this combination of companies is referred to collectively by the pejorative “GAFA,” a ubiquitous bogeyman and symbol of American cultural imperialism. Never mind that all four got where they are by creating tremendous value for consumers. Google organizes information, Apple makes the best phones, etc. They aren’t harming us, they’re making our lives better.

They also aren’t actual monopolies. Amazon faces off with online retailers operating on razor-thin margins. The iPhone only has 18 percent market share. Google has thousands of competitors in digital ads. Facebook could go the way of Myspace. None of these companies is free from competition, or in a position to exert monopoly power callously.

The author wants us to embrace precautionary regulation like the EU’s. But there’s a reason few big tech firms start there. It’s a good thing America’s best companies don’t have to ask permission to innovate or forgiveness for succeeding.

Westinghouse bankruptcy epitomizes failures of electricity monopolies


Westinghouse Electric Co. LLC—the nuclear power company that traces its lineage to the original Westinghouse Electric Corp., founded in 1886—has been forced to declare Chapter 11 bankruptcy, largely the result of immense delays and cost overruns at two nuclear construction sites, Alvin Vogtle and V.C. Summer.

The bankruptcy places a potentially huge financial burden on electric ratepayers in South Carolina and Georgia and underscores the need for nuclear technologies to reduce cost overruns. But it would be a mistake to blame the current state of nuclear technology itself for Westinghouse’s failure. The mess really stems from the perverse incentives of the natural-monopoly model, which rewards utilities for building capital-intensive “mega-projects” irrespective of investment risk.

The story dates back to the late 2000s, when Southern Co. subsidiary Georgia Power Co. and SCANA Corp. subsidiary South Carolina Electric & Gas Co. received state regulatory approval for their shares to build two reactors each at the Vogtle and V.C. Summer sites, respectively. To their credit, the utilities entered into fixed-price contracts (with cost-escalator provisions) with Westinghouse to build the nuclear facilities by a guaranteed date. This helped to mitigate some of the ratepayer risk of cost overruns.

However, the Westinghouse bankruptcy diminishes these guarantees, causing legal disarray amid speculation of rate increases to recover costs of finding new contractors to finish the projects. Both utilities have filed interim agreements with Westinghouse to administer cost-to-complete assessments over a transition period.

The original sales pitch to approve the nuclear projects rested largely on hedging high natural-gas prices, federal carbon regulation, meeting customer demand growth and taking advantage of federal nuclear subsidies. Over the past decade, natural-gas prices tanked, federal carbon regulation (cap-and-trade) never materialized and demand weakened. Now, it appears the utilities may lose the cost advantages of federal nuclear subsidies. Terminating the Westinghouse contracts may force Southern to prepay the outstanding balance on the $8.3 billion loan guarantee provided by the Department of Energy. Billions in cost escalations would continue to spiral if the projects don’t start operations by the end of 2020, which would render them unqualified for the federal production tax credit for nuclear.

Many independent analysts project that delay beyond 2020 is a given. But as the interim assessment period trudges along, the utilities are telling their regulators a different story. Both downplay the remaining time and costs of completing the projects, while expressing their desire to push forward. Meanwhile, Morgan Stanley & Co. assert that abandoning the nuclear projects is the most likely outcome. If regulators elect to complete construction, Morgan Stanley predicts future delays for the projects and estimated additional cost overruns at $5.2 billion for SCANA and $3.3 billion for Southern. By comparison, building an efficient natural gas power plant would cost roughly $2 billion for an amount of capacity equivalent to each nuclear project.

A strong case can be made that the utilities don’t even need the plants’ full capacity. The Southeast has a surplus of regional capacity, meaning that third-party sources would be available at little cost. But because regulated utilities don’t have an incentive to buy from third parties, it leads to a well-documented bias to self-build.

State legislation championed by the utilities exacerbated the perverse incentives of the regulated monopoly model. Georgia and South Carolina passed laws in the 2000s enabling utilities to recover costs via rate hikes during construction, rather than waiting until completion. The laws lower finance costs, but shifts risk to ratepayers. The change also diminishes regulatory scrutiny of costs, thus dampening utilities’ cost-control incentives. The South Carolina Small Business Chamber of Commerce has criticized the unintended consequences, which include undermining utility incentives to avoid cost overruns and lacking transparency and a process for public input on construction contracts.

The Westinghouse bankruptcy makes one thing clear: when legislators and regulators socialize risks and costs, consumers suffer. The regulated-monopoly model creates moral hazard, epitomized by capital-intensive mega-projects in which companies insulated from investment risks lack incentives to guard against those risks. These nuclear projects are just new cases of a century-old problem.

By the late 1980s, monopoly utilities around the world faced high costs and unwanted assets. The subsequent political pressure led to electricity-industry reforms to change incentives, the locus of decisions and risk allocation. Some states liberalized their electric industries in the late 1990s and 2000s and, despite transition challenges, realized the benefits of competitive markets, as merchant suppliers internalized investment risk. In these states, the investment consequences of unexpected policy changes and drops in natural-gas prices and electricity demand have been borne by the private sector, which has repositioned itself to maximize value in a new investment climate. Meanwhile, regulated utilities have sat on power plants that no longer offer the most economical means of producing electricity in order to continue collecting a rate of return on their asset base. Worse, some have embarked on ill-advised investments on the backs of captive ratepayers.

States that failed to learn from the boondoggle projects of regulated monopolies have repeated them. Electric ratepayers will eat much of the cost, even if regulators elect to abandon the nuclear projects, as was the case with mega-projects decades ago. Perhaps the silver lining is that policymakers in regulated-monopoly states finally will learn the appropriate lesson and join the second wave of competitive-electricity reforms.

Federal policymakers should keep in mind that nuclear still provides a strong value proposition as a reliable, zero-emissions resource. However, any technology that takes a decade to build and carries huge capital demands creates an enormous investment risk. For nuclear, the best hope comes in the form of small modular reactors (SMRs). These reactors offer major safety and operational benefits with potential for much lower cost-overrun risk. NuScale Power announced the first SMR submission to the Nuclear Regulatory Commission in January. Easing the regulatory burdens on SMRs would reduce artificial barriers to entry. If SMRs become commercially viable, procurement decisions should come from competitive forces, not rent-seeking monopolies and their regulators.

Image by Martin Lisner

Does Congress have the capacity it needs in foreign affairs?


The Constitution assigns Congress the power to declare war, fund the military, approve treaties and regulate commerce with other nations. Yet, over the past century, presidents have taken the leading role in foreign affairs. Today, the president heads an expanding executive branch security apparatus—one which has found itself mired in controversy many times.

What role does Congress play in foreign affairs in the 21st century? What duties should it have? Does Congress have the resources it needs? The Legislative Branch Capacity Working Group recently hosted a panel on the questions, moderated by R Street’s Kevin Kosar and featuring Kurt Couchman of the Defense Priorities Foundation and Katherine Kidder of the Center for a New American Security. Video of the panel is embedded below:

Florida House bill would make solar installations a pain


If you think getting home improvements approved and ultimately completed in Florida is a hassle now, if monopoly power companies get their way, just wait until you try to install solar panels.

A bill currently under consideration in Florida’s Capitol would impose extensive disclosure and needless paperwork requirements on sellers of rooftop-solar panels and other renewable energy systems—to include everything from performance guarantees to tax advice, insurance and a requirement to project future utility rates.

H.B. 1351 by state Rep. Ray Rodrigues, R-Estero, and S.B. 90 by state Sen. Jeff Brandes, R-St. Petersburg, both would implement provisions of Amendment 4 by exempting solar and other renewable-energy devices from ad valorem property taxes. The Senate bill sticks to its objective by simply codifying the amendment, which was approved by 73 percent of Florida voters last August. The House version, however, goes beyond implementing the amendment by regulating the sale, financing and lease of these energy-generation systems, in addition to imposing other conditions.

Indeed, some requirements prescribed in the bill appear to be reasonable at first glance, as they relate to safety and reliability. However, they are superfluous, since installers of these devices are already regulated by the Department of Business and Financial Regulation and are required to be licensed and insured. Additionally, consumers already enjoy legal protections against fraud and other deceptive transactions with Florida’s very tough Deceptive and Unfair Trade Practices Act.

One provision in the bill even requires installers to comply with undefined “standards” set by the local utility company, which would promote an inherent conflict of interest between the renewable electricity source and the utility that stands to lose business from it.

Nevertheless, proponents cite “consumer protection” as justification for these onerous requirements, as so often is the case with excuses for a swelling nanny state to protect us from ourselves. In reality, all too often, these are nothing more than crony capitalist attempts to protect other industry players.

That, in fact, appears to be the case here. Utility companies have historically been the only option available to purchase electricity. With the rise of solar and dramatic decreases in the cost of renewable energy, consumers now have an alternative. Utilities obviously perceive this as a threat to their business model and businesses unaccustomed to competition generally do not like it.

So while they cannot altogether ban the sale of solar panels and the like, what better way to discourage their purchase than to complicate the process to obtain them? According to a recent Miami Herald investigation, some of H.B. 1351’s language actually was drafted by Florida Power & Light, the state’s largest utility.

If there are legitimate safety or consumer protection concerns with the sale of renewable-energy generation systems that current law does not address, a debate should indeed be had and legislation to address it considered. However, the bills currently under consideration should stick to implementing and codifying the amendment Floridians overwhelmingly approved—not shielding utility companies.

Image by travelfoto

Missouri ridesharing bill moves to Gov. Greitens’ desk


Legislation legalizing ridesharing services in the Show-Me State now sits on Gov. Eric Greitens’ desk, after the Missouri House passed statewide rules for transportation network companies by a 144-7 vote last week. The state Senate had already cleared the measure by a vote of 31-1 a few days earlier.

As the Associated Press described the bill:

The legislation would require that companies pay a licensing fee and adhere to a nondiscrimination policy. It would exempt them from local and municipal taxes and require drivers to submit to background checks and purchase vehicle liability insurance.

Missouri cities, like many others around the country, initially were cool to ridesharing, throwing up regulatory impediments to halt the services’ spread. By the time R Street issued its second Ridescore report in December 2015, there were only 15 states that did not either have or were considering statewide legislation, typically focused on mandatory insurance, taxes and background checks. Today, there’s only a handful of states that have not yet passed statewide rules.

In the first Ridescore report in November 2014, Kansas City earned a D- for overall friendliness to for-hire transportation services and an F for its treatment of TNCs. Those grades improved slightly to a C and a D, respectively, in the second report, though both remained several grades lower than the average and median scores in the 50-city study.

Enacting a statewide law has been a priority for House Speaker Todd Richardson, R-Poplar Bluff, and other Missouri lawmakers focused on job creation. Uber has projected an additional 10,000 jobs for the state through expansion of its ridesharing app service. Floor remarks by legislators from Springfield—where both Uber and Lyft now operate—indicated more people have been able to get downtown since that city moved to allow ridesharing services in the capital.

The compromise that attracted enough support for the large vote in both houses specifies that Uber, Lyft and other ridesharing services must pay city taxes and be liable for pickup fees at the airports. They do not have to pay meter inspection or other license fees, and they are permitted to charge higher prices for busier times because of rush hour or bad weather, when demand escalates.  These increased charges must be accepted by the customer using the application, of course.  Moreover, both Kansas City and St Louis won the right to audit the newly authorized services up to twice a year, to alleviate concerns regarding public safety and chiseling on fees.

State lawmakers have a lot on their plates, since Congress appears unlikely to solve more than a few of the 21st century adjustments required to maintain a reasonable level of civilization. It is encouraging that citizen participation in popular disruptive services has produced an environment where many more people on both sides of the transaction can participate with the government’s blessing and oversight.

Image by Nagel Photography

Carbon taxes are about climate issues, not budgets


A good test for whether politicians are serious about battling climate change, or merely using the problem as an excuse to advance a grab-bag of progressive issues, is to examine what they would do with the revenues collected from a carbon tax.

If the answer involves anything other than offsetting cuts to other taxes, then I suspect the politician’s motives are less than pure. Carbon taxes are not about raising revenue. They are about placing a price on emissions so companies and consumers have incentives to choose lower-emitting options. The goal is to put a price on an “externality” – the economic term for ill side effects that aren’t included in the price of production.

Unfortunately, Washington Gov. Jay Inslee has failed this test with his carbon-tax proposal to help fund the state’s budget. As the Tacoma News Tribune reported in late March: “Not only does Inslee say it would combat climate change, a major priority of the governor’s, but it also would raise $2.1 billion in the next two years to help make court-ordered changes to the public school system and fund construction projects.”

Climate activists routinely warn about the dire consequences for the planet if the public doesn’t get serious about the issue. They also like to harangue global-warming skeptics for their refusal to jump aboard their campaign. Yet when they have the chance to ameliorate the concerns of those with other political views, they fail to do so.

It’s hard to blame skeptics who worry that the global warming fight is mostly about helping the state grab more tax revenue when leaders in that movement make clear they see a carbon tax as a way to help the state grab more tax revenue. Fortunately, Washington legislators from both parties failed to include a carbon tax in their $44.7 billion budget plan, which the newspaper described as a “one-two punch in Inslee’s eyes.”

Carbon-tax proponents believe the tax would internalize the social cost of carbon emissions in a way that’s more efficient and cost-effective than command-and-control regulations. Its purpose is not to fund all sorts of programs or balance the budget. A carbon tax accompanied by cuts in other taxes and paired with reductions in the regulatory burden has the best shot to win over people who suspect the whole thing is a sleight of hand.

Carbon taxes are a hard enough sell when their backers are not looking for a tax grab. On Nov. 4, Washington voters handily defeated Initiative 732, which would have been the first fossil-fuels tax in the nation. The Seattle Times reports, ironically, that “the measure had trouble marshaling consensus among progressive and environmental groups” because of “budgetary and other concerns.” Apparently, they didn’t like that its authors tried to make it revenue neutral.

That’s a sad commentary on the priorities of some activists and politicians, who claim to be urgently alarmed by global warming’s threat to the planet. Voters from across the political spectrum might start to take their dire warnings more seriously when they introduce a carbon tax that is about curbing emissions – not raising taxes to pay for a bunch of programs and subsides. Until then, expect tax-burdened voters to keep giving these proposals a failing grade.

Image by Andre Lefrancois

Are you paying your fair share of taxes?


The following is a guest post by attorney and freelance writer Mark Meuser.


Today, many Americans will finalize their federal income tax returns and send their 1040 forms to the Internal Revenue Service to make tomorrow’s Tax Day deadline. Whether you are receiving a refund or will need to send a check to Uncle Sam, if you worked more than 35 hours a week and did not make at least $164,500 in 2016, you will not be paying your fair share of taxes this year. Shame on you.

Obviously, I am joking, but the per-capita burden of federal spending is no laughing matter. In 2016, the federal government spent approximately $12,387.29 per resident of the United States. Some might think that $12,387.29 in taxes sounds reasonable. Under current tax rates, it would mean each and every man, woman and child must earn at least $66,450 to pay his or her fair share.

Obviously, not everyone works or earns anywhere near that amount. Some 47 percent of all Americans are either too young or too old to be gainfully employed full-time. If 100 percent of all Americans between the ages of 25 and 65 were to pay taxes, federal spending would be equivalent to approximately $23,072.30 per working-age adult

But even among the able-bodied, we don’t see 100 percent workforce participation. Whether because of a disability or lack of necessary job skills, or because a parent chooses to stay home and raise their children, some people just don’t work. According to the Bureau of Labor Statistics, there are approximately 100 million Americans over age 25 who work 35 hours a week or more. To cover total federal spending costs, each would need to pay $39,104.43 in taxes for the government to balance its budget. That would require each to have at least $164,500 in individual (not household) earnings per year.

An American’s fair share of government spending has not always been this high. When my grandfather was born 95 years ago, per-capita federal spending was just $30.14 ($437.04, when adjusted for inflation). The run-up in federal spending amounts to a 3,000 percent increase.

All of which raises the question each of us should ask as we send off our tax filings: how much government am I really willing to pay for?

Image by Steve Heap

How cronyism threatens Louisiana’s craft breweries


Louisiana is well-known for its love of both food and alcohol. The state is a tourist destination for those looking both to enjoy excellent dining and to have a good time. Louisiana’s love affair with food has made its cuisine well-known worldwide. New Orleans’ Mardi Gras festival has few rivals around the world.

Meanwhile, across the country, craft-beer breweries and so-called “gastropubs” have been growing. The craft-beer revolution proceeded at a slower pace in Louisiana, with Abita as one of the few local craft beers to gain national exposure. Much of the reason for this disparity is the hostility the state has shown to brewers, which is in line with its profile generally as a terrible state in which to do business, thanks to its high taxes and crippling regulations. Louisiana has the 12th highest beer excise tax in the country, at 40 cents a gallon. In fact, the tax-hungry state recently raised the fee.

If the tax increase were not enough, the state now is going after craft breweries who also serve food and hold events. Last fall, Louisiana’s craft breweries received “cease and desist” letters and were cited by the Louisiana Office of Alcohol and Tobacco Control for everything from holding yoga classes to serving food. The breweries had been holding those events for years without any complaints, but the ATC suddenly found regulations that limit what breweries could provide on their premises.

The craft brewers got angry and demanded a change in the regulations. In March, the ATC released new rules that, on the surface, would permit many such events. Alas, the devil was in the details.

The ATC ruled that live entertainment was permitted at breweries only so long as it was “not the primary purpose of the facility.” Breweries also could serve food and even charge a cover for some shows. But food sales must be “incidental to the beer sales,” meaning they could not exceed 25 percent of on-premise beer sales. The ATC also banned on-site restaurants from serving alcohol produced off-site. Finally, the ATC ruled that breweries could host fundraisers and events for nonprofits, but they must be a registered 501c(3), c(6) or c(8) and all proceeds from the event must go to the nonprofit.

While the new rules clarify old regulations, they still threaten the existence of craft breweries and gastropubs across the state. NOLA Brewery Co. CEO Kirk Coco told The Advocate that he was concerned about how the regulations would affect his brewery’s recently opened barbecue restaurant, a part of its $1.6 million expansion. Coco also warned of job losses, saying he “would guarantee you that there would be at least three or four closures in the next six months and that’s all jobs.”

Meanwhile, other brewers have threatened to take their operations out of state. One brewer considering leaving Louisiana is Parish Brewing Co. “I am in the process of planning a multimillion dollar expansion and I am considering doing so across the border in Texas or Mississippi if the government is against breweries here,” Parish Brewing owner Andrew Godley told The Advocate.

Craft brewers believe the regulations were issued at the behest of the Louisiana Restaurant Association, which sees breweries as competitors, particularly to sports bars. Instead of going to the Legislature to change the law, entrenched interests merely had to complain to an unaccountable executive-branch agency.

Serving food and holding events is an important part of the craft-brewery business. It helps them build brand recognition and provide jobs for their employees. Louisiana should keep in mind the maxim “do no harm” when they regulate this growing segment of the state economy.

Image by f11photo

Discussing the future of the GSEs on the Investors Unite podcast

I recently joined Investors Unite founder Tim Pagliara on the group’s housing podcast for a broad-ranging discussion about what a future arrangement for Fannie Mae and Freddie Mac might look like. Audio of the full show is embedded below.

R Street launches Justice for Work coalition with April 17 D.C. event

As the bipartisan movement for criminal-justice reform continues to move forward in the states and at the federal level, it’s time to reconsider government-imposed barriers to economic opportunity, such as occupational licensing, mandatory background and biometric checks, and other restrictions on the ability of ex-offenders to find financial stability and meaningful work.

In that vein, R Street will host an April 17 event to announce a new ideologically diverse coalition to highlight the issue of “Justice for Work.” To be held 6 p.m., April 17 at the Stanton & Greene loft (319 Pennsylvania Ave. SE), the launch will be occasioned by an expert panel that includes ex-offenders, former law-enforcement officers, and policy and legal experts. It will be followed by an open-bar social mixer.

We are joined in this new coalition by the American Civil Liberties Union, Right on Crime, Impact Justice, Tech Freedom, FreedomWorks, Americans for Tax Reform and the American Conservative Union Foundation. Together, these members agree that prescriptive mandates may serve a purpose where there is a demonstrated public safety risk that cannot effectively be addressed otherwise. But in areas where access to work is denied solely to signal the empty political slogan of being “tough on crime,” the Justice for Work coalition seeks to make meaningful change.

RSVP here.

The ‘fixed AI’ fallacy


As Andy Kessler points out in The Wall Street Journal, a tax on robots would hinder entrepreneurial activity in automation and artificial intelligence (AI). The same algorithms that make job-displacing robots smarter and more effective also make us more productive at translating documents, searching for information and streamlining daily tasks. We can’t have our cake and tax it too. As Winston Churchill once said, “I contend that for a nation to try to tax itself into prosperity is like standing in a bucket and trying to lift himself up by the handle.”

Gates and others who bemoan the changing job market fall prey to the fixed pie fallacy—the assumption that available jobs and the wages those jobs pay are fixed quantities. Developments in information technology have led to jobs unimagined by macroeconomists and technologists of previous decades, such as social-media managers, website designers, bloggers and virtual assistants. Crafting policy based on “fixed AI” thinking will prevent new jobs from arising.

Job displacement is an inevitable consequence of technological development and economic growth. Instead of taxing our digital co-workers, thought leaders such as Gates should argue for policy changes that permit experimentation in skills-based education and workplace benefits to better equip workers with the skills and financial flexibility to adapt to the changing jobs market. To realize AI’s full benefits of productivity and convenience, we need to view it as a feature, not a bug, of our tech-imbued future.

Image by Jinning Li

Caleb Watney talks self-driving cars on KVOI

In light of last month’s high-speed crash in Tempe, Arizona, involving a self-driving Uber car (reports say the car had the right of way), R Street Tech Policy Associate Caleb Watney was a guest on Mike Check with Mike Shaw on KVOI-AM in Tucson to discuss the technology and public policy around autonomous vehicles. Audio of the segment is embedded below.

Kosar talks congressional reform on The Golden Mean

R Street Governance Project Director Kevin Kosar recently joined host Michael Golden’s podcast The Golden Mean to discuss the Legislative Branch Capacity Working Group and the prospects for congressional reform. The full show is embedded below.

Holding the administrative state accountable

R Street Senior Fellow Kevin Kosar joined the Manhattan Institute’s Oren Cass and Adam White of The Hoover Institution on the Federalist Society’s podcast to discuss the Legislative Capacity Working Group and efforts to restore Congress’ role as a check on the executive branch. The full show is embedded below.

Throwing cold water on the insurance industry’s dog bite numbers


Today is National Pet Day, a day to cherish the love, entertainment and fulfillment provided to us by our animal companions. Or, if you’re in the insurance industry, it’s a day to stoke fear of dog bites.

Dog-Bite Claims Surge 18% as Children Bear Brunt of Attacks” reads the headline from Bloomberg, based on a press release from the Insurance Information Institute. Indeed, the III produces a similar release every year, in recognition of National Dog Bite Prevention Week, which runs April 9 to April 15.

The calendar-making gods are sending some decidedly mixed messages.

As is their wont, insurers want to highlight safety, which is a perfectly commendable goal. Dog bites and other pet-related injuries befall thousands of people each year, and better care can and should be taken to mitigate and avoid them. They also constitute a significant portion of the loss costs associated with the liability portion of one’s homeowners insurers policy, which explains the motivation for the public education campaign.

However, when one drills down on the numbers, there’s little to justify the alarmist rhetoric. Dog bites are not “surging” at all.

It first bears noting that liability isn’t actually an especially big ticket item for homeowners insurers. The III notes that the industry paid out $602.2 million in dog-related claims in 2016. That sounds like a lot. But it represents just a tiny portion—just a little more than 1 percent—of the more than $48 billion in claims they paid out, much less the $91.4 billion in direct premiums they collected, according S&P Global’s statutory insurance data.

Also worth mentioning is that, while the headlines tout a rise in dog “bites,” the data actually refer to “dog-related injuries.” If you break your neck after tripping over your shih tzu, that gets included. How often does that happen? A lot. Falls are the number one cause of nonfatal injuries in this country. A 2009 study from the Centers for Disease Control and Prevention found an average of 87,000 fall injuries treated in emergency rooms each year were associated with cats and dogs. Dogs represented 88 percent of the total, or about 76,000 dog-related falls that send Americans to emergency rooms every year.

Of course, that 76,000 figure far exceeds the 18,123 dog-related claims reported by the III, so the vast majority of people who suffer dog-related falls never file a homeowners claim, even if they went to the emergency room. No doubt the same is true of dog bites. Of the claims we know about, what proportion are dog bites and what proportion are other kinds of injuries? We don’t know. The III doesn’t break out those numbers. We do know that dog bites sound scarier than dog falls (even though the latter might actually produce more serious injuries) so it shouldn’t be surprising that’s what gets the headline.

Speaking of headlines, let’s look at Bloomberg’s choice to characterize the rise in dog-related claims as a “surge.” It’s true that claims rose about 18 percent from 15,352 in 2015 to 18,123 in 2016. Is that really a surge? Bear in mind that there are nearly 90 million dogs in the United States. Even if we assume no single dog was responsible for more than one insurance claim, it would still mean only about 0.02 percent of American dogs contributed to an injury that sparked an insurance claim. A difference of less than 3,000 claims per year, in a universe that big, amounts to statistical noise.

But even if we were to take the incredibly small sample size at face value, note that this year’s increase followed back-to-back years when the number of dog-related injury claims declined. From 2013 to 2015, the number of pet-related claims fell 12 percent, from 17,359 to 15,352. But were we treated to headlines about how dog bites had “plummeted?” No. No, we were not.

For that matter, it is just frankly irresponsible to represent these numbers without making basic adjustments for factors like inflation and population growth. The III notes that the average cost of a dog-related claim has risen by 73.4 percent from 2003 to 2016. This would leave one with the impression that pets have become more dangerous or, specifically, that bitey dogs have become more vicious.

But that’s just not true. Of course the average injury claim has gone up since 2003, because the cost of health care has gone up since 2003. Using a medical cost inflation calculator, one would expect the average claim to rise by about 56 percent over that period. Again, dealing with a small sample size, the mix of the kinds of claims in a given year could make the average claim go up by more or less than the baseline cost of medical inflation. Indeed, from 2015 to 2016, the average claim went down by 11 percent.

Even more significant to the overall picture is that neither the III, nor any of the news outlets reporting their findings, make even the slightest effort to put into perspective that, over the long term, the number of claims has been relatively flat, even as the number of people and dogs continues to increase.

According to the III, from 2003 to 2016, the number of dog bites rose by 7 percent, from 16,919 to 18,123. But the population of the United States rose by 11 percent over that same period, from 290.1 million to 322.8 million. And as the chart below makes clear, the population of U.S. dogs surged by a whopping 35 percent.


So, this actually means both that a declining proportion of Americans are being bit by dogs each year and that a way smaller percentage of dogs are biting (or tripping or what have you) people. In a nutshell we’ve gone from one dog-related injury for every 17,146 people and 3,841 dogs to one for every 17,811 people and 4,949 dogs.

That’s the kind of good news we should be celebrating on National Pet Day.

Image by everydoghasastory

Kevin Kosar at TPA postal reform panel


R Street Senior Fellow Kevin Kosar took part in a recent Capitol Hill briefing on U.S. Postal Service reform. The panel was hosted by Taxpayers Protection Alliance and also featured representatives of Americans for Tax Reform, the American Consumer Institute and Frontiers of Freedom.

It’s time to kill the Durbin amendment


After six years of unfulfilled promises, it’s time the Durbin amendment finally was repealed. A last-minute addition to the Dodd-Frank Act—itself a political overreaction to the financial crisis of 2007-2009—the amendment passed without a hearing or adequate discussion of how it would work in practice. We now know it hasn’t worked at all.

Interchange fees are charged by banks to retailers to allow customers to use that bank’s debit card in that store. The Durbin amendment gave the Federal Reserve power to cap those fees, which at the time averaged $0.44 per transaction, for banks with more than $10 billion in assets.

Proponents of the rule hoped that what would have been banks’ revenues would translate instead into lower retail prices for consumers. Indeed, retailers were projected to save an estimated $8 billion yearly. But nearly six years since the price controls went into effect, consumers have not benefited; a fair number, in fact, were made worse off.

The cost savings have, for the most part, become profits for retailers. The Federal Reserve Bank of Richmond found recently that three-quarters of retailers it surveyed did not change prices since interchange fee caps went into effect, and nearly one-quarter actually increased prices.

The Richmond Fed estimates the goal that retailers would pass savings on to customers in the form of lower prices has had an estimated 1.2 percent success rate. These findings are confirmed elsewhere, providing evidence to conclude that consumers experienced effectively no savings at the register.

For any student of history, it should come as no surprise that governments cannot divine the “fair prices” of things. Rent control laws in New York have created enough abandoned housing units to house all of the city’s homeless. Regulation Q, which allowed for government price fixing in deposits, encouraged complex arrangements that discriminated against smaller and less wealthy savers. One can go back as far as ancient Egypt and Babylon to find examples of people not understanding that prices convey economic realities that remain fixed, even after the government changes the prices.

That the Durbin Amendment would suffer the same fate as these other price controls was not hard to predict. To offset revenue losses and remain competitive, banks needed to find ways to raise their deposit account fees. Some did it through higher monthly service charges, while others cut back on free services like checking. A large number of financial institutions—especially small issuers like community banks and credit unions—essentially were pushed out of the competition due to the administrative costs and red tape of various provisions. And all financial institutions saw reduced incentives to innovate in the payment card industry.

As a result, financial markets suffered fewer free checking accounts, fewer debit-card rewards programs, higher costs of entry into financial services and continued reliance on payment networks more susceptible to fraud. These consequences hurt all bank customers, but especially those with lower incomes. Up to 1 million customers were pushed out of the banking system, presumably into the domain of alternative financial providers such as check-cashers and pawnshops.

From the observable consequences, one would be hard-pressed to find the amendment as accomplishing any legitimate objective, other than unintentionally enshrining benefits to particular kinds of retailers. The rule created market distortions that hurt all financial institutions, especially smaller ones, and hurt all depository customers, especially the poor. The Durbin amendment is a case study in how rushing into legislation—without give-and-take deliberation—tends to produce the opposite of what was intended.

Image by alice-photo

Adams talks self-driving cars at Institute for Legal Reform

ian chamber

The threat of litigation could derail the promise of autonomous vehicles to lives. R Street Senior Fellow Ian Adams recently joined a panel hosted by the U.S. Chamber of Commerce’s Institute for Legal Reform to address potential liability issues and allow the technology to achieve its full potential. Video of the full panel is embedded below.

Short-term rentals are an opportunity Missouri can’t afford to miss


Whether it’s the cars in their garages or the rooms in their homes, Americans are realizing they’re leaving money on the table when their property remains idle. House Bill 608, making its way through the Missouri Legislature, ensures that Missourians are able to take advantage of economic opportunity in the short-term home rental space.

That economic impact of short-term rentals is significant, as one study after another confirms there is growing demand. The National University Institute for Policy Research found that short-term rentals generated a total economic impact of $285 million in San Diego from 2014 to 2015. A study commissioned by Homeaway earlier this year found the economic impact of short-term rentals in Nashville was $477.2 million.

While H.B. 608 isn’t perfect, reasonable statewide standards for short-term rentals make a lot of sense. If Missouri’s legislators want the “gold standard” of short-term-rental laws, Arizona is a good place to start. The Grand Canyon State collects a number of lodging-related taxes on short-term rentals, but prevents cities, towns and counties from restricting short-term rentals simply because the property in use isn’t classified as a hotel.

One of H.B. 608’s more significant departures from the Arizona model is a provision that allows “any county, city, town, village, township, fire district, 10 sewer district, or water district” essentially to ban short-term rentals before April 1, 2018. The bill’s new statewide provisions won’t affect those “political subdivisions” that act before the grandfather date. This might create an incentive for local governments to race to restrict short-term rentals, simply to retain the option to do so in the future.

Oddly, that’s one of the chief problems that a commonsense short-term-rental law should correct. Missourians across the state should have the same basic opportunity to generate additional income with their properties, not a patchwork of local ordinances that grant opportunity to some and remove it from others.

I’ve seen firsthand how short-term rentals benefit the little guy. On a trip to Charleston last year, I crossed paths with an Uber driver who used the income from short-term rentals in his basement to purchase his car. That’s not some corporation horning in on a neighborhood; it’s the American dream of being able to work hard and succeed, using every tool at your disposal.

Having a transparent and predictable legal foundation for short-term rentals at the state level probably means H.B. 608 is worth supporting, even with the grandfathering provisions. But the Missouri Legislature would make a better choice by ensuring the economic opportunity of short-term rentals is open to all its citizens.

Image by f11photo

Ohio and Indiana take different approaches to opioid epidemic


Ohio Gov. John Kasich described drug addiction as “a common enemy” in this week’s state-of-the-state address. Kasich highlighted the challenge in terms similar to those laid out by state Speaker of the House Cliff Rosenberger, R-Clarksville, back in January when members were sworn in. But there does not yet appear to be regional consensus on how to engage this blight on civilization.

Just imagine what kind of relief Ohio could be afforded in health care, where most of $1 billion in state and federal Medicaid addiction-treatment funds go, if this problem were to be resolved. Nearly as large a cost, in terms of both wasted lives and government expenditures, stems from corrections programs for drug abusers. The costs in education, housing, social services and workplace productivity are incalculable.

As J.D. Vance, author of last year’s bestseller Hillbilly Elegy: A Memoir of a Family and Culture in Crisis, pointed out last week in a keynote address to the Federalist Society in Columbus, a policy that works has got to do something about the addict, but also for the aunts, uncles and grandmothers who shoulder the burden of child care for a mother who has succumbed to a drug overdose. In 2015, Ohio led the nation in this tragic category with 3,050, and the 2016 total may have topped 4,000. As reported in the Columbus Dispatch story linked above, Senate Minority Leader Joe Schiavoni, D-Boardman, noted at this week’s state-of-the-state joint session of the Legislature that two Ohio counties have had to rent refrigerator trucks to handle the surge in the number of corpses from lives snuffed out by overdose.

The first of Kasich’s major proposals to tackle the issue is a $20 million grant fund to accelerate treatment programs and technologies that promise to serve as useful tools in the fight against drug abuse. The money would come from the Ohio Third Frontier Commission, which votes to dole out bond proceeds for 21st century innovations. The idea is that these resources might bring some breakthrough addition-mitigating technology to market that otherwise would stall out due to lack of funding.

Currently, prescriptions for pain medications can be written for 30-90 days. According to the Ohio Department of Health, nearly 800 million doses of pain pills were prescribed in Ohio in 2012, although the Dispatch noted that general awareness of the overdose problem has helped curb that figure to about 631 million doses last year.

Number of opioid doses dispensed to Ohio patients, 2011-2015

ohio opioids

The governor’s second proposal is that prescriptions be limited to shorter terms—seven days for adults and five days for minors with acute pain, but not chronic conditions. Doctors could use their judgment to exceed these dosage guidelines, if they document the reasons. Apparently, the Ohio Medical Board, Dental Board and Ohio Boards of Pharmacy and Nursing will all have to sign off on the proposed legislation.

Next door in Indiana, a legislative proposal passed on the House floor this week gives up on the modern approaches to criminal justice, which include giving judges more discretion and preferring treatment over punishment. S.B. 324 instead aims to crack down on heroin dealers and those who rob pharmacies, increasing the severity of the penalties for dealing and lessening the judiciary’s discretion in sentencing. Critics argue the Legislature is “backsliding” to previous, failed attempts to address the drug epidemic, but the bill was approved by a huge 72-18 margin.

As state Rep. Ed Delaney, D-Indianapolis, noted before the vote, taking away judges’ discretion means giving more discretion to prosecutors, which isn’t an unalloyed good in the current criminal justice landscape. Even though nearly all lawmakers agree with the proposition that the goal of incarceration is to deal with the “people we are afraid of, and not the people we are mad at,” it proves difficult to convince them not to be more afraid of drug dealers than rapists and armed robbers.

I can’t yet fault the approach in either state, since all serious policymakers are at their wits’ end about the drug problem. But I have to root for Ohio’s search for innovative breakthroughs. Opioid abuse affects many precious lives, careers and billions in government expenditures, as mentioned above.

Perhaps it is time for a serious discussion of the ameliorative potential of marijuana extracts for pain relief. According to public opinion polls, most Americans would like to give medical marijuana a chance to prove its value. However, this is a place where there is a clear conflict between not just science and law, but two distinct sets of cultural values.

Image by Steve Heap

Toward a global norm against manipulating integrity of financial data


The following is a guest post by Tim Maurer, who co-directs the Cyber Policy Initiative at the Carnegie Endowment for International Peace, and Steven Nyikos, research analyst at the Carnegie Endowment for International Peace.

The February 2016 theft of $81 million from Bangladesh’s central bank, which recent reports suggest may have been perpetrated by agents of North Korea, demonstrated the scale of risk that malicious hackers pose to financial institutions.

Cyberattacks to manipulate the integrity of financial data pose a distinct set of systemic risks. While a cyberattack on an electrical grid, for example, will be mostly limited to a single country’s territory or its immediate neighbors, the effects of an attack on the financial system are not bound by geography. Such attacks could lead to bankruptcies that, in turn, send shock waves throughout the global system.

The G-20 finance ministers and central bank governors recognized the threat in a March 18 communiqué:

The malicious use of Information and Communication Technologies (ICT) could disrupt financial services crucial to both national and international financial systems, undermine security and confidence and endanger financial stability.

Now the G20 heads of state have an opportunity to take further action. A new white paper by the Carnegie Endowment for International Peace proposes the G-20 heads of state explicitly commit not to undermine the integrity of financial institutions’ data—whether in peacetime or during war—or allow their nationals to do so, and to cooperate with the international community when such attacks do occur.

Most states already demonstrate restraint when it comes to cyberattacks that could compromise the integrity of financial institutions’ data. By making such restraint explicit, they could:

  • Send a clear signal that global financial stability depends on preserving the integrity of financial data and that the international community considers attacks on that integrity off limits;
  • Build confidence among states that restraint in this domain is already the norm and thereby make it easier to mobilize the international community when that norm is violated;
  • Foster greater international collaboration to tackle nonstate actors who target financial institutions with cyber-enabled means; and
  • Complement and enhance existing agreements and efforts, namely the 2015 G-20 communiqué, the 2015 UNGGE report and the 2016 cyber guidance from the Committee on Payments and Market Infrastructures and the International Organization of Securities Commissions (CPMI-IOSCO).

The agreement proposed in the Carnegie white paper would commit states not to conduct or knowingly support any activity that intentionally manipulates the integrity of financial institutions’ data and algorithms, wherever they are stored or when in transit. It also binds states, to the extent permitted by law, to respond to requests by other states to assist in halting cyberattacks that target financial institutions’ data and algorithms and that either pass through or emanate from the state in question.

Elements of the proposed agreement are mutually reinforcing. The commitment by states to provide assistance and information, upon request, shifts the burden of attribution from the victim of attack to states that have professed interest in helping to respond to and ultimately prevent such attacks. Linking an agreement on state restraint with expectations for the private sector to implement due-diligence standards addresses potential moral-hazard problems.

The agreement would build on existing international law and on recent international efforts to develop rules for cyberspace. These include the 2015 report of the U.N. Group of Governmental Experts, which proclaimed:

States must not use proxies to commit internationally wrongful acts using ICTs, and should seek to ensure that their territory is not used by non-State actors to commit such acts.

The G-20 heads of state could advance this norm powerfully, building on the finance ministers’ statement, by articulating it formally when they meet in July.

Of course, in the 21st century, a few states that are relatively cut off from the global economy, and nonstate actors who may or may not be affiliated with them, could conduct cyberattacks against financial institutions. But states that endorse the norm explicitly would be more united and would have a clear basis to demand potential retaliatory action against violators—be they states, terrorists or cybercriminals.

Image by vectorfusionart

Vivek Murthy on vaping and public health

R Street Associate Fellow Damon L. Jacobs attended the recent National Council for Behavioral Health’s NatCon conference in Seattle, where he got to ask Surgeon General Vivek Murthy to weigh in on the role vaping could play in harm reduction and public health. Video of the exchange is embedded below.

Juvenile justice legislation now moves to U.S. House floor


A decade after Congress allowed the Juvenile Justice and Delinquency Prevention Act’s authorization to expire, legislation to reauthorize the bill is moving to the House floor following today’s successful markup by the Committee on Education and the Workforce.

First authorized in 1974, the JJDPA has been an important tool in protecting children who are in custody of the criminal-justice system. Based on broad consensus standards of care, the law ensures that children held for “status offenses”—that is, those that are only illegal because they were committed by someone under the age of majority—can’t be held in jails or prisons unless the child also committed a criminal offense. Another crucial provision of the law requires that, if a child is to be detained, there must be a “sight and sound” separation from adult offenders.

The JJDPA has not been reauthorized since it expired in 2007. The current House bill is the Juvenile Justice Reform Act, introduced last week by Reps. Bobby Scott, D-Va., and Jason Lewis, R-Minn. A Senate companion is expected to be introduced next week.

While one should always bear federalism concerns in mind when the federal government sets out standards for issues that clearly are in the states’ purview, it’s encouraging that the JJDPA is back on Congress’ agenda. This is an important piece of legislation that helps ensure children are protected and gives them the opportunity to grow and flourish in their communities.

Image by niceregionpics

Watney talks digital privacy with Chad Benson

R Street Research Assistant Caleb Watney was a guest recently on Radio America’s “The Chad Benson Show” to discuss Congress’ recent move to vacate Federal Communications Commission privacy rules using the Congressional Review Act.  The full interview is embedded below.

Sanders talks Pence and masculinity on WMAL

R Street Senior Fellow Lori Sanders joined “The Larry O’Connor Show” on WMAL 105.9 FM in Washington to discuss her recent piece in The Federalist about Vice President Mike Pence’s marital rules and a Canadian college’s demonstration on “hypermasculity.” The full interview is embedded below.

Dieterle talks privatization of Michigan’s Soo Locks

R Street Governance Policy Fellow Jarrett Dieterle recently was a guest on Interlochen Public Radio in Northern Michigan to privatize Michigan’s crumbling Soo Locks shipping channel, or have the U.S. Army Corps of Engineers charge user fees to fund upgrades. The interview is embedded below.

Pollock to speak at the 30th World Congress of the International Union for Housing Finance

R Street Distinguished Senior Fellow Alex Pollock will speak on a panel at the 30th World Congress of the International Union for Housing Finance. The conference is scheduled for June 25-27, 2017, in Washington, D.C., and will feature housing finance insights from around the world. For conference program and details, click here.

Pollock’s panel focuses on the housing finance debate in the United States and also features Mike Fratantoni, chief economist and senior vice president of the Mortgage Bankers Association and Ed DeMarco, senior fellow at the Center for Financial Markets at the Milken Institute.

Register and RSVP here (requires a registration fee).

Kentucky’s new leadership tries to pull it back from the pension cliff


Having become the last legislative chamber in the entire South to flip to Republican control last November, the Kentucky House of Representatives wasted no time this session in moving through a red meat conservative agenda.

H.B. 1 (calling for right-to-work), H.B. 2 (requiring an ultrasound prior to an abortion) and H.B. 3 (deleting various union-backed “prevailing wage” provisions and abolishing the commonwealth’s Prevailing Wage Review Board) all were signed by Gov. Matt Bevin on Jan. 9. Last week, Bevin signed H.B. 520, authorizing charter schools in Kentucky, which had been one of only seven states not to allow them.

But one issue that lingers as unfinished business on both sides of the aisle the Bluegrass State’s public employees’ pension system.  Kentucky has, by some accounts, the worst-funded state pensions in the country. That’s a pretty notable distinction, given the severe pension challenges faced by states and local governments all over the country.

And it’s despite Kentucky having already attempted several fixes in recent years.  Retiree health benefits were cut in 2004 and the Legislature voted in 2008 to abolish various pension “spiking” gimmicks that awarded much larger benefits based on increased earnings at the very end of a worker’s career.  A law passed in 2013 required the commonwealth to make its full pension payments to the system, a hybrid cash-balance plan was formulated for new employees, and cost-of-living increases were all but eliminated.

Alas, the Kentucky Employees Retirement System’s assets dropped by nearly in 2014, due to poor investment performance, and the 2015 assets were calculated to cover only 17 percent of its total liabilities over the next 30 years. The Lexington Herald-Leader reported that the funded percentage had dropped to 16 percent last year, although several of the other pension funds—for teachers, university faculty, state police and local government employees— are in better shape. Kentucky’s employer share doubled and became a full one-third of the total payroll costs for state employees.  Credit downgrades followed, but the real worry is the cash-flow problem. The point of no return will be when the assets drop in valuation to about $1.3 billion.  If this happens, the plan will be forced to convert all the assets into cash.  A cash portfolio fund can’t be fixed.

In the first month after he was sworn in as governor, Bevin announced independent audits of every state pension system. He is calling for substantive structural change, along with extra contributions to help make up the deficit.  One of the changes would be to replace the current state plan with a 401(k) retirement plan for new employees, which also would be open to any current employees who would like to transfer their traditional pensions.

The governor also proposes lowering the assumed rate of return to something more closely resembling the actual financial landscape. At 6.75 percent, even the rate of return assumed for the pension plans is much higher than the 2.344 percent risk-free rate, which reflects an average of the 10- and 20-year U.S. Treasury bond yields from March 2015 to March 2016.  Depending on whether one uses the current rate-of-return assumption or this risk-free rate of return, Kentucky’s pension systems are between $35 billion and $95 billion short of what would be required to pay off the promises made to state workers over the years.  Bevin’s plan would add another $1.1 billion in contributions, while reserving an additional $1 billion in future pension payments.

These are matters of culture, leadership and responsibility.  Putting off fixes to large politically sensitive benefit systems is a culture shared by most of the nation, and from Puerto Rico to Greece. Kentucky is certainly not the only state facing real consequences from inaction, but it is among the closest to the edge, having already stripped away many of the ancillary benefits like health care and cost-of-living increases for retirees.

Seven of the 10 largest states each have total unfunded pension liabilities exceeding $200 billion.  Why not 10 out of 10? Leadership and responsibility seem to matter more than red or blue, North or South. Ohio, Illinois and Kentucky all rank among the worst-funded states per capita in the United States, but Indiana was second-best in a study released last fall by the Center for State Fiscal Reform at the American Legislative Exchange Council.  It’s hard to blame Rust Belt economics, differences in work ethic, longevity, demographics or geography for this difference in the charts. Instead, one could speculate that having Mitch Daniels—the former director of the federal Office of Management and Budget—in the governor’s seat for several years had something to do with this. The good news for Kentucky is that Gov. Bevin does not intend to disregard the leadership imperative.

It is difficult to fault a short legislative session dominated by a new and energized Republican majority for not getting to some of the bigger picture items. But hope for the state’s thousands of state workers who were promised retirement benefits and for the taxpayers who will have to provide them lies in a bipartisan effort to shore up the pension system sooner and not later.

Image by Aaban

Murphy’s Law and a banking career


Murphy’s law is well-known in the form: “Whatever can go wrong, will go wrong” and similar variations on the theme. But the intellectually interesting substance of Murphy’s law is:  “Whatever can go wrong, will go wrong, given enough time.”

When a financial calamity has a very small probability of occurring—let’s say a 1 percent chance that it will and 99 percent that it won’t in any given year—we tend not, as a practical matter, to worry about it much. In most years, nothing will happen, and when it hasn’t happened for a long time, we may even start to treat the risk as essentially zero. Professors Jack Guttentag and Richard Herring authored a classic paper that gave this tendency the provocative name “disaster myopia.”

Banking and finance are full of events with a very small expected probability, but which are very costly when they do happen – e.g., a financial crisis.

Suppose the chance of a financial crisis is 1 percent annually. Suppose you optimistically start your banking career at the age of 23 and work to age 68, by which time you will be seasoned and cynical. That will be 45 years. Because you have given it enough time, the probability that you will experience at least one crisis during your career grows from that 1 percent in your trainee year to a pretty big number: 36 percent.

We observe in the real world that financial crises occur pretty frequently—every decade or two—and that there are a lot of different countries where a financial crisis can start. We also observe that virtually no one—not central bankers, regulators, bankers, economists, stock brokers or anybody else—is good at predicting the financial future successfully. Do we really believe the risk management and credit screens of banks, regulators and central banks are as efficient enough to screen down to a 1 percent probability?  I don’t.

Suppose instead that the probability of the banking crisis is 2 percent, with 98 percent probability that it won’t happen in a given year. Are banks even that good?  How about 5 percent, with a 95 percent probability of not happening?  That would still feel pretty safe. One more dubious of the risk-management skills of bankers, regulators and the rest might guess the probability, in reality, is more like 10 percent, rather than 1 percent. Even then, in most years, nothing will happen.

How does our banker fare over 45 years with these alternate probabilities?  At 2 percent chance per-year, over 45 years, there is a 60 percent probability he will experience at least one crisis. At 5 percent, the probability becomes 90 percent of at least one crisis, with a 67 percent chance to see two or more. If it’s 10 percent, then over 45 years, the probability of experiencing at least one crisis is 99 percent, and the probability of experiencing at least two is 95 percent. Since we learn from troubles and failures, banking looks like it furnishes the probability of an educational career.

In the last 45 years, there have been financial crises in the 1970s, 1980s, 1990s and 2000s. In the 2010s, we have so far had a big sovereign default in Greece, set the record for a municipal insolvency with the City of Detroit, and then broke that record with the insolvency of Puerto Rico. And the decade is not over. All of these crises by decade have been included in my own career around banking systems, of now close to 48 often-eventful years. The first one—the Penn Central Railroad bankruptcy and the ensuing panic in the commercial paper market—occurred when I was a trainee.

Since 1982, on average, a little less than 1 percent of U.S. financial institutions failed per year, but in the aggregate, there were 3,464 failures. Failures are lumped together in crisis periods, while some periods are calm. There were zero failures in the years 2005-2006, just as the housing bubble was at its peak and the risks were at their maximum, and very few failures in 2003-2004, as the bubble dangerously inflated. Of course, every failure in any period was a crisis from the point of view of the careers of then-active managers and employees.

A further consideration is that the probability of a crisis does not stay the same over long periods—especially if there has not been a crisis for some time. As Guttentag and Herring pointed out, risks may come to be treated as if they were zero, which makes them increase a lot. The behavior induced by the years in which nothing happens makes the chance that something bad will happen go up. In a more complex calculation than ours, the probability of the event would rise over each period it doesn’t occur, thanks to human behavior.

But we don’t need that further complexity to see that, even with quite small and unchanging odds of crises, given enough time across a career, the probability that our banker will have one or more intense learning experiences is very high, just as Mr. Murphy suggests.

Image by Ionut Catalin Parvu

Rorke talks carbon tax on Infinite Earth podcast

R Street Senior Fellow Catrina Rorke spoke recently with Michael Green and Mike Hancox on the Infinite Earth podcast. The podcast focuses on the unlimited potential of human capital in solving pressing resource, social, and health challenges. They spoke about the recent carbon tax proposal advanced by the Climate Leadership Council, how conservatives approach the problem of climate change and what opportunities there are for action in the states and at the federal level. The full audio is embedded below.

‘Right to Know Act’ puts lawyers’ interests ahead of consumers


The Right to Know Act has a great ring to it. Even if you don’t care to read the substance of the legislation—sponsored by state Sen. Michael Hastings, D-Tenley Park—it’s good retail politics. It wouldn’t be as compelling if it were called the “Next Great App Killer Act” or “Helping Lawyers Sue Tech Companies Act.” Unfortunately, the latter titles are probably closer to accurate.

The proposed legislation boldly declares “all individual have a right to privacy in information pertaining to them” that is “protected by the United States Constitution.” That’s absolutely true, if we’re talking about the Constitution’s Fourth Amendment limitations on what government may do with our private information. The Constitution’s privacy protections do not extend to private exchanges of information. In fact, those generally are protected as “speech” by the First Amendment.

When it comes to commercial consumer data, conceptions of data privacy continue to evolve, as they require balancing what consumers reasonably expect to be kept confidential with what consumers want in terms of convenience and performance. Currently, we have a patchwork of federal laws that cover the topic, including the Health Insurance Portability and Accountability Act, the Gramm-Leach-Bliley Act and the Children’s Online Privacy Protection Rule. A smattering of state-based consumer information laws form another layer of privacy rules.

Most of these laws outline rules for how to treat specific kinds of nonpublic information, such as health or credit information. They also require any number of consumer notifications when data security has been breached. When it comes to items like our financial transactions, phone records and health information disclosed to a physician, most of us have a reasonable expectation of privacy.

But that’s not necessarily the case when we use social media, search engines and GPS-based apps like Uber and Yelp. Unless you’ve been living under a rock, you probably recognize the reason many technology companies develop free applications isn’t out of a sense of philanthropy—it’s because the treasure trove of user data is so valuable. That’s a strong incentive for new market entrants to create the next generation of technologies and apps we want.

For example, I know my information is collected to improve how various app and website services perform for me. The trade-off is that companies I patronize have a massive amount of data about my habits, purchases and location. That information is extremely valuable to companies that want my business, politicians who want my vote and charities that want my support. But that’s the deal. When I breeze through those boilerplate legal declarations while installing the software, I’m essentially agreeing that my use of the app or website is worth the trade-off and the associated risks. If I don’t want to take the risk, I forfeit the convenience of the technology.

The problem with the Right to Know Act is that it puts the judgment of legislators and the financial interests of lawyers ahead of consumers. The proposed legislation modifies that boilerplate agreements you’re already ignoring to give Illinois consumers a right to request information about which categories of personal information are disclosed to which third parties. Most notably, it creates a right to sue for $10 in liquidated damages (where actual damages are less than that total), injunctive relief and, of course, attorneys’ fees.

Since most consumers would rather simply use the technology in question than file a lawsuit for $10, this legislation appears custom built enterprising politicians for attorneys interested in putting together class-action lawsuits. Nothing says “thank you” to the plaintiff’s bar for supporting your campaign quite like inventing a new cause of action.

In addition to empowering lawyers, this bill also burdens commercial website and other online services with new compliance costs, in the form of information-disclosure, data-collection and data-retention requirements. This can mean the difference between a technology company adding a new feature to an app, or spending more on legal counsel. While politicians and lawyers line their pockets, consumers will bear the cost in the form of less effective apps.

Image by RRuntsch

Dieterle talks Virginia distilling regs on Freedom and Prosperity Radio

R Street Governance Policy Fellow C. Jarrett Dieterle joined host Joe Thomas on Virginia’s Freedom and Prosperity Radio — sponsored by the Virginia Public Policy Institute — to discuss his recent piece in the American Spectator about Virginia distilling regulations. The full audio is embedded below.

Pollock before Oversight Subcommittee

Here’s more from R Street Distinguished Senior Fellow Alex Pollock’s testimony before the House Committee on Financial Services on the arbitrary and inconsistent non-bank SIFI designation process.

Alaska’s ridesharing bill one step closer to becoming law


With the state Senate’s passage of S.B. 14, Alaska is one step closer to a legal framework for transportation network companies such as Uber and Lyft. The Senate on March 23 voted 14-5 to pass the “Let’s Ride Alaska Act,” which would make Alaska the 45th state to enact comprehensive ridesharing legislation.

Uber left “The Last Frontier” after about six months in 2014, due to a spat between the company and the state Department of Labor and Workforce Development over worker classification. The state was concerned with drivers classified as contractors instead of employees, meaning they would lack company-purchased workers’ compensation insurance.

This bill codifies the classification of drivers as independent contractors. Given the complete autonomy TNC drivers have regarding their work schedule, it is hard to justify mandating additional employee benefits that likely would increase costs to consumers and prevent TNCs from taking on additional drivers.

The bill also preempts localities from imposing further regulations on TNCs, joining a growing number of states that are choosing to preempt county and municipal restrictions. This drew the ire of the capital city of Juneau, which requested the preemption clause be removed because of its desire to apply sales tax ordinances and to “require transportation network drivers to register as a business with the municipality, in the same manner as other businesses.”

TNCs could be a big revenue stream for cities and another backdoor tax on consumers. But creating a patchwork of local regulations has proven to be a bad idea. For instance, overly restrictive regimes in Houston, San Antonio and Austin have undermined transportation options in Texas. While decentralizing governmental responsibility is normally sensible, city governments are the creation of states (unlike states’ relationship to the federal government), and a jumble of municipal ordinances inevitably creates unnecessary compliance costs in an arena that’s inherently inter-jurisdictional.

Alaska’s Senate bill also requires background checks for TNC drivers. In addition to the vetting done by passenger through the TNC apps’ rating systems, the TNCs have a strong incentive to protect their customers and already screen drivers who might be likely to commit crimes while on the job. Juneau’s city government wants the freedom to add to the background-check process, including the ability to mandate fingerprinting checks with the FBI’s NGI database. Juneau justifies its request on grounds that it is “a requirement currently enforced for taxi drivers doing business in Juneau.”

But we shouldn’t impose new regulations just to “level the playing field.” Indeed, mandating fingerprint background checks through the FBI raises some serious concerns. For instance, they pose a disproportionately unfair obstacle to labor participation by minority communities. The FBI database tracks arrests, not convictions. It was built to capture lots of data, but can be prone to false positives. Nearly half the FBI’s records fail to include information on the final disposition of a case, such as whether someone was acquitted or had charges dropped.

This bill may have an uphill battle, given the interest groups it may anger, but the House Labor Committee chairman, who killed a similar bill last year, already intimated he will let S.B. 14 to go to the floor for a vote. This would be another step in the right direction for all Alaskans.

Image by Rocky Grimes

What does the executive order mean for the climate? Not much


President Donald Trump’s long-awaited executive order on climate policy vastly remakes the executive branch approach to climate-change risk, adaptation and mitigation. Whereas the Obama administration was focused on reducing emissions and managing climate risks, the Trump administration is executing program changes that suggest such concerns were folly.

It’s a substantial policy overhaul, prompting the executive branch to reconsider seven regulations or guidance documents that pertain to reducing emissions from the power sector and fossil-fuel development or considering climate risk in federal policy. The order rolls back regulations on oil and gas; lifts the moratorium on federal coal leasing; reconsiders the much-maligned social cost of carbon estimate of climate damages; and requests that agencies abandon the practice of evaluating climate impacts when considering regulations, land-use decisions or new programs or projects.

But at its heart is remaking President Barack Obama’s regulations on coal facilities in the electric power sector. A major source of anguish in the coal community and among Trump’s supporters were two Environmental Protection Act regulations that would make it nearly impossible for the power sector to increase reliance on coal.

  1. A “new source performance standard” required that no new coal facility be built without expensive and unproven carbon-capture-and-sequestration technology.
  2. Separately, the Clean Power Plan pushed states to cut greenhouse-gas emissions from the power sector by about a third below 2005 levels within 25 years.

Both pieces of regulation currently are working their way through the courts, though the Supreme Court offered an unprecedented stay on the Clean Power Plan to prevent it from going into effect. Trump advises the EPA to rewrite both of these rules. This isn’t the outright elimination that many expected, and there’s a good reason why.

A 2007 Supreme Court decision directed the EPA to address greenhouse-gas emissions if those emissions proved to be a threat to public health and welfare. In 2009, the EPA finalized its “endangerment finding,” a regulatory declaration that emissions of greenhouse gases contribute to factors that threaten the public. Taken together, the EPA must take steps to limit greenhouse-gas emissions. An order to rewrite Obama’s climate regulations puts the EPA on the slow and methodical road to issuing a new regulation; in the interim, coal has an opportunity to see if it can fight its way back into the power sector.

If greenhouse-gas policy is in flux, what does this executive order mean for the climate? Probably not much.

Government policy to limit greenhouse-gas emissions has, to-date, been written as though the private sector were incapable of reducing emissions without bureaucratic intervention. The prior administration was certainly no exception, issuing rules to reduce climate emissions from the power sector, vehicles, industrial use and even household appliances. The regulatory onslaught suggested that reducing greenhouse-gas emissions was a pre-eminent government priority and that the market needs the influence of regulation to respond accordingly.

Such an assumption is patently wrong. What we’ve learned in the last decade from the fracking boom, precipitously dropping prices for wind and solar, increased urbanization, technology adoption, automation and a wide variety of additional trends is that the free market can reduce emissions in the absence of government policy even if it doesn’t need to.

So before heeding the claims of the environmentalist left that this executive order and the president’s policies more generally will set the United States on an unacceptable carbon-emissions trajectory, be mindful that the private sector is hard at work offering incidental carbon emissions that meet or exceed the goals of Obama-era policies.

This latest executive order, coupled with substantial cuts to some of our core environmental and risk-management programs in the president’s budget, suggests that the administration’s eagerness to remake federal greenhouse-gas policy may yet prove to be shortsighted. While there is considerable and justifiable debate over whether to reduce greenhouse-gas emissions through federal policy, a clear-eyed assessment of climate risk, like any other risk, remains an indispensable fiduciary responsibility of the federal government.

As the White House rethinks carbon and climate policy, the pressure falls to Congress to serve as ballast. Conservatives may be deeply suspicious of government-expanding approaches to limit greenhouse-gas emissions, but they are equally suspicious of unwise, uninformed government spending. There is ample opportunity to liberate the private sector from regulatory overkill, while providing federal agencies the information they need to handle climate risks appropriately.

Just last week, 17 Republicans released a resolution to use “our tradition of American ingenuity, innovation, and exceptionalism” to address climate change. With bicameral majorities, perhaps conservatives will find their voice and offer an authentic, principled counterpoint to tackle that challenge.

Image by Evan El-Amin

Let’s eliminate the SEC’s Investment Advisory Committee

shutterstock_451626682 (1)

Looking at the Securities and Exchange Commission’s Investment Advisory Committee as a proxy for the relative influence of the shareholder empowerment movement gives one the distinct impression that the SEC is, at the least, unduly influenced, if not fully captured.

Shareholder empowerment advocates—primarily, but not exclusively, those who represent the interests of public pension funds and union-related funds—call for shifting corporate decision-making authority toward shareholders and away from boards of directors and executive management. The effect on corporate governance is to allow uninformed shareholders an ever-increasing power to interfere with the decision-making of the most informed loci of corporate authority. The end results of this approach are suboptimal board and executive decision-making, fewer successful companies willing to become or remain publicly traded and constraints on society’s ability to create economic wealth.

Congress created the IAC under Section 911 of the Dodd-Frank Act, with the drafted purpose to:

advise  and  consult  with  the  Commission [SEC]  on … regulatory priorities of the Commission; … issues  relating  to  the  regulation  of  securities products, trading  strategies, and  fee  structures, and the effectiveness of disclosure; …  initiatives  to  protect  investor  interest;  and  … initiatives to promote investor confidence and the integrity of the securities marketplace; and … submit  to  the  Commission  such  findings  and  recommendations  as  the  Committee determines  are  appropriate, including  recommendations  for  proposed  legislative changes.

This sounds innocuous enough. Moreover, it appears Section 911’s authors expected membership to be broadly based and to represent a variety of interests:

The members of the Committee shall be … the Investor Advocate [heads the Office of the Investor Advocate, a new office established by Section 915 of the Dodd-Frank Act]; … a representative of State securities commissions; … a representative of the interests of senior citizens; and … not fewer than 10, and not more than 20, members appointed by the Commission, from among individuals who … represent the interests of individual equity and debt investors, including investors in mutual funds; … represent the interests of institutional investors, including the interests of pension funds and registered investment companies; … are knowledgeable about investment issues and decisions; and … have reputations of integrity.

Nevertheless, like a lot of legislation, Section 911 has had unintended consequences. The IAC’s membership has been dominated by shareholder empowerment advocates. I estimate at least seven of the 18 members can be considered movement supporters, including Chairman Kurt Schacht of the CFA Institute and Vice Chairman Anne Sheehan, Vice Chairman of the California State Teachers’ Retirement System, who presumably set the committee’s agenda.

Given this bias, it should not come as a surprise that one item on the IAC’s meeting agenda earlier this month was a discussion of Snap Inc.’s recent initial public offering with a dual-class share structure that did not offer voting rights to purchasers. The shareholder empowerment movement’s abhorrence of dual-class share structures—based solely on the fact that they reduce or eliminate the voting power of the typical stockholder—is nonsensical. This structure has been used by some of the most successful companies in the world—including Alphabet (Google), Berkshire Hathaway, Alibaba Group, Facebook, Under Armour and LinkedIn—to create enormous wealth for their stockholders.

Moreover, the Snap IPO was hugely successful. Snap priced its offering at $17 per share, giving it a market valuation of roughly $24 billion. The book was more than 10 times oversubscribed and Snap could have priced the IPO at up to $19 per share. So whose interest are shareholder empowerment advocates trying to protect when they attack dual-class share structures?  It appears the interest of most concern to the movement is its own, as the more dual-class share structures there are, the less power the movement has.

Shareholder empowerment advocates do not need any extra help from the Dodd-Frank Act to have a major impact on corporate governance. As Congress and the White House continue to review sections of the Dodd-Frank Act to be amended or repealed altogether, Section 911 should be included on that list.

Image by g0d4ather

Novato’s tobacco ordinance is discriminatory


The City of Novato’s recently enacted anti-tobacco ordinance—which will go into effect in 2018, but is subject to reconsideration next week—would give landlords a tool with which to discriminate against renters. The measure discriminates against poor people, too.

The law bans smoking in apartments and condominiums, and also bans the use of electronic cigarettes and smokeless tobacco. Other communities have passed similar bans, but Novato’s ban goes much further. Use of any tobacco product—or even ownership of an ashtray—would be considered a material breach of a lease, thus giving landlords wide latitude to evict tenants.

The city exempts single-family homeowners from the law, which means the brunt of the measure will fall on less-affluent residents. It will also fall on those who attempt to quit their dangerous cigarette-smoking habit, given that the law treats less-dangerous vaping and smokeless tobacco products the same as cigarettes.

The law also encourages snitches, by making it illegal to abet or conceal a violation of the law. It encourages litigation, in that it gives absolutely anyone standing to sue and seek damages against those who vape in, say, an apartment’s common area.

This law is a godsend for landlords. They receive broad eviction authorities if they catch residents or even their guests with any type of tobacco product. It’s a creepy and discriminatory measure and its worst sections should be repealed.

Hobson talks autonomous vehicles and organ donation on Polish TV

R Street Tech Policy Fellow Anne Hobson offered an interview recently to TVN, a private broadcaster in Poland, television network TVN, to discuss the recent Slate piece she wrote with Senior Fellow Ian Adams on how the rise of autonomous vehicles could affect the market for organ donation on the morning show “Dzien dobry TVN.”

According to the National Highway Traffic Safety Administration, sutonomous vehicles could prevent many of the 94 percent of road deaths attributable to human error. However, reducing vehicle accidents might reduce organ donations by up to 20 percent, exacerbating existing organ shortages. As the incidence of life-threatening chronic diseases such as kidney and liver disease increases, we need to consider promoting the market for organ donation by introducing presumed-consent rules or legalizing incentive packages that can include things like fixed payments, health insurance and paid donor leave.

Video of the appearance (overdubbed with simultaneous Polish translation) can be found at the link below.



Senate to FCC: Privacy regulation is not your job


The U.S. Senate narrowly passed a resolution Thursday that would halt Federal Communications Commission efforts to create rules for the way internet service providers use their customers’ personal information.

While some critics characterized the vote as anti-consumer, the FCC’s privacy rules—drafted under former Chairman Thomas Wheeler—represented agency overreach and would have created unnecessary overlap and confusion with general privacy rules established and enforced by the Federal Trade Commission.

The Senate’s move is in line with current Chairman Ajit Pai’s decision to delay enactment of the Obama administration’s FCC privacy rules and instead work with the FTC to seek a “comprehensive and uniform regulatory framework.”

The internet’s impact on consumer privacy is a significant issue. The combination of rapid processing speeds, networked databases and universal connectivity make it easier than ever to search and analyze personal information and transform it into a marketable commodity. We may debate the degree to which all of this should be regulated, but those who understand the Constitution know it is Congress’ prerogative to authorize which federal agency should perform which regulatory task.

As much as progressives might wish it, government regulatory agencies cannot simply do what they want. Just because the FCC regulates wide portions of ISP business doesn’t mean it may regulate every portion of ISP business. It is not the FCC’s job to write rules on privacy regulation. That role, by law, falls to the FTC.

Put another way, it would be as if the Securities and Exchange Commission were to decide to write its own rules for disability access at Goldman Sachs’ headquarters, on grounds that it has regulatory authority over securities broker-dealers. Congress authorized the Labor Department to enforce the Americans with Disabilities Act, and would be right to curtail another agency’s attempt to usurp that authority. It would be misleading to say such an action is insensitive to the disabled—the ADA rules are still in force. It’s just not the SEC’s job to write new ones.

From a practical standpoint, the FCC regulates only one group of companies in the internet ecosystem—service providers like AT&T, Verizon, Comcast and T-Mobile. Even if its privacy rules were sterling examples of regulatory discretion, they would not apply to other internet companies that collect as much or more consumer information. That would create an imbalance by saddling different companies within the same industry with different rules.

One set of regs, from one agency, for all players, is the correct way to allow these companies to work within the ecosystem while protecting consumers.

Image by Mark Van Scyoc

Building infrastructure by walking across ideological lines


Talk of infrastructure is in the air, as President Donald Trump and his advisors call for $1 trillion in investment and Democrats propose their own wish lists. Is there an artful deal to be struck? The R Street Institute hosted a recent conversation (which I moderated) on bipartisan solutions to get the economy moving and let those who benefit fund transportation.

The panel featured:

  • Christopher Leinberger, the Charles Bendit Distinguished Scholar and Research Professor and chair of the Center for Real Estate & Urban Analysis at the George Washington University School of Business;
  • Robert Puentes, president and CEO of the Eno Center for Transportation;
  • Christopher Coes, vice president of real estate policy and external affairs at Smart Growth America; and
  • Salim Furth, research fellow in macroeconomics at the Heritage Foundation.

Video of the full panel is embedded below.

Tito’s Vodka isn’t the only good distilled spirit out of Texas


When you think of Texas alcohol, you usually think of good ole Shiner beer. The state is more famous for its beer than anything else. However, some new spirits have come out of the Lone Star State that might change that.

Texas is quickly becoming one of the leaders in the craft distiller industry, which has exploded in recent years, with at least 1,315 craft distillers now operating in the United States. In 2010, craft distillers only made up 0.8 percent of market share. By 2015, their market share had more than doubled to 2.2 percent.

Of course, the most famous of these new Texas distillers also happens to be the oldest, Tito’s Vodka out of Austin. Tito’s has been around long enough to be sued over its labeling. The company became famous after it won the double gold medal for best vodka at the 2001 San Francisco World Spirits Competition. It has since gone nationwide and, perhaps most notably, is served by several airlines on their flights.

But vodka is not the only distilled spirit to come out of Texas. Texas whiskey also has made big strides over the past few years. The Lone Star State is embracing its distillers, with Houston even hosting a whiskey festival. While Texas distillers do not yet have the reach or the reputation of Kentucky or Tennessee, the product certainly stands right there with the more well-known distillers.

One of the best known whiskies out of Texas is Rebecca Creek, based in San Antonio. Unlike many other Texas distillers, they have had success selling their whiskey outside Texas. The distillery, which also produces vodka, was founded in 2009 by Steve Ison. The whiskey is cut to bottle proof by using purified water and has performed well in competitions.

Another Texas distiller is Trinity River Distillery in Fort Worth, a veteran-owned company established in 2011. Trinity River are producers of Texas Silver Star spirits, which include a bourbon, a honey whiskey and vodka. Their whiskey and vodka are cut using 100 percent rainwater, which they claim provides an already purified water source. The rainwater is collected through a drainage system on the roof of the building and is stored at the distillery.

Finally, Houston is represented by Whitmeyer’s Distilling Co., founded in 2012 by a couple of veterans. They make an assortment of whiskies, vodkas and even a gin. Their distillery is a small operation, located in a warehouse, and uses purified water because it helps the products drink smoother.

Most Texas distilleries offer tours via Groupon to bring in both locals and tourists alike to tour their facilities and enjoy samples of their drinks. It’s a way to both market themselves as tourist attractions and spread the word about their products, a low-cost form of advertising that gives the distilleries multiple income streams.

I have toured the Trinity River and Whitmeyer distilleries. The Texas Silver Star bourbon was a very smooth whiskey, one of the best I have ever tasted. I even brought a bottle back home to Louisiana.

The Whitmeyer distillery offers a wide range of products. The expensive whiskies were very good and the basic Texas whiskey offers a good selection in the $20-$30 range. It, too, holds its own against similar competitors.

One of the problems the Texas distillers have is the lack of national reach. Texas itself is a big market, but they cannot secure as many markets outside the state. The only way they will show up on the national radar is by winning plaudits from whiskey connoisseurs both in the United States and around the world.

Texas craft whiskey is a product with enormous potential. It needs to be embraced.

Image by  lev radin

Shifting drinks law landscape in Pennsylvania is a maze for businesses


The alcohol-policy landscape in Pennsylvania was static—almost glacial—for most of the past 40 years. However, these days, the glaciers are melting and the landscape is changing rapidly. New laws and interpretations are coming every few months.

The evolution in our drinks law is welcome and overdue. But in the short term, it has meant chaos for small businesses struggling to keep up and left some consumers confused about what the changes actually mean.

For decades, Pennsylvania’s unique mess of odd options embodied in its archaic post-Repeal liquor code was tweaked only incrementally. Gradually, we got shelves in our state monopoly wine and liquor stores (before, we had to order at a counter, by catalog number); we were allowed to pay for beer with a credit card; and a fraction of the state stores opened for a few hours on Sundays. These were all tiny changes.

The beverage alcohol retailer/wholesaler situation had been pretty quiet, as well. The state maintained its monopoly on wine and liquor wholesale operations and off-premise retail sales. Beer was supplied by privately owned wholesalers and sold by the case and keg only at retail stores (“beer distributors”) and in 12-bottle maximum purchases at bars and restaurants. Producers were allowed to sell directly. It was confusing, but we were used to it, and family businesses planned major investments based on this system.

Much has changed in the past year, as we’ve entered an activist phase in liquor law. Beer distributors first were granted the ability to sell 12-packs as well as cases. Nine months later, the policy was changed to no minimum sale. It’s also no longer always illegal to sell beer at gas stations, so long as the station also has a restaurant license and a 30-seat “café.” License holders may also sell up to four bottles of wine to go, as long as the price is no lower than that charged by the state’s monopoly stores. Grocery stores are buying licenses (and adding seating) to take advantage of this change.

We’re pathetically overjoyed by this. Almost everyone who talks about it says: “We can buy beer and wine at supermarkets and gas stations now!” This isn’t exactly true – less than 300 of the state’s thousands of gas stations and supermarkets have licenses. With the licenses going for as much as $560,000 (yes, just the license), it’s clear that this is not a business opportunity for Mom & Pop’s Corner Market.

A tiny number of new licenses are being created in the tax-abatement Keystone Opportunity Zones, but nowhere near enough to meet this new demand. The price of transferable licenses is spiraling upward, and will soon only be affordable to chain stores and restaurants.

The solution could be a new license for off-premise sales: plentiful, affordable and nontransferable. But every time a grocery or convenience store buys a tavern license for more than $200,000, that solution becomes less attractive to legislators, who are not moving to do anything to fix this situation.

Another recent change allowed Pennsylvania wineries, breweries, distilleries, cideries, and meaderies to sell each others’ products, and also allowed each to have up to two off-premises stores. It was quickly realized that this created a loophole that potentially allowed a privately owned Pennsylvania-only full-service booze store. One has already opened in Pittsburgh, the first private liquor store in the state since Prohibition.

To be fair, the Legislature is moving in the right direction. State House Speaker Mike Turzai, R-Marshall, has promised a bill to privatize the state’s wholesale booze business this session. State Sen. Randy Vulakovich, R-Shaler, has reintroduced a bill that would add spirits to the wine and beer sales allowed to licensees (with the same four-bottle limit as wine), and news outlets are already anticipating its passage. Gov. Tom Wolf and the Democrats in the Legislature seem willing to compromise on more booze freedom, so long as it doesn’t mean outright privatization of the state monopoly stores (and their unionized workforce). The forecast is wet.

Meanwhile, the state’s longtime private beer retailers are being shut out. I recently talked to one who was furious about the lobbying performance of his industry’s trade group, the Pennsylvania Malt Beverage Distributors Association (MBDA). “They could have made a deal,” he spat out, “they could have gotten us wine sales. But there’s only one word in their vocabulary: ‘No!’”

The MBDA for years stonewalled on changes to the liquor laws, rightly seeing the state’s monopoly as crippling their wine and liquor competition. They fought mightily to keep beer sales out of groceries and gas stations. But when things started to change, it would have been smart to cut a deal. Now they’re looking at a shrinking business; “the volume-value end” in cases and kegs, as my source put it.

When things move this quickly, a business has to be nimble and knowledgeable in the ways of lobbying and legal interpretation. That’s practice Pennsylvania businesses simply haven’t had, and it represents skills they’re going to have to acquire as the liquor landscape continues to slip, slide and slowly advance toward privatization.

Image by ra2studio

Guest blogger Lew Bryson is the author of “Tasting Whiskey” (2014) and books on the breweries of Pennsylvania, New York, Virginia, Maryland and Delaware. 

SXSW Roundup: What will AR/VR revolutionize next?

The experiential factor in augmented and virtual reality leads to a more vivid sense of presence and immersion, when compared to other media like television or radio. This makes AR and VR powerful platforms for social engagement, education and adventure.

As part of Innovation Policy Day at SXSW, I took part in a panel discussion led by the Consumer Technology Association’s Michael Hayes on what AR and VR will revolution next. Joining me were Tim Hwang, Google’s public policy counsel, and James Hairston, head of public policy for Oculus. You can see the full video below:

Imagining what AR and VR will revolutionize is no small task. Entrepr are trying to make the next “killer app,” which may be as simple as viewing your two-dimensional computer screen in virtual or as complicated as exploring a vast, immersive, open virtual world reminiscent of Elder Scrolls V: Skyrim. Augmented and virtual reality can aid in job training by simulating flight or manufacturing processes. VR can help paraplegics learn to walk again by retraining the brain to recognize limbs.

AR and VR will also change how we will communicate. While video games often are viewed as antisocial, in VR, this does not have to be the case. Users can interact with artificially intelligent non-player characters or hang out with friends in virtual spaces. By practicing in VR, people can overcome social anxiety in public speaking or other social experiences.

Imagining these good applications for VR and AR is best left to entrepreneurs. According to economist Israel Kirzner, entrepreneurs rely on local knowledge—their own relevant experiences—to envision opportunities for profit. For example, a local Austin resident suggested that an AR headset that could display true north would be helpful to individuals who set up antennae for telecommunications infrastructure. The role of policymakers should be to let these entrepreneurs experiment.

SXSW Roundup: International cooperation in cybersecurity

At the recent South by Southwest festival in Austin, Texas, I took participated in a panel discussion hosted by the European Union and exploring opportunities for international cooperation on cybersecurity policy.

Joining me on the panel were Chris Painter, coordinator for cyber issues for the U.S. State Department; Rafael Laguna, CEO of Open-Xchange; Michael Farrell, editor of CSM Passcode, and Andrea Glorioso, counselor for digital economy for the EU delegation. You can see the full video embedded below:

Some takeaways I had from the discussion:

  • The internet of things is a complex, global system. There is no silver bullet solution or simple regulatory fix. Instead, industry, governments, consumers and third-party stakeholders will have to work together on a variety of efforts to improve data-security and privacy outcomes.
  • Threat information-sharing efforts, device cybersecurity certification programs, after-market smart products, consumer awareness initiatives and efforts to improve cyber insurance adoption are all pieces of a broad strategy to mitigate cyber risk in the internet of things.
  • Artificial intelligence empowers consumers and firms to detect and mitigate cyber threats. At SXSW, IBM demonstrated their “cognitive security” program that leverages machine learner Watson to analyze unstructured data in ways that could help businesses identify threats. On the consumer end, smart routers and firewalls can monitor traffic patterns and metadata to detect when your home’s connected devices are compromised.
  • There is a role for government to encourage solutions that foster a more secure internet of things. For example, the U.S. Commerce Department’s National Institute of Standards and Technology’s industry-led voluntary cybersecurity framework creates a common language for government to engage with stakeholders. In a recent green paper, the department outlined its role in promoting an open global environment for internet-of-things development. R Street filed comments supporting a light-touch regulatory approach and advocates for continued engagement with stakeholders on cybersecurity issues domestically and internationally.

For more ideas about addressing IoT cyber vulnerabilities, see our recent paper, “Aligning Cybersecurity Incentives in an Interconnected World,” which examines the role of government in fostering market-based solutions to device insecurity.

R Street’s criminal justice project goes to SXSW


R Street was busy at the recent South by Southwest festival in Austin, Texas, including co-hosting a series of discussions on criminal justice reform with the Texas Public Policy Foundation, along with support from the Charles Koch Institute and the Coalition for Public Safety.

Our first panel, “The New Wave of Justice Innovators,” focused on how emerging technologies can help solve some longstanding criminal justice issues. It was moderated by Jasmine Heiss, director of coalitions and outreach at the Coalition for Public Safety, and also featured Jon Tippens of Expunge.us, Jordan Richardson of the Charles Koch Institute, Lauren Krisai of the Reason Foundation, Rick Lane of Verie and Derek Cohen, deputy director of Right on Crime and the Center for Effective Justice at the Texas Public Policy Foundation.

We also screened filmmaker Ondi Timoner’s documentary “The Last Mile: Inside San Quentin’s Tech Incubator.” After screening the film, we hosted a discussion with Natrina Gandana, program manager at The Last Mile Project, and Tulio Cardozo, technical manager for the Last Mile Project.

Our second panel focused on background-check policies and how these approaches have an adverse effect on the economic prospects of the most vulnerable.

Moderated by Greg Glod, manager of state initiatives for Right on Crime, the panel featured R Street Criminal Justice Policy Director Arthur Rizer; Texas state Sen. Konni Burton, R-Fort Worth; Malcolm Glenn, public policy manager at Uber;  Teresa Hodge, co-founder of Mission: Launch Inc.; and Bill Cobb, deputy director of the ACLU’s Campaign for Smart Justice.

We wrapped up the day’s programming with a presentation by Marcus Bullock, the founder and CEO of FlikShop. His firm offers technology to help inmates stay in contact with their families – particularly important given that those connections help inmates with reentry when they are released.

Whither the conference committee?


The following post was co-authored by Adam Chan, a former Institute of Politics summer research assistant at the R Street Institute.

Hong Min ParkSteven S. Smith and Ryan J. Vander Wielen recently presented a monograph, “The Changing Politics of Conciliation: How Institutional Reforms and Partisanship Have Killed the Conference Committee,” detailing the “near evaporation” of conference committees in House-Senate conciliation processes. Given the constitutional necessity of conciliation and its significant impact on policy outcomes, this paper is crucial to understanding recent congressional dysfunction.

The Constitution’s Bicameralism and Presentment Clauses require both houses of Congress to pass identical versions of legislative bills. The Constitution is silent, however, about how Congress is to go about reconciling differences that exist when the chambers pass different versions of a bill. This process, known as conciliation, can be accomplished several ways. For example, one chamber can simply approve a bill that was initially passed by the other chamber, or the chambers can continue to exchange amendments back and forth until both finally pass an identical bill (a process known as “ping-ponging”).

But traditionally—at least for more complex pieces of legislation where the potential differences between the chamber versions of a bill are numerous—Congress has used conference committees made up of delegates from each chamber to hammer out any differences. A conference report is then agreed upon and returned to the chambers to be passed on a simple up-or-down vote. This longstanding conference process, as the monograph’s authors lay out in detail, is now broken. Fewer and fewer conference committees are convened to resolve differences between the houses of Congress, and the authors set out to study why this decline has occurred, as well as its ramifications for future policymaking.

Essentially, the monograph has four components:

  1. A description of conciliation when conference committees predominated, before the 1970s;
  2. The waves of change since the 1970s that caused the decline in conference committees;
  3. Current methods of interchamber conciliation;
  4. The effects these changes have had on policy outcomes.

The traditional use of conference committees, pre-1970

The practice of using conference committees traces its roots to English parliamentary practice as early as the 14th century. The use of conference committees hopped the Atlantic and found a home in numerous colonial legislatures and our country’s first Congress. The practice continued well into the mid-20th century, when the conciliation process was still dominated by conference committees.

As has recently been shown by Jeffrey A. Jenkins and Charles Stewart III, this time period featured a decentralized system of independent and influential committees managing Congress. Conciliation was conducted by House and Senate delegations and controlled by relevant committee chairmen (chosen by seniority), who produced a “conference report” for full chamber votes, with little party influence. Starting in the 1970s, however, this state of affairs began to fall apart.

The decline of conference committees: two waves of institutional change

Changes in conciliation since the 1970s occurred amid rapidly increasing polarization, which led to greater differences in potential policy outcomes and more intense party competition. This, in turn, rendered congressional control increasingly uncertain. As a result, party leaders became more attuned to—and involved in—the process by which legislative differences between chambers would be resolved. A series of changes that sought to de-emphasize and reduce the power of congressional committees had the effect of reducing the role of conference committees.

As the authors note, the conciliation process was fundamentally altered by two waves of institutional change. The first, initiated by Democrats in the 1970s, came about as a result of the effort to “bring committees and conferences in line with the preferences of most majority party Democrats by expanding conference delegations.” Specifically, because of longstanding seniority rules, conservative Southern Democrats at the time disproportionately dominated committee chairmanships, allowing them to water down or stymie the civil rights agenda of the more activist wing of the Democratic Party. Seeking to reduce this power, the Democratic-controlled legislature at the time passed a series of measures that eliminated the seniority monopoly on chairmanships, asserted greater party control over committees and conferences, and gave rank-and-file members increased oversight over conference committee reports.

The second wave, “initiated by House Republicans after the 1994 elections, was about partisan efficiency and control.” This wave centralized power in party leadership, rather than committees; cut member and committee staff (while increasing leadership staff); and increased the Senate majority leader’s power over the amendment process. Among other effects, these changes increased party leadership’s influence and control over the conciliation and conference process, since conferees were increasingly expected to “toe the party line.” This more rigidly hierarchal structure eventually decreased the use of conference committees altogether in favor of other methods of conciliation.

Current preferred methods of conciliation

The result of the aforementioned institutional changes in committee power had direct ramifications for the conciliation process. Instead of conference committees, party leadership often chooses to engage in high-level, closed-door negotiations to resolve in interchamber differences. This ad hoc, secretive process has only gained in popularity in recent years.

Because of their obvious partisan implications, fiscal issues were the first to be consumed by party leadership. In more recent times, party leadership has taken the lead in conciliations involving even supposedly nonpartisan issues, like defense authorization and farm bills. As the authors point out, the apotheosis of this trend away from conference committees to more ad hoc methods for resolving differences between House and Senate versions of bills was seen in the initial passage of the Affordable Care Act. In order to both avoid a Republican filibuster and to reconcile the differences between the House and Senate versions of the bill, Democratic leadership used a series of complex legislative maneuvers to gain passage.

Perhaps unsurprisingly, the move toward greater party control of conciliation has been bipartisan: “[B]y the time the Republicans had assumed majorities in both chambers following the 2014 elections, the parties had taken over post-passage action from committees on most major legislation,” according to the authors.

How changes in conciliation have affected policy outcomes      

These changes to conciliation procedure have affected policy outcomes in numerous important ways. First, because conference committees have at least some minority input, they typically result in outcomes that are closer to the median congressman than the more overtly partisan outcomes of leadership negotiation. Thus, the move toward leadership-dominated conciliation has in turn led to more sharply partisan legislation.

Second, these changes shifted “primary responsibility [for conciliation] from legislators with policy expertise to legislators with political expertise,” which can work to reduce general policy expertise among members of Congress. By and large, party leadership is predominantly focused on electoral outcomes, rather than on the minutiae of particular policy issues. This can eventually create perverse incentives for rank-and-file legislators as, over time, “the focus on party-based policymaking and a lack of reliance on committees to write legislation may reduce the incentive for legislators to develop genuine policy expertise.”

Finally, these trends have implications for legislative transparency, as well. In contrast to the traditional conference committee process—which was public and included joint explanatory statements detailing the key results of the conference negotiations—party leadership now mostly relies on ad hoc closed-door bargaining sessions to reconcile differences in legislation. This reduces transparency and provides the public with less insight and guidance regarding agreed-upon compromises.

In their extensive monograph, Park, Smith and Vander Wielen provide important context and history concerning the evolution (and decline) of conference committees. Their analysis is a welcome addition in the effort to understand current congressional dysfunction and its potential impact on policymaking.

Image by holbox

Cameron Smith talks prison reform on Birmingham Talk 99.5

Filling in as host on the Andrea Lindenberg Show, R Street State Programs Director Cameron Smith discussed Rachel Maddow’s “big reveal” on Donald Trump’s taxes, as well as some of the prison reform topics he raised in this column. Full audio of the show is embedded below.

Florida lawmakers weigh streamlining short-term-rental rules statewide


New legislation introduced in the Florida Legislature would establish a framework that does away with the hodgepodge of regulations governing vacation rentals across the state.

The most publicized aspect of the bills is the impact they promise to have on short-term-rental companies like Airbnb and HomeAway, which have faced inconsistency and, at times, outright hostility in their attempts to operate in Sunshine State. Consider the data compiled in R Street’s Roomscore report, which graded the top 59 American cities on how friendly their laws are toward short-term rentals. While Orlando and Fort Lauderdale received mostly passable grades (B and B-, respectively), Miami received a D+ and Jacksonville received an F.

Increasingly, the policy battle over short-term rentals has mirrored that faced by ridesharing companies like Uber and Lyft, whose rapid rise in popularity saw an unprepared cab industry seek to stifle their new competitors through a glut of regulatory attacks, often dishonestly couched in a supposed concern for public safety. The concerns voiced by representatives of the hotel industry often ring similarly hollow.

In a Miami Herald piece outlining the protracted fight by the hospitality industry to increase regulatory burdens on short-term-rental companies, Wendy Kallergis, CEO of the Greater Miami & the Beaches Hotel Association, said:

We want to make sure the guests are entitled to the same safeguards as our hotel guests and that the properties are registered as a business, fully insured, regulated to basic health, safety and cleanliness guidelines, ADA guidelines, and that they are appropriately zoned and that all their taxes and fees are paid in full.

Though it would, perhaps, be impractical to suggest that the hotel industry simply come out and say, “we do not particularly like the fact that a more modern competitor has become very popular and is potentially eating into our profits,” it nonetheless stretches the bounds of credibility to believe that the hotel industry’s motives are quite so pure, that they simply want anyone who might ever decide to spend the night away from home, whether in a hotel or otherwise, to do so with a guaranteed baseline of luxury!

Other complaints from the hotel industry have focused on a purported affordable housing crisis, as Airbnb continues to expand; suggestions that allowing spacesharing in private residences will lead to neighborhoods overrun with deviant behavior and noise complaints; and charges that Airbnb is simply being used as a loophole by commercial rental companies, rather than individuals.

However, data released by Airbnb regarding its experience in San Francisco—an ideal market for testing claims about housing scarcity—found that 0.09 percent of rentals in the city were booked frequently enough to compete with a long-term rental, while from 2005 to 2013, “the number of vacant units in San Francisco has remained essentially unchanged.” Concerns about an Airbnb-created housing crisis appear to be unsubstantiated.

And municipal solutions to rowdy behavior and noise complaints already exist and are best dealt with broadly, through the same avenues one would use to deal with a noisy neighbor in a long-term rental, or a noisy group at a hotel.

Finally, the claims about Airbnb being a covert outlet for commercial undertakings are not borne out by publicly available Airbnb data. According to data from the aforementioned Miami Herald piece, only 1 percent of Airbnb listings in Miami were booked for more than 300 days out of the year. Meanwhile, the average host made $6,400 per year, averaging 42 booked nights in 2015. Hardly the numbers one would expect of slumlord tycoons.

However obvious the hotel industry’s true root motivation may be, even their concerns about potential lost profits may be overblown. R Street’s Andrew Moylan, author of Roomscore, notes in the study:

The American Hotel and Lodging Association reported that revenue grew from $163 billion in 2014 to $176 billion in 2015. A Morgan Stanley equity analyst report projected increases in hotel-occupancy rates from an already strong 65 percent in 2014 to more than 69 percent in 2017. The number of hotels and number of rooms both expanded, as well.

He concludes, “for all the signs pointing to short-term rentals taking a growing share of the lodging pie, there’s substantial evidence that they simultaneously are serving to expand the size of that pie.”

AirDNA, an online tool that tracks Airbnb data, currently lists nearly 66,000 active listings in Florida. Last year, data from the state indicated that more than 750,000 guests had used the service to stay in Florida, a growth of 149 percent over the previous year.

The Legislature has before it an opportunity to score a victory for the state – a victory for consumer choice, for the private property rights of Floridians sharing their homes to supplement their own income and for the industries benefited by the tourism surge affordable rental options help enable.

Image by Fotoluminate LLC

Build infrastructure like a perennial football contender


The American Society of Civil Engineers’ latest report card gave U.S. infrastructure a grade of D+. The report considers the rating “at risk” and one step above failing and unfit for purpose.

But readers shouldn’t see this poor grade as a justification for hasty public spending. It’s time for thoughtful evaluation of existing infrastructure programs and reforms that make infrastructure investments cost-efficient, while respecting the consequences of increased public spending in a constrained fiscal environment.

Shiny, modern public infrastructure makes for great optics but often-lousy investment returns. When considering infrastructure spending, it’s important to look at things from an economic as well as engineering perspective. Engineers evaluate infrastructure by functional performance, whereas economists focus on maximizing the returns of scarce resources. This can result in different conclusions. For example, an economist may be content with an engineering infrastructure rating of C if it’s not worth the cost to upgrade to a B. If B is the desired policy objective, then they’d stress finding the most cost-effective means to get there, rather than just throwing money at the problem.

Football management offers a timely analogy, given the start of NFL free agency and the forthcoming draft. Armchair general managers often “know” that their team must make the splashiest signings to fill positions of need. Then there’s the prudent GMs (who are much better paid!) who seek to maximize returns with finite resources. Commonly, this translates into the pursuit of C and B caliber players for contracts that cost a fraction of those extended to A players. That’s why GMs who sign quality players for reasonable contracts (bargain deals), build robust rosters through the draft (less expensive personnel) and draft the best player available (versus sacrificing quality to meet a short-term need) have sustained success.

Like an athlete, proponents of new public infrastructure investment often use age as a proxy for physical condition. Yet a fit 32-year-old athlete may perform better than an injury-riddled 27-year-old. The same applies in the electric industry. For example, coal and nuclear plants built a half-century ago commonly had an engineering life of 40 years. As they reached 40 years, cost-effective improvements to extend their life were often more economical than investing in new power plants. On the flip side, many power plants recently became unprofitable after natural-gas prices plummeted. Bailout proponents have claimed shutting these units down before their engineering life is through would be premature. Economics tells us there’s nothing premature about closing an unprofitable facility when a less expensive and/or more profitable one can take its place.

The age argument has also led to claims that our electric transmission and distribution infrastructure is crumbling. Such statements are often exaggerated (e.g., age overstates performance risk) or misapplied (i.e., failing to note the effectiveness of processes to replace or repair existing infrastructure). The ASCE report highlights reliability concerns from aging T&D lines built in the 1950s and 1960s, given their 50-year life expectancies. But a look at reliability metrics themselves tells a different story.

Independent studies generally do not find widespread T&D reliability concerns that existing processes can’t handle. Most indicators developed by the North American Electric Reliability Corp. actually show an improving reliability trend of the domestic bulk high-voltage power system. A 2016 study highlighted that, despite aging low-voltage electric distribution infrastructure, existing investments to modernize infrastructure have contributed to a likely decrease in the number and duration of power outages since the 2000s.

A 2015 study found that the frequency of power outages remains unchanged in recent years, but the total number of minutes customers are without power increased. Drilling down, the culprit is major power loss events resulting from severe weather. Importantly, the study did not find a link between reliability and increased expenditures on transmission and distribution, but rather highlighted differences in the effectiveness of utility maintenance policies.

The ASCE report cites growing T&D congestion (lines carrying electric current at their full capacity) as cause for concern. An economic view is that we’re utilizing T&D infrastructure more efficiently. Unused infrastructure is a wasted expense, though it should be granted that excessive congestion can cause reliability problems. Since the early 2000s, major advances in “organized” electricity market structure and operation have enabled far more efficient use of existing infrastructure to manage transmission congestion.

At the same time, new economic paradigms are under consideration for improved distribution management. These market-based approaches bolster reliability, avoid the need for some infrastructure investment and signal efficient infrastructure investment when needed. Still, there’s plenty of room for improvement in T&D planning, especially on joint planning with generation infrastructure. The main value is to lower costs. Reliability processes generally are already robust.

The takeaway from all this: we don’t need to throw public money haphazardly at energy infrastructure. Rather, we need to pinpoint areas of need and to reform any flaws in existing processes to encourage cost-effective private investment.

The ASCE’s report offers insight into areas to expedite and lower the cost of infrastructure-planning processes. These include streamlining permitting processes for new transmission lines and natural-gas pipelines. That’s a worthy pursuit, as is encouraging the Federal Energy Regulatory Commission and states to further pursue competitive models for T&D planning.

Given the White House’s expressed desire for a massive federal infrastructure bill, it’s especially critical that policymakers eye cost-effective investments and maintain fiscal discipline. Laying out the benefits of improved energy infrastructure (e.g., avoided outages) alone should not prioritize policy action (a common engineering perspective). Policy decisions must weigh benefits alongside costs. Costs for expanded federal outlays take on new meaning as we approach $20 trillion in national debt. Plus, the case for fiscal stimulus is especially weak with the economy on relatively solid footing. Congress must carefully weigh digging a deeper debt hole to fill some potholes.

Facilitating efficient infrastructure investments largely comes down to aligning private investment incentives with the public interest. ASCE’s recommendation to use performance-based regulations for pipeline safety is consistent with this. Policymakers should expand such performance-based constructs to electric-distribution utilities, rather than enacting strict equipment-design standards, which seldom weigh costs and benefits effectively. At the same time, policymakers must carefully parse the report’s conclusion that a lack of federal energy policy has caused a lag in energy investment.

America’s infrastructure team is quite strong already, but we could use some roster upgrades. For semantics, let’s just say that to make America’s team (not the Cowboys) great again, we need to do the most with scarce resources. That lends support for cutting red tape, not spending money we don’t have.

Image by Debby Wong

Congress needs to take back its war powers in the fight against ISIS


“We’re not considering any boots-on-the-ground approach,” then-President Barack Obama said during an Aug. 30, 2013 news conference about the situation in Syria. The former president would repeat his promise not to deploy “boots on the ground” over the next few years. But by 2016, U.S. ground forces were operating in Syria as part of the war against Islamic State.

President Donald Trump now plans to expand the war against ISIS. U.S. Marines have deployed alongside Syrian rebels as they plan an assault on the ISIS capital of Raqqa and expect to provide artillery support for the upcoming offensive. Before U.S. Marines engage in ground combat against the enemy, it’s time for a debate about the mission against ISIS. When Obama first ordered U.S. military action against ISIS in the summer of 2014, he did so without congressional approval.

The U.S. Constitution gives Congress alone the power to declare war, although the reality has always been more complicated than that. Under the War Powers Resolution that was passed in the aftermath of the Vietnam War, the president is required to report to Congress whenever U.S. military forces are sent into combat and withdraw them within 60 days unless Congress expressly authorizes the use of force.

Congress has never authorized specific military action against ISIS in Iraq, Syria or any country. The Obama administration—and, presumably, the Trump administration, as well—claimed they were authorized to fight ISIS under the resolution passed after Sept. 11 that allowed the U.S. military to fight the perpetrators of the attack. While ISIS is an offshoot of al-Qaida, it is a stretch, to say the least, to claim that the group was behind the Sept. 11 attacks.

Congress needs to take back its war powers. The last time the United States declared war was after Pearl Harbor. Since then, military force has been used both with and without congressional approval. For the 2011 air war against Libya, President Barack Obama did not even bother consulting with or seeking congressional approval. Now is the time for Congress to put its foot down and stand up for its own prerogatives.

Congress should threaten to defund all military operations against ISIS unless they specifically authorize the war. They should force the Trump administration to justify the war against ISIS and committing U.S. ground troops to both Iraq and Syria. It would force lawmakers and the American public to debate and think long and hard before deploying the military to yet another war in the Middle East.

In many ways, the war against ISIS has many resemblances to the Vietnam War, which spurred the first attempts to rein in presidential war powers. Both wars have seen “mission creep,” or the ratcheting up of U.S. military presence over time instead of a clear objective. For example, the Vietnam War started as a U.S. train-and-equip mission for South Vietnam, whereas the war against ISIS started out as an air bombardment campaign in Iraq.

Involving Congress in the decision to go to war not only forces the executive branch to justify the war, but also to detail what kind of resources will be used to prosecute the mission. Involving Congress also could be a way to unite the country behind a war, provided the public determines the war is just. To his credit, Obama essentially did this in 2013, as his promise about “boots on the ground” aligned with the American people’s conclusion that a war in Syria was not in America’s best interests.

Of course, getting congressional approval does not always mean a war will be easy or quickly concluded. The Iraq War was, of course, authorized by Congress. Before American forces get bogged down in further quagmires in Iraq and Syria, Congress needs to ask some tough questions about the mission. What are we trying to achieve in the fight against ISIS? How is the United States going to achieve those goals? What kind of force is needed to achieve those goals? Is there a better option?

Once Congress gets the answers to those questions, it can and should serve as the deliberative body charged with whether to authorize war against ISIS. Alternatively, it could determine it’s time to pull the pull the plug.

Image by BPTU

Promoting transparency and stakeholder engagement in an era of complex government


It is a well-known tenet of democracy that citizens must have access to information about the government’s activities, as well as the means by which to interact with the government to spur policy changes. Unfortunately, the increasing size and complexity of modern government has made it ever-more difficult for the public to be aware of—and engage with—policymaking that emanates from the federal government.

As part of a recent series of papers compiled by the Congressional Research Service, Clinton T. Brass and Wendy Ginsburg focus on how Congress has evolved over time to promote the principles of transparency and “stakeholder engagement” via legislative reforms.

In particular, they discuss how Congress has passed numerous laws over the years that “embed values of transparency, participation and representation into agency activities.” These laws help ensure the public is aware of important laws and regulations, and give nongovernmental stakeholders the ability to participate in the policymaking process.

The authors start by discussing the Budget and Accounting Act (1921) and the Federal Register Act (1935), two early efforts to increase government transparency. The Budget Act created a more formal budget process, mandated executive branch reporting requirements, and established the watchdog General Accounting Office, which eventually became the Government Accountability Office.

The Federal Register Act paved the way for the Code of Federal Regulations, the government periodical in which government rules are memorialized and recorded. The code’s genesis is especially interesting, as it arose in response to an embarrassing Supreme Court incident during the New Deal era in which the government had to admit to the court that it was seeking to enforce a law that didn’t exist (since an improper version of the regulation had been submitted to the printer).

Each of these early laws served a dual purpose: they gave citizens and stakeholders the ability to track the activities of federal agencies, while also giving Congress an enhanced ability to oversee agency activities and hold agencies accountable. In other words, they equipped those who were outside federal agencies with greater information about executive branch activity.

Perhaps the most significant effort Congress made to standardize and democratize the rulemaking process of agencies was the Administrative Procedure Act, passed in 1946. As the authors recap, the point of the APA was to:

  1. Require agencies to keep the public informed and up-to-date on agency activities;
  2. Provide for public participation in the rulemaking process;
  3. Prescribe uniform standards for rulemaking and adjudicatory proceedings; and
  4. Restate the standards for judicially reviewing agency actions.

The APA gave outside stakeholders the tools they needed to inform themselves about government policies and allowed them to communicate directly with the government about those policies. Promoting this type of “stakeholder engagement” was also the rationale behind other congressional legislation which sought to increase transparency and public participation in agency activities, including laws like the Freedom of Information Act, the Federal Advisory Committee Act, and the Government in the Sunshine Act.

In more modern times, Congress has taken advantage of new technologies like the internet to promote these goals. For example, the Government Performance and Results Act of 1993 required agencies to articulate mission statements and create multiyear strategic plans and retrospective annual reports. The GPRA Modernization Act of 2010, which updated the 1993 Act, required Office of Management and Budget to create a public website that contains metrics and information on agency performance.

Although these legislative reforms were well-intended and broadly effective, the authors also note increased transparency and stakeholder engagement come with costs. The advent of new technologies for disseminating information, coupled with increased opportunities for public involvement in rulemaking, has left Congress swamped with information and stakeholder demands. This overload is particularly concerning, given Congress’ recent habit of cutting legislative branch staff and resources. The authors also point out that diverting resources within an agency toward promoting greater transparency can undermine other important agency priorities.

In a similar vein, increased transparency and stakeholder engagement could alter how information is used and controlled. The authors use the Obama administration’s Open Government Directive, which required federal agencies to release certain datasets to the public, as an example. While the release of datasets can improve data quality through tools like “crowdsourcing,” it can lead to outside groups (intentionally or unintentionally) manipulating datasets and/or presenting skewed interpretations of data. It can also once again add to information overload that actually makes it more difficult for the public and Congress to get a clear picture of agency policymaking.

As the authors put it, Congress has made much progress over time in enhancing transparency and stakeholder engagement “in a way that increases the intensity with which agencies interact with non-federal stakeholders.” This trend has been accelerated by changes in technology and has helped address the information asymmetries between federal agencies and outsiders.

But Congress also needs to look long and hard at itself, and consider ways to adapt to this new context where the executive branch is immense, information is plentiful and pluralist demands are intense.

Image by holbox

Texas bill would end wine protectionism


For her last birthday, my wife got a gift she couldn’t use. Literally. A thoughtful family member had given her a gift card for an online wine retailer. But when she went to redeem the card, she found she was barred from using it because of a Texas law restricting interstate wine sales.

She was, of course, pretty upset by the injustice of all this, as was I. Sadly, it’s just one of many examples of Texas alcohol regulation being used for an anti-competitive purpose.

I hope she hasn’t thrown away the card, though, as things may be about to change:

Last week, in a move that took wine industry observers by surprise, Texas state lawmaker Matt Rinaldi, a Republican from Dallas County, filed a bill that would lift a long-standing ban prohibiting out-of-state retailers from shipping wines to consumers here.

Currently, it is illegal for an out-of-state wine shop to sell and ship wines to Texans. In other words, if you live in Texas, you cannot call or email a wine shop in New York or San Francisco and ask the merchant to sell and ship you its products. If Rinaldi’s bill were to be approved by the Texas Legislature, it would mark a historic break from a restrictive policy that regulates how Texans buy their wines.

The bill is expected to face stiff opposition by the Texas beer, wine and spirits wholesaler and retailer lobbies. As wine industry blogger and wine trade veteran Tom Wark wrote on his site last week, it is ‘the kind of legislation that Texas wholesalers and most Texas alcohol beverage retailers will oppose with their last dying breath.’

The current law is protectionism at its worst. I can only wish Rep. Rinaldi and the Texas Legislature Godspeed in passing this legislation.

Image by Africa Studio

What role should Congress play in regulation?


Historically, Congress has delegated great authority to the executive branch when it came to regulatory matters. For the most part, the executive branch has had a free hand, and when regulators exceed the law, effective pushback frequently has come via the judicial branch.

Lately, however, Congress started asserting itself back into regulatory decisionmaking by using the Congressional Review Act to curb new regulations. But the executive branch struck back: President Donald J. Trump recently mandated a regulatory budget, the workings of which will be decided by his Office of Management and Budget.

In light of these developments, what role should Congress play in regulatory policy? Does it have the capacity to play a meaningful role? What tools does it have and need?

The Legislative Capacity Working Group hosted a recent discussion on these questions, featuring R Street’s Jarrett Dieterle and Kevin Kosar, along with Philip Wallach of the Brookings Institution. Video of the panel discussion is embedded below.

California DMV takes a ‘self-driving’ news dump


Friday news dumps are a long and proud tradition, and one in which the California Department of Motor Vehicles took part today by releasing its long-awaited “Proposed Driverless Testing and Deployment Regulations.” The filing, made with the California Office of Administrative Law, signals the first official action taken by the department to codify regulations that are now two years behind their legislatively mandated schedule.

Comments on the proposed regulations are due April 24, with a public hearing scheduled for April 25 in Sacramento.

Disappointingly, the proposed rules would still require automakers—as part of their application to operate in California—to file a “safety assessment letter” with the National Highway Traffic Safety Administration before their vehicles could be deployed on California roads.

This shows there’s still confusion about the nature of the Federal Automated Vehicle Policy (FAVP), which was crafted as a voluntary guidance document. Requiring states to certify their compliance with the FAVP’s 15-point safety checklist is something that was never contemplated during the policy’s creation. Neither state nor federal law should force automakers to comply with standards that haven’t gone through notice-and-comment rulemaking.

While California remains confused on the matter, the FAVP’s voluntary nature  should be underscored by NHTSA and Transportation Secretary Elaine Chao so that other states do not make similar mistakes.

Some other notable aspects of the proposed regulations include:

  • The proposed regulations include a provision to allow autonomous vehicles to be operated without drivers. This provision is particularly notable in light of the DMV’s original inclination to disallow testing vehicles that lack driver-input mechanisms, like pedals and a steering wheel.
  • The definition of “autonomous mode” has been modified to encompass systems like those operated by Uber – a provision likely spurred by the showdown earlier this year.
  • The regulations’ privacy-sharing provisions have been narrowed and completely reworked, in what is likely a nod to federal supremacy in that area.
  • A requirement that local law enforcement be notified within 24 hours of an accident involving an autonomous vehicle has been removed.
  • Testing permits are now valid for two years, instead of one.
  • The problematic disengagement-reporting requirement also was maintained.

Analysis of these, and all of the other provisions, will appear on R Street in the days to come.

California legislators consider self-driving vehicles and the need for Prop 103 reform


Proposition 103—California’s restrictive regulatory regime for insurance—may need a few tweaks as fewer and fewer cars on the road have human drivers. During a March 8 informational hearing, members of the California Senate Insurance Committee heard from a panel of experts about the future of self-driving technologies and their likely impact on California’s insurance market.

Since Prop 103 passed in 1988, Californians’ auto-insurance rates have been dictated by a hierarchy of rating factors tied directly to drivers’ experience. These so-called “mandatory” factors include the driver’s safety record, their annual mileage driven and their years of driving experience. While other rating factors exist, they cannot be weighted more heavily than the three mandatory factors.

The problem with this state of affairs was immediately apparent to the committee members hearing testimony. As vehicles begin to drive themselves, and human interaction with the driving process declines, the Prop 103 formula for insurance rates increasingly will become divorced from reality.

That’s going to be a problem for drivers. Prop 103—in what is characterized by its author and principal beneficiary, Harvey Rosenfield, as an effort to “protect consumers”—was designed to ensure that it is exceedingly difficult to change rates. Subsequent regulatory developments have had a further chilling effect on insurers’ willingness to even file for rate changes.

As a result, even though self-driving vehicles will, in all likelihood, be safer than today’s human-driven vehicles, the law’s predictable effects will be to “protect” Californians from those lower insurance rates. That is an odd approach to consumer protection.

Rosenfield, who testified at the hearing with camera crew and orange caution cone in tow, was adamant that, so long as drivers face any liability at all, Prop 103 will still be necessary. This line of reasoning surprised state Sen. Tony Mendoza, D-Artesia. The committee’s new chairman responded to Rosenfield’s testimony by noting astutely that “strict adherence to Prop 103 does not fit” with self-driving vehicles.

While Rosenfield was unmoved by reason, testimony offered by the California Department of Insurance struck a more moderate tone. Deputy Insurance Commissioner Chris Schultz told the committee that the department believes Prop 103 works, for now, and that no immediate changes are needed. The basis for his contention was that, if a manufacturer were to release an autonomous vehicle today, the industry would have the ability to insure it using the Prop 103 structure.

Toward that end, the department posited that the state’s experience overseeing the products developed to cover transportation network company vehicles could prove instructive. Under that framework, policies have different “periods” that relate to qualitatively different activities, each with different coverages and coverage limits. In the context of self-driving vehicles—in which drivers will be switching between automated modes and human-piloted modes—the appeal of activity-specific coverage is clear.

Fortuitously, a study related to the way the TNC system has worked in California will be forthcoming this summer. It will be interesting to see if the department will take a leadership role in applying its findings to self-driving vehicles, given that a period-specific approach still likely would require either tweaks to Prop 103, or a creative application of its class plan or affinity group provisions.

At the hearing’s end, it was apparent that there are widely divergent perspectives on the impact that self-driving vehicles will have upon arrival, but there was a near-consensus that Prop 103 is not ideally suited to the reality posed by the technology. Whether characterized as “changes” or “tweaks,” a departure of some kind from Prop 103’s status quo now seems inevitable.

Though Mendoza is new to the issue, he offered what might be the most pressing question of all: “do we (need to) look at a product outside of Prop 103?” Without a doubt, the answer to that question is yes. It’s time to look both outside—and beyond—Prop 103.

Image by PP77LSK

The problem of legislating from inside a silo


In a recent example of how the best intentions often lead to the worst policy, Massachusetts legislators are considering a bill that would tax autonomous vehicles based on the number of miles they drive. There any number of problems with the approach, not least that singling out autonomous vehicles for a usage-based tax would slow their adoption in Massachusetts and could have a chilling effect on their development elsewhere.

In coming up with this flawed proposal, state Rep. Tricia Farley-Bouvier, D-Pittsfield, and state Sen. Jason Lewis, D-Winchester, shouldn’t be blamed too harshly. Autonomous vehicles represent a paradigm shift that will require bold new policies. Both legislators are striving to think outside of the box. However, their bill speaks to a larger problem – a failure to communicate.

In the current political environment, the inability to cultivate honest dialogue about important issues is a significant barrier to developing needed public policies. Well-meaning people of all political stripes too often are working in ideological silos. Encouragingly, the development and regulation of autonomous vehicles may prove a unique point of bipartisan interest and exchange.

Toward that end, the Center for the Study of the Presidency and Congress—a nonprofit dedicated to serving as an honest broker between public-sector leaders, industry and the policy community— hosted a series of off-the-record roundtables in Washington, San Francisco and Seattle on the topic of autonomous vehicles. The roundtables brought together experts from various policy areas, who lent their time and insights to identify key themes and areas of concern that surround the development, deployment and regulation of self-driving cars.

CSPC now has issued a new report that stems from those roundtable discussion, and it could serve as a valuable resource for policymakers at both the state and federal levels. In fact, had the report been available to Farley-Bouvier and Lewis, they might have learned that voices from across the political spectrum agree that autonomous vehicles must not be disadvantaged compared to traditionally operated vehicles, because doing so will stifle and slow their adoption.

Another essential takeaway from the roundtable report is that policymakers and regulators must be careful not to discriminate among autonomous-vehicle developers based on their prior experience as vehicle manufacturers. To ensure the public is willing to adopt autonomous technology, it’s vital that the technology be safe. But the way to ensure the technology is safe is to see that it undergoes rigorous real-world testing. Preventing firms from testing their technology, simply because their legacy business is not focused on vehicle manufacturing, has no demonstrable safety benefit. Over the long term, it will hamper the competition that would otherwise lead to the best technologies.

The report also recommends that the National Highway Traffic Safety Administration work to avoid a patchwork of standards. Sensible distinctions between state and federal authority will help state lawmakers better understand where they can play a constructive role. One way to accomplish that goal would be for NHTSA to affirm its Federal Automated Vehicle Policy and repudiate the confusing State Model Policy.

Working from a point of consensus, like the one represented in the CSPC’s report, is an antidote to the kind of legislation introduced in Massachusetts. As new challenges arise, the vital importance of dialogue will only grow.

Image by sarahjlewis

Dieterle talks distilling restrictions on ‘Free to Brew’ podcast

R Street Governance Project Fellow Jarrett Dieterle joined the Free to Brew podcast to discuss regulatory issues around alcohol distillers, particularly in Virginia, which has some of the most onerous restrictions in the country. The full audio is embedded below:

You can check out more Free to Brew podcasts HERE.

Hobson talks tech on ‘Mike Check’ show

R Street Tech Policy Fellow Anne Hobson appeared last week as a guest on the popular “Mike Check” show on KVOI-FM in Tucson, Arizona. Along with hosts Mike Shaw and Ray Alan, she discussed the benefits—and dangers—of emerging technologies like virtual reality, artificial intelligence and the so-called internet of things.

TV shows like Netflix’s Black Mirror and movies like “The Terminator” do a good job imagining the pitfalls of future technology run-amok. But these on-screen depictions, interesting though they may be, aren’t reflective of the many beneficial applications advanced technology gives us today. For instance, internet-of-things devices are making daily tasks easier and increasing productivity. The owner of a “smart lightbulb” no longer has to get out of bed to turn off the lights, while a baker can simply ask Amazon Echo how many teaspoons in a tablespoon without having to burn the cookies.

Of course, all this new technology has its downsides. The scale, scope and interconnected nature of new devices creates unique cybersecurity challenges. There are more than 17 billion connected devices of all sorts, from internet-enabled toasters, to smart TVs, to EZ-Pass devices with RFID chips. Put another way, there are now 17 billion points of vulnerability to protect.

Their interconnectivity also means one person’s compromised toaster can affect another person’s ability to access their email. Malware can infect devices and conscript them into a botnet “zombie army” ordered to barrage a target with traffic and render websites or web services inaccessible. Botnets aren’t the only cybersecurity problem. A majority of data breaches are caused by human error — people sending information to the wrong party, clicking on a malicious link in a phishing email or using default or simple passwords.

While the cybersecurity challenges to connected devices are significant, market-based solutions such as certification programs, threat information sharing efforts and aftermarket cybersecurity products are developing to address them.

Audio of the full show is embedded below.

Nitzkin talks THR on America Tonight

R Street Senior Fellow Joel Nitzkin joined host Kate Delaney recently to discuss tobacco harm reduction on her America Tonight radio show, which is heard nationally over 220 AM and FM stations. Full audio of the show is embedded below.

At NCOIL, state lawmakers look to claw back power from NAIC


Newly assertive leadership of the National Conference of Insurance Legislators appears eager to confront what it views as an ongoing usurpation of authority from state legislatures.

Thomas B. Considine—now NCOIL’s chief executive, but previously commissioner of the New Jersey Department of Banking and Insurance—chose the organization’s spring meeting in New Orleans as the venue to raise public concerns about states becoming subject to the authority of the National Association of Insurance Commissioners, a private trade association composed of the nation’s insurance regulators.

Appearing at Considine’s invitation, Rutgers Law School professor Robert F. Williams—an expert on state constitutional law— detailed the process by which NAIC decisions are transmuted into state law. While the NAIC serves nominally as a private venue for insurance regulators to meet and share information and best practices for insurance industry oversight, it also promulgates standards and models for states to adopt.

Because of the group’s status as the pre-eminent body on such matters, states across the country have adopted statutes that incorporate NAIC work product into state law by reference. That is to say, changes to NAIC models and standards, in effect, serve to change state law without the explicit consent of state elected officials.

Delegation of authority between the states and the federal government, among the states themselves and between the various branches of government all have a clear constitutional basis. The circumstances under which lawmaking authority may be delegated to private organizations—Williams told this audience of legislators, regulators, industry members and academics—is considerably narrower.

While there are limited circumstances where public bodies might need to outsource highly technical matters to those with expertise, such questions should never extend to cases where policy judgment is involved. Williams raised concerns about legislative bodies’ propensity to bind themselves to such arrangements prospectively.

Williams’ sentiments were echoed by Neil Alldredge of the National Association of Mutual Insurance Companies, who expressed particular alarm that the NAIC’s recent work on corporate governance standards amounted to legislating matters of substantive policy. NCOIL’s current vice president—state Sen. Jason Rapert, R-Ark.—speculated that many lawmakers are likely unaware of the arrangement with the NAIC.

From the perspective of constitutional law, the problems with incorporation-by-reference statutes are interesting in that they largely are untested in court. No cases have ever been tried involving insurance regulation. A judicial confrontation might be avoided if the NAIC rededicated itself to focusing on nonsubstantive matters.

Alternatively, states could make it a regular practice to readopt the NAIC’s incorporation-by-reference statutes each legislative session, to ensure newly elected lawmakers are reminded of the power that they are ceding to the commissioners’ trade association. The readoption approach likely is preferable. It would eliminate the NAIC’s temptation to oversee substantive matters, while simultaneously allowing the people’s representatives to re-examine their faith in the relationship with the NAIC on a regular basis.

Another problem with the NAIC’s ongoing power to incorporate standards by reference is that, to fund its operations, the NAIC restricts access to both the information it gathers and to participation in its meetings, in a manner inconsistent with the transparency otherwise available in the public lawmaking process. In fact, members of the public face substantial obstacles should they wish to participate in the process by which standards that directly impact them are set.

Ironically, if anything, the current arrangement is a good argument for the need for the Treasury Department’s Federal Insurance Office, an agency whose very existence has actively been questioned by both NCOIL and NAIC (as well as some elements of the industry). FIO could make public the information over which the NAIC currently has an effective monopoly and thereby address the information asymmetry that members of the public currently labor under. Interestingly, though it has in the past been highly skeptical of FIO, the NCOIL body declined, on the final day of its session, to take the position that FIO should be abolished.

Under Considine’s direction, NCOIL is seeking to chart a rapid course to renewed relevance. At the summer meeting in Chicago, the group is expected to consider a model law from state Assemblyman Ken Cooley, D-Calif., that would require state insurance departments to help fund NCOIL, which would put the group at much greater parity with the NAIC.

It’s striking that it should take a former commissioner, a consummate insider, to bring attention to the worst excesses of the NAIC’s quiet empire. In doing so, he may just return power to the people that they didn’t even realize had been taken from them.

Image by Andrey_Kuzmin

Short-term-rental rules on the docket in Indiana


Carmel, Indiana, is a beautiful and completely modern community of more than 85,000 folks, just north of Indianapolis. Its 100 traffic roundabouts are the most of any U.S. city and, last year, one of them was named the most beautiful in the world. Carmel’s charming Arts and Design District hosts the annual Carmel International Arts Festival and is marked by the Museum of Miniature Houses and public sculptures by John Seward Johnson II. The 1,600-seat Palladium concert hall in the Center for Performing Arts is home to the Carmel Symphony Orchestra. In September, the city hosts the Rollfast Gran Fondo cycling tournament.

A lot of people want to be in Carmel and some of them want to stay overnight, both for these attractions and for the major sporting events frequently hosted 13 miles to the south in Indianapolis. The Indianapolis market has about 33,000 hotel rooms, which were sold out in March for the Indianapolis 500 on Memorial Day weekend.

Airbnb, an online short-term rental service, has been a very popular option across for lodging across the Midwest, and is particularly adept at making rooms available during busy times when hotel rooms sell out. In the Indianapolis area, the service grew by more than 200 percent last year. That includes about 300 spaces available for rent in Carmel.

But building commissioner Jim Blanchard recently sent out notices to city homeowners that they had 10 days to remove themselves from the Airbnb website and similar services or face city code enforcement for illegal activity. While apartment dwellers could continue to list with the services, due to differences in zoning requirements, single family dwellings cannot. City officials maintain they’ve received complaints from residents distressed about noise, traffic and other issues they associate with short-term rental visitors. But it means Carmel residents who relied on the extra cash from those rentals—the typical Airbnb host rents about 22 nights per year and earns about $3,000—were suddenly left out in the cold.

In response to actions like Carmel’s, House Majority Flood Leader Matt Lehman, R-Berne, is proposing legislation to pre-empt communities from completely banning rentals of less than 30 days. Lehman has gained recognition for his work on insurance issues with state lawmakers from across the country. He helped to craft a compromise approach on the insurance aspects of ridesharing services offered by companies like Uber and Lyft, which has since gone on to become a national model. He sees no reason why Indiana couldn’t similarly become the pace car for national effort to work out differences between hotels, homeowners, bed-and-breakfast establishments and elected officials passionate about local control.

While his bill would prevent cities and countries from banning spacesharing services outright, it includes a number of sensible limits. If a property is rented out for more than six months, the owner would have to acquire a regular business license and pay merchant innkeeper taxes. It specifies that local regulation of fire, safety, sanitation, pollution, sexually oriented businesses, nuisances, noise, traffic control and the like would continue unimpeded, as long as the regulation and enforcement is applied equitably to all residential housing. The measure also includes minimum insurance requirements, as hosts would have to maintain primary first-dollar liability coverage of one at least $1 million.

After two tries, the Lehman bill passed its house of origin this past week, squeaking by after fending off several amendments from legislators who are more sympathetic to local control than emerging technology businesses. Because of the narrow passage in the House, there may need to be more work done in the Senate to address some of the perceived problems. The main issue, as in many other areas in the emerging “sharing” economy, is how to draw the line on how much professional activity a nominal amateur can engage in before he or she is recognized as a business competing with other businesses, and paying the requisite licensing fees and taxes.

The answer may not come easily, but it seems likely that a reasonable compromise can be found on short-term rentals. This issue will likely get a hearing or two in a great many laboratories of democracy across 21st century America.

Image by sevenMaps7

Criminal justice reform takes center stage at CPAC

cpac 2017

The right end of the political spectrum historically has favored getting tough on crime and increasing criminal penalties as a solution to rising crime rates. But to be “right on crime” nowadays means something else entirely. The lack of measurable results in straightening out the lives of wrongdoers and the huge expense of incarceration both have produced a new ethic focused on data, common sense and a much better cost-benefit ratio for government corrections policy.

Fiscal questions about how and when incarceration is appropriate have generated significant new interest, as well as experimentation by some states that are trying to find the criminal-justice formula that makes the most sense. Texas, Georgia, Oklahoma, Kentucky and many other states have in recent years changed laws, rationalized systems, diminished recidivism and saved billions. The phrase heard several times from presenters at this year’s Conservative Political Action Conference was: “We want to imprison the people we are afraid of, not the people we are mad at.”

FreedomWorks points out that the U.S. Constitution defined just three federal crimes – piracy, treason and counterfeiting. In 1870, Congress added a baker’s dozen more federal crimes, including murder and manslaughter, larceny and perjury. There are now about 5,000 federal crimes, and according to some estimates, around 400,000 federal regulations that can be enforced criminally. No one is quite sure exactly how many.

For the last few years, concern has been focused on the expanding number of new crimes that executive agency regulatory processes have created. In contrast to the legal doctrine of mens rea, which holds that intent is always an element of a crime, there’s no need to prove intent for many of these infractions. Author Harvey Silvergate claims Americans today unknowingly commit an average of three felonies a day.

This year’s CPAC featured a number of both main-stage appearances and expert breakout panels composed of conservatives intent on reformulating aspects of the nation’s criminal-justice system. One of the most interesting issues was highlighted by Stephen Mills, a retired Army military policeman who now serves as chief of police in Lindsay, Oklahoma. Mills also happened to be one of the law-enforcement first responders to the 2009 terrorist attack on Fort Hood.

Mills described how, after retiring from 25 years of active military duty, he became a rancher and hired some help to run his cattle operation. One of his ranch hands was out one day and stole a big roll of copper wire. When the ranch hand was apprehended, Chief Mills’ pickup truck was confiscated “an instrumentality of a crime.” He couldn’t get it back, because he couldn’t prove he knew nothing about the theft.

This is the process of civil asset forfeiture. Law enforcement considers it a valuable tool in the fight against crime, particularly in drug-related cases. The prototypical case proponents of the practice will usually cite is the traffic stop that uncovers thousands of dollars in cash from drug deals. Unfortunately, there seem to be a lot of cases more like the one Mills faced. Many of these forfeiture laws now are under review – for instance, to compel confiscated items be returned where the owners are never charged with a crime. Several states already have enacted asset-forfeiture reform and many more are currently considering it.

Another reform widely enacted by states in recent years is to prohibit government employers from asking about criminal convictions on job applications. The most important thing to reintegrate ex-convicts into a productive life is a job. Prospective employees likely would ultimately have to explain any convictions before they could be hired by a government agency, but this would cut down on automatic rejections at the application level.

Another panel highlighted stories about political prosecutions that ultimately were overturned, but not before they had ruined careers, families and finances. It sparked a lot of passion, which is understandable when it comes to outcomes that clearly seem unjust. But the most important thing about the new criminal justice reform agenda is how practical and data-driven reforms have proven out when tested against real world challenges.

Why the FCC should wait on privacy rules


Chairman Ajit Pai wants an emergency vote this week by the Federal Communications Commission to halt implementation of new rules on how internet service providers like AT&T, Verizon and Comcast can share the consumer data they collect. The rules—implemented by Pai’s predecessor, Thomas Wheeler—will take effect March 2 unless the FCC votes to stay.

With internet connectivity as pervasive as it is, consumer-privacy protection has been the center of policy debates worldwide. Part of this debate requires thinking critically about the government agencies best suited to take the lead.

The Federal Trade Commission has privacy rules that govern all companies collecting personal information. Indeed, Pai’s objective isn’t to kill privacy rules outright, but to make sure they are built into a “comprehensive and uniform regulatory framework.” It’s on this point—not ISP privacy regulations, per se—where Pai departs from Wheeler.

Wheeler’s FCC aggressively sought to expand its purview. Wheeler successfully reclassified ISPs as monopoly carriers and, while promising forbearance, immediately launched regulatory inquiries not just on privacy rules, but on the operation of cable TV set-top boxes, streaming video and pricing plans that offer unlimited data. That’s a long way from its original congressional mandate to manage use of the public airwaves and landline telephone service.

When Wheeler pushed the frontiers of FCC mission creep, he was only following a trend. Well-before he arrived, the FCC had been trying to expand broadcast decency rules to cable programming and demanding a say in merger approvals, a role historically reserved to the Justice Department and FTC. That why Pai’s taking a breath for regulatory reset is so welcome.

Within the communications industry, an uneven regulatory field can’t help but create arbitrage opportunities, rent-seeking and other market distortions that, in the end, cost consumers value, as well as real dollars. It also creates a situation in which ISPs may be bound by different and contradictory rules. Eventually, the courts would have to sort it out, a process that could take years.

This is why Pai is right. The FCC should defer to the FTC. Right now, the FTC is the right place for policymaking on the use of consumer data. We are all best served when privacy rules come from one place.

Image by Mark Van Scyoc

What the Wall Street Journal gets wrong about farming in 2017


With the Farm Bill up for reauthorization in 2018 and legislative debate poised to heat up later this year, there’s been a lot of talk about the plight of the American farmer. A recent Wall Street Journal piece proclaimed that we’re on the brink of a major national farm bust, as a shrinking global grain market and low prices increasingly will drive small family farms out of business.

Citing research from the U.S. Department of Agriculture, the Journal predicts that farm incomes will drop 9 percent in 2017, “extending the steepest slide since the Great Depression into a fourth year.”

Declining farm incomes have not gone unnoticed on Capitol Hill. The House Agriculture Committee recently held its first hearing of the 115th Congress, titled “Rural Economic Outlook: Setting the Stage for the Next Farm Bill.” Chairman Michael Conaway, R-Texas, said in his opening remarks:

America’s farmers and ranchers are facing very difficult times right now… As we begin consideration of the next Farm Bill, current conditions in farm and ranch country must be front and center.

For taxpayer advocates hoping for meaningful reforms in the next farm bill, this doom and gloom does not bode well. Instead of leading to reforms that help farms struggling to stay afloat, it will likely only result in more of the status quo. That means more taxpayer-funded subsidies flowing to wealthy agribusinesses, while small farms become increasingly obsolete.

It’s true that commodity prices are down and many farmers are struggling. But it’s also true that, relatively speaking, the farm economy is doing pretty well. As the Environmental Working Group points out in a response to the Journal piece, median farm household income is expected to grow in 2017. At $76,735, median farm household income is actually $20,000 more than the median income for all U.S. households. While there is certainly risk in starting a farm operation—as there is with any business in a market-based economy—the annual business failure rate is 14 times greater than the annual failure rate for farms.

The alarmists imply that struggling commodity farmers are being left out to dry, but that couldn’t be further from the truth. Not only are commodity farmers protected by the Agricultural Risk Coverage program, which triggers payments when revenues fall below an anticipated threshold, and the Price Loss Coverage program, which pays out when market-year average prices fall below what’s called the reference price, but they also have the option to purchase government-subsidized crop insurance with lavish coverage options. On average, the government subsidizes 62 percent of farmers’ crop-insurance premiums, regardless of the size of the farm operation.

Farmers also have the option to insure not only their projected yields, but also their revenue. Under the most extravagant federal crop insurance product, the “harvest price option,” farmers can cash in either the locked-in price at the time they planted or the current market price, whichever is higher. As we’ve said before, it’s the “crop insurance equivalent of your auto insurer surprising you with a new Cadillac Escalade after you’ve totaled your Toyota Corolla.”

The Wall Street Journal correctly notes that the consolidation of large, industrial-scale farm operations has driven many small farms out of business. But what it doesn’t mention is that our crony agriculture policy is likely driving much of that consolidation. An EWG analysis found that the top 10 percent of U.S. farms are getting more than 50 percent of the subsidies, with 26 farm operations receiving subsidies of $1 million or more. Congress should address this cronyism by putting a cap on the amount of premium support a single farm operation can receive and enacting a means test, so that farmers who are making high incomes cannot receive subsidies. This would help to level the playing field for small family farmers and ensure that our taxpayer dollars are not being spent to boost the incomes of mega-farm agribusinesses.

Accounts like the Journal article invoke nostalgia for the hard-working family farmer, but this sympathy will only be misplaced if lawmakers do not seize on this opportunity to craft a reformed farm bill that puts the interests of taxpayers and struggling farmers above those of the Big Ag lobby. As we look ahead to the next farm bill, let’s not allow dismal, exaggerated narratives distract us from the fact that our current farm-support system is not working and badly needs reform.

Image by Lost Mountain Studio

A cyber mandate isn’t the way to address cyber-insurance takeup


To improve cyber preparedness and help companies recover from cyberattacks, it’s essential that the takeup rate for cyber insurance continues to rise. The insurance capacity plainly exists to write virtually all of the risks for which the market currently seeks coverage. What’s missing is demand.

The federal government can play a role to bolster that demand, but prescriptive measures, like a cyber-insurance backstop or a cyber-insurance mandate, would have a negative impact on those efforts. A recently released paper from R Street Technology Policy Fellow Anne Hobson instead recommended that so-called “internet-of-things” vendors and contractors that do business with the federal government be held financially responsibility should a cybersecurity failure on their part result in costs to U.S. taxpayers.

Some have proposed the best way to accomplish this goal would be to require that federal vendors and contractors buy cyber insurance. It’s a proposal that has some attractive features. Insurance often has the benefit of forcing private actors to take account of their practices and reduce risks that otherwise would cause their premiums to rise or render them unable to obtain coverage at all.

But it’s important to remember that insurance exists to transfer risk. Whether it’s appropriate to buy cyber coverage to address directors and officers liability, or to deal with the potential for business interruption, is a decision each firm’s management team must make for themselves. The government has no special knowledge to know what’s right for every company with which it does business. What it can and should require is that taxpayers be protected from having risks directly or implicitly transferred to them as a result of those private risk-management decisions.

While firms could pursue other mechanisms to address their financial responsibility—from letters of credit to surety bonds to cash—most would find cyber insurance the most efficient means to transfer the risk of cyber-related liabilities. When it comes to risk management, private firms have a number of questions they must ask, including what risks they face, what strategies can be employed to mitigate those risks and whether it is more cost-effective to retain those risks, which will be borne by shareholders and creditors, or to transfer them to third parties like insurers. The key is to make clear that firms know they will be held liable for any risks they create for others—in this case, the government and individuals whose private data are entrusted to the government. There will be no bailout if things go wrong.

If the government simply wants its contractors to undertake cybersecurity measures, it can do that and, to an extent, it already has. It’s not only appropriate, but it’s absolutely essential that federal agencies vet the vendors and contractors with which they do business, and not offer contracts to those who practice poor cybersecurity hygiene. This should include examining vendors’ overall risk-management practices and taking as a positive sign that a given contractor has prepared for contingencies by obtaining insurance.

But as a technical matter, while it’s possible to require and define the scope of financial responsibilities that a government contractor holds to the agency with which it contracts, there is no magic formula to determine what kind and how much insurance every potential government contractor should get. Different firms of different sizes engaged in different kinds of activities that face different kinds of risks and operate under different contract terms all will have vastly different sorts of insurance needs.

The larger point here is that before you can enjoy the benefits of cyber insurance, which are many, you first must have a definable need for insurance. There is much to praise about how the underwriting process can act as a kind of cybersecurity audit. But the social benefits of cyber insurance do not, themselves, create the need for a cyber-insurance policy. First, you must have a risk that it actually would be prudent to transfer.

Image by Rawpixel.com

Should we have a long-term budget for entitlements?


Federal law treats Social Security, Medicare, Medicaid and other entitlements as “mandatory” spending programs, which means they are not subject to the annual appropriations process like the rest of the budget. These programs annually take on an ever-greater share of the federal budget — now almost 60 percent. They also are racking up huge unfunded long-term obligations, as the nonpartisan Congressional Budget Office has reported. The Government Accountability Office has chimed in by warning that entitlements are on an unsustainable growth trajectory.

So what to do?  Stuart Butler of the Brookings Institution and Maya MacGuineas of Committee for a Responsible Federal Budget suggest a long-term budget for entitlements. Doing this, they posit, would establish an “orderly pathway for helping to resolve inherent tensions” in current budgeting. Additionally, a long-term entitlement budget would “encourage Congress to make clear choices about long-term spending.” Presently, the auto-pilot growth of entitlements is crowding out spending on other priorities and fueling bruising budget fights.

The authors identify two steps to enact a long-term budget: designing a long-term budget plan and treating the plan as binding going forward (unless Congress takes steps to change it). For the initial design phase, Congress would need to map out a 25-year spending plan for major entitlement programs, as well as a funding plan to cover their costs (presumably by identifying specific taxes or revenue, savings, or by proposing a debt increase). The authors advocate that this long-term budget plan should also include tax expenditures, which, like entitlements, tend to grow inexorably.

CBO would then be tasked with publishing an annual 10-year “moving average” (based on the results of the previous five years and projections five years into the future) that would act as the baseline from which it would be determined if the budget was veering outside what the authors call the “corridors” of the long-term plan.

Congress would engage in quadrennial reviews of the long-term budget during the year after each presidential election. This formal review could be used to alter the long-term budget going forward, if Congress deemed it desirable. Once finalized, the budget timeline would be extended by another four years to create a new 25-year budget. Critically, revenue and entitlement levels that had been established during prior reviews would only be alterable by an act of Congress signed by the president.

In between these quadrennial reviews, agencies and Congress would have free rein to change revenues and entitlements, so long as the changes did not cause the long-term budget to diverge from the established corridors.

Once a plan is developed, Congress would vote to enact it, thereby creating a framework moving forward and teeing up the second element of the author’s proposal. That element requires the long-term budget plan to be the default budget for entitlement programs moving forward, and automatic procedures would be triggered if the spending plan failed to stay within the agreed-upon corridors.

The authors sensibly note that a long-term budget plan would be unlikely to survive if it required Congress to take proactive action to maintain it. Therefore, automatic enforcement mechanisms are necessary. The authors criticize the idea of using automatic triggers—modeled after mechanisms like the Medicare Sustainable Growth Rate—that would initiate cuts whenever the long-term plan veered outside the set corridors. As they point out, such proposals are politically difficult given their blunt nature of cutting expenditures across the board.

Instead, they propose establishing a commission similar to the Defense Base Realignment Commission (which was established in the late 1980s to identify and close unnecessary military bases) that would act as the default enforcement mechanism to maintain the long-term budget. This would mean that, in the event the long-term budget started to careen outside the guardrails (for example, entitlement spending started to rise more than anticipated), the commission would be able to engage in the necessary course-correction actions to keep the long-term budget within the corridors (for example, by cutting spending or identifying more revenues for funding).

This commission would be jointly selected by Congress and the president, and its recommendations would be final — unless Congress took action to override it. This could be done by a congressional “super committee” that could develop an alternative method and package to maintain the long-term budget’s proper trajectory. The super committee’s alternative would then be considered on an expedited up-or-down vote in Congress and would replace the commission’s plan if approved.

In their paper, the authors address a few other points regarding their proposal, including some counterarguments to the idea. They note, for example, that it might be wise to allow the automatic-enforcement mechanisms (the commission and super-committee process) to be suspended during a certified recession. They also suggest the importance of setting an explicit fiscal objective that the long-term budget would be trying to help achieve, in an effort to discourage overpromising by politicians. Finally, the authors acknowledge that, while one Congress cannot legally bind future Congresses, the legislature is empowered to establish congressional procedures that can work to shape future congressional behavior and politics.

Congress has struggled mightily in recent years to carry out the fantastically complex annual budget and appropriations processes. A new process is needed, one that both reduces the steps that need to be completed each year, reduces the realm of conflict and soberly confronts fiscal reality. As such, Butler’s and MacGuineas’ proposal is a welcome effort to tackle a problem that too long has been unaddressed.

Image by Digital Storm

How the federal government could lead by example in cyber insurance

The takeup rate of cyber insurance is rising, but the market’s growth to date has been uneven. Anne Hobson, a technology policy fellow at the R Street Institute, may have a solution to that problem. She proposes that the federal government, as a high-profile cyber target and large user of connected devices, is well-positioned to address the uniformity problems currently afflicting cyber-insurance policies by introducing a financial responsibility requirement for some of its most vulnerable vendors and contractors.

In a new paper, Hobson takes stock of the risks presented by growth of the so-called “internet of things.” The “things” are objects, hitherto unconnected, that are now being networked for our convenience. Each of these objects has the ability to send or receive data, often to do both, and is, as a result, susceptible to breach and malicious misuse.

She reasons that, beyond outright ignorance of the existence of cyber insurance, the principal reason firms fail to carry coverage is that the policies are both complex and nonstandard. Thus, while there is reason to believe the takeup rate will continue to climb as the market grows, a step taken to introduce some level of uniformity to the market may speed that process further.

But finding a benign way to introduce greater uniformity to cyber-insurance offerings is challenging, particularly given that the long-predicted rapid growth of the cyber-insurance market is now meaningfully underway.

Nigel Pearson, global head of fidelity at Allianz Global Corporate & Specialty recently noted that “the cyber market is growing by double-digit figures year-on-year, and could reach $20 billion or more in the next 10 years.” His prediction was echoed by similarly bullish analysis from Allied Market Research. Allied projects the cyber-insurance market will reach $14 billion in written premium in five years, by 2022.

Despite those developments, growth has been uneven. While firms in some sectors, particularly large firms in financial services, are now more likely than not to carry some level of cyber insurance, the vast majority of small and midsize businesses do not. The risk for such firms is large and growing. In fact, according to Hartford Steam Boiler, 60 percent of small and midsized businesses that experience a cyber attack go out of business within six months.

Hobson argues that prescriptive regulations establishing cybersecurity standards would do more harm than good for firms of all sizes, but that federal agencies can help encourage the fast-developing cyber-insurance market by insisting that internet-of-things contractors be held financially responsible for any liabilities created for taxpayers as a result of cyber-attacks on their products or services.

The rationale for such a requirement is twofold. First, it is important to insulate taxpayers from the costs associated with a breach. In Hobson’s own words:

In the case of a cyber-attack or data breach that stems from the insecurity of a contractor or vendor’s system, the contracting agency…could have to expend resources on a host of ancillary costs, which can include DDoS mitigation services, forensic investigations, user notifications and data recovery. Rather than pass such costs onto the taxpayers, agencies and government purchasing agents should assert in contractual language their right to subrogate these liabilities from the contractor or vendor.

Second, greater adoption of cyber insurance would help to improve cybersecurity itself, as it would align security incentives. As firms go through the cyber-insurance underwriting process, they are made to audit their cyber vulnerability and to address problems as they are uncovered. For their part, insurers have every reason to ensure that firms maintain a vigilant cyber defense. Thus, each party has an independent pecuniary incentive to foster an effective ongoing cyber defense.

Congress is eager to improve the nation’s cybersecurity preparedness. But instead of a cyber-insurance backstop for large risks, or a prescriptive set of security requirements for small firms to follow, Hobson concludes that the best thing that it can do is set an example as a market participant. By taking a modest step, Congress can both expand the universe of firms with cyber insurance and bolster the nation’s cyber preparedness. That’s a win-win.

CTA previews Innovation Policy Day at SXSW

sxsw preview

The Consumer Technology Association welcomed R Street Tech Policy Fellow Anne Hobson to take part in a recent Facebook Live panel to preview the Innovation Policy Day programming CTA will be hosting at the upcoming SXSW festival in Austin, Texas. Hobson’s comments focus on augmented reality (AR) and virtual reality (VR), the roles these emerging technologies will play a role in improving people’s lives, and the concerns they have raised in such policy areas as cybersecurity, privacy, intellectual property, e-commerce, free expression and health and safety. Video of the panel is embedded below.

Ian Adams on KVOI in Tucson

R Street Senior Fellow Ian Adams and Mike Shaw of KVOI in Tucson, Arizona, to discuss the REINS (Regulations from the Executive in Need of Scrutiny Act) Act and what it means for re-balancing the distribution of power between the legislative and executive branches of government. The half-hour segment also focused on NHTSA’s decision to clear Tesla of any wrongdoing stemming from a recent fatal accident involving its autopilot system, as well as President Donald Trump’s use of a mobile device which might, in fact, be hacked already.

The full show is embedded below.

Transportation regulators are determined to stretch their powers to the limit


Since 2008, the freight-railroad industry has contested an effort by two federal regulators to rebalance the relationship between freight-rail operators and Amtrak. The issue is one of control and timing.

Amtrak, which is government-funded but operates as a for-profit corporation, predominantly runs its many passenger rail routes over lines owned by freight-railroad companies. Even though freights own most of the lines, Congress has historically granted preference to Amtrak trains, meaning that freight trains must generally yield to Amtrak trains if their routes conflict. Even with this priority, however, Amtrak has proven remarkably poor at running its trains on time. Amtrak, in turn, has blamed its poor performance on the freights, arguing that they do not sufficiently prioritize Amtrak trains along their routes.

In an effort to rectify Amtrak’s lagging performance, Congress passed the Passenger Rail Investment and Improvement Act of 2008, which ordered the Federal Railroad Administration and Amtrak to promulgate joint metrics and standards to measuring the performance and quality of intercity passenger trains. Under PRIIA, the metrics are used to investigate substandard performance and, in some situations, may be used to award Amtrak damages if freight railroads are found to be the true cause of Amtrak’s poor performance.

While the statute required the FRA and Amtrak to promulgate the performance metrics, it charged a different federal agency, the Surface Transportation Board, with undertaking any such investigations into laggard performance. This framework drew a constitutional challenge from the freight-rail operators. Because the law empowered FRA and Amtrak to come up with the performance metrics together, the freight railroads argue the law violates the so-called “non-delegation doctrine.” That doctrine prohibits Congress from delegating its legislative powers to other entities, particularly private entities. Given that Amtrak is managed as a for-profit corporation, the rail industry argues that it qualifies as a private entity.

The rail industry initially won on this argument at the D.C. Circuit Court of Appeals, which agreed that PRIIA’s delegation of metric-setting power to a quasi-private entity like Amtrak was an unconstitutional delegation of Congress’ powers. On appeal to the Supreme Court, though, the tide turned. The high court held that Amtrak was, in fact, a government entity rather than a private entity. As a result, Congress enjoyed more leeway in delegating its power to Amtrak.

The drama didn’t end there. The D.C. Circuit revisited the case and again struck down the metric-setting section of PRIIA. This time they struck it down on alternative grounds—namely, that the law violates due process because it allowed Amtrak effectively to act as a regulator in a market in which it had “skin in the game.” In other words, Amtrak was competing as an economic actor with the freight railroads, while also having power to regulate and impact the conduct of its competitors.

The federal government is expected once again to appeal that ruling to the Supreme Court. In the meantime, another related skirmish has broken out. While the non-delegation dispute was working its way up and down from the Supreme Court, the Surface Transportation Board stepped in and declared its authority to define PRIIA’s metrics and standards. As laid out above, PRIIA clearly awarded the power to define metrics—such as on-time performance to the FRA and Amtrak—while it reserved the powers of investigation and enforcement to the STB. But since the FRA/Amtrak version of the regulations are tied up in court, the STB claims it has the power to step into the fray and define its own metrics.

Unsurprisingly, this development sparked another court case by the freight-rail industry—this one in the 8th U.S. Circuit Court of Appeals—arguing that the STB usurped powers not granted to it by the law. At stake in this case is the proposition that, just because a protracted legal battle has halted the implementation of a law, another regulator is not suddenly empowered to fill the gaps.

The best option would be for Congress to step in again and fix the statutory defect and clarify whether the FRA or the STB has metric-setting power (and at the same time, clarify that Amtrak does not have such power). Either way, the drama surrounding this controversy is just one more demonstration of federal agencies pursuing creative methods to implement regulations and accrue power—even in the face of seemingly clear statutory language and court decisions to the contrary. If their power to do so is affirmed, both regulated industry and Congress have something new to fear.

Image by Sherman Cahal

The administration’s bad start on civil asset forfeiture


It appears that President Donald Trump has officially taken a position on civil asset forfeiture. This week, the president offered to help “destroy” the career of a Texas state senator who was supporting reform with the state’s asset forfeiture laws (he later claimed that he was joking.)

Joke or not, the threat came during an exchange with Rockwall County Sheriff Harold Eavenson, who told the president: “We’ve got a state senator in Texas that was talking about introducing legislation to require conviction before we could receive that forfeiture money.” Eavenson went on to say that the Mexican cartels would “build a monument to [the unnamed senator] in Mexico if he could get that legislation passed.”

The particular bill Eavenson was referring to is co-sponsored by state Sens. Konni Burton, R-Fort Worth, and Juan “Chuy” Hinojosa, D-McAllen, with the goal to reform civil asset forfeiture in Texas. The two senators championed this reform effort because of the inherent injustices associated with the asset-forfeiture process.

Because such seizures are civil, a victim of forfeiture does not have access to appointed counsel and the standard of proof is typically “preponderance of the evidence,” much lower than the criminal standard of “beyond a reasonable doubt.” Maybe most egregious, the practice has encouraged a policing-for-profit mentality where officers pursue much-needed funds via the forfeiture mechanism, rather than acting as administers of peace—protecting and serving. This mentality has increased the tension between the police and the policed.

A pillar of President Trump’s campaign (and now his presidency) has been the populist defense of “the little guy.” It is mind-boggling that he would defend a practice that targets his base. The rich elite can afford lawyers, but the little guy cannot.

The Supreme Court has performed legal somersaults defending forfeiture without charge or conviction. Thus, federal courts continue to hear cases with titles like United States v. Articles Consisting of 50,000 Cardboard Boxes More or Less, Each Containing One Part of Clacker Balls.

But the legislation in question here is a state bill. Federalism, according to the GOP’s official platform, is “the foundation of personal liberty.” The platform goes on to say:

Federalism is a cornerstone of our constitutional system. Every violation of state sovereignty by federal officials is not merely a transgression of one unit of government against another; it is an assault on the liberties of individual Americans.

It seems reasonable, then, that the state of Texas should decide what is right for the state of Texas, without interference or threats from the federal executive. The notion that the president would use his bully pulpit to “destroy” duly elected state senators because a sheriff with a shiny star on his chest bitched about having his toys taken away is a Nixonesque abuse of power.

Image by Shevs

Neeley addresses press conference on ‘Conservative Texas Budget’

neely conf

R Street Southwest Region Director Josiah Neeley joined with Texas state Sen. Don Huffines, R-Dallas, and other members of the Conservative Texas Budget Coalition at an Austin press conference to unveil a proposed 2018-19 state budget that includes significant tax and spending reforms, capping spending at no more than $218.5 billion, eliminating the state’s costly business margins tax and enacted structural reforms to property taxes. Video of the press conference is embedded below.

Adams talks self-driving cars on Tech Policy Podcast

R Street Institute Senior Fellow Ian Adams recently joined TechDirt’s Tech Policy Podcast for a discussion about the many implications of autonomous vehicles. Full audio of the show is embedded below.

Debunking the ‘experts’ on vaping


A recent article on the millennial-focused website Mic.com deigned to summarize “what the experts say” about the long-term effects of vaping, which certainly is an important topic. Alas, the piece by author Aric Suber-Jenkins was hardly objective or unbiased. In fact, it’s wrong in nearly every major assertion it makes and every bit of analysis if offers.

Rather than feature diverse perspectives from a variety of experts, the piece focuses almost entirely on the views of one person: Stanton Glantz, director of the Center for Tobacco Control Research and Education at the University of California, San Francisco. To be sure, Glantz is a respected researcher and his perspective deserves a hearing. He’s also wrong on a lot of things. Four major myths he and the Mic piece present very much need to be debunked:

MYTH:  “The scientific community is beginning to see things differently, however. Its consensus: vaping is a scam.”

FACT: There is no consensus that vaping is a scam. If anything, the voices that oppose vaping as a harm-reduction option are being increasingly drowned out by rational applications of science.  In the United Kingdom, public-health authorities acknowledge that health vaping is 95 percent less harmful than traditional cigarettes. In Ireland, vaping now is seen as a crucial tool to help people quit cigarettes. Indeed, even in the United States, experts increasingly regard vaping as a valuable harm-reduction strategy. The only “consensus” lies in the Mic author’s refusal to include diverse views from the scientific community that aren’t represented by Glantz’s sole claims.

MYTH: “The most dangerous thing about e-cigarettes is that they keep people smoking cigarettes.”

FACT:  The Mic piece cites a 2012 study, co-authored by David Abrahms, to back up this assertion. However, he fails to include 2015 research from that same Abrahms making the case that e-cigarettes can assist some smokers’ efforts to quit greatly.  Furthermore, there is an entire body of international research that has illustrated to great effect how vaping has helped many quit or greatly reduce their smoking. Why this plentiful research is ignored in favor of Glantz’s unsupported assertion is unclear.

MYTH: “E-cigarettes deliver as much or more ultrafine particles as the ones found in cigarettes.”

FACT: Here, Glantz contradicts his own 2014 research, which stated:

It is not clear whether the ultrafine particles delivered by e-cigarettes have health effects and toxicity similar to the ambient fine particles generated by conventional cigarette smoke or secondhand smoke.

Did Glantz change his mind? He certainly didn’t publish any new discoveries on the issue. From this contradiction, we find a lack of consensus even among Glantz’s own beliefs, much less those of the scientific community at-large. It would be much more accurate to report what is empirically known and what remains uncertain.

MYTH:  “The e-cigarette industry … [has a] hold on adolescents.”

FACT: The National Institutes of Health reports teen use of e-cigarettes declined significantly in 2016, from 16.2 percent to 12.4 percent.  No one actually wants adolescents to begin consuming nicotine. But kids smoked in high school long before e-cigarettes were invented. Encouragingly, teen rates of tobacco use have been declining consistently ever since vaping became available. Wouldn’t it make sense to guide them toward less destructive options, rather than withholding access altogether?

As a gay man in the United States, I lived through the devastation and grief of the AIDS epidemic, and celebrated when treatment medications began to save the lives of many people who were dying of this deadly disease. At the same time, I’ve seen many of those same individuals and their loved ones consume nicotine through cigarettes out of a sense of habit, dependence and fear of the physical and emotional discomfort of quitting.

As we’ve seen HIV-related deaths in the United States drop to historical lows of around 12,000 a year, we have continued to see lesbian, gay, bisexual and transgender tobacco-related deaths remain stagnant at 30,000 per year.  When the technology and the empirical evidence overwhelmingly demonstrated an opportunity for vaping to save lives, I thought for sure our LGBT and supportive media would support such an advancement. Sadly, this has not been the case. And Mic.com isn’t helping.

Image by totallypic

Seventh Circuit strikes down Indiana’s protectionist vaping rules


Apparently we weren’t the only observers who thought the Indiana law regulating vaping products was a little over the top. A major reason why Indianapolis rated a grade of “D-” in the 52-city survey of local vaping rules that R Street released in December was a state law with some peculiarities.

We expect and support appropriately tailored regulation related to quality control of anything that is designed to be inhaled by humans. We also support bans on sales to minors and requirements for child-proof packaging. In that vein, I’m sure it appeared to most of the Indiana General Assembly who voted on the Hoosier State’s legislation in 2015 that its very specific cleanliness provisions were in line with what most food-service operations are required to maintain.

But earlier this week, the U.S, 7th Circuit Court of Appeals ruled portions of the law unenforceable against out-of-state manufacturers. They were a little just a little too specific to get a pass under the Commerce Clause’s protection of interstate commerce. The Indiana law required particular sinks and even specific cleaning products. It required manufacturers of vaping liquid to sign security contracts for five years that provide 24-hour video monitoring and high-security key systems. One telltale clue as to the draconian nature of the law was its requirement that manufacturing facilities have “clean rooms” that comply with the Indiana Kitchen Code.

I’m sure that you have guessed by now that there were only a few companies that could meet all of the Indiana requirements. Judge David Hamilton’s opinion stated that the Indiana law “is written so as to have extraterritorial reach that is unprecedented, imposing detailed requirements of Indiana law on out-of-state manufacturing organizations.” The decision noted that 99 percent of vaping-liquid revenue had come from out-of-state manufacturers before the legislation was enacted.

The security provisions are a special case, even beyond the impact of the lawsuit. Although the basic legislation was passed in 2015, an amendment to the law last year effectively granted a single Lafayette company monopoly power to approve security requirements for all manufacturers. It further set a deadline for permit applications that was backdated one week before the bill was signed into law. Involved lawmakers swear that state lawyers and the Indiana Alcohol and Tobacco Commission signed off on the language. But under the new law, only six manufacturers met the security requirements to allow Indiana sales, and four of those turned out—not so surprisingly—to bear Indiana home addresses.

The Republican-controlled Legislature will try again on this particular provision, which may or may not have to do with a rumored but unconfirmed FBI investigation. Speaker Brian Bosma, R-Indianapolis, has publicly stated that his House of Representatives leadership team will support a change to the language to solve this problem, and we can assume it will take up changes that need to be made because of the court decision.

Vaping has become exceedingly popular in this country, including in the Midwest. The out-of-state companies that challenged the Indiana law note in their lawsuit maintains that there are currently about 138 vaping shops or locations in the state, which produce annual sales of more than $77 million annually. Since the overwhelming majority of vapers are formers smokers who are looking to quit or cut down, this is good news. The evidence indicates that vaping is 95 percent less harmful than combustible tobacco cigarettes.

We are rooting for Indianapolis to rate a higher score in our next vaping friendliness survey.

Image by FabrikaSimf

Does Congress have the technology it needs to govern?

During its first two centuries of operations, Congress conducted business the old fashion way: with paper and pencil and face-to-face. Over the past 25 years, however, the legislative branch began—hesitatingly—using computer technology and the Internet.

How is this digital transition going? What has gone well and what need improved? And does Congress have the tech tools it needs to govern in the 21st century?


Meag Doherty, operations associate, OpenGov Foundation
Sasha Moss, technology policy fellow, R Street Institute
Christian Hoehner, policy director, Data Coalition
Daniel Schuman, policy pirector, DemandProgress
Kevin Kosar, governance project director, R Street Institute

Video: Orange County short-term rentals panel

Internet-based short-term rental companies such as Airbnb have built a burgeoning industry connecting homeowners with vacationers, who rent out rooms or entire houses for short-term stays. It’s a fascinating example of the New Economy, but this new business model has run up against stiffer opposition than expected.

On January 27th, 2017, the R Street Institute and the Orange County Register sponsored a panel discussion and breakfast with some key local thought leaders.

Hobson talks ‘Poképolicy’ on the Tech Policy Podcast

Last summer’s megahit game Pokémon Go was many Americans’ first introduction to the notion of “augmented reality,” in which artificial visuals are superimposed onto the real world. R Street Tech Policy Fellow Anne Hobson explored some of the policy implications of augmented reality in a recent policy study, including how it could affect cybersecurity, privacy, intellectual property and public safety. More recently, she sat down with TechFreedom’s Tech Policy Podcast to explore the issue in-depth. That show is embedded below.

FERC chair’s departure spells policy uncertainty and infrastructure delays


In his first week in office, President Donald Trump elevated Commissioner Cheryl LaFleur to serve as acting chairman of the Federal Energy Regulatory Commission. This action removed that designation from then-Chairman Norman Bay, who promptly issued his resignation.

Bay’s decision to leave is customary—most chairmen do not stay following demotion—but the surprising immediacy of the Feb. 3 effective date leaves FERC with only two sitting commissioners. That’s one short of the quorum required to issue orders. Commonly, if a quorum is at stake, a FERC commissioner will retain their position until a replacement is imminent. Bay’s unusually rapid departure leaves the agency’s policy apparatus and infrastructure-approval processes paralyzed.

The apparent motivation for the change was a perceived lack of alignment with the president’s agenda. It also suggests Bay never overcame the distrust of Republican leadership. During Bay’s confirmation, GOP leaders expressed concern that he would serve as a rubber-stamp for the Obama administration’s “extreme anti-coal agenda” On the contrary, Bay’s policy record proved fuel-agnostic, in addition to being otherwise quite consistent with market principles and supportive of fossil-fuel infrastructure expansion.

In an extensive resignation letter, Bay highlighted that FERC issued certificates for more than 1,000 miles of natural gas pipeline for 2016, the largest amount since 2007, and authorized two liquefied natural gas export facilities. It’s worth noting these actions came despite intense, disruptive grassroots opposition, as well as criticism from the U.S. Environmental Protection Agency asking that FERC expand its environmental reviews. Bay also improved market transparency and expanded education tools and other public resources. He modernized FERC staff’s data capabilities, drove robust analysis into decision-making and tore down silos between FERC offices.

Current and former FERC staff (including this author) attest to Bay’s commitment to economically sound market design. His genuine intent was to level the playing field. This manifested most notably in a notice of proposed rulemaking to reduce regulatory barriers to entry for energy storage and distributed energy resources. This departed thematically from some prior FERC policies that gave preferential treatment to certain resources (and still need correction). He also continued FERC’s price formation initiative, which epitomizes the appropriate role of a regulator – to foster competition and healthy markets. But these rules remain merely proposed; the Trump Administration’s choice for permanent FERC chair will determine their fate.

Going forward, the theme of FERC policy should be more one of refinement than course correction. In most areas, Bay’s agenda was consistent with conservative principles that support markets and transparency. But his otherwise pro-market legacy is overshadowed by his reputation as a backer of exuberant enforcement practices.

Enforcement policy marked Bay’s clearest disconnect with conservatives and has grown contentious within industry, policymaker and media circles. Criticisms were prone to hyperbole, such as The Wall Street Journal declaring Bay “Harry Reid’s personal prosecutor.” More thoughtful critiques emerged from scholars who were concerned with FERC enforcement’s legal processes and how it determined what constitutes “manipulative” behavior. LaFleur herself expressed concerns with FERC penalty guidelines and procedural aspects affecting investigation targets. This discrepancy with Bay may have tipped the political scales in her favor.

LaFleur’s policy stances matter little, as her chair status remains temporary until the administration fills other vacancies and presumably appoints a new chair. In the meantime, FERC lacks a quorum to advance a policy agenda.

It could easily take two to three months to confirm a new FERC commissioner. In the interim, FERC cannot act on major orders, rules and policy pronouncements. At least some routine business will continue under authority delegated to FERC office directors, which FERC may expand in the interim. But given pending rulemakings and infrastructure applications, a hobbled FERC cannot approve proposed mergers, including a major pending power plant purchase by Dynegy. Nor can it respond to pressing complaints or enact rule reforms slated to enhance price formation and enable market access for advanced technologies. Lack of a quorum also stalls FERC approval of natural gas infrastructure projects, including pipelines and LNG facilities.

It’s unlikely that delaying pipeline approvals several months would disrupt service, but it does add costs, as pipeline congestion increases the delivered price of natural gas to customers. In particular, extended delay may cause four major Appalachian pipeline projects to miss their in-service dates. This coincides with market forces driving a rebound in natural gas prices – perhaps the highest since 2014. That’s a tricky political calculus for a new administration promising lower energy prices, more infrastructure and less red tape.

The Trump administration should consider accelerating one commissioner nomination. As they mull candidates, it’s worth noting that LaFleur’s leadership, especially in an interim role, should mostly align with administration priorities and not raise red flags. It may be worth nominating the least-controversial candidate first, to restore FERC promptly under LaFleur’s watch. That would serve as a bridge to the permanent chair, whose confirmation process may face greater scrutiny and delay.

Senate leadership also needs to prioritize FERC restoration. Fortunately, Sen. Lisa Murkowski, R-Alaska, has warned of a FERC crisis and promised to move nominees rapidly to re-establish a working quorum. In addition to avoiding infrastructure delays, Senate Republicans should remain mindful that returning FERC to its full functional capacity is simply good governance. After all, the major pending rulemakings would, on balance, enhance market competition. They’d also reduce barriers and bolster compensation for some advanced clean technologies, which should appeal to Democrats. Dems should also note that natural gas expansion has led to large emissions reductions, and pipeline congestion causes the most customer headaches in the Northeast.

Promptly restoring FERC will require that the Senate avoid a drawn-out nominations process. This should be a bipartisan exercise in search of good governance. However, some on the environmental left have already voiced opposition to Trump’s nominees, even though they haven’t been announced. If nominations prove unworkably contentious, it’s plausible that a pro-market Democrat would make the best choice, even for the Republican agenda. This may swiftly cut through partisan debate and serve as the eventual replacement for Democratic Commissioner Colette Honorable, whose term ends in June (Republicans can only fill three of the five FERC spots). It would delay seating a Republican until the expiration of Honorable’s term, a price the GOP may be willing to pay to expedite FERC restoration. Regardless of the political calculus, expediting FERC nominations is a pressing economic need both parties must address.

Image by QiuJu Song

Why process matters in congressional appropriations



As recent legislative sessions repeatedly have ground to a halt with threats of government shutdown, we’re forced to wonder: what has gone so terribly wrong in the appropriations process? Peter Hanson of the University of Denver explores this question in his recent white paper for the Brookings Institution. For much of Congress’ history, discretionary spending was set by regular order, in which appropriations bills were debated and passed individually. Hanson both investigates how the “regular order” method of appropriations went extinct and ultimately calls to restore this time-honored process.

Regular order allowed the budget to be broken into bite-sized pieces, encouraging legislators to exercise greater control over spending. Today, by contrast, Congress has taken to bundling appropriations bills into thousand-page omnibus packages that are rammed through at the end of legislative sessions—often in the face of a looming government shutdown. Lawmakers hardly have time to read these voluminous bills, let alone exercise oversight.

Unsurprisingly, an inferior process leads to inferior results. As Hanson and political scientists like Matthew Green and Daniel Burns have argued, regular order helps encourage debate among legislators and reduces the risk of “substandard legislation” being stuck into appropriations bills. Furthermore, as Yuval Levin has noted, breaking up the budget plays better to the innate strengths of Congress, in that it allows individual lawmakers to exercise greater influence over discrete funding choices.

Hanson pins most of the blame for the collapse of regular order on the Senate, since its rules allow senators to engage in disruptive tactics like filibustering or attaching controversial amendments to bills in order to score political points. This leads Senate leadership to stifle debate by pushing “must-pass” omnibus bills under tight deadlines at the end of legislative sessions.

Hanson proposes four fixes for the appropriations process: reforming the filibuster to allow a simple majority vote to end debate on spending bills; allowing bills to be considered concurrently by the House and Senate; restoring earmarks on a limited basis, to help grease the skids to pass legislation; and shifting toward nonpublic deal-making to shield individual lawmakers from negative publicity.

While there’s room to debate the efficacy of these different proposals—such as whether restoring earmarks would really help pass more legislation—Hanson’s plea to restore congressional authority over the federal budget is timely. Given the well-documented diminution in power Congress has experienced vis-à-vis the executive branch in recent years, it’s time for our First Branch of government to reassert its leadership role.

Congress’ power of the purse is a good place to start; James Madison himself described it as “the most complete and effectual weapon” to be wielded by the people’s representatives. But even the most effectual weapon is useless when wielded incorrectly. Until Congress recognizes why process matters when it comes to appropriations, it will have trouble restoring its control over our government’s spending.

Image by mj007

Reforming the administrative state—and reining it in

kosar hoover

National Affairs released a special report earlier this month on comprehensive proposals to rein in the regulatory state. The report’s three authors — Hoover Institution Research Fellow Adam White, Manhattan Institute Senior Fellow Oren Cass and R Street Institute Senior Fellow Kevin Kosar–took part in a recent panel moderated by National Affairs editor Yuval Levin at an event the magazine co-hosted with Hoover. Video of the discussion is embedded below:

What Dow 20,000 looks like in inflation-adjusted terms


The Dow Jones industrial average closing Jan. 25 at more than 20,000 inspired big, top of the fold front-page headlines in both the Wall Street Journal and the Financial Times (although the story was on Page 14 of The Washington Post). The Journal and FT both ran long-term graphs of the DJIA, but both were in nominal dollars. In nominal dollars, the 100-year history looks like Graph 1—the DJIA is 211 times its Dec. 31, 1916, level.

This history includes two world wars, a Great Depression, several other wars, the great inflation of the 1970s, numerous financial crises, both domestic and international, booms and recessions, amazing innovations, unending political debating and 18 U.S. presidents (10 Republicans and eight Democrats). Through all this, there has been, up until yesterday, an average annual nominal price increase of 5.5 percent in the DJIA.

graf 1

Using nominal dollars makes the graphs rhetorically more impressive, but ignores that, for much of that long history, the Federal Reserve and the government have been busily depreciating the dollar. A dollar now is worth 4.8 percent of what