Governor, legislators spend more and ignore obstacles to reform


Legislators announced a budget deal last week that spends a record $125 billion in the general fund. But most interesting isn’t what’s in the deal, but what isn’t.

There’s plenty of new spending, of course, but not so much that it outpaces the rate of inflation. There are controversial “trailer” bills that attempt to change the rules in an ongoing recall election and take away power from elected members of the Board of Equalization, the state’s tax board. Missing are any attempts at serious reform of existing government programs or ways to stretch the already hefty tax dollars Californians send to Sacramento.

The budget’s authors talk quite a lot about funding important priorities, especially the public-education programs that consume an awe-inspiring 43 percent of the general fund. Yet Gov. Jerry Brown and the Democrat-dominated Legislature refuse to confront the main reason such programs typically are so costly and ineffective: public-sector unions.

These unions are so powerful that they stifle cost-saving reforms in every conceivable area of government – from the prison system to policing to transportation programs to the public school and college systems. Union work rules don’t allow for experimentation and creativity, or even the firing of poorly performing employees. The state is thus left with just one approach: throwing more money at the problem.

This is why every year’s budget kerfuffle centers on figuring out ways to come up with more money to spend in the exact same ways. The only difference this year is, because of Democratic supermajorities in both houses of the Legislature, the state now plans to spend more than ever. What else would you expect, given that the minority party has no power to thwart such efforts?

The investigative news site CALmatters provides perhaps the best example of the disconnect between higher spending and better outcomes, noting in a June 18 report that there’s no evidence the tens of billions of dollars the state has pumped into failing schools under its new public education system have done much of anything to help the most disadvantaged students. The investigation found “the biggest districts with the greatest clusters of needy children found limited success with the policy’s goal: to close the achievement gap between these students and their more privileged peers. Instead, test scores in most of those districts show the gap is growing.”

The same is true for myriad programs, but as the single largest chunk of the budget, any failures in the K-14 education system certainly have the deepest financial ramifications. As I reported recently for the California Policy Center, while voters in the Los Angeles Unified School District and elsewhere are supporting candidates who back expanded access to charter schools for poor children, state legislators are backing legislation pushed by the California Teachers’ Association that would make it much harder for locals to start such schools.

Charters operate with less funding than comparable school districts, yet often (but not always, of course) show remarkable progress in closing the achievement gaps that aren’t being closed by truckloads of new state spending. Think of it this way: If a system is failing, there’s little chance that giving the same agencies more money to do things in the same way will yield significantly different results. It’s obvious, but not to legislators or Gov. Brown.

The budget deal also includes a provision that lets the state borrow $6 billion from a short-term investment fund to pay down some of California’s growing pension debt. It’s another example of the state’s money-dumping approach to a massive financial problem. Instead of taking aim at overly generous pension formulas, or myriad pension-spiking and disability abuses, the state is borrowing money at low interest rates from one account and putting the money into another account (that is supposed to earn higher rates) to chisel away at some pension debt.

Some would compare this to a homeowner borrowing money from a low-interest home-equity account to pay off a higher-interest credit card. But there’s a disturbing downside to the governor’s approach here – namely that it shifts more of the costs and risks from public employees to the state’s taxpayers. The plan lets the California Public Employees’ Retirement System (CalPERS) keep its investment predictions artificially high, as the higher the predicted rate of return, the lower the predicted liability. The result—as observed recently by David Crane, a Stanford University lecturer who had served as Gov. Arnold Schwarzenegger’s pension adviser—is that “using special fund cash to finance pension contributions would reward CalPERS’s board for keeping normal-cost contributions – the only pension costs shared by employees — unreasonably low.”

The budget plan also increases spending on Medi-Cal, expands benefits that poor people receive under the state Earned Income Tax Credit, expands funding for the two university systems (although it punishes the University of California Office of the President, following the recent scathing audit about its spending practices from the state auditor), appropriates new funding for water projects and includes a long list of expanded spending for myriad programs.

Gov. Brown said the budget is “balanced and progressive,” but columnist Dan Walters gets to the heart of the problem: “The budget does little or nothing to whittle down that burden on future generations of taxpayers.”

Again, it’s not what the budget does, but what it doesn’t do. It doesn’t deal with debt, or even try to reform the education system. It doesn’t try to reform any existing programs. How could it? That would mean tackling the elephant in the room, the public-employee unions who view government not as a service provider but as a jobs program. Until California officials decide to tackle that problem, expect more spending, higher taxes and no improvement in the state’s long-term fiscal health or the way it provides the public with services.

Image by underverse


FDA misinterprets massive victory on teen smoking


As detailed this morning by the Food and Drug Administration, cigarette smoking by U.S. high school students has been cut in half since 2011—from 15.8 percent to 8.0 percent—a remarkable and previously unanticipated public health victory.

Unfortunately, it appears federal authorities may be misattributing the cause. In his announcement earlier today, FDA Commissioner Scott Gottlieb attributes most, if not all, of this reduction in smoking to a federally sponsored program that has only been in place since 2014. Despite substantial evidence in federally sponsored surveys in the United States and abroad showing that remarkable reductions in teen and adult smoking have been concurrent with the increasing popularity of e-cigarettes, the FDA announcement makes no reference to the possibility that much, if not most, of the recent reductions in teen smoking may be attributable to e-cigarettes.

In fact, Gottlieb urges continuing efforts to reduce teen use of all nonpharmaceutical nicotine delivery products, while endorsing expanded efforts at smoking cessation that rely on the pharmaceutical nicotine gums, patches and other products that have proved to be of only marginal effectiveness over the past four decades.

This public health victory is too important to leave to chance and guesswork. If Commissioner Gottlieb has evidence to support the claim that The Real Cost campaign “has already helped prevent nearly 350,000 kids from smoking cigarettes since it launched in 2014,” he should present it to the public. Regulators and public health authorities also should present and discuss the evidence for and against the possibility that the availability of e-cigarettes and related vapor products may, in fact, have played a major role in securing these reductions in smoking.

This is not an academic question.  Recently promulgated regulations from the Gottlieb’s own FDA threaten to eliminate more than 99 percent of e-cig products from the marketplace before the end of 2018, including all or almost all of the vape-shop component of this industry. The limited data available strongly suggests that the vape-shop products—with their ability to customize devices, flavors and strengths of nicotine to satisfy the preferences of each smoker, and modify the flavors and strength of nicotine over time to prevent relapse to cigarettes—may be more effective than the mass-market products in achieving and maintain reductions in smoking in both youth and adults.

Image by Sabphoto


Harm reduction is about making better choices, not perfect ones


Dr. Mark Boom, president and CEO of the Houston Methodist hospital system in Texas, suggests in a recent piece in The Hill that proponents of vaping are simply ignoring evidence that vapor products are not 100 percent safe.

Of course, people in the vaping community do not think that e-cigarettes are 100 percent safe. And if these products were found to increase the incidence of teen smoking of combustible cigarettes, we don’t want that either.

However, Boom appears to misunderstand the philosophy of harm reduction. Boom no doubt would encourage his patients who use intravenous drugs to, at the very least, use clean needles, rather than sharing. If he did not, he would be grossly abusing his privileged position as a healthcare authority. Similarly, applying a harm reduction philosophy by encouraging smokers to switch to e-cigarettes could save the vast majority of the 480,000 lives taken by combustible cigarettes every year.

As Boom rightly points out, e-cigarettes do, in fact, contain toxins. These are, however, at a very low concentration in the excipients – the products that make up the aerosol suspension that delivers the active ingredient of nicotine. What he neglects to add is that the excipients in nicotine liquid are strikingly similar to those in asthma inhalers. We certainly wouldn’t suggest to an asthma patient to forgo their medication because they are also inhaling toxins.

As a pharmacologist, I would encourage every person who ingests toxins to stop doing so. Of course I would. But my years in addiction research have made clear that you cannot simply tell someone to not pick up that cigarette, syringe or beer. Until that is possible, we have to encourage people to make better choices – which, unsurprisingly, is very easy to do.

When people do things we don’t approve of, we often write them off as not caring about their own health or personhood. Having worked at community organizations that distribute clean needles to curb transmission of infectious disease, naloxone to reverse overdoses and HIV drugs to prevent new infections, it is clear that people do recognize the risks they take everyday and embrace opportunities to reduce consequences associated with risky behaviors.

Image by Grey Carnation


Setting the record straight on copyright modernization


There’s a lot to be said for the adage that “we shouldn’t let the perfect be the enemy of the good.” While true in many situations, it also requires there be enough “good” to be worth the effort you’re engaged in, and not wasting energy better deployed doing something else.

In a recent blog post on Truth on the Market, Kristian Stout of the International Center for Law and Economics takes issue with my framing of a bill that would require the register of copyrights—the person who heads the Copyright Office within the Library of Congress—to be a presidential appointment. I should add the proposal comes during a time when President Donald Trump is considerably behind in selecting and confirming his appointees to a broad range of executive branch positions.

Unfortunately, Stout mischaracterizes and misreads my position. In my TechDirt piece, I described both points of view about the bill, writing that “opponents argue the bill will make the register and the Copyright Office more politicized and vulnerable to capture by special interests.” Stout takes this out of context and represents it as my position, rather than a description of what others have said.

There are a number of other issues with Stout’s piece, not all of which are worth addressing. But I will tackle the main ones.

It’s true, as Stout claims, that the idea for making the register a nominated and confirmed position has been under discussion for several years as part of the House Judiciary Committee’s copyright review, but so were a lot of other things that didn’t come to fruition. My point is not that this idea is totally new, but that the impetus for the bill to be rushed through now is motivated by the political dynamic between Congress and Librarian of Congress Carla Hayden, as well as her removal last year of then-Register of Copyrights Maria Pallante. Stout attests Hayden’s nomination was not politicized, when in fact, it was. The Heritage Foundation, among other conservative groups, argued against her confirmation. Heritage Action even urged senators to vote “no” on her nomination, a position with which we disagreed.

To set the record straight — I don’t think it’s a terrible bill. As I’ve argued in TechDirt and The Hill, there are some reasonable arguments in its favor. There are also some plausible arguments against it. I simply don’t think it does much to move the ball either way.

The main point of the bill, according to many of its proponents, would be to make the Copyright Office position more politically accountable. In theory, with congressional input, stakeholders on all sides would have an opportunity to weigh in on who gets confirmed for the position. This could limit edge cases where there is a truly awful candidate. But the Senate rarely, if ever, rejects presidential appointments who are otherwise broadly qualified — particularly for what is not a Cabinet-level position. And there wouldn’t be many groups capable of mounting a successful opposition fight over this position, as they might over a Supreme Court seat (even then, it’s rarely the primary factor). Even for Heritage, likely the most powerful conservative group in Washington, key-vote scoring against Hayden in a Republican-controlled Senate only got them 18 votes.

This, in itself, is not much of a justification for a bill.

One of the key points of Stout’s argument for the legislation is that: “Separating the Copyright Office from the Library is a straightforward and seemingly apolitical step toward modernization.” But changing who appoints the register shouldn’t be conflated with separation or modernization. Indeed, the librarian of Congress still has final authority over all of the office’s substantive regulatory powers. Changing who picks the register also has nothing to do with meeting the challenges of modernizing the office’s information technology infrastructure. If an independent office is what you want, this bill isn’t that.

For the record, we at R Street are not necessarily opposed to an independent (or relocated) Copyright Office. Some scholars, including former Register Pallante, make a plausible case that the systemic bureaucracies of the Library are part of what’s holding the Copyright Office back. But it’s also hard to separate the Library’s well-documented IT problems from the decadeslong tenure of the previous librarian, James Billington. Additionally, there are IT modernization challenges at every level of the federal government, including independent agencies, and it may be worth giving the new librarian a chance to fix them.

At heart, the location of the Copyright Office is a complex question of public administration that is worthy of deep consideration and review. An immediate step I have suggested in conversations with colleagues is to have Congress ask the National Academy of Public Administration to conduct a review of the internal structural challenges of the Library and its component agencies (as it did for the PTO in 2005). This would inject a much-needed dose of objectivity into a discussion that has unfortunately served as another proxy battle between the entrenched sides of the intellectual property debate.

In his conclusion, Stout makes an excellent point: “Sensible process reforms should be implementable without the rancor that plagues most substantive copyright debates.” I agree. Regardless of how strong you think our nation’s copyright laws ought to be, you should be in favor of making the system’s core functions work better. This bill will do little, if anything, to advance that goal. I look forward to working with stakeholders on all sides, including Stout, to find solutions that do.

Image by Jirsak


PACE Act would prosecute teen sexting as kiddie porn


Crimes against children, particularly those that involve sexual exploitation, are beyond the pale. But while society needs to make sure it protects children from sexual abuse, recent legislation passed by the U.S. House could cause more problems than it solves – hurting minors, expanding minatory minimums and creating redundant federal authority where there already are similar laws at the state level.

By a 368-51 margin, the House voted May 25 to approve H.R. 1761, the Protecting Against Child Exploitation (PACE) Act of 2017. The bill is intended to strengthen federal laws dealing with the production and distribution of child pornography by making the transmission of sexual images of minors a federal crime. The measure has moved on to the upper chamber, where it will be considered by the Senate Judiciary Committee.

While the bill’s purpose is to punish child predators, its unintended consequence will be to create more criminals out of teenagers whose main crime is simply lacking common sense.

As written, the law could apply to minors who send sexual images to other minors, or what is commonly referred to as “sexting.” The House-passed bill provides no exemption or provision to deal with minors who engage in sexting, meaning they could be subject to a mandatory minimum sentence of 15 years in prison and lifetime registration as a sex offender. Because of how broadly the text is written, even a teenager who merely views a sexual image or requests that one be sent could be subject to the mandatory minimum.

Sexting among teenagers increasingly has become the norm. While the phenomenon is worth a larger discussion, most would agree that locking teenagers up for 15 years is not the best way to handle the situation. Few believe these minors are committing crimes on a par with actual child predators. They should not be treated the same way under the law.

Teenagers are still minors in the eyes of the court. By creating an inflexible law that cannot take into account the ages of those involved, the law will force the courts to punish minors for having poor judgment. For numerous other crimes, the court system is purposely designed differently when it comes to how and whether to prosecute and sentence minors. Judges are given more tools to keep them out of jail and without criminal records. By retaining local jurisdiction, communities could respond more effectively to offenders and victims, as well as to the community at large. Child pornography laws should protect children from terrible acts, not punish teenagers for lapses in judgement.

Such concerns could have been addressed in the PACE Act, were it not for pure laziness on the part of the House of Representatives. The bill was passed without any hearings or input from experts, and approved as members fled Washington for their Memorial Day recess. The American people deserve better that.

There is still hope that the Senate will take notice of these issues. Law enforcement at both the state and federal level already have multiple tools at their disposal to prosecute child predators. This expansion of federal power is nothing but Congress creating a solution to a problem that did not exist.

Image by nito


Dual-class shares and the shareholder empowerment movement


The shareholder empowerment movement has renewed its effort to eliminate, restrict or, at the very least, discourage use of dual-class share structures—that is, classes of common stock with unequal voting rights—in initial public offerings. Of particular interest to the movement, which is made up primarily of public pension funds and union-related funds that hold more than $3 billion in assets, was the recent Snap Inc. IPO that sold nonvoting stock to the public, a first for IPOs with dual-class shares.

Typically, a company will issue a class of common stock “ordinary shares” to the public that carry one vote per share, as Facebook Inc. did in its IPO, while reserving a separate “super-voting” class that provides founders like Marc Zuckerberg with at least 10 votes per share. This structure allows the founders to maintain control of the company without having to own the majority of outstanding common stock.

Even though it offered no voting rights in the shares sold to the public, the Snap IPO was a huge success. Snap priced its IPO at $17 per share, giving it a market valuation of roughly $24 billion. The book was more than 10 times oversubscribed and Snap could have priced the IPO at a price of up to $19 per share.

The Council of Institutional Investors, the trade organization that represents the shareholder empowerment movement, has asked the S&P Dow Jones Indices, MSCI Inc. and FTSE Russell to exclude Snap Inc. and other companies with nonvoting stock from their indices unless they include extremely restrictive provisions, such as maximum sunset provisions—triggers that would terminate the super-voting characteristics of the founders’ shares—of three to five years. Moreover, consistent with the CII’s general policy, the letters the council sent also advocate for a forced conversion of all dual-class share structures to one-share, one-vote, unless the majority of ordinary shares vote to extend the dual-class structures for a maximum of five years.

The movement’s advocacy is not confined to those IPOs with dual-class shares listed on the U.S. stock exchanges. It also is attempting to persuade the Singapore stock exchange not to allow dual-class share structures of any kind.

If the movement is successful, this shift would not be trivial, as many of our most valuable and dynamic companies have gone public by offering shares with unequal voting rights. Besides Snap and Facebook, other companies that have gone public with dual-class shares include Alphabet Inc. (Google); LinkedIn (acquired by Microsoft for $26 billion in 2016); Comcast; Zoetis Inc.; Nike, Inc.; and Alibaba Group Holding Ltd. Two of these companies, Alphabet and Facebook, rank in the top 10 in the world based on market valuation. Berkshire Hathaway Inc., a company that also uses a dual-class share structure, also ranks in the top 10, although it only started using the structure after Warren Buffet bought control of the company.

Public companies with dual-class share structures have an aggregate market value of close to $4 trillion. As reflected in their market valuations, they are some of our most important companies, helping to fuel the growth of the economy.

The movement’s vigorous response to Snap’s hugely successful IPO was unsurprising. The CII, since its founding in 1985, has promoted a “one-share, one-vote” policy as one of its bedrock principles. But this policy of “shareholder democracy” should not be confused with political democracy, where each person gets one vote. In shareholder democracy, voting power is assigned according to property ownership – i.e., how many shares the person or entity owns. Dual-class share structures clearly violate the CII’s policy of shareholder democracy and are an obvious threat to the movement’s power. That is, the more public companies that utilize a dual-class share structure, the more controlled companies exist and the less power the movement has.

Most importantly, the movement’s advocacy comes into strong conflict with what many believe to be the great strength of our system of corporate governance: the private ordering of corporate governance arrangements, with dual-class share structures being an optimal result of that ordering. Consistent with this understanding, NASDAQ Inc. recently declared:

One of America’s greatest strengths is that we are a magnet for entrepreneurship and innovation. Central to cultivating this strength is establishing multiple paths entrepreneurs can take to public markets. Each publicly-traded company should have flexibility to determine a class structure that is most appropriate and beneficial for them, so long as this structure is transparent and disclosed up front so that investors have complete visibility into the company. Dual class structures allow investors to invest side-by-side with innovators and high growth companies, enjoying the financial benefits of these companies’ success.

At its core, the shareholder empowerment movement advocates shifting corporate decision-making authority to shareholders, and thus away from boards of directors and executive management, the most informed loci of corporate authority. Shareholder empowerment, not maximizing shareholder wealth, is the movement’s objective. This movement must be stopped from opportunistically interfering with the use of dual-class share structures in IPOs.

Image by create jobs 51


How Congress became colonized by the imperial presidency


Ever since Arthur Schlesinger’s 1973 book coined the phrase, the so-called “imperial presidency” has been a perennial topic of our national political discourse. At a time when the American branches of government are separate but unequal, the seven essays collected in The Imperial Presidency and the Constitution trace when fears of an imperial presidency first arose, the extent to which such fears are justified and what can be done about it.

Adam J. White’s contribution, “The Administrative State and the Imperial Presidency,” cautions not to conflate the “imperial presidency” with the administrative state itself. As White points out, the administrative state is “first and foremost a creation of Congress,” and “to at least some extent, a necessary creation.”

By contrast, the imperial presidency refers to the power the president wields through his office. While this power can be channeled and enhanced through the apparatus of the administrative state, an imperial presidency also “can restrain the administrative state, as in the Reagan administration … and, less obviously, the administrative state can restrain an imperial president.”

In modern times, of course, the power of the presidency and the administrative state have grown in tandem. “The president wields executive power broadly to expand the administrative state, and the administrative state acts in service of the current president’s agenda,” White writes.

After various failed attempts by Congress itself to act as an administrative body during the Articles of Confederation era, the U.S. Constitution provided for an energetic executive, which Alexander Hamilton described as “essential to the steady administration of the laws.” Despite this, the Constitution offered little in the way of an affirmative vision of the administrative bureaucracy, an omission some scholars have referred to as “the hole in the Constitution.”

Although there were earlier antecedents, Congress’ creation of the Interstate Commerce Commission in 1887 marked the modern administrative state’s arrival. Ove time, the ICC’s powers were enhanced by Congress to encompass both judicial and legislative powers, given its ability to both set rates and adjudicate disputes. During the Progressive Era and through the New Deal, more administrative agencies were built on the ICC model, including the Federal Trade Commission and Federal Communications Commission.

Importantly, these agencies were distinct from the traditional executive branch departments and thus operated “outside of the direct oversight of the president,” White notes. Progressive policymakers—starting with some in the Franklin Roosevelt administration—quickly grew frustrated with the agencies’ ability to “impede an energetic liberal president’s regulatory agenda.”

Years later, conservatives also began to bemoan the independent nature of certain agencies. As the Reagan administration sought to cut back on the regulatory state, it attempted to increase the president’s power over the administrative state through mechanisms such as centralized regulatory review under the Office of Information and Regulatory Affairs. Since Reagan, presidents of both parties increasingly have embraced greater presidential control over federal agencies. Some used that control to expand the administrative state’s power, while others have sought to curtail it.

The “most straightforward” way to shrink the administrative state, White argues, “would be for Congress to do the work of taking delegated powers away from the agencies, by amending statutes.” Since many legislators prefer to delegate their power in an effort to avoid responsibility, White views this option as unrealistic.

This leads White to the “second best option,” which is to pass some form of broad regulatory reform legislation that revamps the processes through which agencies enact rules. He mentions the REINS Act and the Regulatory Accountability Act as two possible options. R Street actually has identified a whole menu of options from which Congress feasibly could choose.

More broadly, White points out that using the imperial presidency as a means to control and direct the administrative state is no longer an effective mechanism to rein it in. Rather, it’s far past time that the other branches assert themselves and join the fray. One possibility is for the judicial branch to revisit its doctrines that grant significant deference to federal agencies.

In many ways, Andrew Rudalevige’s contribution, “Constitutional Structure, Political History and the Invisible Congress,” picks up where White’s essay leaves off. When the system of separated powers works as intended, the legislative and executive branches operate as “rivals for power,” making their relationship contentious, rather than cooperative. Although the Founding Fathers were more concerned about the legislature accreting power than the executive, Rudalevige’s chapter retraces how both structural and political factors have created the exact opposite dynamic.

Rudalevige lays out an obvious—but often underappreciated—truth: the president has a built-in advantage in that he is just a single person. By contrast, Congress must function as a 535-member conglomeration of legislators spread across two different chambers and hailing from different political parties and geographical regions. Given that each member carries “their own localized electoral incentives,” they will “rarely find it in their interests to work together, much less to confront the executive branch.”

Another factor Rudalevige pinpoints for Congress’ decline is the rise of political polarization. Politics has increasingly become a team sport: “A vote against presidential overreach is now seen by the president’s party colleagues as damaging to the party brand, and thus to their own careers.” The result is that legislators are more likely to toe the party line in pursuit of short-term policy victories, rather than vote to strengthen Congress as an institution.

Rudalevige also highlights how modern travel has allowed congressmen to transit back-and-forth from their home districts to Washington with relative ease. This has led to the rise of the “Tuesday-Thursday club of drop-in legislators,” who spend more time pressing the flesh with donors and constituents back home than doing the hard work of hammering out legislative compromises. One option is for Congress to extend its work weeks, which could increase the amount of floor time available to conduct legislative business.

Exercising more effective oversight doesn’t just mean finding more time; it also requires more capacity. Rudalevige cites R Street’s Kevin Kosar, who has chronicled the decline in congressional staff and pay levels over the past 40 years. Beefing up congressional staff, as well as support systems like the Congressional Research Service, would help address this deficiency.

Other possibilities include forming new institutions such as a Congressional Regulation Office—as proposed by Kosar and the Brookings Institution’s Phillip Wallach—to provide independent cost-benefit analyses and retrospective reviews of regulations. A final idea—and one long advocated by policy wonks—is a return to “regular order” budgeting, in which Congress breaks the federal budget into bite-sized pieces rather than relying on last-second, thousand-page omnibus spending bills to keep the government’s lights on.

While all of these ideas are available and ready for the picking, Rudalevige admits that “current returns are unpromising” that Congress will actually implement any of them. Nonetheless, he’s correct in warning that “the matter demands our attention even so.” Let’s hope Congress—and the American citizenry—heeds his call.

Image by Ed-Ni Photo


How executive ‘detailees’ could help ease Congress’ staffing problems

Capitol Building

It is becoming more widely acknowledged that Congress has a staffing problem. While the executive branch employs more than 4 million people, the legislative branch has only about 30,000. This number includes personnel toiling for agencies that do not readily come to mind as legislative, like the Government Publishing Office, the Architect of the Capitol and the U.S. Capitol Police.

While congressional capacity advocates shout for more funding and personnel to be allocated to the legislative branch, political scientists Russell Mills and Jennifer Selin examine the use of an often-overlooked stream of expertise available to congressional committees: federal agency detailees. Detailees are executive agency personnel with a particular policy mastery who are temporarily loaned out to congressional committees. The typical detailee assignment runs one year.

Hill operators and observers have long known policy expertise resides primarily in congressional committee staff. Compared to House and Senate personal office aides, committee staffers typically have more experience and narrower portfolios, both of which enhance the abilities of committees and their members to conduct oversight, draft legislation and develop fruitful lines of communication with relevant agency stakeholders.

However, as Mills and Selin point out in a recent piece in Legislative Studies Quarterly, there are only about half as many committee staff as there were in 1980, while inflation-adjusted pay levels have fallen 20 percent for many committee aides. This reduction in resources has hampered committees’ oversight capabilities, in addition to abetting the centralization of policymaking in leadership offices or its complete delegation to the executive branch.

House versus Senate committee staff, 1977-2014

house v senate staff

SOURCE: Russell Mills and Jennifer Selin, 2017

Mills and Selin argue detailees offer at least three specific benefits to supplement Congress’ legislative and oversight responsibilities:

  1. Detailees provide additional legislative support. Though committee staffers are usually issue specialists, “detailees often have specialized, expert knowledge of a policy, [and] they are able to provide awareness more traditional congressional staff may not have.” Moreover, given their personal experience within the agencies, detailees offer committees important insight into the decision-making processes and likely agency responses to potential congressional action.
  1. Detailees assist with executive branch oversight. “The process for securing information through requests directly to a federal agency is slower and involves agency coordination with the presidential administration. Detailees provide a way around these problems.” Simply having agency contacts and being able to connect committee staffers directly to those agency personnel most likely to respond quickly with accurate information can expedite the frustratingly slow information-gathering process vital to conducting effective congressional oversight.
  1. Detailees supplement interest-group engagement. In developing policy, committee staffers spend much of their time meeting with relevant policy stakeholders. “Committee staff routinely assists members of Congress by meeting with interest groups to gather their input for legislative initiatives as well as to hear their objections or support for actions taken by executive agencies.” Detailees provide the committee more, and different, stakeholder contacts established from the agency perspective, which allows for better information filtering and a more informed assessment of legislative potential.

Finally, and importantly, Mills and Selin point out that use of detailees is a rare win-win for both the legislative and the executive branches. The benefits to Congress are clear: committees gain expert-level staffers with experience and connections to the agencies under the committee’s purview, all on the agencies’ dime. Sen. Susan Collins, R-Maine, has noted:

These detailees apply their expertise in researching issues, staffing hearings, and working on legislation. In return, they gain valuable experience, which develops their careers and benefits their agencies.

The gains for the executive branch are less intuitive. After all, the agency loses a competent staffer who then offers Congress firsthand insight into agency operations, even potentially providing increased oversight to the very agency from which the staffer originated.

But Mills and Selin note that, from qualitative interviews they conducted with current and former detailees, they discovered that “detailees gain experience in the legislative process, can represent the interests and perspectives of the agency, and give the agency a conduit to committee decision making.”

In other words, just as detailees provide insider information to committees on agency operations, agencies profit from their detailees returning to the agency with intelligence on committee decision-making, policymaking and oversight capabilities. All of which our personnel-strapped national legislature badly needs.

Five years of R Street


Five years ago today, Deborah Bailin, Christian Cámara, Julie Drenner, R.J. Lehmann, Alan Smith and I resigned our jobs at the Heartland Institute over a horrifically ill-advised billboard advertisement and began a new think tank called R Street. Tonight, we’ll celebrate our fifth anniversary.

We’re now almost 40 strong and have a budget about 10 times that of our first year. In honor of our anniversary, here are five bits of trivia about R Street that I like to share:

  1. R Street’s first hire was Erica Schoder, now our senior vice president for operations. Our first office, previously the Heartland Institute’s Washington office, was a converted art gallery above a vintage clothing store.
  2. Some other names we considered were the Metis Institute (after the Greek goddess of common sense) and JuneFirst (after the day we officially opened). Our offices were near R Street and R is the first mostly residential street off Connecticut Avenue, which is arguably the main street in Washington. So it’s the place where real life begins in the nation’s capital.
  3. One huge advantage of the name R Street was that we could get the short URL org. That’s actually a big deal. It makes our email addresses much easier to type. Many other think tanks that have started recently have long and unwieldly URLs. We don’t.
  4. To my knowledge, we remain the only right-of-center think tank that both reimburses bike sharing and maintains a gender-identify nondiscrimination policy. I’m a cyclist and support civil rights protections for the gender nonconforming. But I’d argue that both policies are simply grounded in common sense.
  5. We believe that pirates are much cooler than ninjas. By a lot.

Image by Africa Studio


Reports of the taxi industry’s death have been greatly exaggerated


Co-written with Jonathan Haggerty 

It seems like nearly every time ridesharing is brought up in New York City, someone will inevitably bring up the dramatic decline in taxi medallion prices. Dubbed the “Uber effect” by American Enterprise Institute scholar Mark Perry, the theory is that increased competition from companies like Uber and Lyft has eroded the legal monopoly that taxi medallion holders previously exerted in the on-demand automobile transport market.

By competing against this once isolated market, transportation network companies like Uber and Lyft have made these medallions significantly less valuable. One proxy for this decline can be found in share prices of Medallion Financial Corp., a publicly traded consumer and commercial lending firm that is a major creditor in the taxi medallion lending business. When looking at the period from 2013 to 2016, the decline certainly looks precipitous:


This may not be the complete story, however. After all, the stock price may vary depending both on the specific quality of loans the company issues, its underlying cost of capital and on general market confidence. Furthermore, the stock price doesn’t make any distinction across the numerous categories of medallion ownership.

To the extent that news reports cite changes in the actual market value of a medallion, they usually do so anecdotally, comparing the peak value in 2014 of more than $1 million to the current trough of under $300,000.

Given the clamor and potential policy implications, a more detailed analysis seemed appropriate. We examined medallion price trends over time and differentiated across the different medallion categories. NYC’s Taxi and Limousine Commission compiles monthly records of medallion transactions for each of six categories: Individual unrestricted, handicap accessible and fuel alternative, as well as corporate (minifleet) unrestricted, accessible and fuel alternative. Unrestricted cabs are the general purpose yellow taxis that everyone thinks of, handicap accessible are cabs specially retrofitted to allow persons with disabilities easier access, and fuel alternative cabs have specific fuel requirements titled towards being more environmentally friendly.

The primary breakdown is between individual and minifleet. Where an individual medallion owner has to spend a minimum number of hours per year (usually the equivalent of 200 separate nine-hour shifts) driving the cab, a minifleet owner can lease out taxis to other drivers.

By far, the largest categories are the individual and minifleet unlimited licenses, and the general decline here tracks fairly well with Medallion Financial’s stock price:


Immediately we can see that there is a clear and substantial price premium for minifleet licenses over individual licenses. This makes sense intuitively. A license with strict personal driving requirements is going to be more restrictive on your time, and less valuable, than one without. Another factor that stands out is how messy the data is, with transfers at price points both significantly cheaper and significantly more expensive than the average in any given month. Unfortunately, it’s difficult to tell whether this was an issue with the NYC taxi commission’s data recording or whether these were due to external factors, like family transfer discounts or business liquidations.

However, it is important to recognize that the towering price high in 2014 was spurred partially by fleet owners borrowing against the rising value of the medallions they already owned to finance further purchases. So while medallion prices are undoubtedly dropping, it may look worse because prices were experiencing a bit of a bubble in the first place. Indeed, a former head of the TLC stated in April “the (taxi) industry’s performance has not been as bad as the decline in medallion prices would suggest.” In other words, don’t mistake the price of medallions with the health of the industry overall.

Another obvious factor here is the decrease in liquidity since 2014. One sale in March and two in February of 2017 means one of two things are happening: either medallion owners can’t find buyers, or owners are holding on because they view a price rise or stabilization on the horizon. The prospect of a bailout could keep buoying prices, while easing restrictions on medallion transfers has increased the potential pool of buyers.

Unfortunately, there were so few alternative fuel licenses released or transferred that there was not much data to analyze. Handicap accessible licenses, however, had a more interesting story to tell:


Here you can see that the handicap accessible licenses have actually appreciated in value over the same timeframe. (If the graph looks funky with the straight lines, that’s due to the initial auctions where these licenses were sold.) This is not an apples-to-apples comparison, because we have so little data post-2014, but the total lack of sales (for minifleet accessible) may be an indication that it’s not an asset worth liquidating.

One reason for this may be that Uber partners with cab drivers who own these handicap accessible licenses to help provide rides on their platform to users with disabilities. It seems intuitive then, that these specific medallions would continue to hold value.

But perhaps the most important factor in all this is the total size of the market. The market share of taxis has shrunk with the emergence of Uber and Lyft, but the overall size of the market is larger today:


Note that taxi trip volume has begun to level out in late 2016 and 2017. Taxis can coexist with TNCs in some markets, especially in densely populated cities where the value of a street hail is higher.

Put all of this together, and it appears the reports of taxi death have been greatly exaggerated. While some form of the Uber effect certainly exists, insofar as general medallion prices are concerned, the decline is not quite as precipitous as some have reported and taxi ride volume is not disappearing overnight. Furthermore, the future price of all these medallions likely will be more dependent on the success or failure of autonomous vehicles than on competition from ridesharing services from here on out.



The data we compiled for the piece can be found here.

Image by Cameris

Even without Durbin Amendment repeal, Congress should pass the CHOICE Act


The following post was co-authored by R Street Outreach Manager Clark Packard.

House Financial Services Committee Chairman Jeb Hensarling, R-Texas, has done the yeoman’s work of putting together a host of fundamantal conservative reforms in the CHOICE Act. Although repeal of the Durbin amendment would have been a positive, pro-market reform, Congress should pass the bill even if this repeal is not included.

The most important provision of the bill allows banks the very sensible choice of maintaining substantial equity capital in exchange for a reduction in onerous and intrusive regulation. This provision puts before banks a reasonable and fundamental trade-off: more capital, less intrusive regulation. This is reason enough to support the CHOICE Act. Its numerous other reforms also include improved constitutional governance of administrative agencies, which are also a key reason to support the bill.

Accountability of banks

The 10 percent tangible leverage capital ratio, conservatively calculated, as proposed in the CHOICE Act, is a fair and workable level.

A key lesson of the housing bubble was that mortgage loans made with 0 percent skin in the game are much more likely to cause trouble. To be fully accountable for the credit risk of its loans, a bank can keep them on its own balance sheet. This is 100 percent skin in the game. The CHOICE Act rightly gives relief to banks holding mortgage loans in portfolio from regulations that try to address problems of a zero skin in the game model – problems irrelevant to the incentives of the portfolio lender.

Accountability of regulatory agencies

The CHOICE Act is Congress asserting itself to clarify that regulatory agencies are derivative bodies accountable to the legislative branch. They cannot be sovereign fiefdoms, not even the dictatorship of the Consumer Financial Protection Bureau. The most classic and still most important power of the legislature is the power of the purse.  The CHOICE Act accordingly puts all the financial regulatory agencies under the democratic discipline of congressional appropriations. This notably would end the anti-constitutional direct grab from public funds that was granted to the CFPB precisely to evade the democratic power of the purse.

The CHOICE Act also requires of all financial regulatory agencies the core discipline of cost-benefit analysis. Overall, this represents very significant progress in the governance of the administrative state and brings it under better constitutional control.

Accountability of the Federal Reserve

The CHOICE Act includes the text of The Fed Oversight Reform and Modernization Act, which improves governance of the Federal Reserve by Congress. As a former president of the New York Federal Reserve Bank once testified to the House Committee on Banking and Currency: “Obviously, the Congress which set us up has the authority and should review our actions at any time they want to, and in any way they want to.” That is entirely correct. Under the CHOICE Act, such reviews would happen at least quarterly. These reviews should include having the Fed quantify and discuss the effects of its monetary policies on savings and savers.

Reform for community banks

A good summary of the results of the Dodd-Frank Act is supplied by the Independent Community Bankers of America’s “Community Bank Agenda for Economic Growth.” “Community banks,” it states, “need relief from suffocating regulatory mandates. The exponential growth of these mandates affects nearly every aspect of community banking. The very nature of the industry is shifting away from community investment and community building to paperwork, compliance and examination,” and “the new Congress has a unique opportunity to simplify, streamline and restructure.”

So it does. The House of Representatives should pass the CHOICE Act.

Image by Erce


How congressional power became separate, but unequal


Recent polling shows that Americans are increasingly turned off by the rancor and high-stakes nature of our recent presidential elections. But don’t expect contests for the presidency to calm down anytime soon. Today, the modern American presidency is more powerful than ever, making the importance of the office paramount to partisans on both sides of the political aisle.

It’s important to remember, however, that the presidency wasn’t always viewed this way. The system established by our Founding Fathers went to great lengths to separate the powers of government both vertically and horizontally. If anything, the founders actually were more concerned about power accreting in the legislature than in the executive.

As James Madison warned: “[i]n republican government, the legislative authority necessarily predominates,” rendering it necessary to take certain “precautions” to “guard against dangerous encroachments.” In contrast, he noted that the “weakness of the executive … might require it to be fortified” in order to resist legislative power grabs. The text of the Constitution reflected the primacy of Congress, too: Article I of the document, which lays out of the legislative powers, is more than twice as long as Article II, which describes the executive’s role.

Over the past several decades, though, Congress has gradually lost its influential role, while the presidency has been ascendant. Today, the executive branch is a sprawling behemoth with more than 4 million employees, and presidents routinely advance policy goals by executive fiat rather than by working with Congress. Given Congress’ diminished state, it is important to consider how and why Congress has failed to maintain its role as the country’s “first branch.” A recent paper by Matthew Glassman of the Congressional Research Service lays out a primer on the history of the separation of powers, as well as providing clues about Congress’ dwindling status within that system.

As Glassman recounts, the notion of governmental power being comprised of distinct functions—lawmaking, administration and adjudication—can be traced back to the ancients, including greats like Aristotle, Polybius and Cicero. The theory was more fully developed in the 17th and 18th century by Locke and Montesquieu, who acted as intellectual guideposts to the American founders.

The key feature of the American tripartite system is that it placed the legislative, executive and judicial powers of government into distinct spheres, but also ensured that their powers overlapped in certain areas. For example, the president has veto power over congressionally passed legislation, while Congress has a say in executive branch appointments. In Glassman’s words, this setup produces conflict “by design,” allowing each branch to guard its power against encroachment from the other branches.

Glassman also identifies several institutional features that have allowed our system of separated powers to remain effective throughout most of our country’s history, such as distinct personnel, independent electoral bases and separate resources for each branch. But using a system of separated power to guard against the accumulation of power is only effective if the numerous branches are operating in relative equipoise.

Glassman’s paper is particularly insightful in analyzing why the power of different branches can ebb and flow over time. He highlights the perverse incentives individual actors within each branch face—incentives that can cause them to undermine their own branch’s long-term institutional power. These forces at least partly explain why Congress’ power has declined in recent times.

For one, Glassman notes that an individual actor within a branch may have personal policy positions that conflict with the long-term institutional interests of his or her branch. An example might be a member of Congress agreeing on policy grounds with a president’s decision to engage in a unilateral military strike, despite the fact that the president acted without consulting Congress.

Partisan affiliations also might cause individuals to take actions that undermine their branch’s institutional power. This phenomenon is commonly seen when members of Congress refuse to criticize a president of their own party publicly, even if they believe the president is acting beyond his power. The electoral goals and strategies of individual members of Congress can conflict with their own branch’s long-term interests.

Glassman recognizes that the problem of a branch’s institutional power conflicting with the personal goals of individual branch members is “particularly acute for Congress”:

As individual members of a large body, Representatives and Senators may not believe they have the responsibility or the capacity to defend the institution… Even when Congress does choose to institutionally defend itself, it often finds itself speaking with less than a unified voice, as only the most vital institutional powers have the ability to unanimously unify Congress.

These problems of collective action—the responsibility/capacity to defend the institution, the ability to speak with a unified voice, and the conflict with party or policy goals—rarely if ever occur in the executive branch. The unitary nature of the presidency ensures that the executive branch will ultimately always speak with one voice, and past presidents have often expressed— both in office and after retirement—a deep feeling of responsibility for the maintenance of the powers of the presidency.

These trends, of course, are not irreversible. Congress can fight back against executive branch encroachment, if it so chooses.

R Street’s Legislative Branch Capacity Working Group has identified numerous “Madisonian solutions” that would allow Congress to rebalance the separation of powers. Options include strategies to strengthen Congress itself—for example, by beefing up committee staffs and providing more funding for entities like the Congressional Research Service and the Government Accountability Office. Alternatively, Congress could seek to reduce the power of the presidency by clawing back power from federal agencies through comprehensive regulatory reform legislation.

In other words, Congress has the tools at its disposal to return our branches of government to a more equal footing. Members of Congress simply need to start prioritizing their branch’s long-term institutional interests over their personal preferences and predilections. Until that happens, we can expect the preeminence of the presidency—and the vitriol of presidential elections—to continue unabated.

Image by Pozdeyev Vitaly

Does Congress have the capacity it needs to conduct oversight?


Envisioned by the founders as the “first” branch of government, Congress has the responsibility of overseeing and managing the other two arms of our constitutional system. And yet, as the executive branch has grown in power and prestige, Congress has increasingly lost its authority.

What resources does Congress currently employ when overseeing federal agencies? Which current resources are well-used; which are under-utilized? What additional tools and resources does Congress need to engage in truly effective oversight? The Legislative Branch Capacity Working Group recently hosted a panel on these questions, moderated by R Street’s Kevin Kosar and featuring Morton Rosenberg of The Constitution Project and Justin Rood of the Project on Government Oversight. Video of the panel is embedded below:

To keep jobs in Missouri, special session should allow more options for renewable energy


As the legislature continues work during the special session, it needs to keep sight of the big picture. The case that motivated Gov. Eric Greitens to call the session—the loss of two plants in southeast Missouri due to high electricity costs—highlights the importance of cheap, reliable electricity to the economic health of the state. But if Missouri politicians are interested in sustainable growth in its energy sector, they need to go beyond legislating single cases and take a broader look at how the electrical system can become more attractive to employers and consumers alike.

Of course, there is an easy way to reform Missouri’s electrical regulations that will increase the state’s attractiveness to business while advancing the free market principles that the legislature—and voters—support.

As things stand, consumers are restricted to buying power from their utility company, local municipality or electric cooperative. This lack of choice can be burdensome, but it is a particular problem for businesses that have internal sustainability goals regarding energy use. Indeed, many large companies have set goals to receive a set percentage of their energy from renewable sources. Businesses adopt these goals to save on costs, satisfy consumer preferences and to underscore good corporate stewardship. In Missouri, however, many companies may not be able to meet their energy goals, because local utilities simply do not offer sufficient renewable electricity. For a business deciding whether to locate or expand facilities in the state, lack of options makes the choice clear.

During the recently ended regular session, the legislature considered the Missouri Energy Freedom Act, by Rep. Bill Kidd, which would have solved the scarcity problem by allowing companies to purchase renewable electricity from someone other than their official local provider. This legal structure has worked in other parts of the country, and has the potential to attract thousands of jobs to the state both from energy conscious employers and from potential renewable generators. Companies save money on their energy bills, but also shoulder the risk of new clean energy projects. That means one simple rule change can bring Missouri huge new investments, more profitable businesses, jobs in the community and clean energy to fuel the economy – all risk free.

Most important, however, this approach would bring more jobs to the state without increasing the role of government. Allowing PPAs involves no mandates, subsidies or government heavy handedness. It simply provides companies with another option. The proposal also requires utilities to be reimbursed for any costs associated with allowing other power generators access to the grid, essentially leveling playing field.

Seven of Missouri’s largest companies – General Mills, General Motors, Nestle, Procter & Gamble, Target, Unilever and Wal-Mart – are on record supporting this approach. Even the Department of Defense is supportive. But, it’s not just the big guys who stand to benefit. As long as you use enough power, you’d be able to lock in long-term, low prices for electricity through this new structure—a benefit small mom and pop firms will appreciate.

By allowing PPAs for renewable energy, Missouri can help keep tens of thousands of jobs in the state by opening up greater access to clean energy and increasing competition and free markets. Adding this element of competition should be part of the final legislative package for the special session.

Image by Gino Santa Maria

Nebraska should be on the cutting edge of spacesharing


Nebraska’s fame as a place for innovation and leadership is legendary. Even before Warren Buffet became the “Oracle of Omaha,” Nebraskans had invented CliffsNotes, the Reuben sandwich, Vise-Grip locking pliers, T.V. dinners, Kool-Aid and Arbor Day, just to name a few.

But that history makes it even more perplexing why this state, which so often has been on the entrepreneurial leading-edge, suddenly would turn against that heritage and ban useful modern innovations like Airbnb and other short-term-rental platforms that help travelers visit Nebraska’s great and historic cities. These services have helped fill the market gap for people coming to the College World Series and other exceptional crowd-drawing events, who sometimes have trouble finding a place to stay overnight within commuting distance.

The Unicameral, itself a unique feature of Nebraska innovation, has been considering legislation that would prevent local governments from outlawing short-term rentals by those who wish to make a little extra cash and perhaps meet some nice folks from out-of-state. Alas, it is having trouble finding a spot on the agenda.

Like similar bills around the country, the Nebraska bill would continue to allow local governments to prohibit sexually oriented businesses, unlicensed liquor sales, sales of street drugs and anything else in a short-term rental that would constitute a genuine public-safety hazard. Municipalities also could still regulate for noise, animal control, fire and building codes, nuisances and the like.

The crux of the resistance to statewide regulation seems to be that hotels, motels, resorts, inns, licensed bed and breakfasts and other clearly commercial operations are just flat-out opposed to what they view as additional competition unburdened by many of the fees and requirements of commercial hospitality. One of the compromises suggested is that the legislation be amended to require short-term-rental customers to pay the applicable hotel tax when they book, which companies like Airbnb already collect in many other communities.

Indeed, there are lessons for Nebraska in how these conflicts already have been resolved elsewhere around the country. New York City recently settled a lawsuit that challenged its statute setting fines of up to $7,500 for hosts who illegally list a property on one of a short-term-rental platforms, with the platforms concerned that the vague language could leave them on the hook.

A more recent settlement with the City of San Francisco could set a pattern for future legislation, in that two of the major short-term-rental platforms agreed to a registration process with the city, allowing hosts to know the requirements and giving them confidence that they are operating legally. Processes in Denver and New Orleans similarly work to pass host registrations through to the local governments.

There can be reasonable regulations to protect neighborhoods and public safety that stop well short of prohibition. Lawmakers and regulators should craft targeted rules that allow opportunities for people with a room to spare to match with tourists who can take advantage of an overnight stay. Nebraska—a reservoir of both good sense and an innovation ethic—has the chance to be a great model for other states with a well-crafted new law.

Image by paulrommer


WannaCry underscores a need for cyber hygiene and insurance

“Oops, your important files are encrypted” read the pop-up message on hundreds of thousands of Windows operating systems across the world. The ransomware cyberattack, infamously labeled “WannaCry,” paralyzed computers by encrypting their data and holding it ransom pending payments from the afflicted.

In the days following, headlines bemoaned the arrival of the long-feared “ransomware meltdown,” while critics jumped to blame Microsoft for product insecurities and condemned the National Security Agency for stockpiling vulnerabilities. While it’s easy to assign blame and stoke fear, policymakers should, instead, use the attack as an opportunity to encourage better cybersecurity behavior and sensible risk management practices – including cyber insurance.

Cyber insurance was first touted during the dot-com boom of the early 2000s, but has only recently grown in popularity. Like other types of insurance, cyber insurance offers financial protection from sudden and unexpected losses.

For instance, in addition to coverage for WannaCry-like ransom attacks, many policies now encompass a wide range of possible costs businesses may face associated with a breach, including regulatory fines, legal costs, public relations services and costs associated with internet downtime. Because cyberattacks can result in all sorts of unexpectedly large expenses, coverage designed to insulate a business from the financial shock of a cyberattack is vital.

In the case of WannaCry, the total illicit haul of the ransom is projected to be less than one hundred thousand dollars. Yet, downstream damages are expected to tally in the billions. In fact, one firm is projecting that up to $8 billion in global computer downtime costs may accrue to services ranging from hospitals and government agencies to car companies.

The consequences of that damage may, for some, be ruinous. According to Symantec, ransomware attacks have increased 36 percent from 2015 to 2016, while the average ransom has increased 266 percent in that time to $1077.

With the number of attacks on the rise, it is important to note that cyber insurance can both facilitate resilience and can also assist in the maintenance of system security. That’s because the underwriting process, during which the insurer assesses the risk it considers taking on, often requires a cyber risk assessment. Once a policy is written, specific policy terms often require adherence to basic security practices such as patching or regular network assessments. Companies that do not meet a threshold of cyber preparedness may not be eligible for coverage, may face higher premiums and could risk losing their coverage entirely. Put another way, cyber insurance coverage contributes to a culture of preparedness.

Cyber insurance take-up rates are growing, but the market is still evolving and penetration is uneven. According to a recent survey by Aon, only 33 percent of companies worldwide had cyber insurance coverage. Foreign countries are at a particular disadvantage when it comes to recovery because they hold less than 10 percent of all cyber insurance policies.

This is particularly worrying because WannaCry revealed a geographic gap in cyber preparedness. Russia and China saw the largest incidence of infected computers, suggesting that lax patching practices and overreliance on pirated or outdated systems is more common abroad. Those companies without coverage today face the full brunt of the costs associated with the WannaCry attack.

Though the domestic cyber insurance picture is better, more should be done to encourage coverage. For instance, while the White House’s recent cybersecurity executive order reiterated that cybersecurity is a priority area for the Trump administration, it was silent on the role cyber insurance can play in incentivizing agencies and their contractors to internalize cyber preparedness. This is a missed opportunity. The government can use the power of the purse to promote cyber insurance adoption in the market as a whole by requiring federal contractors to acquire certain types of cyber risk coverage.

High-profile cyberattacks like WannaCry highlight the need for cyber preparedness and cyber insurance. A policy approach that emphasizes both—and cyber insurance in particular as a market solution to the global ransomware problem—will be a boon for companies and consumers alike.

UPDATE (May 30, 2017): This piece originally cited a statistic attributed to the National Cyber Security Alliance that the alliance says is outdated.  

Another bright idea from Mitch Daniels


Purdue University President Mitch Daniels was in Washington last week to receive the Order of the Rising Sun, Gold and Silver Star from the government of Japan at an embassy ceremony. The award is one of Japan’s highest, and was given for “significant contributions to the strengthening of economic relations and mutual understanding” between the two countries.

During his time as governor of Indiana, Daniels saw 70 new direct investments in the state from Japan, including a Honda assembly plant that was the biggest “greenfields” investment in the United States in 2006. Over the following six years, Japan brought more than $2.6 billion of new investment and 8,400 jobs to the Hoosier State, as the governor led five economic missions to the country.

Since Daniels came to Purdue in January 2013, nearly $5 million of Japanese corporate research has come to campus. Largely because of the groundwork he laid, Indiana ranks second this year among the 50 states for best economic outlook, as measured through 15 important state fiscal policy variables laid out the 10th annual edition of the American Legislative Exchange Council’s “Rich State, Poor State” study.

He’s also accomplished a number of significant milestones at Purdue, including a six-year tuition freeze. There may not be another university in the country that plans to charge students less tuition in 2019 than was paid in 2012. The student loan default rate for Purdue graduates hovers around 1 percent. The Milken Institute ranked Purdue No. 1 for technology transfer among public universities without a medical school.

Now, the university is going to expand its offerings to millions of people online. Instead of committing to a multiyear project to build a significant online learning university, Purdue announced April 27 that it is creating a new public university (temporarily named “New U”) by acquiring most of the assets of Kaplan University, a competency-based online learning business of 15 campuses in the United States, 32,000 nontraditional students and nearly 80 years of remote-learning experience.

Kaplan offered the nation’s first totally online law school and has created study courses to review vast amounts of material for various accreditation and professional certification exams. It is a global provider of education programs in more than 30 countries and has forged partnerships with many colleges, universities, school districts and more than 2,600 corporations. The educational networking possibilities are nearly limitless. A whole new chapter of efforts to produce more affordable post-secondary opportunities, particularly for working adults, is likely to be launched by this marriage of a top public research institution and an online juggernaut in competency-based education.

According to reports, the Purdue faculty is not yet prepared to give its blessing to New U, which is an endemic feature of both disruptive initiatives and university faculties generally. Quick to embrace every progressive policy fad, it is less likely that the Purdue management will get an immediate pass from those participants in higher education with sinecures anchored in the traditional business model. But it is a model that deserves more consideration as workplace needs drive absorption of sophisticated technical knowledge and skills and leans toward affordable learning for the benefit of its students and the good of the country.


Sessions’ charging memo underscores need for Congress to pass reform


Attorney General Jeff Sessions’ memorandum instructing federal prosecutors to “charge and pursue the most serious, readily provable offense” against defendants signals a desire to return to a tough-on-crime stance. From the perspective of criminal justice reform, the most daunting aspect of these developments is a likely resurgent dependence on mandatory minimums.

As has been noted by John Malcolm, a criminal justice expert and director of the Heritage Foundation’s Edwin Meese III Center for Legal and Judicial Studies, reinstatement of stricter charging and sentencing policies is fully within the attorney general’s authority. We’ve seen it before from Attorney General Richard Thornburgh, who issued his own guidelines in 1989 requiring strict enforcement of all provable offenses. In the years since, there’s been back-and-forth directives from Thornburgh’s successors Janet Reno, John Ashcroft, Eric Holder and, now, Sessions.

But over that interim, experts have gathered evidence against mandatory minimums, finding that heavy use of these sentencing laws failed to reduce drug use or recidivism. Mandatory minimum sentences are fixed prison terms applied to specific crimes, which can range from five years to life imprisonment. They strip judges of the ability to use their own professional discretion to determine sentencing based on the facts at hand.

The aim of mandatory minimums during the height of the 1980s crack epidemic was, of course, to target drug kingpins and cartel leaders, in order to improve public safety. Prison populations surged, but it was primarily due to an increase of low-level offenders. With prisons now bursting at the seams and calls for the construction of newer prisons to house an ever-growing population of prisoners, taxpayers have had to shoulder the costs.

The most notable portions of Sessions’ memo are where he instructs that “the most serious offenses are those that carry the most substantial guidelines sentence, including mandatory minimum sentences,” which marks a deviation from the “smart-on-crime” approach under Holder. Sessions’ memo would ensure the U.S. Justice Department “fully utilizes the tools Congress has given” the agency.

But to the extent that it is Congress that provides the DOJ tools to enforce federal laws, Congress itself needs to reassess those tools. Sen. Rand Paul, R-Ky., has taken that exact strategy. Alongside Sen. Patrick Leahy, D-Vt., Paul has introduced the Justice Safety Valve Act in the Senate, while Reps. Thomas Massie, R-Ky., and Bobby Scott, D-Va., have done the same in the House. The legislation authorizes federal judges to provide more fitting sentences outside of a mandatory-minimum requirement.

During a press call Wednesday, Paul noted that momentum for change is likely to build if more members introduce more criminal justice reform bills. While acknowledging that reform advocates face an “uphill battle,” he also indicated that he is “having conversations with people” within the Trump administration willing to listen.

The call to enforce harsher charging and sentencing methods is a serious concern, especially the goal to revive the one-size-fits-all use of mandatory minimum sentences. However, seeking ways for Congress to set the tone and dictate what tools are available for judges and parties at the DOJ is currently the most effective way to remedy this recent course of events. It’s checks and balances at its finest.

Image by Brad McPherson

OPEN Government Data Act moves to Senate floor after markup


Legislation requiring federal agencies to publish their data online in a searchable, nonproprietary, machine-readable format has been cleared for the Senate following a May 17 markup by the Senate Homeland Security and Governmental Affairs Committee.

Sponsored by Sen. Brian Schatz, D-Hawaii, S. 760, the Open Public Electronic and Necessary Government Data Act is identical to an earlier Schatz bill that passed the Senate unanimously last year after analysis by the Congressional Budget Office determined it wouldn’t cost taxpayers any money.

What it would do is modernize government agencies and increase their effectiveness, while also allowing taxpayers to see how their money is spent. For these reasons, R Street joined more than 80 organizations—including trade groups, businesses and other civil-society organizations—in urging the Senate committee to pass these badly needed reforms.

The status quo makes it difficult for engaged citizens to view the spending data of the agencies they fund. A taxpayer interested in viewing the companies and organizations that receive federal grants and contract awards would need to have a license for the proprietary Data Universal Numbering System (DUNS). Dun & Bradstreet Inc., the company that owns DUNS, functions as a monopoly with respect to government contractor data.

In a 2012 report, the GAO claimed the costs of moving away from DUNS to a different system would be too great, but that was in a time of fewer alternatives. More recently, a Government Accountability Services 18F technology team study showed that government agencies across the world are beginning to use a 20-digit code called the Legal Entity Identifier (LEI). LEI is free for organizations and companies to use, as it is managed by the Global LEI Foundation, a nonprofit organization based in Switzerland. It would require no expensive upgrades.

Both the current and previous administrations have publicly supported transparency reforms for federal agencies. President Barack Obama introduced an Open Data Policy in 2013, and Matt Lira, a special assistant to President Donald Trump for innovation policy and initiatives, told an audience in April that financial transparency is still a priority for the White House.

Vested interested likely will still oppose the bill, which also has companion legislation, H.R. 1770, in the U.S. House. But given that it has support from both parties—an incredibly rare thing these days—as well as from the present and prior administrations, transparency advocates have room for optimism. The case for nonproprietary data standards and government transparency will now be in the hands of Congress.

Image by zimmytws

Big names weigh in on FCC’s net-neutrality rules


We seldom see a cadre of deceased Founding Fathers petition the Federal Communications Commission, but this past week was an exception. All the big hitters—from George Washington to Benjamin Franklin—filed comments in favor of a free internet. Abraham Lincoln also weighed in from beyond the grave, reprising his threat “to attack with the North” if the commission doesn’t free the internet.

These dead Sons of Liberty likely are pleased that the FCC’s proposed rules take steps to protect innovation and free the internet from excessive regulation. But it shouldn’t surprise us that politicians have strong opinions. What about some figures with a broader perspective?

Jesus weighed in with forceful, if sometimes incomprehensible, views that take both sides on the commission’s Notice of Proposed Rulemaking, which seeks comment on scaling back the FCC’s 2015 decision to subject internet service to the heavy hand of Title II of the Communications Act of 1934. Satan, on the other hand, was characteristically harsher, entreating the commissioners to “rot in Florida.”

Our magical friends across the pond also chimed with some thoughts. Harry Potter, no doubt frustrated with the slow Wi-Fi at Hogwarts, seems strongly in favor of keeping Title II. His compatriot Hermione Granger, however, is more supportive of the current FCC’s efforts to move away from laws designed to regulate a now defunct telephone monopoly, perhaps because she realizes the 2015 rules won’t do much to improve internet service. Dumbledore used his comments to give a favorable evaluation of both Title II and the casting of Jude Law to portray his younger self in an upcoming film.

A few superheroes also deigned to join the discourse. Wonder Woman, Batman and Superman joined a coalition letter which made up with brevity what it lacked in substance. The same can’t be said for the FCC’s notice itself, which contains dozens of pages of analysis and seeks comments on many substantive suggestions designed to reduce regulatory burdens on infrastructure investment and the next generation of real time, internet-based services. Another, more diverse, coalition letter was joined by Morgan Freeman, Pepe the Frog, a “Mr. Dank Memes” and the Marvel villain (and Norse trickster god) Loki. It contained a transcript of Jerry Seinfeld’s Bee Movie.

Speaking of villains, Josef Stalin made known his preference that no rules be changed. But Adolf Hitler attacked Stalin’s position like it was 1941.

Then there are those with advanced degrees. Doctor Bigfoot and Doctor Who filed separate comments in support of net neutrality.

In a debate too often characterized by shrill and misleading rhetoric, it’s heartening to see the FCC’s comment process is engaging such lofty figures to substantively inform the policymaking process. I mean, it sure would be a shame if taxpayer money supporting the mandatory review of the 1,500,000+ comments in this proceeding was wasted on fake responses.

Image by Bonezboyz

Coppage talks urbanism on the Matt Lewis Show

R Street Visiting Senior Fellow Jonathan Coppage was a recent guest on the Matt Lewis Show, where he made the case for the Federal Housing Administration to re-legalize Main Street. Full audio is embedded below.

How Scott Gottlieb’s ‘boring’ approach could transform the FDA


Dr. Scott Gottlieb, confirmed earlier this week by the U.S. Senate to become the new commissioner of the Food and Drug Administration, has a pragmatic—some might even say boring—approach to public health that could revolutionize how FDA regulations can fight the consequences of addiction.

With his vision of the future of tobacco, Gottlieb takes all the fun out of the heated arguments that anti-tobacco and pro-vaping individuals engage in on a regular basis – offering a reasonable solution to the disease burden of cigarettes. In a 2013 Forbes essay, he stated:

Whatever one thinks of cigarette makers, if the industry was earnest about transitioning away from the manufacture of smoked cigarettes, and getting into the development of new products that would still satisfy peoples’ taste for nicotine (with hopefully much lower risks) there could be public health virtue. The overall incidence of smoking related disease could be sharply diminished.

He acknowledges the enormous power the FDA has in the future of public health, particularly as it relates to tobacco consumption. He even has the guts to imply that “big tobacco” could actually be an ally in solving a problem many think they created, by encouraging cigarette manufacturers to focus on safer products and the e-cigarette market.

He recognizes the emergence of e-cigarettes present a viable alternative to other smoking-cessation products and that they have the potential to contribute to a future without combustible cigarettes. During his confirmation hearings, Gottlieb stated that reduced-harm products should be available to consumers to transition off combustible cigarettes, and he has taken note of the burdensome regulations that will be put on small businesses who want to enter the e-cigarette market, under currently scheduled FDA vaping rules.

These comments suggest that he would be open to regulations that make it easier for safer products to enter the market, rather than the currently planned deeming regulations, which would require nearly all existing e-cigarette products to go through a pre-market tobacco application (PMTA) process that would cost approximately $300,000 for each combination of flavor, strength, mixture and device. In a harm-reduction model, this is important, because increased competition from small businesses in the e-cigarette market will increase innovation and production of even safer products, while decreasing the price point of products that are at least 95 percent safer than combustible cigarettes.

Furthermore, this harm-reduction approach also could be applied to the opioid epidemic, which Gottlieb has stated is the FDA’s top priority. Medication-assisted treatments—such as methadone and Suboxone—help nearly 40 percent of people with opioid-use disorders to abstain from heroin and other commonly abused opioids. Opioid antagonists—such as Narcan and Vivitrol—can be used to reverse overdoses and cut cravings. Pharmaceutical companies, both big and small, have an opportunity to improve upon medications that can be used to treat opioid addiction and its consequences. Gottlieb’s willingness to embrace a harm-reduction philosophy and his recognition that it is important to have a practical approach to expensive and time-consuming FDA regulations could further encourage small pharmaceutical companies to enter the pipeline of life-saving opioid addiction treatments.

During the confirmation process, Gottlieb received criticism for his ties to the pharmaceutical industry. But frankly, his recognition that the tobacco and pharmaceutical industries can help solve an addiction crisis that kills nearly half a million people a year is to be applauded. That level-headed vision is exactly what the FDA needs to reduce the economic and health burden of addiction in the United States.

In reshaping U.S. energy policy, Perry’s best model is his home state


The Federal Energy Regulatory Commission (FERC) held a high-stakes conference May 1 and 2 to address the contentious interplay of state policies and interstate wholesale-electricity markets. The week prior, Energy Secretary Rick Perry announced his department might intervene in state energy planning to protect baseload coal and nuclear generation.

Ironically, market experts speaking before FERC identified state interventions to bail out coal and nuclear as the most damaging form of intervention. Market experts noted that state subsidies and mandates for renewables also continue to stress market performance, but do not displace system “capacity” needs the same way that baseload subsidies do.

This marks a fundamental debate over the role of government in competitive electricity marks, which happens to intersect with federalist themes in various ways. In the case of the Northeast and some Mid-Atlantic states, FERC seeks to uphold the competitive functionality of electricity markets, while some states have undertaken anti-competitive interventions to dictate outcomes that are better determined by markets. On the flip side, the prospect of federal intervention in state energy planning runs completely counter to conservative arguments against the Clean Power Plan, even if they are (incorrectly) made in the name of national security.

This paints a convoluted picture for federalists, but the pro-market case is clear – interventions at both the state and federal level are unwarranted and destroy wealth.

At the FERC conference, state representatives reiterated their support for relying on markets because of the clear economic benefits (i.e., little to no sign of “re-regulation” interest). Yet they also wanted to preserve the option to pick government-preferred investments, which runs counter to the very premise of liberalized electricity markets. Constructively, states expressed a willingness to engage in dialogue, which was marked by identifying policy principles, but they struggled to articulate what those objectives were.

Much of the challenge that state representatives—often, commissioners of public utility commissions—had in articulating an energy vision consistent with market principles is that politicians back in their states often support industrial policy (i.e., government explicitly picking winners), creating a difficult agenda to reconcile with FERC’s obligation to uphold competition. The most common policy theme was to reduce emissions, yet the states largely rebuffed market-compatible approaches to reducing emissions – namely, emissions pricing. A couple states brought up the need for state actions to improve reliability, claiming (contrary to the evidence) that markets aren’t able to provide reliable service. They rehashed generic slogans, like the need for fuel diversity, which has no direct bearing on whether an electricity system is reliable.

Competitive electricity markets are complex and poorly understood by state and federal policymakers. Given the rapid transition of electricity technologies and fuels, coupled with the persistent political obsession to dictate what this mix “ought” to be, the scene is set for half-truths and false narratives to prevail. Whether it’s progressives pushing for more renewables, confused conservatives supporting interventions to preserve baseload or any other such combination, all these narratives fundamentally miss the point that the goal of smart policy is to encourage well-functioning markets.

Fortunately, the narrative that we should level the playing field and let technologies compete on their merits still holds some political weight. Some FERC reforms could move in that direction, such as enabling participation of energy storage and pricing of fast-start resources. However, FERC reforms to appease the industrial-policy ambitions of some states (or the U.S. Department of Energy) would fundamentally deviate from the core objectives of competitive electricity markets. This could easily result in extensive unintended consequences. It’s not FERC’s job to validate state policy, but it must pass judgement on anti-competitive conduct. Conference participants offered ideas on this definition, and FERC would be wise to continue that dialogue.

There continues to be an immense need to educate state and federal policymakers on how electricity markets function and of the consequences of industrial policy, especially ad hoc subsidies. While the uptick in state interventions stirred controversy, it has also spurred productive dialogue in the Northeast and between the states and FERC. The conference demonstrated a clear need and willingness among states, stakeholders and current FERC commissioners to continue and deepen such a dialogue.

The concern of the Northeastern and some Mid-Atlantic states to reduce emissions is laudable. Pollution is a valid market failure that can be corrected, efficiently, by market-based policies. Such policies have excelled in competitive markets, where strong cost-reduction incentives have driven emissions reductions and innovations that lower the cost of emissions abatement. That’s an example of where state and federal interests align, as well-functioning markets that internalize all costs create the most benefit for society. But many policymakers do not understand these benefits clearly, and FERC and states should engage market experts in forums that help foster and disseminate this research. One example to highlight is Texas, which has seen reductions in costs and emissions without the controversy and distortions of industrial policy.

Encouragingly, Perry noted that the president asked him to reshape U.S. energy policy in the mold of Texas, where he spent 14 years as governor. Texas relies on competitive markets to signal power-plant investments and price-responsive demand. These markets do not explicitly value baseload, nor should they. Rather, they value reliable operations by providing revenues to resources that perform, especially when supply scarcity drives price spikes.

The Texas markets handsomely rewarded baseload coal and nuclear in the past, when they were highly competitive during periods of higher-priced natural gas. Now, inexpensive gas and cost declines for gas generation and renewables (the latter partially resulting from subsidies) have heavily cut into baseload margins, even driving some into retirement. Yet, according to the independent monitor of the Texas market, the reliability outlook remains strong, as more gas and wind-generation come online (and, in the long term, solar). Meanwhile, consumer costs have tanked. The monitor emphasizes that the new resource mix underscores the need for efficient price formation. This is the product of quality market design (e.g., “scarcity pricing” to account for the market failure of having adequate resources) and market discipline, as interventions can dramatically distort investment decisions and freeze capital markets.

Perry would serve America well by encouraging the Texas model. Over the past few years, Texas has bolstered its scarcity pricing, while Texas legislators and regulators have let the market work. The Northeast and Mid-Atlantic have not done so, causing the need for FERC’s conference to address the uptick in disruptive state interventions. Once FERC re-establishes a quorum, it will face the tasks of improving price formation and moderating the effects of state interventions. Competitive markets will drive costs reductions, innovation and emissions reductions, but only if state and federal policymakers keep interventions at bay. As Marlo Lewis Jr. of the Competitive Enterprise Institute recently remarked, “subsidizing uneconomic energy to the detriment of consumers and taxpayers is no way to drain the swamp.”

Image by Crush Rush

Ridesharing a victim of Alaska’s budget battle


As Alaska’s legislative session ends, some perplexing in-house political gamesmanship has kept a popular bipartisan measure from making it to the House floor. It’s a shame, because freedom-loving Alaska now remains one of just a handful of states that still doesn’t allow ridesharing services such as Uber and Lyft to operate within its boundaries.

Senate Bill 14 has passed the full House and needs only a vote of the Senate to move it along to the governor’s office. But according to news reports, the House has bottled up the legislation in the Rules Committee, which rarely meets. The measure reportedly is being used as leverage by legislators fighting a contentious budget battle.

The Alaska Journal of Commerce reports that, instead of letting S.B. 14 get to the House floor and then presumably to the governor for his signature, “the House is effectively starting the legislative process anew by advancing its own version of Uber legislation.”

That’s bad for Alaskans, given the obvious benefits of allowing these services to operate. The bill wisely clarifies that these drivers are independent contractors, thus restricting various efforts to mandate the payment of myriad employee benefits and thereby keeping this a cost-effective option. It also would prevent Alaska cities from imposing their own onerous local restrictions on these services. The bill requires background checks for these TNC drivers, but it’s still overall a good step forward that’s backed by ridesharing companies.

Although Alaska political observers expect that a ridesharing bill will eventually get to the floor at some point this year or next, it’s mistake to delay the ability of these companies to offer not only jobs, but rides to people who need a convenient – or safe – way to get home.

The Economist recently reported on a new study suggesting that “the arrival of Uber to New York City may have helped reduce alcohol-related traffic accidents by 25-35 percent.” According to the U.S. Centers for Disease Control, Alaska has one of the nation’s highest rates of excessive drinking. Uber and Lyft aren’t a panacea for such a significant social problem, but they could make the streets of Anchorage, Juneau and Fairbanks a bit safer.

With a week to go before the session’s end and a budget crisis looming, legislators might have other things on their mind. But the budget problem will eventually get fixed. Alaska residents who want to use ridesharing services shouldn’t be held hostage to that process.

Image by Joseph Sohm

California looks to finally end the Cold War


The Cold War ended decades ago, but vestiges of the conflict still surround us. In the California Legislature, Assemblyman Rob Bonta, D-Oakland, has introduced Assembly Bill 22, which seeks to bring one chapter of that history to a close.

A.B. 22 would replace a nearly 80-year-old prohibition that barred members of the Communist Party or individuals who otherwise advocated communist ideals from employment by the State of California. In its place, the bill would impose an ideologically neutral prohibition on employing anyone who actively seeks the forceful or violent overthrow of the government of the United States.

There can be no doubt that communism was a blight on the 20th century. In its name and under its red banners, hundreds of millions of people were killed. And it is well-known that the chief geopolitical rival of the United States through the second half of that century—the Soviet Union—was a power animated by communist ideology.

It is therefore no wonder that, in an effort to ensure the state’s government institutions were not subverted by those who would like to see the Soviet Union best the United States, legislators placed in statute a prohibition on employment for anyone with ongoing ties to, or outspoken sympathy for, the Communist Party. The party, like the Soviet Union, was understandably viewed with extreme prejudice by lawmakers who felt threatened by those who sought to topple market-oriented liberal democratic institutions. In fact, the text of the existing California law goes into great detail about the consequences of communism and spells out unambiguously the threat post by a Communist fifth column.

But while it remains helpful to examine the history of communism to better understand dictatorial barbarism and anti-democratic preferences, the time has come to correct the mistakes that legislators of decades past made when they needlessly trampled their own values by targeting people’s beliefs, rather than their actions. AB 22 does that.

To be clear, there aren’t legions of communists waiting to enmesh themselves in California’s bureaucracy, so it’s a bit strange that a lawmaker would feel so strongly as to want to carry the legislation. But Republican opposition to Bonta’s bill is no more explicable.

Bonta’s bill doesn’t diminish our recognition of the repugnant nature of communist ideology. That ideology was, and remains, an affront to individual liberty and dignity. But it is the liberal aspirations of the United States—which preclude discriminating on the basis of one’s political beliefs—that set the country apart from the Soviet Union in the first place.

Image by StrelaStudio

R Street hosts Justice for Work coalition panel

The R Street Institute hosted an April 17 launch party for the Justice for Work Coalition. Justice for Work is a coalition of organizations spanning the ideological spectrum that seeks to raise awareness and advocate for lowering the barriers created by laws and regulations that unnecessarily restrict economic participation.

The event included a panel discussion featuring former law enforcement officers, an ex-offender, and policy and legal experts.

The panelists were:
Arthur Rizer, R Street Institute
Ed Chung, Center for American Progress (moderator)
Teresa Hodge, Mission: Launch
Marcus Bullock, Flikshop
Alvaro Bedoya, Georgetown University Law Center

Full audio of the panel is embedded below.

CRS should stop fighting access to its own reports


The Congressional Research Service plays an essential role in policymaking and oversight. It makes Congress smarter about issues and teaches new legislators how to legislate. I would not have spent 11 years working at CRS if I did not think very highly of the institution.

But there is one topic on which the widely esteemed and nonpartisan agency has been embarrassingly biased: the proposals to make its reports more equitably available to the public.

As a practical matter, CRS reports are available – 27,000 copies can be found on government and private-sector websites., for example, has more than 8,000 reports. But official congressional policy does not provide for consistent public release of the reports, which explain the workings of Congress, agencies and myriad public policies.

Legislation has been introduced in this Congress and last Congress to fix this situation, and a number of times previously. Reps. Mike Quigley, D-Ill., and Leonard Lance, R-N.J., would have the Government Publishing Office post the reports on This solution would give citizens a central repository to go to read authenticated copies of the reports, and would relieve CRS and congressional staff of the hassles of responding to reporters, lobbyists and constituents who ask for copies.

Inevitably, CRS proclaims aloud that it takes no position on the issue and will do whatever Congress directs. But how are we to square that claim with this 2015 memorandum that CRS’ leadership shopped to legislators? The memorandum is modestly titled: “Considerations arising from the dissemination of CRS products.” The content, however, is nothing but scare-mongering speculation about bad things that might happen if more Americans had access to CRS reports. Proponents of expanded access to CRS reports quickly demolished the claims made in CRS’ “considerations” memo.

As someone who once reviewed CRS reports before they were published, I can tell you that, had a CRS analyst written this memo, it never would have seen the light of day. And said analyst would have been rebuked by his or her supervisor. The memorandum not only misconstrues what is being proposed — nobody is advocating that CRS itself distribute the reports—but it also makes no mention of the many possible benefits of a change in policy (like increased public understanding of how Congress and government operates).

That means the memo violates CRS’ own very clear policies that its work for Congress must be accurate and unbiased, and must consider the possible benefits and costs of any proposed policy. (This internal CRS rule not only is intellectually honest, it also, ahem, protects the agency from having its work give the appearance of bias.)

One hopes that someone in Congress would call CRS leadership to the carpet on this tartuffery, and demand the agency to disavow the memorandum. In a time when federal budget cuts are being seriously discussed, the agency does itself, its employees and Congress no favors by being the lone voice advocating against common-sense reform.

Image by Micolas

Fierce debates dominate D.C.’s first E-Cigarette Summit


If you imagined an e-cigarette conference full of policymakers at a Marriott in Washington would be a tame event, you would be wrong. I suppose I shouldn’t be surprised that e-cigarettes could a polarizing topic, but I will not soon forget the cheers and boos in the crowd as people stood up to state their opinions and present their research at the first E-Cigarette Summit here in Washington.

A running theme of this conference came down to the existential question: are you a skeptic or are you an enthusiast? Are e-cigarettes addictive products designed to hook teenagers or should they be marketed to current smokers as a quitting tool?

It’s important to understand that e-cigarettes are much safer than combustible cigarettes. Every panelist—including professors, physicians, economists and industry folks—agreed with reports that e-cigarettes are at least 95 percent safer than traditional cigarettes. What is not so easily understood is how best to use e-cigarettes to promote a healthier society.

We’ve seen debates like this before. Will needle-exchange sites keep injection users free from infectious disease or will they tacitly encourage people to try heroin? Does condom distribution in high schools prevent teen pregnancy or lead to a breakdown of morals? There are valid points in support of either argument, but whichever way we as a society land will have long-lasting effects.

The truth is there are a lot of specific questions that need to be answered before people will feel comfortable with novel devices. When it comes to e-cigarettes, there needs to be a balance between consumer protection, trust and the application of science, so that sound policy can best direct public health goals. Some of the discussions at this forum centered on questions for which we don’t yet have definitive answers:

  • Does a standalone nicotine product at the concentrations found in e-cigarettes (with the absence of other chemicals that are present in tobacco) produce changes in the brain consistent with addiction?
  • What environmental or product factors are predictors of successful transition from combustibles to e-cigarettes?

As we move forward in our research and advocacy endeavors, the answers to these questions will help shape both tobacco and e-cigarette policy and will form a foundation for U.S. harm-reduction policy.

Some of the more contentious issues created even more forceful debates. While e-cigarettes are effective smoking-cessation tools, physicians are reluctant to recommend them over medications, gum or the patch. Although teen smoking rates are at historic lows, the rise in experimentation of e-cigarettes is concerning (it is noteworthy that daily use of e-cigarettes among teens is 2 percent). While it is unethical to perpetuate the myth that e-cigarettes are nearly as harmful as traditional cigarettes, some suggest there might be an ethical dilemma in marketing e-cigarettes to recreational users.

It is fair to say that more information is better to avoid hooking a new generation on cigarettes, but it is more important to use the tools we have now to encourage smokers to switch to safer products. We cannot forget that, just today, more than 1,300 people will die from smoking in the United States alone. Getting people to stop smoking combustible cigarettes should be our No. 1 priority and there is now a product to make that happen.

Image by LezinAV

Fixing California’s bloated sex-offender registry


R Street just signed a letter calling for commonsense reform of the California sex-offender registry, based on a bill proposed by our friend and Legislative Advisory Board member Sen. Joel Anderson, R-Alpine.

The bill we’re supporting in California, backed strongly by our own research, creates a tiered system for adult sex offenders. This is a step in the right direction to reform California’s overgrown and overly large sex-offender registry. A registry that includes too many people is likely even worse than one that includes too few: it diverts resources toward monitoring low-risk people that should be devoted to monitoring the relative handful of truly dangerous offenders. The best available research on sex-offender registries, which I summarized in this article for National Affairs, indicates that risk-based approaches like the one contemplated in the bill are good public policy.

While taking this first step is important, it doesn’t solve what is likely the single biggest problem with sex-offender registries: their inclusion of offenders who were adjudicated as juveniles. As I’ve written about here with my friends Nicole Pittman and Stacie Rumenap, it’s unjust, cruel and undermines the purpose of the juvenile justice system—which, at least in theory, is supposed to act in offenders’ own best interests. Youth registration, as R Street research has shown, costs millions of dollars more than it could possibly save. It’s the single greatest inefficiency in our sex-offender registration system.

The California bill is a good start, but it’s only a start. If the Golden State really wants to fix its registry, it’s going to have to end the registration of children.

Image by Jeffrey B. Banke

Congressmen reintroduce bill to make CRS reports public


The Government Publishing Office would be required to make Congressional Research Service reports publicly accessible over the internet, under legislation reintroduced last week by Reps. Leonard Lance, R-N.J., and Mike Quigley, D-Ill.

The CRS, a division of the Library of Congress, is known as Congress’ in-house “think tank.” House offices and committees historically have been free to publish CRS reports on their own websites for constituents to view and some third parties aggregate CRS data on websites like

But while taxpayers spend more than $100 million annually to fund CRS, timely access to these important documents is usually reserved to Washington insiders. There exists no official, aggregated source for taxpayers to access the CRS’ valuable and informative work.

R Street Vice President for Policy Kevin Kosar, himself a veteran CRS analyst, testified recently before the House Legislative Branch Appropriations Subcommittee, where he presented the panel with a letter signed by 25 former CRS employees with more than 570 combined years of service who all support an open public database of nonconfidential CRS reports.

There is strong precedent for public access to legislative support agency documents. In his subcommittee testimony, Kevin noted the Government Accountability Office, Congressional Budget Office and the Law Library of Congress all make their reports public, as do the 85 percent of G-20 countries whose parliaments have subject-matter experts.

Proposals like the Lance-Quigley bill would place publishing responsibilities with another entity, to ameliorate CRS concerns about the service having to publish the reports itself. Briefings and confidential memoranda would not be disclosed and data issued to the public through a searchable, aggregated database would only include nonconfidential information.

As Kevin noted in his testimony, the public deserves to be on equal footing with lobbyists and the Hill.


Applying a BAT to reinsurance would be a big swing and a miss by Congress


As we all saw in recent media coverage of President Donald Trump’s 100th day in office, many observers treat the first 100 days of a new presidential administration as if were the only time that matters, a legacy that has been with us since President Franklin Roosevelt passed most of his New Deal agenda in the first three months of his administration in 1933.

But in some ways, the first 100 days of any new Congress or presidential administration actually is more like baseball’s spring training. It offers lawmakers the chance to warm up, get their teams set and plot out a game plan for the coming year. For baseball, the end of spring training is marked by the start of competitive play. As of last week, Washington’s spring training is closed and it is time to play ball.

The president, congressional leaders and Washington’s many think tanks all have their versions of what comprehensive tax reform should look like, and frankly, everyone is all over the field. One of the biggest issues under debate is a plank from the House Republicans’ plan called the border-adjustment tax, or “BAT.” If Washington isn’t careful, this plan could turn into one giant swing and a miss, particularly when it comes to the reinsurance market.

For a quick trip around the bases, essentially, under the BAT, companies will no longer be able to deduct the costs of imported goods and services. Meanwhile, any company that exports or profits from foreign sales will now enjoy that income tax-free. The debates over whether or not this will be a good thing for the U.S. economy tend to focus on a very few select points. However, if the subjects of insurance and reinsurance are left on the bench, we are going to find ourselves wishing for a rainout.

Right now, it’s unknown whether House Republicans still intend to go forward with their plans for a BAT, much less whether it would apply to financial services like reinsurance – something that only one country (China) of the 160 that employ the conceptually similar value-added tax does. If Congress chooses to follow in China’s footsteps, we have a problem.

In order to take on the risks of events like Texas hailstorms, Missouri tornadoes, Florida hurricanes and California earthquakes, property insurance companies cede portions of those risks to the global reinsurance market, where they are pooled with risks like earthquakes in Japan, floods in the United Kingdom or terrorist events in France. By pooling portions these uncorrelated risks from around the globe, the reinsurance market makes it possible for Americans to buy affordable insurance for their homes, vehicles and businesses.

If Congress decides to pass a BAT system that would apply to reinsurance, the cost to American consumers would be painful. A recently released study by the Brattle Group looked at the effects of a BAT on the reinsurance market and found U.S. consumers would have to pay between $8.4 billion and $37.4 billion more each year just to get the same coverage. Several of my colleagues recently have conducted more targeted research that, over the next decade, the tax would add $3.39 billion to the cost of property insurance in Texas and $1.11 billion in Louisiana.

Applying a border-adjustment tax to reinsurance would be a pitch in the dirt for American consumers and Congress shouldn’t swing. Insurance companies will be put in the unwinnable position of having to raise their prices and offer less coverage. The end result is higher costs, with more risk concentrated on American shores. That’s a bad call for everyone.

Image by smspsy

Three years in, what does the DATA Act tell us about agency spending?


Trying to figure out exactly how much money the federal government spends long has been an exercise in futility for those few brave souls who endeavor to try it. Though the U.S. Treasury has published financial data since the beginning of the republic, the government has an uneven history, to say the least, when it comes to reporting agency expenditures.

Agencies traditionally have employed a hodgepodge of data and spending models that fail to adhere to a common metric. This makes it difficult for lawmakers and policy experts to wrap their arms fully around federal agency spending. Since at least the 1970s, efforts have been afoot to standardize government data, culminating in 2014’s Digital Accountability and Transparency Act, also known as the DATA Act.

The bill’s purpose was to make expenditures both more transparent and more accessible. It requires Treasury to establish common reporting standards across all federal agencies, with the data posted online in a publicly accessible format.

The DATA Act has been in the news again recently because the first agency reporting deadline is May 9, the third anniversary of the law’s passage. Right on cue, the DATA Coalition hosted a panel discussion and “hackathon” last week to let teams of data wonks work with some of the early datasets the agencies have provided.

Keynote speaker Rep. Jim Jordan, R-Ohio, emphasized the potential for uniform spending data to shape policy by helping lawmakers better understand the scope and size of government. That, in turn, could allow them to enact more meaningful reforms. As he put it: “If you don’t know where you are, it’s impossible to know where you’re going.”

The coalition also hosted a panel featuring three individuals who have been key to creating the uniform financial data standards the agencies now must use: Chistina Ho, deputy assistant Treasury secretary for accounting policy and financial transparency; Dave Zvenyach, executive director of General Services Administration’s 18F project; and Kristen Honey, senior policy adviser for the Office of Management and Budget’s chief information officer.

The panelists generally were optimistic about the implementation process, though each noted the difficulty involved in pursuing new endeavors within a convoluted bureaucracy like the federal government. Honey was sanguine about the potential for agencies to follow the lead of private industries that use open datasets for productive ends, noting that American taxpayers have “already paid for this data, so they should have access to it.”

She pointed to the example of the Department of Veterans Affairs’ synthetic dataset published last fall that will help them study mental health issues among military veterans. Honey also predicted that state and local governments were likely to follow suit on open data initiatives, which she hoped would help expose and weed out inefficiencies in government spending and operations across all levels of government.

The panelists also cautioned that many agencies likely will encounter difficulties aggregating and successfully publishing their spending data by the May 9 deadline. The concern was that if reports from the Government Accountability Office and agency inspectors general catalog widespread deficiencies around the first reporting deadline, it could lead the public and lawmakers to doubt the DATA Act’s efficacy.

James Madison famously claimed that the power of the purse was “the most complete and effectual weapon” that could be wielded by government. Increasing the standardization and transparency of government spending data will only help strengthen that power.

Image by zimmytws

Eli Lehrer at the New American Jobs Summit

R Street President Eli Lehrer was featured on a recent panel at the New American Jobs Summit, joined by Micaela Fernandez Allen of Wal-Mart, Tom Kamber from Older Adults Technology Services and Bill Kamela of Microsoft Corp., to discuss how technology and shifting economic needs are changing how workers prepare to join or rejoin the workforce. Video of the full panel is embedded below.

What’s wrong with e-cigarettes?

R Street Policy Analyst Caroline Kitchens recorded this recent video for PragerU on e-cigarettes, a safer alternative to traditional tobacco cigarettes that could help millions of smokers to quit.

Let’s get rid of Puerto Rico’s triple-tax exemption


Let’s ask a simple and necessary question: Why in the world is the interest on Puerto Rican bonds triple-tax exempt all over the United States, when no U.S. state or municipality gets such favored treatment?

The municipal bond market got used to that disparity, but in fact, it makes no sense. It is an obvious market distortion, on top of being unfair to all the other municipal borrowers. It helped lure investors and savers, and mutual funds as intermediaries, into supporting years of overexpansion of Puerto Rican government debt, ultimately with disastrous results. It is yet another example of a failed government notion to push credit in some politically favored direction. Investors profited from their special exemption from state and local income taxes on interest paid by Puerto Rico; now, in exchange, they will have massive losses on their principal. Just how big the losses will be is still uncertain, but they are certainly big.

Where did that triple-tax exemption come from?  In fact, from the Congress in 1917. The triple-tax exemption is celebrating its 100th anniversary this year by the entry of the government of Puerto Rico into effective bankruptcy. Said the 1917 Jones-Shafroth Act:

All bonds issued by the government of Porto Rico or of by its authority, shall be exempt from taxation by the Government of the United States, or by the government of Porto Rico or of any political or municipal subdivision thereof, or by any State, or by any county, municipality, or other municipal subdivision of any State or Territory of the United States, or by the District of Columbia.

That’s clear enough. But why?  Said U.S. Sen. James K. Vardaman, D-Miss., at the time: “Those people are underdeveloped, and it is for the purpose of enabling them to develop their country to make the securities attractive by extending that exemption.” All right, but 100 years of a special favor to encourage development is enough, especially when the result was instead to encourage massive overborrowing and insolvency.

It’s time to end Puerto Rico’s triple-tax exemption for any newly issued bonds (as there will be again someday). As we observe the unhappy 100th birthday of this financial distortion, it’s time to give it a definitive farewell.

Image by Filipe Frazao

Lehmann talks NFIP reform on NPR’s Marketplace

In the wake of devastating floods in Missouri, R Street Editor-in-Chief and Senior Fellow R.J. Lehmann was a guest on National Public Radio’s “Marketplace” to discuss why reforms to the National Flood Insurance Program that encourage more private market participation and risk-based rates are essential. The audio is embedded below.

Kevin Kosar on Fox 5 DC ‘On The Hill’

Vice president of policy at the R Street Institute Kevin Kosar appeared on Fox 5 DC’s “On The Hill” to discuss President Donald Trump’s first 100 days in office.

Cameron Smith talks Alabama’s backdoor booze tax.

R Street’s Cameron Smith joined the Matt & Aunie Show on Birmingham’s Talk 99.5 to discuss backdoor booze taxes in Alabama. Audio of the show is embedded below.

Kosar testifies to House Legislative Branch Appropriations Subcommittee on CRS reports

On May 3, 2017, R Street’s vice president of policy Kevin Kosar testifies before the Legislative Branch Appropriations Subcommittee in support of making Congressional Research Service reports available to the public.

More from Kevin Kosar on why CRS reports should be publicly available can be found here.

Greenhut on ‘damning’ UC audit

R Street Western Region Director Steven Greenhut was a recent guest on the John and Ken Show on KFI AM 640 in Los Angeles to discuss his piece of the Orange County Register discussing the recent unfavorable audit of the University of California system. Audio of the show is embedded below.

Puerto Rico’s inevitable debt restructuring arrives


“Debt that cannot be repaid will not be repaid” is Pollock’s Law of Finance. It applies in spades to the debt of the government of Puerto Rico, which is dead broke.

Puerto Rico is the biggest municipal market insolvency and, now, court-supervised debt restructuring in history. Its bond debt, in a complex mix of multiple governmental issuers, totals $74 billion. On top of this, there are $48 billion in unfunded public-pension liabilities, for a grand total of $122 billion. This is more than six times the $18.5 billion with which the City of Detroit, the former municipal insolvency record holder, entered bankruptcy.

The Commonwealth of Puerto Rico will not enter technical bankruptcy under the general bankruptcy code, which does not apply to Puerto Rico. But today, sponsored by the congressionally created Financial Oversight and Management Board of Puerto Rico, it petitioned the federal court to enter a similar debtor protection and debt-settlement proceeding. This framework was especially designed by Congress for Puerto Rico under Title III of the Puerto Rico Oversight, Management, and Economic Stability Act (PROMESA) of 2016. It was modeled largely on Chapter 9 municipal bankruptcy and will operate in similar fashion.

This moment was inevitable, and Congress was right to provide for it. It is a necessary part of the recovery of Puerto Rico from its hopeless financial situation, fiscal crisis and economic malaise. But it will make neither the creditors, nor the debtor government, nor the citizens of Puerto Rico happy, for all have now reached the hard part of an insolvency: sharing out the losses. Who gets which losses and how much the various interested parties lose is what the forthcoming proceeding is all about.

The proceedings will be contentious, as is natural when people are losing money or payments or public services, and the Oversight Board will get criticized from all sides. But it is responsibly carrying out its duty in a situation that is difficult, to say the least.

There are three major problems to resolve to end the Puerto Rican financial and economic crisis:

  • First, reorganization of the government of Puerto Rico’s massive debt: this began today and will take some time. In Detroit, the bankruptcy lasted about a year and a half.
  • Second, major reforms of the Puerto Rican government’s fiscal and financial management, systems and controls. Overseeing the development and implementation of these is a key responsibility of the Oversight Board.
  • Third—and by far the most difficult step and the most subject to uncertainty—is that Puerto Rico needs to move from a failed dependency economy to a successful market economy. Economic progress from internally generated enterprise, employment and growth is the necessary long-term requirement. Here there are a lot of historical and political obstacles to be overcome. Not least, as some of us think, is that Puerto Rico is trapped in the dollar zone so it cannot have external adjustment by devaluing its currency.

The first and second problems can be settled in a relatively short time; the big long-term challenge, needing the most thought, is the third problem.

The story of the Puerto Rican financial and economic crisis just entered a new chapter, but it is a long way from over.

Image by bobby20

Rep. Ken Buck on the Federal Budget Accountability Act


The Federal Budget Accountability Act—introduced last month by U.S. Rep. Ken Buck, R-Colo., as H.R. 1999—is a short bill, barely two pages long. But it aims to help Congress answer a basic oversight question: how much revenue does the federal government actually receive each year from offsets?

As part of the congressional budget process, Congress gathers estimates of revenues to be received by the federal government, which can be used to “offset” authorizations for spending. For example, as a Buck press release points out, Congress authorizes the Strategic Petroleum Reserve to sell oil. “However, the price of crude oil continuously fluctuates … [which] creates uncertainty regarding the accuracy of Congressional Budget Office projections versus actual revenue received through offsets.”

I had the chance to speak about the bill with Buck, who came upon the issue soon after he arrived in the House in January 2015. “There was not a moment when a lightbulb went off. It was a series of statements about how new spending was ‘paid for,'” he said.

On its face, Buck’s bill may seem utterly unobjectionable. It requires nothing more than that the Office of Management and Budget annually report to Congress on the actual revenues received from offsets. Obviously, it is a basic fiduciary duty to discern whether the revenues received actually cover the costs as intended. A few members of the House Budget Committee are cosponsoring the legislation.

But will H.R. 1999 advance? It’s not clear. Buck suspects that additional spending is being passed off as budget neutral by the misuse of overly optimistic offsets. (On offsets and spending amendments in the House, see this CRS report.) “If they pass the bill, the misrepresentations will be known,” he told me. Enacting the legislation could collectively call out Congress and make the already tough debates over mandatory spending more difficult. “Nobody wants to know what the answer is,” Buck reports, “but we all know. … We just don’t know how bad it is.”

Image by lkeskinen

Dodd-Frank reform must include repealing the Durbin amendment


Many of us know what a “seven-year itch” is. Between the famous Marilyn Monroe movie of the 1950s and the legendary Roseanne Cash song of the 1980s, it is a fairly well-understood turn of phrase.

Congress finally got around this past week to scratching one the most economically painful and fairly literal “seven-year itches” by starting the process to roll back the Dodd-Frank Act, which will turn seven this July.

The Financial CHOICE (Creating Hope and Opportunity for Investors, Consumers and Entrepreneurs) Act—currently before the House Financial Services Committee—has many bright ideas and could serve as a great replacement for the burdensome Dodd-Frank bill of the Obama years. However, in the midst of this happy occasion, the American consumer needs to pay close attention, because Congress may in the end do something stupid.

A behind-the-scenes effort is underway let a Dodd-Frank provision commonly referred to as the “Durbin amendment” remain in the law. If you have a checking account, you should not let Congress keep this law on the books. Chairman Jeb Hensarling, R-Texas, took a strong stand in calling for repeal of the Durbin amendment as part of the CHOICE Act, and the committee should follow his lead by keeping that repeal in the final mark-up.

The Durbin amendment affects literally anyone with a checking account and a debit card. It requires the Federal Reserve to impose artificial government price controls to cap what banks charge to retailers for what are referred to as “interchange fees,” which banks use to pay for the security they provide for customers’ accounts. The cap is set far lower than it would be in a free market, creating a host of unintended consequences.

Before the government interference, banks and credit unions would use these fees to cover more than just security. They would use the revenues to offer perks to their customers, like free checking or point rewards system similar to what we see with traditional credit cards. Studies have shown these perks are worth millions in value to customers. But thanks to the Durbin amendment, banks have been forced to scale back their perks dramatically. The end result has hurt consumers, particularly those—like lower-income families or younger customers—who rely heavily on their checking accounts to conduct financial transactions.

While checking-account customers lost out, retailers (especially big-box retailers) made out like bandits. In 2010, the major retailers’ lobby sold Congress on limiting these transaction fees, promising they would pass along the savings to their customers. As of today, there is no evidence that has ever happened. In fact, an analysis of Federal Reserve data shows retailers have made off with more than $42 billion in foregone interchange fees over the last seven years. Shoppers have seen virtually no decrease in prices, even as they watched as many of their banking benefits disappear.

As the Financial Services Committee wraps up its hearings on the CHOICE Act, it’s important for the American people not to sit by idly. The Durbin amendment was sold in 2010 as protection for the American people, but the data prove the only protection it offers is to the major retailers’ profit margins. The House Financial Services Committee should strive to repeal the Durbin amendment, as should the full House when it hits the floor.

Image by alice-photo

Congress’ ‘cotton fix’ just another corporate welfare handout


Spring is a special time in Washington, filled with many wonderful traditions. Between the blooming of the cherry blossoms, the White House Easter Egg Roll and the Washington Nats’ Opening Day, the nation’s capital is full of action.

However, none of these events compare to Congress’ favorite perennial tradition: trying not to shut down the government. After a two-week spring break, Congress is back, ready to work and horse-trading for votes to prevent a government shutdown. One of the items for “trade” currently being kicked around is a massive expansion of two corporate welfare programs. Referred to as the “cotton fix,” Congress is poised to expand the U.S. Department of Agriculture’s Agriculture Risk Coverage and Price Loss Coverage programs to include cotton as a covered crop.

The ARC and PLC programs already are hardly the gold standard of fiscal responsibly. When Congress created the programs in the 2014 farm bill, the projected costs were $18 billion over five years. They now are projected actually to cost $32 billion over that same time frame. If Congress is successful in adding cotton into the mix, the projected costs easily could be topped up by an additional $1 billion a year.

This might be understandable if there were some crisis in the domestic cotton industry that needed to be averted, but Big Cotton already a pretty cozy deal with Washington. Between subsidized marketing loans, trade promotion programs and economic assistance to cotton mills, the industry is well taken care of by American taxpayers.

And that’s not all the federal government does for them. Unlike many other crops, cotton growers can participate in the Federal Crop Insurance Program and get to ask taxpayers to cover 62 percent of their premiums. Furthermore, during negotiations that produced the last farm bill in 2014, the cotton lobby was able convince Congress to create a special program just for them called the Stacked Income Protection Plan (STAX). This cotton-only program has taxpayers covering 80 percent of the cost for policies that protect against “shallow losses” too minor to be covered under traditional crop insurance.

The cotton industry’s costs to American taxpayers don’t end there. The federal government is in the process of paying out $300 million to the Brazilian cotton industry as part of a 2014 settlement agreement with the World Trade Organization. The settlement was a way to resolve a longstanding trade dispute with Brazil over U.S. domestic cotton subsidies that violated WTO rules. The $300 million payment comes on top of about $500 million the United States paid Brazil from 2010 to 2013 over the same set of issues.

The STAX program was created in hopes that it would stave off future disputes with Brazil, but whether STAX meets WTO rules is itself still an open question among experts. What is certain is that adding cotton to the ARC and PLC programs would only raise the odds of more trade disputes that ultimately cost Americans more money.

Let’s be clear, cotton is still king in Texas and some other parts of the country and Congress knows it. Adding cotton to ARC and PLC isn’t a noble gesture to a struggling industry. It’s about more about making sure multimillion-dollar companies maintain their profit levels at U.S. taxpayers’ expense.

Congress made a deliberate decision to exclude cotton from these two program when they were created in 2014. For Congress to sneak more cotton in the back door of a must-pass bill would amount to yet another corporate welfare payoff, with taxpayers once again left holding the bag.

Image by Kent Weakley

Statewide ridesharing rules on the table in Louisiana


Louisiana may soon join the more than 40 states that have adopted some kind of statewide ridesharing rules, under legislation that would pre-empt parish and local governments from setting regulations and taxes on transportation network companies.

Sponsored by state House Transportation Committee Chairman Kenny Havard, R-St. Francisville, H.B. 527 would require TNCs to register with the Louisiana Department of Transportation and Development and to charge a “local assessment fee” equal to 1 percent of each “gross trip fare.” The 1 percent fee would be sent to local governments from whence rides originated, and part also would be collected by the state to administer the permitting process.

TNCs would be required, through their apps, to display the driver’s picture and license plate before the passenger enters the vehicle. The TNC would also be required to transmit an electronic receipt of the trip.

The legislation also imposes minimum requirements for drivers. The state would ban from working as TNC drivers all sex offenders, felons for up to seven years after their conviction and those convicted of an offense involving drugs or alcohol. The legislation also requires TNCs to adhere to all state anti-discrimination laws and laws providing for the transport of service animals. The law bans drivers from using drugs or alcohol while on duty and requires TNCs to post the policy on their website and provide a means for reporting violations.

In exchange for these requirements, the state would bar local governments and other authorities (including airports) from imposing their own requirements or imposing additional fees. Airports would be permitted to impose only those fees that taxi drivers already pay. Finally, the statute would clarify that TNCs are not taxi operators and are not bound by the taxis code of regulations.

Understandably, the proposal isn’t being received kindly by some in local government:

New Orleans Councilwoman Susan Guidry, who authored the city’s ordinance regulating ride-hailing services, said just a quick overview of the proposed law showed it fell short of the city’s ordinance in a number of ways. It has fewer insurance requirements, less stringent background checks, does not require random drug tests or drug tests after crashes and does not prohibit surge pricing in emergencies.

The proposed state law also does not include prohibitions on discrimination in pick-ups and drop-offs and would not require the ride-hailing services to provide data that could be used to verify whether such discrimination is occurring, something that is including in the city ordinance.

“Why would you create a law that was less protective when they have already agreed to operate under our city’s law which is more protective?” Guidry asked.

Of course, ridesharing companies already operate under a patchwork of rules and regulations. For example, three of the largest parishes in the metro New Orleans area—Jefferson, Orleans and St. Tammany—each has its own ridesharing ordinance, which differ from one another in details. Theoretically, it is possible to drive through all three parishes within an hour, depending on traffic. It doesn’t make sense literally to have to navigate that maze of regulatory regimes over such a short distance.

The Legislature should unleash the potential of the sharing economy statewide. It’s good for consumers and provides new opportunities for drivers to make ends meet.

Image by Ionut Catalin Parvu

Permissionless innovation vs. the precautionary principle

Jonathan Taplin worries the “unfettered monoliths” of Google, Apple, Facebook and Amazon undermine democracy and should be broken up. In Europe and elsewhere, this combination of companies is referred to collectively by the pejorative “GAFA,” a ubiquitous bogeyman and symbol of American cultural imperialism. Never mind that all four got where they are by creating tremendous value for consumers. Google organizes information, Apple makes the best phones, etc. They aren’t harming us, they’re making our lives better.

They also aren’t actual monopolies. Amazon faces off with online retailers operating on razor-thin margins. The iPhone only has 18 percent market share. Google has thousands of competitors in digital ads. Facebook could go the way of Myspace. None of these companies is free from competition, or in a position to exert monopoly power callously.

The author wants us to embrace precautionary regulation like the EU’s. But there’s a reason few big tech firms start there. It’s a good thing America’s best companies don’t have to ask permission to innovate or forgiveness for succeeding.

Westinghouse bankruptcy epitomizes failures of electricity monopolies


Westinghouse Electric Co. LLC—the nuclear power company that traces its lineage to the original Westinghouse Electric Corp., founded in 1886—has been forced to declare Chapter 11 bankruptcy, largely the result of immense delays and cost overruns at two nuclear construction sites, Alvin Vogtle and V.C. Summer.

The bankruptcy places a potentially huge financial burden on electric ratepayers in South Carolina and Georgia and underscores the need for nuclear technologies to reduce cost overruns. But it would be a mistake to blame the current state of nuclear technology itself for Westinghouse’s failure. The mess really stems from the perverse incentives of the natural-monopoly model, which rewards utilities for building capital-intensive “mega-projects” irrespective of investment risk.

The story dates back to the late 2000s, when Southern Co. subsidiary Georgia Power Co. and SCANA Corp. subsidiary South Carolina Electric & Gas Co. received state regulatory approval for their shares to build two reactors each at the Vogtle and V.C. Summer sites, respectively. To their credit, the utilities entered into fixed-price contracts (with cost-escalator provisions) with Westinghouse to build the nuclear facilities by a guaranteed date. This helped to mitigate some of the ratepayer risk of cost overruns.

However, the Westinghouse bankruptcy diminishes these guarantees, causing legal disarray amid speculation of rate increases to recover costs of finding new contractors to finish the projects. Both utilities have filed interim agreements with Westinghouse to administer cost-to-complete assessments over a transition period.

The original sales pitch to approve the nuclear projects rested largely on hedging high natural-gas prices, federal carbon regulation, meeting customer demand growth and taking advantage of federal nuclear subsidies. Over the past decade, natural-gas prices tanked, federal carbon regulation (cap-and-trade) never materialized and demand weakened. Now, it appears the utilities may lose the cost advantages of federal nuclear subsidies. Terminating the Westinghouse contracts may force Southern to prepay the outstanding balance on the $8.3 billion loan guarantee provided by the Department of Energy. Billions in cost escalations would continue to spiral if the projects don’t start operations by the end of 2020, which would render them unqualified for the federal production tax credit for nuclear.

Many independent analysts project that delay beyond 2020 is a given. But as the interim assessment period trudges along, the utilities are telling their regulators a different story. Both downplay the remaining time and costs of completing the projects, while expressing their desire to push forward. Meanwhile, Morgan Stanley & Co. assert that abandoning the nuclear projects is the most likely outcome. If regulators elect to complete construction, Morgan Stanley predicts future delays for the projects and estimated additional cost overruns at $5.2 billion for SCANA and $3.3 billion for Southern. By comparison, building an efficient natural gas power plant would cost roughly $2 billion for an amount of capacity equivalent to each nuclear project.

A strong case can be made that the utilities don’t even need the plants’ full capacity. The Southeast has a surplus of regional capacity, meaning that third-party sources would be available at little cost. But because regulated utilities don’t have an incentive to buy from third parties, it leads to a well-documented bias to self-build.

State legislation championed by the utilities exacerbated the perverse incentives of the regulated monopoly model. Georgia and South Carolina passed laws in the 2000s enabling utilities to recover costs via rate hikes during construction, rather than waiting until completion. The laws lower finance costs, but shifts risk to ratepayers. The change also diminishes regulatory scrutiny of costs, thus dampening utilities’ cost-control incentives. The South Carolina Small Business Chamber of Commerce has criticized the unintended consequences, which include undermining utility incentives to avoid cost overruns and lacking transparency and a process for public input on construction contracts.

The Westinghouse bankruptcy makes one thing clear: when legislators and regulators socialize risks and costs, consumers suffer. The regulated-monopoly model creates moral hazard, epitomized by capital-intensive mega-projects in which companies insulated from investment risks lack incentives to guard against those risks. These nuclear projects are just new cases of a century-old problem.

By the late 1980s, monopoly utilities around the world faced high costs and unwanted assets. The subsequent political pressure led to electricity-industry reforms to change incentives, the locus of decisions and risk allocation. Some states liberalized their electric industries in the late 1990s and 2000s and, despite transition challenges, realized the benefits of competitive markets, as merchant suppliers internalized investment risk. In these states, the investment consequences of unexpected policy changes and drops in natural-gas prices and electricity demand have been borne by the private sector, which has repositioned itself to maximize value in a new investment climate. Meanwhile, regulated utilities have sat on power plants that no longer offer the most economical means of producing electricity in order to continue collecting a rate of return on their asset base. Worse, some have embarked on ill-advised investments on the backs of captive ratepayers.

States that failed to learn from the boondoggle projects of regulated monopolies have repeated them. Electric ratepayers will eat much of the cost, even if regulators elect to abandon the nuclear projects, as was the case with mega-projects decades ago. Perhaps the silver lining is that policymakers in regulated-monopoly states finally will learn the appropriate lesson and join the second wave of competitive-electricity reforms.

Federal policymakers should keep in mind that nuclear still provides a strong value proposition as a reliable, zero-emissions resource. However, any technology that takes a decade to build and carries huge capital demands creates an enormous investment risk. For nuclear, the best hope comes in the form of small modular reactors (SMRs). These reactors offer major safety and operational benefits with potential for much lower cost-overrun risk. NuScale Power announced the first SMR submission to the Nuclear Regulatory Commission in January. Easing the regulatory burdens on SMRs would reduce artificial barriers to entry. If SMRs become commercially viable, procurement decisions should come from competitive forces, not rent-seeking monopolies and their regulators.

Image by Martin Lisner

Does Congress have the capacity it needs in foreign affairs?


The Constitution assigns Congress the power to declare war, fund the military, approve treaties and regulate commerce with other nations. Yet, over the past century, presidents have taken the leading role in foreign affairs. Today, the president heads an expanding executive branch security apparatus—one which has found itself mired in controversy many times.

What role does Congress play in foreign affairs in the 21st century? What duties should it have? Does Congress have the resources it needs? The Legislative Branch Capacity Working Group recently hosted a panel on the questions, moderated by R Street’s Kevin Kosar and featuring Kurt Couchman of the Defense Priorities Foundation and Katherine Kidder of the Center for a New American Security. Video of the panel is embedded below:

Florida House bill would make solar installations a pain


If you think getting home improvements approved and ultimately completed in Florida is a hassle now, if monopoly power companies get their way, just wait until you try to install solar panels.

A bill currently under consideration in Florida’s Capitol would impose extensive disclosure and needless paperwork requirements on sellers of rooftop-solar panels and other renewable energy systems—to include everything from performance guarantees to tax advice, insurance and a requirement to project future utility rates.

H.B. 1351 by state Rep. Ray Rodrigues, R-Estero, and S.B. 90 by state Sen. Jeff Brandes, R-St. Petersburg, both would implement provisions of Amendment 4 by exempting solar and other renewable-energy devices from ad valorem property taxes. The Senate bill sticks to its objective by simply codifying the amendment, which was approved by 73 percent of Florida voters last August. The House version, however, goes beyond implementing the amendment by regulating the sale, financing and lease of these energy-generation systems, in addition to imposing other conditions.

Indeed, some requirements prescribed in the bill appear to be reasonable at first glance, as they relate to safety and reliability. However, they are superfluous, since installers of these devices are already regulated by the Department of Business and Financial Regulation and are required to be licensed and insured. Additionally, consumers already enjoy legal protections against fraud and other deceptive transactions with Florida’s very tough Deceptive and Unfair Trade Practices Act.

One provision in the bill even requires installers to comply with undefined “standards” set by the local utility company, which would promote an inherent conflict of interest between the renewable electricity source and the utility that stands to lose business from it.

Nevertheless, proponents cite “consumer protection” as justification for these onerous requirements, as so often is the case with excuses for a swelling nanny state to protect us from ourselves. In reality, all too often, these are nothing more than crony capitalist attempts to protect other industry players.

That, in fact, appears to be the case here. Utility companies have historically been the only option available to purchase electricity. With the rise of solar and dramatic decreases in the cost of renewable energy, consumers now have an alternative. Utilities obviously perceive this as a threat to their business model and businesses unaccustomed to competition generally do not like it.

So while they cannot altogether ban the sale of solar panels and the like, what better way to discourage their purchase than to complicate the process to obtain them? According to a recent Miami Herald investigation, some of H.B. 1351’s language actually was drafted by Florida Power & Light, the state’s largest utility.

If there are legitimate safety or consumer protection concerns with the sale of renewable-energy generation systems that current law does not address, a debate should indeed be had and legislation to address it considered. However, the bills currently under consideration should stick to implementing and codifying the amendment Floridians overwhelmingly approved—not shielding utility companies.

Image by travelfoto

Missouri ridesharing bill moves to Gov. Greitens’ desk


Legislation legalizing ridesharing services in the Show-Me State now sits on Gov. Eric Greitens’ desk, after the Missouri House passed statewide rules for transportation network companies by a 144-7 vote last week. The state Senate had already cleared the measure by a vote of 31-1 a few days earlier.

As the Associated Press described the bill:

The legislation would require that companies pay a licensing fee and adhere to a nondiscrimination policy. It would exempt them from local and municipal taxes and require drivers to submit to background checks and purchase vehicle liability insurance.

Missouri cities, like many others around the country, initially were cool to ridesharing, throwing up regulatory impediments to halt the services’ spread. By the time R Street issued its second Ridescore report in December 2015, there were only 15 states that did not either have or were considering statewide legislation, typically focused on mandatory insurance, taxes and background checks. Today, there’s only a handful of states that have not yet passed statewide rules.

In the first Ridescore report in November 2014, Kansas City earned a D- for overall friendliness to for-hire transportation services and an F for its treatment of TNCs. Those grades improved slightly to a C and a D, respectively, in the second report, though both remained several grades lower than the average and median scores in the 50-city study.

Enacting a statewide law has been a priority for House Speaker Todd Richardson, R-Poplar Bluff, and other Missouri lawmakers focused on job creation. Uber has projected an additional 10,000 jobs for the state through expansion of its ridesharing app service. Floor remarks by legislators from Springfield—where both Uber and Lyft now operate—indicated more people have been able to get downtown since that city moved to allow ridesharing services in the capital.

The compromise that attracted enough support for the large vote in both houses specifies that Uber, Lyft and other ridesharing services must pay city taxes and be liable for pickup fees at the airports. They do not have to pay meter inspection or other license fees, and they are permitted to charge higher prices for busier times because of rush hour or bad weather, when demand escalates.  These increased charges must be accepted by the customer using the application, of course.  Moreover, both Kansas City and St Louis won the right to audit the newly authorized services up to twice a year, to alleviate concerns regarding public safety and chiseling on fees.

State lawmakers have a lot on their plates, since Congress appears unlikely to solve more than a few of the 21st century adjustments required to maintain a reasonable level of civilization. It is encouraging that citizen participation in popular disruptive services has produced an environment where many more people on both sides of the transaction can participate with the government’s blessing and oversight.

Image by Nagel Photography

Carbon taxes are about climate issues, not budgets


A good test for whether politicians are serious about battling climate change, or merely using the problem as an excuse to advance a grab-bag of progressive issues, is to examine what they would do with the revenues collected from a carbon tax.

If the answer involves anything other than offsetting cuts to other taxes, then I suspect the politician’s motives are less than pure. Carbon taxes are not about raising revenue. They are about placing a price on emissions so companies and consumers have incentives to choose lower-emitting options. The goal is to put a price on an “externality” – the economic term for ill side effects that aren’t included in the price of production.

Unfortunately, Washington Gov. Jay Inslee has failed this test with his carbon-tax proposal to help fund the state’s budget. As the Tacoma News Tribune reported in late March: “Not only does Inslee say it would combat climate change, a major priority of the governor’s, but it also would raise $2.1 billion in the next two years to help make court-ordered changes to the public school system and fund construction projects.”

Climate activists routinely warn about the dire consequences for the planet if the public doesn’t get serious about the issue. They also like to harangue global-warming skeptics for their refusal to jump aboard their campaign. Yet when they have the chance to ameliorate the concerns of those with other political views, they fail to do so.

It’s hard to blame skeptics who worry that the global warming fight is mostly about helping the state grab more tax revenue when leaders in that movement make clear they see a carbon tax as a way to help the state grab more tax revenue. Fortunately, Washington legislators from both parties failed to include a carbon tax in their $44.7 billion budget plan, which the newspaper described as a “one-two punch in Inslee’s eyes.”

Carbon-tax proponents believe the tax would internalize the social cost of carbon emissions in a way that’s more efficient and cost-effective than command-and-control regulations. Its purpose is not to fund all sorts of programs or balance the budget. A carbon tax accompanied by cuts in other taxes and paired with reductions in the regulatory burden has the best shot to win over people who suspect the whole thing is a sleight of hand.

Carbon taxes are a hard enough sell when their backers are not looking for a tax grab. On Nov. 4, Washington voters handily defeated Initiative 732, which would have been the first fossil-fuels tax in the nation. The Seattle Times reports, ironically, that “the measure had trouble marshaling consensus among progressive and environmental groups” because of “budgetary and other concerns.” Apparently, they didn’t like that its authors tried to make it revenue neutral.

That’s a sad commentary on the priorities of some activists and politicians, who claim to be urgently alarmed by global warming’s threat to the planet. Voters from across the political spectrum might start to take their dire warnings more seriously when they introduce a carbon tax that is about curbing emissions – not raising taxes to pay for a bunch of programs and subsides. Until then, expect tax-burdened voters to keep giving these proposals a failing grade.

Image by Andre Lefrancois

Are you paying your fair share of taxes?


The following is a guest post by attorney and freelance writer Mark Meuser.


Today, many Americans will finalize their federal income tax returns and send their 1040 forms to the Internal Revenue Service to make tomorrow’s Tax Day deadline. Whether you are receiving a refund or will need to send a check to Uncle Sam, if you worked more than 35 hours a week and did not make at least $164,500 in 2016, you will not be paying your fair share of taxes this year. Shame on you.

Obviously, I am joking, but the per-capita burden of federal spending is no laughing matter. In 2016, the federal government spent approximately $12,387.29 per resident of the United States. Some might think that $12,387.29 in taxes sounds reasonable. Under current tax rates, it would mean each and every man, woman and child must earn at least $66,450 to pay his or her fair share.

Obviously, not everyone works or earns anywhere near that amount. Some 47 percent of all Americans are either too young or too old to be gainfully employed full-time. If 100 percent of all Americans between the ages of 25 and 65 were to pay taxes, federal spending would be equivalent to approximately $23,072.30 per working-age adult

But even among the able-bodied, we don’t see 100 percent workforce participation. Whether because of a disability or lack of necessary job skills, or because a parent chooses to stay home and raise their children, some people just don’t work. According to the Bureau of Labor Statistics, there are approximately 100 million Americans over age 25 who work 35 hours a week or more. To cover total federal spending costs, each would need to pay $39,104.43 in taxes for the government to balance its budget. That would require each to have at least $164,500 in individual (not household) earnings per year.

An American’s fair share of government spending has not always been this high. When my grandfather was born 95 years ago, per-capita federal spending was just $30.14 ($437.04, when adjusted for inflation). The run-up in federal spending amounts to a 3,000 percent increase.

All of which raises the question each of us should ask as we send off our tax filings: how much government am I really willing to pay for?

Image by Steve Heap

How cronyism threatens Louisiana’s craft breweries


Louisiana is well-known its love of both food and alcohol. The state is a tourist destination for those looking both to enjoy excellent dining and to have a good time. Louisiana’s love affair with food has made its cuisine well-known worldwide. Meanwhile, New Orleans’ Mardi Gras festival has few rivals around the world.

Meanwhile, across the country, craft-beer breweries and so-called “gastropubs” have been growing. The craft-beer revolution proceeded at a slower pace in Louisiana, with Abita as one of the few local craft beers to gain national exposure. Much of the reason for this disparity is the hostility the state has shown to brewers, which is in line with its profile generally as a terrible state in which to do business, thanks to its high taxes and crippling regulations. Louisiana has the 12th highest beer excise tax in the country, at 40 cents a gallon. In fact, the tax-hungry state recently raised the fee.

If the tax increase were not enough, the state now is going after craft breweries who also serve food and hold events. Last fall, Louisiana’s craft breweries received “cease and desist” letters and were cited by the Louisiana Office of Alcohol and Tobacco Control for everything from holding yoga classes to serving food. The breweries had been holding those events for years without any complaints, but the ATC suddenly found regulations that limit what breweries could provide on their premises.

The craft brewers got angry and demanded a change in the regulations. In March, the ATC released new rules that, on the surface, would permit many such events. Alas, the devil was in the details.

The ATC ruled that live entertainment was permitted at breweries only so long as it was “not the primary purpose of the facility.” Breweries also could serve food and even charge a cover for some shows. But food sales must be “incidental to the beer sales,” meaning they could not exceed 25 percent of on-premise beer sales. The ATC also banned on-site restaurants from serving alcohol produced off-site. Finally, the ATC ruled that breweries could host fundraisers and events for nonprofits, but they must be a registered 501c(3), c(6) or c(8) and all proceeds from the event must go to the nonprofit.

While the new rules clarify old regulations, they still threaten the existence of craft breweries and gastropubs across the state. NOLA Brewery Co. CEO Kirk Coco told The Advocate that he was concerned about how the regulations would affect his brewery’s recently opened barbecue restaurant, a part of its $1.6 million expansion. Coco also warned of job losses, saying he “would guarantee you that there would be at least three or four closures in the next six months and that’s all jobs.”

Meanwhile, other brewers have threatened to take their operations out of state. One brewer considering leaving Louisiana is Parish Brewing Co. “I am in the process of planning a multimillion dollar expansion and I am considering doing so across the border in Texas or Mississippi if the government is against breweries here,” Parish Brewing owner Andrew Godley told The Advocate.

Craft brewers believe the regulations were issued at the behest of the Louisiana Restaurant Association, which sees breweries as competitors, particularly to sports bars. Instead of going to the Legislature to change the law, entrenched interests merely had to complain to an unaccountable executive-branch agency.

Serving food and holding events is an important part of the craft-brewery business. It helps them build brand recognition and provide jobs for their employees. Louisiana should keep in mind the maxim “do no harm” when they regulate this growing segment of the state economy.

Image by f11photo

Discussing the future of the GSEs on the Investors Unite podcast

I recently joined Investors Unite founder Tim Pagliara on the group’s housing podcast for a broad-ranging discussion about what a future arrangement for Fannie Mae and Freddie Mac might look like. Audio of the full show is embedded below.

R Street launches Justice for Work coalition with April 17 D.C. event

As the bipartisan movement for criminal-justice reform continues to move forward in the states and at the federal level, it’s time to reconsider government-imposed barriers to economic opportunity, such as occupational licensing, mandatory background and biometric checks, and other restrictions on the ability of ex-offenders to find financial stability and meaningful work.

In that vein, R Street will host an April 17 event to announce a new ideologically diverse coalition to highlight the issue of “Justice for Work.” To be held 6 p.m., April 17 at the Stanton & Greene loft (319 Pennsylvania Ave. SE), the launch will be occasioned by an expert panel that includes ex-offenders, former law-enforcement officers, and policy and legal experts. It will be followed by an open-bar social mixer.

We are joined in this new coalition by the American Civil Liberties Union, Right on Crime, Impact Justice, Tech Freedom, FreedomWorks, Americans for Tax Reform and the American Conservative Union Foundation. Together, these members agree that prescriptive mandates may serve a purpose where there is a demonstrated public safety risk that cannot effectively be addressed otherwise. But in areas where access to work is denied solely to signal the empty political slogan of being “tough on crime,” the Justice for Work coalition seeks to make meaningful change.

RSVP here.

The ‘fixed AI’ fallacy


As Andy Kessler points out in The Wall Street Journal, a tax on robots would hinder entrepreneurial activity in automation and artificial intelligence (AI). The same algorithms that make job-displacing robots smarter and more effective also make us more productive at translating documents, searching for information and streamlining daily tasks. We can’t have our cake and tax it too. As Winston Churchill once said, “I contend that for a nation to try to tax itself into prosperity is like standing in a bucket and trying to lift himself up by the handle.”

Gates and others who bemoan the changing job market fall prey to the fixed pie fallacy—the assumption that available jobs and the wages those jobs pay are fixed quantities. Developments in information technology have led to jobs unimagined by macroeconomists and technologists of previous decades, such as social-media managers, website designers, bloggers and virtual assistants. Crafting policy based on “fixed AI” thinking will prevent new jobs from arising.

Job displacement is an inevitable consequence of technological development and economic growth. Instead of taxing our digital co-workers, thought leaders such as Gates should argue for policy changes that permit experimentation in skills-based education and workplace benefits to better equip workers with the skills and financial flexibility to adapt to the changing jobs market. To realize AI’s full benefits of productivity and convenience, we need to view it as a feature, not a bug, of our tech-imbued future.

Image by Jinning Li

Caleb Watney talks self-driving cars on KVOI

In light of last month’s high-speed crash in Tempe, Arizona, involving a self-driving Uber car (reports say the car had the right of way), R Street Tech Policy Associate Caleb Watney was a guest on Mike Check with Mike Shaw on KVOI-AM in Tucson to discuss the technology and public policy around autonomous vehicles. Audio of the segment is embedded below.

Kosar talks congressional reform on The Golden Mean

R Street Governance Project Director Kevin Kosar recently joined host Michael Golden’s podcast The Golden Mean to discuss the Legislative Branch Capacity Working Group and the prospects for congressional reform. The full show is embedded below.

Holding the administrative state accountable

R Street Senior Fellow Kevin Kosar joined the Manhattan Institute’s Oren Cass and Adam White of The Hoover Institution on the Federalist Society’s podcast to discuss the Legislative Capacity Working Group and efforts to restore Congress’ role as a check on the executive branch. The full show is embedded below.

Throwing cold water on the insurance industry’s dog bite numbers


Today is National Pet Day, a day to cherish the love, entertainment and fulfillment provided to us by our animal companions. Or, if you’re in the insurance industry, it’s a day to stoke fear of dog bites.

Dog-Bite Claims Surge 18% as Children Bear Brunt of Attacks” reads the headline from Bloomberg, based on a press release from the Insurance Information Institute. Indeed, the III produces a similar release every year, in recognition of National Dog Bite Prevention Week, which runs April 9 to April 15.

The calendar-making gods are sending some decidedly mixed messages.

As is their wont, insurers want to highlight safety, which is a perfectly commendable goal. Dog bites and other pet-related injuries befall thousands of people each year, and better care can and should be taken to mitigate and avoid them. They also constitute a significant portion of the loss costs associated with the liability portion of one’s homeowners insurers policy, which explains the motivation for the public education campaign.

However, when one drills down on the numbers, there’s little to justify the alarmist rhetoric. Dog bites are not “surging” at all.

It first bears noting that liability isn’t actually an especially big ticket item for homeowners insurers. The III notes that the industry paid out $602.2 million in dog-related claims in 2016. That sounds like a lot. But it represents just a tiny portion—just a little more than 1 percent—of the more than $48 billion in claims they paid out, much less the $91.4 billion in direct premiums they collected, according S&P Global’s statutory insurance data.

Also worth mentioning is that, while the headlines tout a rise in dog “bites,” the data actually refer to “dog-related injuries.” If you break your neck after tripping over your shih tzu, that gets included. How often does that happen? A lot. Falls are the number one cause of nonfatal injuries in this country. A 2009 study from the Centers for Disease Control and Prevention found an average of 87,000 fall injuries treated in emergency rooms each year were associated with cats and dogs. Dogs represented 88 percent of the total, or about 76,000 dog-related falls that send Americans to emergency rooms every year.

Of course, that 76,000 figure far exceeds the 18,123 dog-related claims reported by the III, so the vast majority of people who suffer dog-related falls never file a homeowners claim, even if they went to the emergency room. No doubt the same is true of dog bites. Of the claims we know about, what proportion are dog bites and what proportion are other kinds of injuries? We don’t know. The III doesn’t break out those numbers. We do know that dog bites sound scarier than dog falls (even though the latter might actually produce more serious injuries) so it shouldn’t be surprising that’s what gets the headline.

Speaking of headlines, let’s look at Bloomberg’s choice to characterize the rise in dog-related claims as a “surge.” It’s true that claims rose about 18 percent from 15,352 in 2015 to 18,123 in 2016. Is that really a surge? Bear in mind that there are nearly 90 million dogs in the United States. Even if we assume no single dog was responsible for more than one insurance claim, it would still mean only about 0.02 percent of American dogs contributed to an injury that sparked an insurance claim. A difference of less than 3,000 claims per year, in a universe that big, amounts to statistical noise.

But even if we were to take the incredibly small sample size at face value, note that this year’s increase followed back-to-back years when the number of dog-related injury claims declined. From 2013 to 2015, the number of pet-related claims fell 12 percent, from 17,359 to 15,352. But were we treated to headlines about how dog bites had “plummeted?” No. No, we were not.

For that matter, it is just frankly irresponsible to represent these numbers without making basic adjustments for factors like inflation and population growth. The III notes that the average cost of a dog-related claim has risen by 73.4 percent from 2003 to 2016. This would leave one with the impression that pets have become more dangerous or, specifically, that bitey dogs have become more vicious.

But that’s just not true. Of course the average injury claim has gone up since 2003, because the cost of health care has gone up since 2003. Using a medical cost inflation calculator, one would expect the average claim to rise by about 56 percent over that period. Again, dealing with a small sample size, the mix of the kinds of claims in a given year could make the average claim go up by more or less than the baseline cost of medical inflation. Indeed, from 2015 to 2016, the average claim went down by 11 percent.

Even more significant to the overall picture is that neither the III, nor any of the news outlets reporting their findings, make even the slightest effort to put into perspective that, over the long term, the number of claims has been relatively flat, even as the number of people and dogs continues to increase.

According to the III, from 2003 to 2016, the number of dog bites rose by 7 percent, from 16,919 to 18,123. But the population of the United States rose by 11 percent over that same period, from 290.1 million to 322.8 million. And as the chart below makes clear, the population of U.S. dogs surged by a whopping 35 percent.


So, this actually means both that a declining proportion of Americans are being bit by dogs each year and that a way smaller percentage of dogs are biting (or tripping or what have you) people. In a nutshell we’ve gone from one dog-related injury for every 17,146 people and 3,841 dogs to one for every 17,811 people and 4,949 dogs.

That’s the kind of good news we should be celebrating on National Pet Day.

Image by everydoghasastory

Kevin Kosar at TPA postal reform panel


R Street Senior Fellow Kevin Kosar took part in a recent Capitol Hill briefing on U.S. Postal Service reform. The panel was hosted by Taxpayers Protection Alliance and also featured representatives of Americans for Tax Reform, the American Consumer Institute and Frontiers of Freedom.

It’s time to kill the Durbin amendment


After six years of unfulfilled promises, it’s time the Durbin amendment finally was repealed. A last-minute addition to the Dodd-Frank Act—itself a political overreaction to the financial crisis of 2007-2009—the amendment passed without a hearing or adequate discussion of how it would work in practice. We now know it hasn’t worked at all.

Interchange fees are charged by banks to retailers to allow customers to use that bank’s debit card in that store. The Durbin amendment gave the Federal Reserve power to cap those fees, which at the time averaged $0.44 per transaction, for banks with more than $10 billion in assets.

Proponents of the rule hoped that what would have been banks’ revenues would translate instead into lower retail prices for consumers. Indeed, retailers were projected to save an estimated $8 billion yearly. But nearly six years since the price controls went into effect, consumers have not benefited; a fair number, in fact, were made worse off.

The cost savings have, for the most part, become profits for retailers. The Federal Reserve Bank of Richmond found recently that three-quarters of retailers it surveyed did not change prices since interchange fee caps went into effect, and nearly one-quarter actually increased prices.

The Richmond Fed estimates the goal that retailers would pass savings on to customers in the form of lower prices has had an estimated 1.2 percent success rate. These findings are confirmed elsewhere, providing evidence to conclude that consumers experienced effectively no savings at the register.

For any student of history, it should come as no surprise that governments cannot divine the “fair prices” of things. Rent control laws in New York have created enough abandoned housing units to house all of the city’s homeless. Regulation Q, which allowed for government price fixing in deposits, encouraged complex arrangements that discriminated against smaller and less wealthy savers. One can go back as far as ancient Egypt and Babylon to find examples of people not understanding that prices convey economic realities that remain fixed, even after the government changes the prices.

That the Durbin Amendment would suffer the same fate as these other price controls was not hard to predict. To offset revenue losses and remain competitive, banks needed to find ways to raise their deposit account fees. Some did it through higher monthly service charges, while others cut back on free services like checking. A large number of financial institutions—especially small issuers like community banks and credit unions—essentially were pushed out of the competition due to the administrative costs and red tape of various provisions. And all financial institutions saw reduced incentives to innovate in the payment card industry.

As a result, financial markets suffered fewer free checking accounts, fewer debit-card rewards programs, higher costs of entry into financial services and continued reliance on payment networks more susceptible to fraud. These consequences hurt all bank customers, but especially those with lower incomes. Up to 1 million customers were pushed out of the banking system, presumably into the domain of alternative financial providers such as check-cashers and pawnshops.

From the observable consequences, one would be hard-pressed to find the amendment as accomplishing any legitimate objective, other than unintentionally enshrining benefits to particular kinds of retailers. The rule created market distortions that hurt all financial institutions, especially smaller ones, and hurt all depository customers, especially the poor. The Durbin amendment is a case study in how rushing into legislation—without give-and-take deliberation—tends to produce the opposite of what was intended.

Image by alice-photo

Adams talks self-driving cars at Institute for Legal Reform

ian chamber

The threat of litigation could derail the promise of autonomous vehicles to lives. R Street Senior Fellow Ian Adams recently joined a panel hosted by the U.S. Chamber of Commerce’s Institute for Legal Reform to address potential liability issues and allow the technology to achieve its full potential. Video of the full panel is embedded below.

Short-term rentals are an opportunity Missouri can’t afford to miss


Whether it’s the cars in their garages or the rooms in their homes, Americans are realizing they’re leaving money on the table when their property remains idle. House Bill 608, making its way through the Missouri Legislature, ensures that Missourians are able to take advantage of economic opportunity in the short-term home rental space.

That economic impact of short-term rentals is significant, as one study after another confirms there is growing demand. The National University Institute for Policy Research found that short-term rentals generated a total economic impact of $285 million in San Diego from 2014 to 2015. A study commissioned by Homeaway earlier this year found the economic impact of short-term rentals in Nashville was $477.2 million.

While H.B. 608 isn’t perfect, reasonable statewide standards for short-term rentals make a lot of sense. If Missouri’s legislators want the “gold standard” of short-term-rental laws, Arizona is a good place to start. The Grand Canyon State collects a number of lodging-related taxes on short-term rentals, but prevents cities, towns and counties from restricting short-term rentals simply because the property in use isn’t classified as a hotel.

One of H.B. 608’s more significant departures from the Arizona model is a provision that allows “any county, city, town, village, township, fire district, 10 sewer district, or water district” essentially to ban short-term rentals before April 1, 2018. The bill’s new statewide provisions won’t affect those “political subdivisions” that act before the grandfather date. This might create an incentive for local governments to race to restrict short-term rentals, simply to retain the option to do so in the future.

Oddly, that’s one of the chief problems that a commonsense short-term-rental law should correct. Missourians across the state should have the same basic opportunity to generate additional income with their properties, not a patchwork of local ordinances that grant opportunity to some and remove it from others.

I’ve seen firsthand how short-term rentals benefit the little guy. On a trip to Charleston last year, I crossed paths with an Uber driver who used the income from short-term rentals in his basement to purchase his car. That’s not some corporation horning in on a neighborhood; it’s the American dream of being able to work hard and succeed, using every tool at your disposal.

Having a transparent and predictable legal foundation for short-term rentals at the state level probably means H.B. 608 is worth supporting, even with the grandfathering provisions. But the Missouri Legislature would make a better choice by ensuring the economic opportunity of short-term rentals is open to all its citizens.

Image by f11photo

Ohio and Indiana take different approaches to opioid epidemic


Ohio Gov. John Kasich described drug addiction as “a common enemy” in this week’s state-of-the-state address. Kasich highlighted the challenge in terms similar to those laid out by state Speaker of the House Cliff Rosenberger, R-Clarksville, back in January when members were sworn in. But there does not yet appear to be regional consensus on how to engage this blight on civilization.

Just imagine what kind of relief Ohio could be afforded in health care, where most of $1 billion in state and federal Medicaid addiction-treatment funds go, if this problem were to be resolved. Nearly as large a cost, in terms of both wasted lives and government expenditures, stems from corrections programs for drug abusers. The costs in education, housing, social services and workplace productivity are incalculable.

As J.D. Vance, author of last year’s bestseller Hillbilly Elegy: A Memoir of a Family and Culture in Crisis, pointed out last week in a keynote address to the Federalist Society in Columbus, a policy that works has got to do something about the addict, but also for the aunts, uncles and grandmothers who shoulder the burden of child care for a mother who has succumbed to a drug overdose. In 2015, Ohio led the nation in this tragic category with 3,050, and the 2016 total may have topped 4,000. As reported in the Columbus Dispatch story linked above, Senate Minority Leader Joe Schiavoni, D-Boardman, noted at this week’s state-of-the-state joint session of the Legislature that two Ohio counties have had to rent refrigerator trucks to handle the surge in the number of corpses from lives snuffed out by overdose.

The first of Kasich’s major proposals to tackle the issue is a $20 million grant fund to accelerate treatment programs and technologies that promise to serve as useful tools in the fight against drug abuse. The money would come from the Ohio Third Frontier Commission, which votes to dole out bond proceeds for 21st century innovations. The idea is that these resources might bring some breakthrough addition-mitigating technology to market that otherwise would stall out due to lack of funding.

Currently, prescriptions for pain medications can be written for 30-90 days. According to the Ohio Department of Health, nearly 800 million doses of pain pills were prescribed in Ohio in 2012, although the Dispatch noted that general awareness of the overdose problem has helped curb that figure to about 631 million doses last year.

Number of opioid doses dispensed to Ohio patients, 2011-2015

ohio opioids

The governor’s second proposal is that prescriptions be limited to shorter terms—seven days for adults and five days for minors with acute pain, but not chronic conditions. Doctors could use their judgment to exceed these dosage guidelines, if they document the reasons. Apparently, the Ohio Medical Board, Dental Board and Ohio Boards of Pharmacy and Nursing will all have to sign off on the proposed legislation.

Next door in Indiana, a legislative proposal passed on the House floor this week gives up on the modern approaches to criminal justice, which include giving judges more discretion and preferring treatment over punishment. S.B. 324 instead aims to crack down on heroin dealers and those who rob pharmacies, increasing the severity of the penalties for dealing and lessening the judiciary’s discretion in sentencing. Critics argue the Legislature is “backsliding” to previous, failed attempts to address the drug epidemic, but the bill was approved by a huge 72-18 margin.

As state Rep. Ed Delaney, D-Indianapolis, noted before the vote, taking away judges’ discretion means giving more discretion to prosecutors, which isn’t an unalloyed good in the current criminal justice landscape. Even though nearly all lawmakers agree with the proposition that the goal of incarceration is to deal with the “people we are afraid of, and not the people we are mad at,” it proves difficult to convince them not to be more afraid of drug dealers than rapists and armed robbers.

I can’t yet fault the approach in either state, since all serious policymakers are at their wits’ end about the drug problem. But I have to root for Ohio’s search for innovative breakthroughs. Opioid abuse affects many precious lives, careers and billions in government expenditures, as mentioned above.

Perhaps it is time for a serious discussion of the ameliorative potential of marijuana extracts for pain relief. According to public opinion polls, most Americans would like to give medical marijuana a chance to prove its value. However, this is a place where there is a clear conflict between not just science and law, but two distinct sets of cultural values.

Image by Steve Heap

Toward a global norm against manipulating integrity of financial data


The following is a guest post by Tim Maurer, who co-directs the Cyber Policy Initiative at the Carnegie Endowment for International Peace, and Steven Nyikos, research analyst at the Carnegie Endowment for International Peace.

The February 2016 theft of $81 million from Bangladesh’s central bank, which recent reports suggest may have been perpetrated by agents of North Korea, demonstrated the scale of risk that malicious hackers pose to financial institutions.

Cyberattacks to manipulate the integrity of financial data pose a distinct set of systemic risks. While a cyberattack on an electrical grid, for example, will be mostly limited to a single country’s territory or its immediate neighbors, the effects of an attack on the financial system are not bound by geography. Such attacks could lead to bankruptcies that, in turn, send shock waves throughout the global system.

The G-20 finance ministers and central bank governors recognized the threat in a March 18 communiqué:

The malicious use of Information and Communication Technologies (ICT) could disrupt financial services crucial to both national and international financial systems, undermine security and confidence and endanger financial stability.

Now the G20 heads of state have an opportunity to take further action. A new white paper by the Carnegie Endowment for International Peace proposes the G-20 heads of state explicitly commit not to undermine the integrity of financial institutions’ data—whether in peacetime or during war—or allow their nationals to do so, and to cooperate with the international community when such attacks do occur.

Most states already demonstrate restraint when it comes to cyberattacks that could compromise the integrity of financial institutions’ data. By making such restraint explicit, they could:

  • Send a clear signal that global financial stability depends on preserving the integrity of financial data and that the international community considers attacks on that integrity off limits;
  • Build confidence among states that restraint in this domain is already the norm and thereby make it easier to mobilize the international community when that norm is violated;
  • Foster greater international collaboration to tackle nonstate actors who target financial institutions with cyber-enabled means; and
  • Complement and enhance existing agreements and efforts, namely the 2015 G-20 communiqué, the 2015 UNGGE report and the 2016 cyber guidance from the Committee on Payments and Market Infrastructures and the International Organization of Securities Commissions (CPMI-IOSCO).

The agreement proposed in the Carnegie white paper would commit states not to conduct or knowingly support any activity that intentionally manipulates the integrity of financial institutions’ data and algorithms, wherever they are stored or when in transit. It also binds states, to the extent permitted by law, to respond to requests by other states to assist in halting cyberattacks that target financial institutions’ data and algorithms and that either pass through or emanate from the state in question.

Elements of the proposed agreement are mutually reinforcing. The commitment by states to provide assistance and information, upon request, shifts the burden of attribution from the victim of attack to states that have professed interest in helping to respond to and ultimately prevent such attacks. Linking an agreement on state restraint with expectations for the private sector to implement due-diligence standards addresses potential moral-hazard problems.

The agreement would build on existing international law and on recent international efforts to develop rules for cyberspace. These include the 2015 report of the U.N. Group of Governmental Experts, which proclaimed:

States must not use proxies to commit internationally wrongful acts using ICTs, and should seek to ensure that their territory is not used by non-State actors to commit such acts.

The G-20 heads of state could advance this norm powerfully, building on the finance ministers’ statement, by articulating it formally when they meet in July.

Of course, in the 21st century, a few states that are relatively cut off from the global economy, and nonstate actors who may or may not be affiliated with them, could conduct cyberattacks against financial institutions. But states that endorse the norm explicitly would be more united and would have a clear basis to demand potential retaliatory action against violators—be they states, terrorists or cybercriminals.

Image by vectorfusionart

Vivek Murthy on vaping and public health

R Street Associate Fellow Damon L. Jacobs attended the recent National Council for Behavioral Health’s NatCon conference in Seattle, where he got to ask Surgeon General Vivek Murthy to weigh in on the role vaping could play in harm reduction and public health. Video of the exchange is embedded below.

Juvenile justice legislation now moves to U.S. House floor


A decade after Congress allowed the Juvenile Justice and Delinquency Prevention Act’s authorization to expire, legislation to reauthorize the bill is moving to the House floor following today’s successful markup by the Committee on Education and the Workforce.

First authorized in 1974, the JJDPA has been an important tool in protecting children who are in custody of the criminal-justice system. Based on broad consensus standards of care, the law ensures that children held for “status offenses”—that is, those that are only illegal because they were committed by someone under the age of majority—can’t be held in jails or prisons unless the child also committed a criminal offense. Another crucial provision of the law requires that, if a child is to be detained, there must be a “sight and sound” separation from adult offenders.

The JJDPA has not been reauthorized since it expired in 2007. The current House bill is the Juvenile Justice Reform Act, introduced last week by Reps. Bobby Scott, D-Va., and Jason Lewis, R-Minn. A Senate companion is expected to be introduced next week.

While one should always bear federalism concerns in mind when the federal government sets out standards for issues that clearly are in the states’ purview, it’s encouraging that the JJDPA is back on Congress’ agenda. This is an important piece of legislation that helps ensure children are protected and gives them the opportunity to grow and flourish in their communities.

Image by niceregionpics

Watney talks digital privacy with Chad Benson

R Street Research Assistant Caleb Watney was a guest recently on Radio America’s “The Chad Benson Show” to discuss Congress’ recent move to vacate Federal Communications Commission privacy rules using the Congressional Review Act.  The full interview is embedded below.

Sanders talks Pence and masculinity on WMAL

R Street Senior Fellow Lori Sanders joined “The Larry O’Connor Show” on WMAL 105.9 FM in Washington to discuss her recent piece in The Federalist about Vice President Mike Pence’s marital rules and a Canadian college’s demonstration on “hypermasculity.” The full interview is embedded below.

Dieterle talks privatization of Michigan’s Soo Locks

R Street Governance Policy Fellow Jarrett Dieterle recently was a guest on Interlochen Public Radio in Northern Michigan to privatize Michigan’s crumbling Soo Locks shipping channel, or have the U.S. Army Corps of Engineers charge user fees to fund upgrades. The interview is embedded below.

Pollock to speak at the 30th World Congress of the International Union for Housing Finance

R Street Distinguished Senior Fellow Alex Pollock will speak on a panel at the 30th World Congress of the International Union for Housing Finance. The conference is scheduled for June 25-27, 2017, in Washington, D.C., and will feature housing finance insights from around the world. For conference program and details, click here.

Pollock’s panel focuses on the housing finance debate in the United States and also features Mike Fratantoni, chief economist and senior vice president of the Mortgage Bankers Association and Ed DeMarco, senior fellow at the Center for Financial Markets at the Milken Institute.

Register and RSVP here (requires a registration fee).

Kentucky’s new leadership tries to pull it back from the pension cliff


Having become the last legislative chamber in the entire South to flip to Republican control last November, the Kentucky House of Representatives wasted no time this session in moving through a red meat conservative agenda.

H.B. 1 (calling for right-to-work), H.B. 2 (requiring an ultrasound prior to an abortion) and H.B. 3 (deleting various union-backed “prevailing wage” provisions and abolishing the commonwealth’s Prevailing Wage Review Board) all were signed by Gov. Matt Bevin on Jan. 9. Last week, Bevin signed H.B. 520, authorizing charter schools in Kentucky, which had been one of only seven states not to allow them.

But one issue that lingers as unfinished business on both sides of the aisle the Bluegrass State’s public employees’ pension system.  Kentucky has, by some accounts, the worst-funded state pensions in the country. That’s a pretty notable distinction, given the severe pension challenges faced by states and local governments all over the country.

And it’s despite Kentucky having already attempted several fixes in recent years.  Retiree health benefits were cut in 2004 and the Legislature voted in 2008 to abolish various pension “spiking” gimmicks that awarded much larger benefits based on increased earnings at the very end of a worker’s career.  A law passed in 2013 required the commonwealth to make its full pension payments to the system, a hybrid cash-balance plan was formulated for new employees, and cost-of-living increases were all but eliminated.

Alas, the Kentucky Employees Retirement System’s assets dropped by nearly in 2014, due to poor investment performance, and the 2015 assets were calculated to cover only 17 percent of its total liabilities over the next 30 years. The Lexington Herald-Leader reported that the funded percentage had dropped to 16 percent last year, although several of the other pension funds—for teachers, university faculty, state police and local government employees— are in better shape. Kentucky’s employer share doubled and became a full one-third of the total payroll costs for state employees.  Credit downgrades followed, but the real worry is the cash-flow problem. The point of no return will be when the assets drop in valuation to about $1.3 billion.  If this happens, the plan will be forced to convert all the assets into cash.  A cash portfolio fund can’t be fixed.

In the first month after he was sworn in as governor, Bevin announced independent audits of every state pension system. He is calling for substantive structural change, along with extra contributions to help make up the deficit.  One of the changes would be to replace the current state plan with a 401(k) retirement plan for new employees, which also would be open to any current employees who would like to transfer their traditional pensions.

The governor also proposes lowering the assumed rate of return to something more closely resembling the actual financial landscape. At 6.75 percent, even the rate of return assumed for the pension plans is much higher than the 2.344 percent risk-free rate, which reflects an average of the 10- and 20-year U.S. Treasury bond yields from March 2015 to March 2016.  Depending on whether one uses the current rate-of-return assumption or this risk-free rate of return, Kentucky’s pension systems are between $35 billion and $95 billion short of what would be required to pay off the promises made to state workers over the years.  Bevin’s plan would add another $1.1 billion in contributions, while reserving an additional $1 billion in future pension payments.

These are matters of culture, leadership and responsibility.  Putting off fixes to large politically sensitive benefit systems is a culture shared by most of the nation, and from Puerto Rico to Greece. Kentucky is certainly not the only state facing real consequences from inaction, but it is among the closest to the edge, having already stripped away many of the ancillary benefits like health care and cost-of-living increases for retirees.

Seven of the 10 largest states each have total unfunded pension liabilities exceeding $200 billion.  Why not 10 out of 10? Leadership and responsibility seem to matter more than red or blue, North or South. Ohio, Illinois and Kentucky all rank among the worst-funded states per capita in the United States, but Indiana was second-best in a study released last fall by the Center for State Fiscal Reform at the American Legislative Exchange Council.  It’s hard to blame Rust Belt economics, differences in work ethic, longevity, demographics or geography for this difference in the charts. Instead, one could speculate that having Mitch Daniels—the former director of the federal Office of Management and Budget—in the governor’s seat for several years had something to do with this. The good news for Kentucky is that Gov. Bevin does not intend to disregard the leadership imperative.

It is difficult to fault a short legislative session dominated by a new and energized Republican majority for not getting to some of the bigger picture items. But hope for the state’s thousands of state workers who were promised retirement benefits and for the taxpayers who will have to provide them lies in a bipartisan effort to shore up the pension system sooner and not later.

Image by Aaban

Murphy’s Law and a banking career


Murphy’s law is well-known in the form: “Whatever can go wrong, will go wrong” and similar variations on the theme. But the intellectually interesting substance of Murphy’s law is:  “Whatever can go wrong, will go wrong, given enough time.”

When a financial calamity has a very small probability of occurring—let’s say a 1 percent chance that it will and 99 percent that it won’t in any given year—we tend not, as a practical matter, to worry about it much. In most years, nothing will happen, and when it hasn’t happened for a long time, we may even start to treat the risk as essentially zero. Professors Jack Guttentag and Richard Herring authored a classic paper that gave this tendency the provocative name “disaster myopia.”

Banking and finance are full of events with a very small expected probability, but which are very costly when they do happen – e.g., a financial crisis.

Suppose the chance of a financial crisis is 1 percent annually. Suppose you optimistically start your banking career at the age of 23 and work to age 68, by which time you will be seasoned and cynical. That will be 45 years. Because you have given it enough time, the probability that you will experience at least one crisis during your career grows from that 1 percent in your trainee year to a pretty big number: 36 percent.

We observe in the real world that financial crises occur pretty frequently—every decade or two—and that there are a lot of different countries where a financial crisis can start. We also observe that virtually no one—not central bankers, regulators, bankers, economists, stock brokers or anybody else—is good at predicting the financial future successfully. Do we really believe the risk management and credit screens of banks, regulators and central banks are as efficient enough to screen down to a 1 percent probability?  I don’t.

Suppose instead that the probability of the banking crisis is 2 percent, with 98 percent probability that it won’t happen in a given year. Are banks even that good?  How about 5 percent, with a 95 percent probability of not happening?  That would still feel pretty safe. One more dubious of the risk-management skills of bankers, regulators and the rest might guess the probability, in reality, is more like 10 percent, rather than 1 percent. Even then, in most years, nothing will happen.

How does our banker fare over 45 years with these alternate probabilities?  At 2 percent chance per-year, over 45 years, there is a 60 percent probability he will experience at least one crisis. At 5 percent, the probability becomes 90 percent of at least one crisis, with a 67 percent chance to see two or more. If it’s 10 percent, then over 45 years, the probability of experiencing at least one crisis is 99 percent, and the probability of experiencing at least two is 95 percent. Since we learn from troubles and failures, banking looks like it furnishes the probability of an educational career.

In the last 45 years, there have been financial crises in the 1970s, 1980s, 1990s and 2000s. In the 2010s, we have so far had a big sovereign default in Greece, set the record for a municipal insolvency with the City of Detroit, and then broke that record with the insolvency of Puerto Rico. And the decade is not over. All of these crises by decade have been included in my own career around banking systems, of now close to 48 often-eventful years. The first one—the Penn Central Railroad bankruptcy and the ensuing panic in the commercial paper market—occurred when I was a trainee.

Since 1982, on average, a little less than 1 percent of U.S. financial institutions failed per year, but in the aggregate, there were 3,464 failures. Failures are lumped together in crisis periods, while some periods are calm. There were zero failures in the years 2005-2006, just as the housing bubble was at its peak and the risks were at their maximum, and very few failures in 2003-2004, as the bubble dangerously inflated. Of course, every failure in any period was a crisis from the point of view of the careers of then-active managers and employees.

A further consideration is that the probability of a crisis does not stay the same over long periods—especially if there has not been a crisis for some time. As Guttentag and Herring pointed out, risks may come to be treated as if they were zero, which makes them increase a lot. The behavior induced by the years in which nothing happens makes the chance that something bad will happen go up. In a more complex calculation than ours, the probability of the event would rise over each period it doesn’t occur, thanks to human behavior.

But we don’t need that further complexity to see that, even with quite small and unchanging odds of crises, given enough time across a career, the probability that our banker will have one or more intense learning experiences is very high, just as Mr. Murphy suggests.

Image by Ionut Catalin Parvu

Rorke talks carbon tax on Infinite Earth podcast

R Street Senior Fellow Catrina Rorke spoke recently with Michael Green and Mike Hancox on the Infinite Earth podcast. The podcast focuses on the unlimited potential of human capital in solving pressing resource, social, and health challenges. They spoke about the recent carbon tax proposal advanced by the Climate Leadership Council, how conservatives approach the problem of climate change and what opportunities there are for action in the states and at the federal level. The full audio is embedded below.

‘Right to Know Act’ puts lawyers’ interests ahead of consumers


The Right to Know Act has a great ring to it. Even if you don’t care to read the substance of the legislation—sponsored by state Sen. Michael Hastings, D-Tenley Park—it’s good retail politics. It wouldn’t be as compelling if it were called the “Next Great App Killer Act” or “Helping Lawyers Sue Tech Companies Act.” Unfortunately, the latter titles are probably closer to accurate.

The proposed legislation boldly declares “all individual have a right to privacy in information pertaining to them” that is “protected by the United States Constitution.” That’s absolutely true, if we’re talking about the Constitution’s Fourth Amendment limitations on what government may do with our private information. The Constitution’s privacy protections do not extend to private exchanges of information. In fact, those generally are protected as “speech” by the First Amendment.

When it comes to commercial consumer data, conceptions of data privacy continue to evolve, as they require balancing what consumers reasonably expect to be kept confidential with what consumers want in terms of convenience and performance. Currently, we have a patchwork of federal laws that cover the topic, including the Health Insurance Portability and Accountability Act, the Gramm-Leach-Bliley Act and the Children’s Online Privacy Protection Rule. A smattering of state-based consumer information laws form another layer of privacy rules.

Most of these laws outline rules for how to treat specific kinds of nonpublic information, such as health or credit information. They also require any number of consumer notifications when data security has been breached. When it comes to items like our financial transactions, phone records and health information disclosed to a physician, most of us have a reasonable expectation of privacy.

But that’s not necessarily the case when we use social media, search engines and GPS-based apps like Uber and Yelp. Unless you’ve been living under a rock, you probably recognize the reason many technology companies develop free applications isn’t out of a sense of philanthropy—it’s because the treasure trove of user data is so valuable. That’s a strong incentive for new market entrants to create the next generation of technologies and apps we want.

For example, I know my information is collected to improve how various app and website services perform for me. The trade-off is that companies I patronize have a massive amount of data about my habits, purchases and location. That information is extremely valuable to companies that want my business, politicians who want my vote and charities that want my support. But that’s the deal. When I breeze through those boilerplate legal declarations while installing the software, I’m essentially agreeing that my use of the app or website is worth the trade-off and the associated risks. If I don’t want to take the risk, I forfeit the convenience of the technology.

The problem with the Right to Know Act is that it puts the judgment of legislators and the financial interests of lawyers ahead of consumers. The proposed legislation modifies that boilerplate agreements you’re already ignoring to give Illinois consumers a right to request information about which categories of personal information are disclosed to which third parties. Most notably, it creates a right to sue for $10 in liquidated damages (where actual damages are less than that total), injunctive relief and, of course, attorneys’ fees.

Since most consumers would rather simply use the technology in question than file a lawsuit for $10, this legislation appears custom built enterprising politicians for attorneys interested in putting together class-action lawsuits. Nothing says “thank you” to the plaintiff’s bar for supporting your campaign quite like inventing a new cause of action.

In addition to empowering lawyers, this bill also burdens commercial website and other online services with new compliance costs, in the form of information-disclosure, data-collection and data-retention requirements. This can mean the difference between a technology company adding a new feature to an app, or spending more on legal counsel. While politicians and lawyers line their pockets, consumers will bear the cost in the form of less effective apps.

Image by RRuntsch

Dieterle talks Virginia distilling regs on Freedom and Prosperity Radio

R Street Governance Policy Fellow C. Jarrett Dieterle joined host Joe Thomas on Virginia’s Freedom and Prosperity Radio — sponsored by the Virginia Public Policy Institute — to discuss his recent piece in the American Spectator about Virginia distilling regulations. The full audio is embedded below.

Pollock before Oversight Subcommittee

Here’s more from R Street Distinguished Senior Fellow Alex Pollock’s testimony before the House Committee on Financial Services on the arbitrary and inconsistent non-bank SIFI designation process.

Alaska’s ridesharing bill one step closer to becoming law


With the state Senate’s passage of S.B. 14, Alaska is one step closer to a legal framework for transportation network companies such as Uber and Lyft. The Senate on March 23 voted 14-5 to pass the “Let’s Ride Alaska Act,” which would make Alaska the 45th state to enact comprehensive ridesharing legislation.

Uber left “The Last Frontier” after about six months in 2014, due to a spat between the company and the state Department of Labor and Workforce Development over worker classification. The state was concerned with drivers classified as contractors instead of employees, meaning they would lack company-purchased workers’ compensation insurance.

This bill codifies the classification of drivers as independent contractors. Given the complete autonomy TNC drivers have regarding their work schedule, it is hard to justify mandating additional employee benefits that likely would increase costs to consumers and prevent TNCs from taking on additional drivers.

The bill also preempts localities from imposing further regulations on TNCs, joining a growing number of states that are choosing to preempt county and municipal restrictions. This drew the ire of the capital city of Juneau, which requested the preemption clause be removed because of its desire to apply sales tax ordinances and to “require transportation network drivers to register as a business with the municipality, in the same manner as other businesses.”

TNCs could be a big revenue stream for cities and another backdoor tax on consumers. But creating a patchwork of local regulations has proven to be a bad idea. For instance, overly restrictive regimes in Houston, San Antonio and Austin have undermined transportation options in Texas. While decentralizing governmental responsibility is normally sensible, city governments are the creation of states (unlike states’ relationship to the federal government), and a jumble of municipal ordinances inevitably creates unnecessary compliance costs in an arena that’s inherently inter-jurisdictional.

Alaska’s Senate bill also requires background checks for TNC drivers. In addition to the vetting done by passenger through the TNC apps’ rating systems, the TNCs have a strong incentive to protect their customers and already screen drivers who might be likely to commit crimes while on the job. Juneau’s city government wants the freedom to add to the background-check process, including the ability to mandate fingerprinting checks with the FBI’s NGI database. Juneau justifies its request on grounds that it is “a requirement currently enforced for taxi drivers doing business in Juneau.”

But we shouldn’t impose new regulations just to “level the playing field.” Indeed, mandating fingerprint background checks through the FBI raises some serious concerns. For instance, they pose a disproportionately unfair obstacle to labor participation by minority communities. The FBI database tracks arrests, not convictions. It was built to capture lots of data, but can be prone to false positives. Nearly half the FBI’s records fail to include information on the final disposition of a case, such as whether someone was acquitted or had charges dropped.

This bill may have an uphill battle, given the interest groups it may anger, but the House Labor Committee chairman, who killed a similar bill last year, already intimated he will let S.B. 14 to go to the floor for a vote. This would be another step in the right direction for all Alaskans.

Image by Rocky Grimes

What does the executive order mean for the climate? Not much


President Donald Trump’s long-awaited executive order on climate policy vastly remakes the executive branch approach to climate-change risk, adaptation and mitigation. Whereas the Obama administration was focused on reducing emissions and managing climate risks, the Trump administration is executing program changes that suggest such concerns were folly.

It’s a substantial policy overhaul, prompting the executive branch to reconsider seven regulations or guidance documents that pertain to reducing emissions from the power sector and fossil-fuel development or considering climate risk in federal policy. The order rolls back regulations on oil and gas; lifts the moratorium on federal coal leasing; reconsiders the much-maligned social cost of carbon estimate of climate damages; and requests that agencies abandon the practice of evaluating climate impacts when considering regulations, land-use decisions or new programs or projects.

But at its heart is remaking President Barack Obama’s regulations on coal facilities in the electric power sector. A major source of anguish in the coal community and among Trump’s supporters were two Environmental Protection Act regulations that would make it nearly impossible for the power sector to increase reliance on coal.

  1. A “new source performance standard” required that no new coal facility be built without expensive and unproven carbon-capture-and-sequestration technology.
  2. Separately, the Clean Power Plan pushed states to cut greenhouse-gas emissions from the power sector by about a third below 2005 levels within 25 years.

Both pieces of regulation currently are working their way through the courts, though the Supreme Court offered an unprecedented stay on the Clean Power Plan to prevent it from going into effect. Trump advises the EPA to rewrite both of these rules. This isn’t the outright elimination that many expected, and there’s a good reason why.

A 2007 Supreme Court decision directed the EPA to address greenhouse-gas emissions if those emissions proved to be a threat to public health and welfare. In 2009, the EPA finalized its “endangerment finding,” a regulatory declaration that emissions of greenhouse gases contribute to factors that threaten the public. Taken together, the EPA must take steps to limit greenhouse-gas emissions. An order to rewrite Obama’s climate regulations puts the EPA on the slow and methodical road to issuing a new regulation; in the interim, coal has an opportunity to see if it can fight its way back into the power sector.

If greenhouse-gas policy is in flux, what does this executive order mean for the climate? Probably not much.

Government policy to limit greenhouse-gas emissions has, to-date, been written as though the private sector were incapable of reducing emissions without bureaucratic intervention. The prior administration was certainly no exception, issuing rules to reduce climate emissions from the power sector, vehicles, industrial use and even household appliances. The regulatory onslaught suggested that reducing greenhouse-gas emissions was a pre-eminent government priority and that the market needs the influence of regulation to respond accordingly.

Such an assumption is patently wrong. What we’ve learned in the last decade from the fracking boom, precipitously dropping prices for wind and solar, increased urbanization, technology adoption, automation and a wide variety of additional trends is that the free market can reduce emissions in the absence of government policy even if it doesn’t need to.

So before heeding the claims of the environmentalist left that this executive order and the president’s policies more generally will set the United States on an unacceptable carbon-emissions trajectory, be mindful that the private sector is hard at work offering incidental carbon emissions that meet or exceed the goals of Obama-era policies.

This latest executive order, coupled with substantial cuts to some of our core environmental and risk-management programs in the president’s budget, suggests that the administration’s eagerness to remake federal greenhouse-gas policy may yet prove to be shortsighted. While there is considerable and justifiable debate over whether to reduce greenhouse-gas emissions through federal policy, a clear-eyed assessment of climate risk, like any other risk, remains an indispensable fiduciary responsibility of the federal government.

As the White House rethinks carbon and climate policy, the pressure falls to Congress to serve as ballast. Conservatives may be deeply suspicious of government-expanding approaches to limit greenhouse-gas emissions, but they are equally suspicious of unwise, uninformed government spending. There is ample opportunity to liberate the private sector from regulatory overkill, while providing federal agencies the information they need to handle climate risks appropriately.

Just last week, 17 Republicans released a resolution to use “our tradition of American ingenuity, innovation, and exceptionalism” to address climate change. With bicameral majorities, perhaps conservatives will find their voice and offer an authentic, principled counterpoint to tackle that challenge.

Image by Evan El-Amin

Let’s eliminate the SEC’s Investment Advisory Committee

shutterstock_451626682 (1)

Looking at the Securities and Exchange Commission’s Investment Advisory Committee as a proxy for the relative influence of the shareholder empowerment movement gives one the distinct impression that the SEC is, at the least, unduly influenced, if not fully captured.

Shareholder empowerment advocates—primarily, but not exclusively, those who represent the interests of public pension funds and union-related funds—call for shifting corporate decision-making authority toward shareholders and away from boards of directors and executive management. The effect on corporate governance is to allow uninformed shareholders an ever-increasing power to interfere with the decision-making of the most informed loci of corporate authority. The end results of this approach are suboptimal board and executive decision-making, fewer successful companies willing to become or remain publicly traded and constraints on society’s ability to create economic wealth.

Congress created the IAC under Section 911 of the Dodd-Frank Act, with the drafted purpose to:

advise  and  consult  with  the  Commission [SEC]  on … regulatory priorities of the Commission; … issues  relating  to  the  regulation  of  securities products, trading  strategies, and  fee  structures, and the effectiveness of disclosure; …  initiatives  to  protect  investor  interest;  and  … initiatives to promote investor confidence and the integrity of the securities marketplace; and … submit  to  the  Commission  such  findings  and  recommendations  as  the  Committee determines  are  appropriate, including  recommendations  for  proposed  legislative changes.

This sounds innocuous enough. Moreover, it appears Section 911’s authors expected membership to be broadly based and to represent a variety of interests:

The members of the Committee shall be … the Investor Advocate [heads the Office of the Investor Advocate, a new office established by Section 915 of the Dodd-Frank Act]; … a representative of State securities commissions; … a representative of the interests of senior citizens; and … not fewer than 10, and not more than 20, members appointed by the Commission, from among individuals who … represent the interests of individual equity and debt investors, including investors in mutual funds; … represent the interests of institutional investors, including the interests of pension funds and registered investment companies; … are knowledgeable about investment issues and decisions; and … have reputations of integrity.

Nevertheless, like a lot of legislation, Section 911 has had unintended consequences. The IAC’s membership has been dominated by shareholder empowerment advocates. I estimate at least seven of the 18 members can be considered movement supporters, including Chairman Kurt Schacht of the CFA Institute and Vice Chairman Anne Sheehan, Vice Chairman of the California State Teachers’ Retirement System, who presumably set the committee’s agenda.

Given this bias, it should not come as a surprise that one item on the IAC’s meeting agenda earlier this month was a discussion of Snap Inc.’s recent initial public offering with a dual-class share structure that did not offer voting rights to purchasers. The shareholder empowerment movement’s abhorrence of dual-class share structures—based solely on the fact that they reduce or eliminate the voting power of the typical stockholder—is nonsensical. This structure has been used by some of the most successful companies in the world—including Alphabet (Google), Berkshire Hathaway, Alibaba Group, Facebook, Under Armour and LinkedIn—to create enormous wealth for their stockholders.

Moreover, the Snap IPO was hugely successful. Snap priced its offering at $17 per share, giving it a market valuation of roughly $24 billion. The book was more than 10 times oversubscribed and Snap could have priced the IPO at up to $19 per share. So whose interest are shareholder empowerment advocates trying to protect when they attack dual-class share structures?  It appears the interest of most concern to the movement is its own, as the more dual-class share structures there are, the less power the movement has.

Shareholder empowerment advocates do not need any extra help from the Dodd-Frank Act to have a major impact on corporate governance. As Congress and the White House continue to review sections of the Dodd-Frank Act to be amended or repealed altogether, Section 911 should be included on that list.

Image by g0d4ather

Novato’s tobacco ordinance is discriminatory


The City of Novato’s recently enacted anti-tobacco ordinance—which will go into effect in 2018, but is subject to reconsideration next week—would give landlords a tool with which to discriminate against renters. The measure discriminates against poor people, too.

The law bans smoking in apartments and condominiums, and also bans the use of electronic cigarettes and smokeless tobacco. Other communities have passed similar bans, but Novato’s ban goes much further. Use of any tobacco product—or even ownership of an ashtray—would be considered a material breach of a lease, thus giving landlords wide latitude to evict tenants.

The city exempts single-family homeowners from the law, which means the brunt of the measure will fall on less-affluent residents. It will also fall on those who attempt to quit their dangerous cigarette-smoking habit, given that the law treats less-dangerous vaping and smokeless tobacco products the same as cigarettes.

The law also encourages snitches, by making it illegal to abet or conceal a violation of the law. It encourages litigation, in that it gives absolutely anyone standing to sue and seek damages against those who vape in, say, an apartment’s common area.

This law is a godsend for landlords. They receive broad eviction authorities if they catch residents or even their guests with any type of tobacco product. It’s a creepy and discriminatory measure and its worst sections should be repealed.

Hobson talks autonomous vehicles and organ donation on Polish TV

R Street Tech Policy Fellow Anne Hobson offered an interview recently to TVN, a private broadcaster in Poland, television network TVN, to discuss the recent Slate piece she wrote with Senior Fellow Ian Adams on how the rise of autonomous vehicles could affect the market for organ donation on the morning show “Dzien dobry TVN.”

According to the National Highway Traffic Safety Administration, sutonomous vehicles could prevent many of the 94 percent of road deaths attributable to human error. However, reducing vehicle accidents might reduce organ donations by up to 20 percent, exacerbating existing organ shortages. As the incidence of life-threatening chronic diseases such as kidney and liver disease increases, we need to consider promoting the market for organ donation by introducing presumed-consent rules or legalizing incentive packages that can include things like fixed payments, health insurance and paid donor leave.

Video of the appearance (overdubbed with simultaneous Polish translation) can be found at the link below.,2064,n/samochody-bez-kierowcy-jak-zmienia-nasze-zycie,224799.html


Senate to FCC: Privacy regulation is not your job


The U.S. Senate narrowly passed a resolution Thursday that would halt Federal Communications Commission efforts to create rules for the way internet service providers use their customers’ personal information.

While some critics characterized the vote as anti-consumer, the FCC’s privacy rules—drafted under former Chairman Thomas Wheeler—represented agency overreach and would have created unnecessary overlap and confusion with general privacy rules established and enforced by the Federal Trade Commission.

The Senate’s move is in line with current Chairman Ajit Pai’s decision to delay enactment of the Obama administration’s FCC privacy rules and instead work with the FTC to seek a “comprehensive and uniform regulatory framework.”

The internet’s impact on consumer privacy is a significant issue. The combination of rapid processing speeds, networked databases and universal connectivity make it easier than ever to search and analyze personal information and transform it into a marketable commodity. We may debate the degree to which all of this should be regulated, but those who understand the Constitution know it is Congress’ prerogative to authorize which federal agency should perform which regulatory task.

As much as progressives might wish it, government regulatory agencies cannot simply do what they want. Just because the FCC regulates wide portions of ISP business doesn’t mean it may regulate every portion of ISP business. It is not the FCC’s job to write rules on privacy regulation. That role, by law, falls to the FTC.

Put another way, it would be as if the Securities and Exchange Commission were to decide to write its own rules for disability access at Goldman Sachs’ headquarters, on grounds that it has regulatory authority over securities broker-dealers. Congress authorized the Labor Department to enforce the Americans with Disabilities Act, and would be right to curtail another agency’s attempt to usurp that authority. It would be misleading to say such an action is insensitive to the disabled—the ADA rules are still in force. It’s just not the SEC’s job to write new ones.

From a practical standpoint, the FCC regulates only one group of companies in the internet ecosystem—service providers like AT&T, Verizon, Comcast and T-Mobile. Even if its privacy rules were sterling examples of regulatory discretion, they would not apply to other internet companies that collect as much or more consumer information. That would create an imbalance by saddling different companies within the same industry with different rules.

One set of regs, from one agency, for all players, is the correct way to allow these companies to work within the ecosystem while protecting consumers.

Image by Mark Van Scyoc

Building infrastructure by walking across ideological lines


Talk of infrastructure is in the air, as President Donald Trump and his advisors call for $1 trillion in investment and Democrats propose their own wish lists. Is there an artful deal to be struck? The R Street Institute hosted a recent conversation (which I moderated) on bipartisan solutions to get the economy moving and let those who benefit fund transportation.

The panel featured:

  • Christopher Leinberger, the Charles Bendit Distinguished Scholar and Research Professor and chair of the Center for Real Estate & Urban Analysis at the George Washington University School of Business;
  • Robert Puentes, president and CEO of the Eno Center for Transportation;
  • Christopher Coes, vice president of real estate policy and external affairs at Smart Growth America; and
  • Salim Furth, research fellow in macroeconomics at the Heritage Foundation.

Video of the full panel is embedded below.

Tito’s Vodka isn’t the only good distilled spirit out of Texas


When you think of Texas alcohol, you usually think of good ole Shiner beer. The state is more famous for its beer than anything else. However, some new spirits have come out of the Lone Star State that might change that.

Texas is quickly becoming one of the leaders in the craft distiller industry, which has exploded in recent years, with at least 1,315 craft distillers now operating in the United States. In 2010, craft distillers only made up 0.8 percent of market share. By 2015, their market share had more than doubled to 2.2 percent.

Of course, the most famous of these new Texas distillers also happens to be the oldest, Tito’s Vodka out of Austin. Tito’s has been around long enough to be sued over its labeling. The company became famous after it won the double gold medal for best vodka at the 2001 San Francisco World Spirits Competition. It has since gone nationwide and, perhaps most notably, is served by several airlines on their flights.

But vodka is not the only distilled spirit to come out of Texas. Texas whiskey also has made big strides over the past few years. The Lone Star State is embracing its distillers, with Houston even hosting a whiskey festival. While Texas distillers do not yet have the reach or the reputation of Kentucky or Tennessee, the product certainly stands right there with the more well-known distillers.

One of the best known whiskies out of Texas is Rebecca Creek, based in San Antonio. Unlike many other Texas distillers, they have had success selling their whiskey outside Texas. The distillery, which also produces vodka, was founded in 2009 by Steve Ison. The whiskey is cut to bottle proof by using purified water and has performed well in competitions.

Another Texas distiller is Trinity River Distillery in Fort Worth, a veteran-owned company established in 2011. Trinity River are producers of Texas Silver Star spirits, which include a bourbon, a honey whiskey and vodka. Their whiskey and vodka are cut using 100 percent rainwater, which they claim provides an already purified water source. The rainwater is collected through a drainage system on the roof of the building and is stored at the distillery.

Finally, Houston is represented by Whitmeyer’s Distilling Co., founded in 2012 by a couple of veterans. They make an assortment of whiskies, vodkas and even a gin. Their distillery is a small operation, located in a warehouse, and uses purified water because it helps the products drink smoother.

Most Texas distilleries offer tours via Groupon to bring in both locals and tourists alike to tour their facilities and enjoy samples of their drinks. It’s a way to both market themselves as tourist attractions and spread the word about their products, a low-cost form of advertising that gives the distilleries multiple income streams.

I have toured the Trinity River and Whitmeyer distilleries. The Texas Silver Star bourbon was a very smooth whiskey, one of the best I have ever tasted. I even brought a bottle back home to Louisiana.

The Whitmeyer distillery offers a wide range of products. The expensive whiskies were very good and the basic Texas whiskey offers a good selection in the $20-$30 range. It, too, holds its own against similar competitors.

One of the problems the Texas distillers have is the lack of national reach. Texas itself is a big market, but they cannot secure as many markets outside the state. The only way they will show up on the national radar is by winning plaudits from whiskey connoisseurs both in the United States and around the world.

Texas craft whiskey is a product with enormous potential. It needs to be embraced.

Image by  lev radin

Shifting drinks law landscape in Pennsylvania is a maze for businesses


The alcohol-policy landscape in Pennsylvania was static—almost glacial—for most of the past 40 years. However, these days, the glaciers are melting and the landscape is changing rapidly. New laws and interpretations are coming every few months.

The evolution in our drinks law is welcome and overdue. But in the short term, it has meant chaos for small businesses struggling to keep up and left some consumers confused about what the changes actually mean.

For decades, Pennsylvania’s unique mess of odd options embodied in its archaic post-Repeal liquor code was tweaked only incrementally. Gradually, we got shelves in our state monopoly wine and liquor stores (before, we had to order at a counter, by catalog number); we were allowed to pay for beer with a credit card; and a fraction of the state stores opened for a few hours on Sundays. These were all tiny changes.

The beverage alcohol retailer/wholesaler situation had been pretty quiet, as well. The state maintained its monopoly on wine and liquor wholesale operations and off-premise retail sales. Beer was supplied by privately owned wholesalers and sold by the case and keg only at retail stores (“beer distributors”) and in 12-bottle maximum purchases at bars and restaurants. Producers were allowed to sell directly. It was confusing, but we were used to it, and family businesses planned major investments based on this system.

Much has changed in the past year, as we’ve entered an activist phase in liquor law. Beer distributors first were granted the ability to sell 12-packs as well as cases. Nine months later, the policy was changed to no minimum sale. It’s also no longer always illegal to sell beer at gas stations, so long as the station also has a restaurant license and a 30-seat “café.” License holders may also sell up to four bottles of wine to go, as long as the price is no lower than that charged by the state’s monopoly stores. Grocery stores are buying licenses (and adding seating) to take advantage of this change.

We’re pathetically overjoyed by this. Almost everyone who talks about it says: “We can buy beer and wine at supermarkets and gas stations now!” This isn’t exactly true – less than 300 of the state’s thousands of gas stations and supermarkets have licenses. With the licenses going for as much as $560,000 (yes, just the license), it’s clear that this is not a business opportunity for Mom & Pop’s Corner Market.

A tiny number of new licenses are being created in the tax-abatement Keystone Opportunity Zones, but nowhere near enough to meet this new demand. The price of transferable licenses is spiraling upward, and will soon only be affordable to chain stores and restaurants.

The solution could be a new license for off-premise sales: plentiful, affordable and nontransferable. But every time a grocery or convenience store buys a tavern license for more than $200,000, that solution becomes less attractive to legislators, who are not moving to do anything to fix this situation.

Another recent change allowed Pennsylvania wineries, breweries, distilleries, cideries, and meaderies to sell each others’ products, and also allowed each to have up to two off-premises stores. It was quickly realized that this created a loophole that potentially allowed a privately owned Pennsylvania-only full-service booze store. One has already opened in Pittsburgh, the first private liquor store in the state since Prohibition.

To be fair, the Legislature is moving in the right direction. State House Speaker Mike Turzai, R-Marshall, has promised a bill to privatize the state’s wholesale booze business this session. State Sen. Randy Vulakovich, R-Shaler, has reintroduced a bill that would add spirits to the wine and beer sales allowed to licensees (with the same four-bottle limit as wine), and news outlets are already anticipating its passage. Gov. Tom Wolf and the Democrats in the Legislature seem willing to compromise on more booze freedom, so long as it doesn’t mean outright privatization of the state monopoly stores (and their unionized workforce). The forecast is wet.

Meanwhile, the state’s longtime private beer retailers are being shut out. I recently talked to one who was furious about the lobbying performance of his industry’s trade group, the Pennsylvania Malt Beverage Distributors Association (MBDA). “They could have made a deal,” he spat out, “they could have gotten us wine sales. But there’s only one word in their vocabulary: ‘No!’”

The MBDA for years stonewalled on changes to the liquor laws, rightly seeing the state’s monopoly as crippling their wine and liquor competition. They fought mightily to keep beer sales out of groceries and gas stations. But when things started to change, it would have been smart to cut a deal. Now they’re looking at a shrinking business; “the volume-value end” in cases and kegs, as my source put it.

When things move this quickly, a business has to be nimble and knowledgeable in the ways of lobbying and legal interpretation. That’s practice Pennsylvania businesses simply haven’t had, and it represents skills they’re going to have to acquire as the liquor landscape continues to slip, slide and slowly advance toward privatization.

Image by ra2studio

Guest blogger Lew Bryson is the author of “Tasting Whiskey” (2014) and books on the breweries of Pennsylvania, New York, Virginia, Maryland and Delaware. 

SXSW Roundup: What will AR/VR revolutionize next?

The experiential factor in augmented and virtual reality leads to a more vivid sense of presence and immersion, when compared to other media like television or radio. This makes AR and VR powerful platforms for social engagement, education and adventure.

As part of Innovation Policy Day at SXSW, I took part in a panel discussion led by the Consumer Technology Association’s Michael Hayes on what AR and VR will revolution next. Joining me were Tim Hwang, Google’s public policy counsel, and James Hairston, head of public policy for Oculus. You can see the full video below:

Imagining what AR and VR will revolutionize is no small task. Entrepr are trying to make the next “killer app,” which may be as simple as viewing your two-dimensional computer screen in virtual or as complicated as exploring a vast, immersive, open virtual world reminiscent of Elder Scrolls V: Skyrim. Augmented and virtual reality can aid in job training by simulating flight or manufacturing processes. VR can help paraplegics learn to walk again by retraining the brain to recognize limbs.

AR and VR will also change how we will communicate. While video games often are viewed as antisocial, in VR, this does not have to be the case. Users can interact with artificially intelligent non-player characters or hang out with friends in virtual spaces. By practicing in VR, people can overcome social anxiety in public speaking or other social experiences.

Imagining these good applications for VR and AR is best left to entrepreneurs. According to economist Israel Kirzner, entrepreneurs rely on local knowledge—their own relevant experiences—to envision opportunities for profit. For example, a local Austin resident suggested that an AR headset that could display true north would be helpful to individuals who set up antennae for telecommunications infrastructure. The role of policymakers should be to let these entrepreneurs experiment.

SXSW Roundup: International cooperation in cybersecurity

At the recent South by Southwest festival in Austin, Texas, I took participated in a panel discussion hosted by the European Union and exploring opportunities for international cooperation on cybersecurity policy.

Joining me on the panel were Chris Painter, coordinator for cyber issues for the U.S. State Department; Rafael Laguna, CEO of Open-Xchange; Michael Farrell, editor of CSM Passcode, and Andrea Glorioso, counselor for digital economy for the EU delegation. You can see the full video embedded below:

Some takeaways I had from the discussion:

  • The internet of things is a complex, global system. There is no silver bullet solution or simple regulatory fix. Instead, industry, governments, consumers and third-party stakeholders will have to work together on a variety of efforts to improve data-security and privacy outcomes.
  • Threat information-sharing efforts, device cybersecurity certification programs, after-market smart products, consumer awareness initiatives and efforts to improve cyber insurance adoption are all pieces of a broad strategy to mitigate cyber risk in the internet of things.
  • Artificial intelligence empowers consumers and firms to detect and mitigate cyber threats. At SXSW, IBM demonstrated their “cognitive security” program that leverages machine learner Watson to analyze unstructured data in ways that could help businesses identify threats. On the consumer end, smart routers and firewalls can monitor traffic patterns and metadata to detect when your home’s connected devices are compromised.
  • There is a role for government to encourage solutions that foster a more secure internet of things. For example, the U.S. Commerce Department’s National Institute of Standards and Technology’s industry-led voluntary cybersecurity framework creates a common language for government to engage with stakeholders. In a recent green paper, the department outlined its role in promoting an open global environment for internet-of-things development. R Street filed comments supporting a light-touch regulatory approach and advocates for continued engagement with stakeholders on cybersecurity issues domestically and internationally.

For more ideas about addressing IoT cyber vulnerabilities, see our recent paper, “Aligning Cybersecurity Incentives in an Interconnected World,” which examines the role of government in fostering market-based solutions to device insecurity.

R Street’s criminal justice project goes to SXSW


R Street was busy at the recent South by Southwest festival in Austin, Texas, including co-hosting a series of discussions on criminal justice reform with the Texas Public Policy Foundation, along with support from the Charles Koch Institute and the Coalition for Public Safety.

Our first panel, “The New Wave of Justice Innovators,” focused on how emerging technologies can help solve some longstanding criminal justice issues. It was moderated by Jasmine Heiss, director of coalitions and outreach at the Coalition for Public Safety, and also featured Jon Tippens of, Jordan Richardson of the Charles Koch Institute, Lauren Krisai of the Reason Foundation, Rick Lane of Verie and Derek Cohen, deputy director of Right on Crime and the Center for Effective Justice at the Texas Public Policy Foundation.

We also screened filmmaker Ondi Timoner’s documentary “The Last Mile: Inside San Quentin’s Tech Incubator.” After screening the film, we hosted a discussion with Natrina Gandana, program manager at The Last Mile Project, and Tulio Cardozo, technical manager for the Last Mile Project.

Our second panel focused on background-check policies and how these approaches have an adverse effect on the economic prospects of the most vulnerable.

Moderated by Greg Glod, manager of state initiatives for Right on Crime, the panel featured R Street Criminal Justice Policy Director Arthur Rizer; Texas state Sen. Konni Burton, R-Fort Worth; Malcolm Glenn, public policy manager at Uber;  Teresa Hodge, co-founder of Mission: Launch Inc.; and Bill Cobb, deputy director of the ACLU’s Campaign for Smart Justice.

We wrapped up the day’s programming with a presentation by Marcus Bullock, the founder and CEO of FlikShop. His firm offers technology to help inmates stay in contact with their families – particularly important given that those connections help inmates with reentry when they are released.

Whither the conference committee?


The following post was co-authored by Adam Chan, a former Institute of Politics summer research assistant at the R Street Institute.

Hong Min ParkSteven S. Smith and Ryan J. Vander Wielen recently presented a monograph, “The Changing Politics of Conciliation: How Institutional Reforms and Partisanship Have Killed the Conference Committee,” detailing the “near evaporation” of conference committees in House-Senate conciliation processes. Given the constitutional necessity of conciliation and its significant impact on policy outcomes, this paper is crucial to understanding recent congressional dysfunction.

The Constitution’s Bicameralism and Presentment Clauses require both houses of Congress to pass identical versions of legislative bills. The Constitution is silent, however, about how Congress is to go about reconciling differences that exist when the chambers pass different versions of a bill. This process, known as conciliation, can be accomplished several ways. For example, one chamber can simply approve a bill that was initially passed by the other chamber, or the chambers can continue to exchange amendments back and forth until both finally pass an identical bill (a process known as “ping-ponging”).

But traditionally—at least for more complex pieces of legislation where the potential differences between the chamber versions of a bill are numerous—Congress has used conference committees made up of delegates from each chamber to hammer out any differences. A conference report is then agreed upon and returned to the chambers to be passed on a simple up-or-down vote. This longstanding conference process, as the monograph’s authors lay out in detail, is now broken. Fewer and fewer conference committees are convened to resolve differences between the houses of Congress, and the authors set out to study why this decline has occurred, as well as its ramifications for future policymaking.

Essentially, the monograph has four components:

  1. A description of conciliation when conference committees predominated, before the 1970s;
  2. The waves of change since the 1970s that caused the decline in conference committees;
  3. Current methods of interchamber conciliation;
  4. The effects these changes have had on policy outcomes.

The traditional use of conference committees, pre-1970

The practice of using conference committees traces its roots to English parliamentary practice as early as the 14th century. The use of conference committees hopped the Atlantic and found a home in numerous colonial legislatures and our country’s first Congress. The practice continued well into the mid-20th century, when the conciliation process was still dominated by conference committees.

As has recently been shown by Jeffrey A. Jenkins and Charles Stewart III, this time period featured a decentralized system of independent and influential committees managing Congress. Conciliation was conducted by House and Senate delegations and controlled by relevant committee chairmen (chosen by seniority), who produced a “conference report” for full chamber votes, with little party influence. Starting in the 1970s, however, this state of affairs began to fall apart.

The decline of conference committees: two waves of institutional change

Changes in conciliation since the 1970s occurred amid rapidly increasing polarization, which led to greater differences in potential policy outcomes and more intense party competition. This, in turn, rendered congressional control increasingly uncertain. As a result, party leaders became more attuned to—and involved in—the process by which legislative differences between chambers would be resolved. A series of changes that sought to de-emphasize and reduce the power of congressional committees had the effect of reducing the role of conference committees.

As the authors note, the conciliation process was fundamentally altered by two waves of institutional change. The first, initiated by Democrats in the 1970s, came about as a result of the effort to “bring committees and conferences in line with the preferences of most majority party Democrats by expanding conference delegations.” Specifically, because of longstanding seniority rules, conservative Southern Democrats at the time disproportionately dominated committee chairmanships, allowing them to water down or stymie the civil rights agenda of the more activist wing of the Democratic Party. Seeking to reduce this power, the Democratic-controlled legislature at the time passed a series of measures that eliminated the seniority monopoly on chairmanships, asserted greater party control over committees and conferences, and gave rank-and-file members increased oversight over conference committee reports.

The second wave, “initiated by House Republicans after the 1994 elections, was about partisan efficiency and control.” This wave centralized power in party leadership, rather than committees; cut member and committee staff (while increasing leadership staff); and increased the Senate majority leader’s power over the amendment process. Among other effects, these changes increased party leadership’s influence and control over the conciliation and conference process, since conferees were increasingly expected to “toe the party line.” This more rigidly hierarchal structure eventually decreased the use of conference committees altogether in favor of other methods of conciliation.

Current preferred methods of conciliation

The result of the aforementioned institutional changes in committee power had direct ramifications for the conciliation process. Instead of conference committees, party leadership often chooses to engage in high-level, closed-door negotiations to resolve in interchamber differences. This ad hoc, secretive process has only gained in popularity in recent years.

Because of their obvious partisan implications, fiscal issues were the first to be consumed by party leadership. In more recent times, party leadership has taken the lead in conciliations involving even supposedly nonpartisan issues, like defense authorization and farm bills. As the authors point out, the apotheosis of this trend away from conference committees to more ad hoc methods for resolving differences between House and Senate versions of bills was seen in the initial passage of the Affordable Care Act. In order to both avoid a Republican filibuster and to reconcile the differences between the House and Senate versions of the bill, Democratic leadership used a series of complex legislative maneuvers to gain passage.

Perhaps unsurprisingly, the move toward greater party control of conciliation has been bipartisan: “[B]y the time the Republicans had assumed majorities in both chambers following the 2014 elections, the parties had taken over post-passage action from committees on most major legislation,” according to the authors.

How changes in conciliation have affected policy outcomes      

These changes to conciliation procedure have affected policy outcomes in numerous important ways. First, because conference committees have at least some minority input, they typically result in outcomes that are closer to the median congressman than the more overtly partisan outcomes of leadership negotiation. Thus, the move toward leadership-dominated conciliation has in turn led to more sharply partisan legislation.

Second, these changes shifted “primary responsibility [for conciliation] from legislators with policy expertise to legislators with political expertise,” which can work to reduce general policy expertise among members of Congress. By and large, party leadership is predominantly focused on electoral outcomes, rather than on the minutiae of particular policy issues. This can eventually create perverse incentives for rank-and-file legislators as, over time, “the focus on party-based policymaking and a lack of reliance on committees to write legislation may reduce the incentive for legislators to develop genuine policy expertise.”

Finally, these trends have implications for legislative transparency, as well. In contrast to the traditional conference committee process—which was public and included joint explanatory statements detailing the key results of the conference negotiations—party leadership now mostly relies on ad hoc closed-door bargaining sessions to reconcile differences in legislation. This reduces transparency and provides the public with less insight and guidance regarding agreed-upon compromises.

In their extensive monograph, Park, Smith and Vander Wielen provide important context and history concerning the evolution (and decline) of conference committees. Their analysis is a welcome addition in the effort to understand current congressional dysfunction and its potential impact on policymaking.

Image by holbox

Cameron Smith talks prison reform on Birmingham Talk 99.5

Filling in as host on the Andrea Lindenberg Show, R Street State Programs Director Cameron Smith discussed Rachel Maddow’s “big reveal” on Donald Trump’s taxes, as well as some of the prison reform topics he raised in this column. Full audio of the show is embedded below.

Florida lawmakers weigh streamlining short-term-rental rules statewide


New legislation introduced in the Florida Legislature would establish a framework that does away with the hodgepodge of regulations governing vacation rentals across the state.

The most publicized aspect of the bills is the impact they promise to have on short-term-rental companies like Airbnb and HomeAway, which have faced inconsistency and, at times, outright hostility in their attempts to operate in Sunshine State. Consider the data compiled in R Street’s Roomscore report, which graded the top 59 American cities on how friendly their laws are toward short-term rentals. While Orlando and Fort Lauderdale received mostly passable grades (B and B-, respectively), Miami received a D+ and Jacksonville received an F.

Increasingly, the policy battle over short-term rentals has mirrored that faced by ridesharing companies like Uber and Lyft, whose rapid rise in popularity saw an unprepared cab industry seek to stifle their new competitors through a glut of regulatory attacks, often dishonestly couched in a supposed concern for public safety. The concerns voiced by representatives of the hotel industry often ring similarly hollow.

In a Miami Herald piece outlining the protracted fight by the hospitality industry to increase regulatory burdens on short-term-rental companies, Wendy Kallergis, CEO of the Greater Miami & the Beaches Hotel Association, said:

We want to make sure the guests are entitled to the same safeguards as our hotel guests and that the properties are registered as a business, fully insured, regulated to basic health, safety and cleanliness guidelines, ADA guidelines, and that they are appropriately zoned and that all their taxes and fees are paid in full.

Though it would, perhaps, be impractical to suggest that the hotel industry simply come out and say, “we do not particularly like the fact that a more modern competitor has become very popular and is potentially eating into our profits,” it nonetheless stretches the bounds of credibility to believe that the hotel industry’s motives are quite so pure, that they simply want anyone who might ever decide to spend the night away from home, whether in a hotel or otherwise, to do so with a guaranteed baseline of luxury!

Other complaints from the hotel industry have focused on a purported affordable housing crisis, as Airbnb continues to expand; suggestions that allowing spacesharing in private residences will lead to neighborhoods overrun with deviant behavior and noise complaints; and charges that Airbnb is simply being used as a loophole by commercial rental companies, rather than individuals.

However, data released by Airbnb regarding its experience in San Francisco—an ideal market for testing claims about housing scarcity—found that 0.09 percent of rentals in the city were booked frequently enough to compete with a long-term rental, while from 2005 to 2013, “the number of vacant units in San Francisco has remained essentially unchanged.” Concerns about an Airbnb-created housing crisis appear to be unsubstantiated.

And municipal solutions to rowdy behavior and noise complaints already exist and are best dealt with broadly, through the same avenues one would use to deal with a noisy neighbor in a long-term rental, or a noisy group at a hotel.

Finally, the claims about Airbnb being a covert outlet for commercial undertakings are not borne out by publicly available Airbnb data. According to data from the aforementioned Miami Herald piece, only 1 percent of Airbnb listings in Miami were booked for more than 300 days out of the year. Meanwhile, the average host made $6,400 per year, averaging 42 booked nights in 2015. Hardly the numbers one would expect of slumlord tycoons.

However obvious the hotel industry’s true root motivation may be, even their concerns about potential lost profits may be overblown. R Street’s Andrew Moylan, author of Roomscore, notes in the study:

The American Hotel and Lodging Association reported that revenue grew from $163 billion in 2014 to $176 billion in 2015. A Morgan Stanley equity analyst report projected increases in hotel-occupancy rates from an already strong 65 percent in 2014 to more than 69 percent in 2017. The number of hotels and number of rooms both expanded, as well.

He concludes, “for all the signs pointing to short-term rentals taking a growing share of the lodging pie, there’s substantial evidence that they simultaneously are serving to expand the size of that pie.”

AirDNA, an online tool that tracks Airbnb data, currently lists nearly 66,000 active listings in Florida. Last year, data from the state indicated that more than 750,000 guests had used the service to stay in Florida, a growth of 149 percent over the previous year.

The Legislature has before it an opportunity to score a victory for the state – a victory for consumer choice, for the private property rights of Floridians sharing their homes to supplement their own income and for the industries benefited by the tourism surge affordable rental options help enable.

Image by Fotoluminate LLC

Build infrastructure like a perennial football contender


The American Society of Civil Engineers’ latest report card gave U.S. infrastructure a grade of D+. The report considers the rating “at risk” and one step above failing and unfit for purpose.

But readers shouldn’t see this poor grade as a justification for hasty public spending. It’s time for thoughtful evaluation of existing infrastructure programs and reforms that make infrastructure investments cost-efficient, while respecting the consequences of increased public spending in a constrained fiscal environment.

Shiny, modern public infrastructure makes for great optics but often-lousy investment returns. When considering infrastructure spending, it’s important to look at things from an economic as well as engineering perspective. Engineers evaluate infrastructure by functional performance, whereas economists focus on maximizing the returns of scarce resources. This can result in different conclusions. For example, an economist may be content with an engineering infrastructure rating of C if it’s not worth the cost to upgrade to a B. If B is the desired policy objective, then they’d stress finding the most cost-effective means to get there, rather than just throwing money at the problem.

Football management offers a timely analogy, given the start of NFL free agency and the forthcoming draft. Armchair general managers often “know” that their team must make the splashiest signings to fill positions of need. Then there’s the prudent GMs (who are much better paid!) who seek to maximize returns with finite resources. Commonly, this translates into the pursuit of C and B caliber players for contracts that cost a fraction of those extended to A players. That’s why GMs who sign quality players for reasonable contracts (bargain deals), build robust rosters through the draft (less expensive personnel) and draft the best player available (versus sacrificing quality to meet a short-term need) have sustained success.

Like an athlete, proponents of new public infrastructure investment often use age as a proxy for physical condition. Yet a fit 32-year-old athlete may perform better than an injury-riddled 27-year-old. The same applies in the electric industry. For example, coal and nuclear plants built a half-century ago commonly had an engineering life of 40 years. As they reached 40 years, cost-effective improvements to extend their life were often more economical than investing in new power plants. On the flip side, many power plants recently became unprofitable after natural-gas prices plummeted. Bailout proponents have claimed shutting these units down before their engineering life is through would be premature. Economics tells us there’s nothing premature about closing an unprofitable facility when a less expensive and/or more profitable one can take its place.

The age argument has also led to claims that our electric transmission and distribution infrastructure is crumbling. Such statements are often exaggerated (e.g., age overstates performance risk) or misapplied (i.e., failing to note the effectiveness of processes to replace or repair existing infrastructure). The ASCE report highlights reliability concerns from aging T&D lines built in the 1950s and 1960s, given their 50-year life expectancies. But a look at reliability metrics themselves tells a different story.

Independent studies generally do not find widespread T&D reliability concerns that existing processes can’t handle. Most indicators developed by the North American Electric Reliability Corp. actually show an improving reliability trend of the domestic bulk high-voltage power system. A 2016 study highlighted that, despite aging low-voltage electric distribution infrastructure, existing investments to modernize infrastructure have contributed to a likely decrease in the number and duration of power outages since the 2000s.

A 2015 study found that the frequency of power outages remains unchanged in recent years, but the total number of minutes customers are without power increased. Drilling down, the culprit is major power loss events resulting from severe weather. Importantly, the study did not find a link between reliability and increased expenditures on transmission and distribution, but rather highlighted differences in the effectiveness of utility maintenance policies.

The ASCE report cites growing T&D congestion (lines carrying electric current at their full capacity) as cause for concern. An economic view is that we’re utilizing T&D infrastructure more efficiently. Unused infrastructure is a wasted expense, though it should be granted that excessive congestion can cause reliability problems. Since the early 2000s, major advances in “organized” electricity market structure and operation have enabled far more efficient use of existing infrastructure to manage transmission congestion.

At the same time, new economic paradigms are under consideration for improved distribution management. These market-based approaches bolster reliability, avoid the need for some infrastructure investment and signal efficient infrastructure investment when needed. Still, there’s plenty of room for improvement in T&D planning, especially on joint planning with generation infrastructure. The main value is to lower costs. Reliability processes generally are already robust.

The takeaway from all this: we don’t need to throw public money haphazardly at energy infrastructure. Rather, we need to pinpoint areas of need and to reform any flaws in existing processes to encourage cost-effective private investment.

The ASCE’s report offers insight into areas to expedite and lower the cost of infrastructure-planning processes. These include streamlining permitting processes for new transmission lines and natural-gas pipelines. That’s a worthy pursuit, as is encouraging the Federal Energy Regulatory Commission and states to further pursue competitive models for T&D planning.

Given the White House’s expressed desire for a massive federal infrastructure bill, it’s especially critical that policymakers eye cost-effective investments and maintain fiscal discipline. Laying out the benefits of improved energy infrastructure (e.g., avoided outages) alone should not prioritize policy action (a common engineering perspective). Policy decisions must weigh benefits alongside costs. Costs for expanded federal outlays take on new meaning as we approach $20 trillion in national debt. Plus, the case for fiscal stimulus is especially weak with the economy on relatively solid footing. Congress must carefully weigh digging a deeper debt hole to fill some potholes.

Facilitating efficient infrastructure investments largely comes down to aligning private investment incentives with the public interest. ASCE’s recommendation to use performance-based regulations for pipeline safety is consistent with this. Policymakers should expand such performance-based constructs to electric-distribution utilities, rather than enacting strict equipment-design standards, which seldom weigh costs and benefits effectively. At the same time, policymakers must carefully parse the report’s conclusion that a lack of federal energy policy has caused a lag in energy investment.

America’s infrastructure team is quite strong already, but we could use some roster upgrades. For semantics, let’s just say that to make America’s team (not the Cowboys) great again, we need to do the most with scarce resources. That lends support for cutting red tape, not spending money we don’t have.

Image by Debby Wong

Congress needs to take back its war powers in the fight against ISIS


“We’re not considering any boots-on-the-ground approach,” then-President Barack Obama said during an Aug. 30, 2013 news conference about the situation in Syria. The former president would repeat his promise not to deploy “boots on the ground” over the next few years. But by 2016, U.S. ground forces were operating in Syria as part of the war against Islamic State.

President Donald Trump now plans to expand the war against ISIS. U.S. Marines have deployed alongside Syrian rebels as they plan an assault on the ISIS capital of Raqqa and expect to provide artillery support for the upcoming offensive. Before U.S. Marines engage in ground combat against the enemy, it’s time for a debate about the mission against ISIS. When Obama first ordered U.S. military action against ISIS in the summer of 2014, he did so without congressional approval.

The U.S. Constitution gives Congress alone the power to declare war, although the reality has always been more complicated than that. Under the War Powers Resolution that was passed in the aftermath of the Vietnam War, the president is required to report to Congress whenever U.S. military forces are sent into combat and withdraw them within 60 days unless Congress expressly authorizes the use of force.

Congress has never authorized specific military action against ISIS in Iraq, Syria or any country. The Obama administration—and, presumably, the Trump administration, as well—claimed they were authorized to fight ISIS under the resolution passed after Sept. 11 that allowed the U.S. military to fight the perpetrators of the attack. While ISIS is an offshoot of al-Qaida, it is a stretch, to say the least, to claim that the group was behind the Sept. 11 attacks.

Congress needs to take back its war powers. The last time the United States declared war was after Pearl Harbor. Since then, military force has been used both with and without congressional approval. For the 2011 air war against Libya, President Barack Obama did not even bother consulting with or seeking congressional approval. Now is the time for Congress to put its foot down and stand up for its own prerogatives.

Congress should threaten to defund all military operations against ISIS unless they specifically authorize the war. They should force the Trump administration to justify the war against ISIS and committing U.S. ground troops to both Iraq and Syria. It would force lawmakers and the American public to debate and think long and hard before deploying the military to yet another war in the Middle East.

In many ways, the war against ISIS has many resemblances to the Vietnam War, which spurred the first attempts to rein in presidential war powers. Both wars have seen “mission creep,” or the ratcheting up of U.S. military presence over time instead of a clear objective. For example, the Vietnam War started as a U.S. train-and-equip mission for South Vietnam, whereas the war against ISIS started out as an air bombardment campaign in Iraq.

Involving Congress in the decision to go to war not only forces the executive branch to justify the war, but also to detail what kind of resources will be used to prosecute the mission. Involving Congress also could be a way to unite the country behind a war, provided the public determines the war is just. To his credit, Obama essentially did this in 2013, as his promise about “boots on the ground” aligned with the American people’s conclusion that a war in Syria was not in America’s best interests.

Of course, getting congressional approval does not always mean a war will be easy or quickly concluded. The Iraq War was, of course, authorized by Congress. Before American forces get bogged down in further quagmires in Iraq and Syria, Congress needs to ask some tough questions about the mission. What are we trying to achieve in the fight against ISIS? How is the United States going to achieve those goals? What kind of force is needed to achieve those goals? Is there a better option?

Once Congress gets the answers to those questions, it can and should serve as the deliberative body charged with whether to authorize war against ISIS. Alternatively, it could determine it’s time to pull the pull the plug.

Image by BPTU

Promoting transparency and stakeholder engagement in an era of complex government


It is a well-known tenet of democracy that citizens must have access to information about the government’s activities, as well as the means by which to interact with the government to spur policy changes. Unfortunately, the increasing size and complexity of modern government has made it ever-more difficult for the public to be aware of—and engage with—policymaking that emanates from the federal government.

As part of a recent series of papers compiled by the Congressional Research Service, Clinton T. Brass and Wendy Ginsburg focus on how Congress has evolved over time to promote the principles of transparency and “stakeholder engagement” via legislative reforms.

In particular, they discuss how Congress has passed numerous laws over the years that “embed values of transparency, participation and representation into agency activities.” These laws help ensure the public is aware of important laws and regulations, and give nongovernmental stakeholders the ability to participate in the policymaking process.

The authors start by discussing the Budget and Accounting Act (1921) and the Federal Register Act (1935), two early efforts to increase government transparency. The Budget Act created a more formal budget process, mandated executive branch reporting requirements, and established the watchdog General Accounting Office, which eventually became the Government Accountability Office.

The Federal Register Act paved the way for the Code of Federal Regulations, the government periodical in which government rules are memorialized and recorded. The code’s genesis is especially interesting, as it arose in response to an embarrassing Supreme Court incident during the New Deal era in which the government had to admit to the court that it was seeking to enforce a law that didn’t exist (since an improper version of the regulation had been submitted to the printer).

Each of these early laws served a dual purpose: they gave citizens and stakeholders the ability to track the activities of federal agencies, while also giving Congress an enhanced ability to oversee agency activities and hold agencies accountable. In other words, they equipped those who were outside federal agencies with greater information about executive branch activity.

Perhaps the most significant effort Congress made to standardize and democratize the rulemaking process of agencies was the Administrative Procedure Act, passed in 1946. As the authors recap, the point of the APA was to:

  1. Require agencies to keep the public informed and up-to-date on agency activities;
  2. Provide for public participation in the rulemaking process;
  3. Prescribe uniform standards for rulemaking and adjudicatory proceedings; and
  4. Restate the standards for judicially reviewing agency actions.

The APA gave outside stakeholders the tools they needed to inform themselves about government policies and allowed them to communicate directly with the government about those policies. Promoting this type of “stakeholder engagement” was also the rationale behind other congressional legislation which sought to increase transparency and public participation in agency activities, including laws like the Freedom of Information Act, the Federal Advisory Committee Act, and the Government in the Sunshine Act.

In more modern times, Congress has taken advantage of new technologies like the internet to promote these goals. For example, the Government Performance and Results Act of 1993 required agencies to articulate mission statements and create multiyear strategic plans and retrospective annual reports. The GPRA Modernization Act of 2010, which updated the 1993 Act, required Office of Management and Budget to create a public website that contains metrics and information on agency performance.

Although these legislative reforms were well-intended and broadly effective, the authors also note increased transparency and stakeholder engagement come with costs. The advent of new technologies for disseminating information, coupled with increased opportunities for public involvement in rulemaking, has left Congress swamped with information and stakeholder demands. This overload is particularly concerning, given Congress’ recent habit of cutting legislative branch staff and resources. The authors also point out that diverting resources within an agency toward promoting greater transparency can undermine other important agency priorities.

In a similar vein, increased transparency and stakeholder engagement could alter how information is used and controlled. The authors use the Obama administration’s Open Government Directive, which required federal agencies to release certain datasets to the public, as an example. While the release of datasets can improve data quality through tools like “crowdsourcing,” it can lead to outside groups (intentionally or unintentionally) manipulating datasets and/or presenting skewed interpretations of data. It can also once again add to information overload that actually makes it more difficult for the public and Congress to get a clear picture of agency policymaking.

As the authors put it, Congress has made much progress over time in enhancing transparency and stakeholder engagement “in a way that increases the intensity with which agencies interact with non-federal stakeholders.” This trend has been accelerated by changes in technology and has helped address the information asymmetries between federal agencies and outsiders.

But Congress also needs to look long and hard at itself, and consider ways to adapt to this new context where the executive branch is immense, information is plentiful and pluralist demands are intense.

Image by holbox

Texas bill would end wine protectionism


For her last birthday, my wife got a gift she couldn’t use. Literally. A thoughtful family member had given her a gift card for an online wine retailer. But when she went to redeem the card, she found she was barred from using it because of a Texas law restricting interstate wine sales.

She was, of course, pretty upset by the injustice of all this, as was I. Sadly, it’s just one of many examples of Texas alcohol regulation being used for an anti-competitive purpose.

I hope she hasn’t thrown away the card, though, as things may be about to change:

Last week, in a move that took wine industry observers by surprise, Texas state lawmaker Matt Rinaldi, a Republican from Dallas County, filed a bill that would lift a long-standing ban prohibiting out-of-state retailers from shipping wines to consumers here.

Currently, it is illegal for an out-of-state wine shop to sell and ship wines to Texans. In other words, if you live in Texas, you cannot call or email a wine shop in New York or San Francisco and ask the merchant to sell and ship you its products. If Rinaldi’s bill were to be approved by the Texas Legislature, it would mark a historic break from a restrictive policy that regulates how Texans buy their wines.

The bill is expected to face stiff opposition by the Texas beer, wine and spirits wholesaler and retailer lobbies. As wine industry blogger and wine trade veteran Tom Wark wrote on his site last week, it is ‘the kind of legislation that Texas wholesalers and most Texas alcohol beverage retailers will oppose with their last dying breath.’

The current law is protectionism at its worst. I can only wish Rep. Rinaldi and the Texas Legislature Godspeed in passing this legislation.

Image by Africa Studio

What role should Congress play in regulation?


Historically, Congress has delegated great authority to the executive branch when it came to regulatory matters. For the most part, the executive branch has had a free hand, and when regulators exceed the law, effective pushback frequently has come via the judicial branch.

Lately, however, Congress started asserting itself back into regulatory decisionmaking by using the Congressional Review Act to curb new regulations. But the executive branch struck back: President Donald J. Trump recently mandated a regulatory budget, the workings of which will be decided by his Office of Management and Budget.

In light of these developments, what role should Congress play in regulatory policy? Does it have the capacity to play a meaningful role? What tools does it have and need?

The Legislative Capacity Working Group hosted a recent discussion on these questions, featuring R Street’s Jarrett Dieterle and Kevin Kosar, along with Philip Wallach of the Brookings Institution. Video of the panel discussion is embedded below.

California DMV takes a ‘self-driving’ news dump


Friday news dumps are a long and proud tradition, and one in which the California Department of Motor Vehicles took part today by releasing its long-awaited “Proposed Driverless Testing and Deployment Regulations.” The filing, made with the California Office of Administrative Law, signals the first official action taken by the department to codify regulations that are now two years behind their legislatively mandated schedule.

Comments on the proposed regulations are due April 24, with a public hearing scheduled for April 25 in Sacramento.

Disappointingly, the proposed rules would still require automakers—as part of their application to operate in California—to file a “safety assessment letter” with the National Highway Traffic Safety Administration before their vehicles could be deployed on California roads.

This shows there’s still confusion about the nature of the Federal Automated Vehicle Policy (FAVP), which was crafted as a voluntary guidance document. Requiring states to certify their compliance with the FAVP’s 15-point safety checklist is something that was never contemplated during the policy’s creation. Neither state nor federal law should force automakers to comply with standards that haven’t gone through notice-and-comment rulemaking.

While California remains confused on the matter, the FAVP’s voluntary nature  should be underscored by NHTSA and Transportation Secretary Elaine Chao so that other states do not make similar mistakes.

Some other notable aspects of the proposed regulations include:

  • The proposed regulations include a provision to allow autonomous vehicles to be operated without drivers. This provision is particularly notable in light of the DMV’s original inclination to disallow testing vehicles that lack driver-input mechanisms, like pedals and a steering wheel.
  • The definition of “autonomous mode” has been modified to encompass systems like those operated by Uber – a provision likely spurred by the showdown earlier this year.
  • The regulations’ privacy-sharing provisions have been narrowed and completely reworked, in what is likely a nod to federal supremacy in that area.
  • A requirement that local law enforcement be notified within 24 hours of an accident involving an autonomous vehicle has been removed.
  • Testing permits are now valid for two years, instead of one.
  • The problematic disengagement-reporting requirement also was maintained.

Analysis of these, and all of the other provisions, will appear on R Street in the days to come.

California legislators consider self-driving vehicles and the need for Prop 103 reform


Proposition 103—California’s restrictive regulatory regime for insurance—may need a few tweaks as fewer and fewer cars on the road have human drivers. During a March 8 informational hearing, members of the California Senate Insurance Committee heard from a panel of experts about the future of self-driving technologies and their likely impact on California’s insurance market.

Since Prop 103 passed in 1988, Californians’ auto-insurance rates have been dictated by a hierarchy of rating factors tied directly to drivers’ experience. These so-called “mandatory” factors include the driver’s safety record, their annual mileage driven and their years of driving experience. While other rating factors exist, they cannot be weighted more heavily than the three mandatory factors.

The problem with this state of affairs was immediately apparent to the committee members hearing testimony. As vehicles begin to drive themselves, and human interaction with the driving process declines, the Prop 103 formula for insurance rates increasingly will become divorced from reality.

That’s going to be a problem for drivers. Prop 103—in what is characterized by its author and principal beneficiary, Harvey Rosenfield, as an effort to “protect consumers”—was designed to ensure that it is exceedingly difficult to change rates. Subsequent regulatory developments have had a further chilling effect on insurers’ willingness to even file for rate changes.

As a result, even though self-driving vehicles will, in all likelihood, be safer than today’s human-driven vehicles, the law’s predictable effects will be to “protect” Californians from those lower insurance rates. That is an odd approach to consumer protection.

Rosenfield, who testified at the hearing with camera crew and orange caution cone in tow, was adamant that, so long as drivers face any liability at all, Prop 103 will still be necessary. This line of reasoning surprised state Sen. Tony Mendoza, D-Artesia. The committee’s new chairman responded to Rosenfield’s testimony by noting astutely that “strict adherence to Prop 103 does not fit” with self-driving vehicles.

While Rosenfield was unmoved by reason, testimony offered by the California Department of Insurance struck a more moderate tone. Deputy Insurance Commissioner Chris Schultz told the committee that the department believes Prop 103 works, for now, and that no immediate changes are needed. The basis for his contention was that, if a manufacturer were to release an autonomous vehicle today, the industry would have the ability to insure it using the Prop 103 structure.

Toward that end, the department posited that the state’s experience overseeing the products developed to cover transportation network company vehicles could prove instructive. Under that framework, policies have different “periods” that relate to qualitatively different activities, each with different coverages and coverage limits. In the context of self-driving vehicles—in which drivers will be switching between automated modes and human-piloted modes—the appeal of activity-specific coverage is clear.

Fortuitously, a study related to the way the TNC system has worked in California will be forthcoming this summer. It will be interesting to see if the department will take a leadership role in applying its findings to self-driving vehicles, given that a period-specific approach still likely would require either tweaks to Prop 103, or a creative application of its class plan or affinity group provisions.

At the hearing’s end, it was apparent that there are widely divergent perspectives on the impact that self-driving vehicles will have upon arrival, but there was a near-consensus that Prop 103 is not ideally suited to the reality posed by the technology. Whether characterized as “changes” or “tweaks,” a departure of some kind from Prop 103’s status quo now seems inevitable.

Though Mendoza is new to the issue, he offered what might be the most pressing question of all: “do we (need to) look at a product outside of Prop 103?” Without a doubt, the answer to that question is yes. It’s time to look both outside—and beyond—Prop 103.

Image by PP77LSK

The problem of legislating from inside a silo


In a recent example of how the best intentions often lead to the worst policy, Massachusetts legislators are considering a bill that would tax autonomous vehicles based on the number of miles they drive. There any number of problems with the approach, not least that singling out autonomous vehicles for a usage-based tax would slow their adoption in Massachusetts and could have a chilling effect on their development elsewhere.

In coming up with this flawed proposal, state Rep. Tricia Farley-Bouvier, D-Pittsfield, and state Sen. Jason Lewis, D-Winchester, shouldn’t be blamed too harshly. Autonomous vehicles represent a paradigm shift that will require bold new policies. Both legislators are striving to think outside of the box. However, their bill speaks to a larger problem – a failure to communicate.

In the current political environment, the inability to cultivate honest dialogue about important issues is a significant barrier to developing needed public policies. Well-meaning people of all political stripes too often are working in ideological silos. Encouragingly, the development and regulation of autonomous vehicles may prove a unique point of bipartisan interest and exchange.

Toward that end, the Center for the Study of the Presidency and Congress—a nonprofit dedicated to serving as an honest broker between public-sector leaders, industry and the policy community— hosted a series of off-the-record roundtables in Washington, San Francisco and Seattle on the topic of autonomous vehicles. The roundtables brought together experts from various policy areas, who lent their time and insights to identify key themes and areas of concern that surround the development, deployment and regulation of self-driving cars.

CSPC now has issued a new report that stems from those roundtable discussion, and it could serve as a valuable resource for policymakers at both the state and federal levels. In fact, had the report been available to Farley-Bouvier and Lewis, they might have learned that voices from across the political spectrum agree that autonomous vehicles must not be disadvantaged compared to traditionally operated vehicles, because doing so will stifle and slow their adoption.

Another essential takeaway from the roundtable report is that policymakers and regulators must be careful not to discriminate among autonomous-vehicle developers based on their prior experience as vehicle manufacturers. To ensure the public is willing to adopt autonomous technology, it’s vital that the technology be safe. But the way to ensure the technology is safe is to see that it undergoes rigorous real-world testing. Preventing firms from testing their technology, simply because their legacy business is not focused on vehicle manufacturing, has no demonstrable safety benefit. Over the long term, it will hamper the competition that would otherwise lead to the best technologies.

The report also recommends that the National Highway Traffic Safety Administration work to avoid a patchwork of standards. Sensible distinctions between state and federal authority will help state lawmakers better understand where they can play a constructive role. One way to accomplish that goal would be for NHTSA to affirm its Federal Automated Vehicle Policy and repudiate the confusing State Model Policy.

Working from a point of consensus, like the one represented in the CSPC’s report, is an antidote to the kind of legislation introduced in Massachusetts. As new challenges arise, the vital importance of dialogue will only grow.

Image by sarahjlewis

Dieterle talks distilling restrictions on ‘Free to Brew’ podcast

R Street Governance Project Fellow Jarrett Dieterle joined the Free to Brew podcast to discuss regulatory issues around alcohol distillers, particularly in Virginia, which has some of the most onerous restrictions in the country. The full audio is embedded below:

You can check out more Free to Brew podcasts HERE.

Hobson talks tech on ‘Mike Check’ show

R Street Tech Policy Fellow Anne Hobson appeared last week as a guest on the popular “Mike Check” show on KVOI-FM in Tucson, Arizona. Along with hosts Mike Shaw and Ray Alan, she discussed the benefits—and dangers—of emerging technologies like virtual reality, artificial intelligence and the so-called internet of things.

TV shows like Netflix’s Black Mirror and movies like “The Terminator” do a good job imagining the pitfalls of future technology run-amok. But these on-screen depictions, interesting though they may be, aren’t reflective of the many beneficial applications advanced technology gives us today. For instance, internet-of-things devices are making daily tasks easier and increasing productivity. The owner of a “smart lightbulb” no longer has to get out of bed to turn off the lights, while a baker can simply ask Amazon Echo how many teaspoons in a tablespoon without having to burn the cookies.

Of course, all this new technology has its downsides. The scale, scope and interconnected nature of new devices creates unique cybersecurity challenges. There are more than 17 billion connected devices of all sorts, from internet-enabled toasters, to smart TVs, to EZ-Pass devices with RFID chips. Put another way, there are now 17 billion points of vulnerability to protect.

Their interconnectivity also means one person’s compromised toaster can affect another person’s ability to access their email. Malware can infect devices and conscript them into a botnet “zombie army” ordered to barrage a target with traffic and render websites or web services inaccessible. Botnets aren’t the only cybersecurity problem. A majority of data breaches are caused by human error — people sending information to the wrong party, clicking on a malicious link in a phishing email or using default or simple passwords.

While the cybersecurity challenges to connected devices are significant, market-based solutions such as certification programs, threat information sharing efforts and aftermarket cybersecurity products are developing to address them.

Audio of the full show is embedded below.

Nitzkin talks THR on America Tonight

R Street Senior Fellow Joel Nitzkin joined host Kate Delaney recently to discuss tobacco harm reduction on her America Tonight radio show, which is heard nationally over 220 AM and FM stations. Full audio of the show is embedded below.

At NCOIL, state lawmakers look to claw back power from NAIC


Newly assertive leadership of the National Conference of Insurance Legislators appears eager to confront what it views as an ongoing usurpation of authority from state legislatures.

Thomas B. Considine—now NCOIL’s chief executive, but previously commissioner of the New Jersey Department of Banking and Insurance—chose the organization’s spring meeting in New Orleans as the venue to raise public concerns about states becoming subject to the authority of the National Association of Insurance Commissioners, a private trade association composed of the nation’s insurance regulators.

Appearing at Considine’s invitation, Rutgers Law School professor Robert F. Williams—an expert on state constitutional law— detailed the process by which NAIC decisions are transmuted into state law. While the NAIC serves nominally as a private venue for insurance regulators to meet and share information and best practices for insurance industry oversight, it also promulgates standards and models for states to adopt.

Because of the group’s status as the pre-eminent body on such matters, states across the country have adopted statutes that incorporate NAIC work product into state law by reference. That is to say, changes to NAIC models and standards, in effect, serve to change state law without the explicit consent of state elected officials.

Delegation of authority between the states and the federal government, among the states themselves and between the various branches of government all have a clear constitutional basis. The circumstances under which lawmaking authority may be delegated to private organizations—Williams told this audience of legislators, regulators, industry members and academics—is considerably narrower.

While there are limited circumstances where public bodies might need to outsource highly technical matters to those with expertise, such questions should never extend to cases where policy judgment is involved. Williams raised concerns about legislative bodies’ propensity to bind themselves to such arrangements prospectively.

Williams’ sentiments were echoed by Neil Alldredge of the National Association of Mutual Insurance Companies, who expressed particular alarm that the NAIC’s recent work on corporate governance standards amounted to legislating matters of substantive policy. NCOIL’s current vice president—state Sen. Jason Rapert, R-Ark.—speculated that many lawmakers are likely unaware of the arrangement with the NAIC.

From the perspective of constitutional law, the problems with incorporation-by-reference statutes are interesting in that they largely are untested in court. No cases have ever been tried involving insurance regulation. A judicial confrontation might be avoided if the NAIC rededicated itself to focusing on nonsubstantive matters.

Alternatively, states could make it a regular practice to readopt the NAIC’s incorporation-by-reference statutes each legislative session, to ensure newly elected lawmakers are reminded of the power that they are ceding to the commissioners’ trade association. The readoption approach likely is preferable. It would eliminate the NAIC’s temptation to oversee substantive matters, while simultaneously allowing the people’s representatives to re-examine their faith in the relationship with the NAIC on a regular basis.

Another problem with the NAIC’s ongoing power to incorporate standards by reference is that, to fund its operations, the NAIC restricts access to both the information it gathers and to participation in its meetings, in a manner inconsistent with the transparency otherwise available in the public lawmaking process. In fact, members of the public face substantial obstacles should they wish to participate in the process by which standards that directly impact them are set.

Ironically, if anything, the current arrangement is a good argument for the need for the Treasury Department’s Federal Insurance Office, an agency whose very existence has actively been questioned by both NCOIL and NAIC (as well as some elements of the industry). FIO could make public the information over which the NAIC currently has an effective monopoly and thereby address the information asymmetry that members of the public currently labor under. Interestingly, though it has in the past been highly skeptical of FIO, the NCOIL body declined, on the final day of its session, to take the position that FIO should be abolished.

Under Considine’s direction, NCOIL is seeking to chart a rapid course to renewed relevance. At the summer meeting in Chicago, the group is expected to consider a model law from state Assemblyman Ken Cooley, D-Calif., that would require state insurance departments to help fund NCOIL, which would put the group at much greater parity with the NAIC.

It’s striking that it should take a former commissioner, a consummate insider, to bring attention to the worst excesses of the NAIC’s quiet empire. In doing so, he may just return power to the people that they didn’t even realize had been taken from them.

Image by Andrey_Kuzmin

Short-term-rental rules on the docket in Indiana


Carmel, Indiana, is a beautiful and completely modern community of more than 85,000 folks, just north of Indianapolis. Its 100 traffic roundabouts are the most of any U.S. city and, last year, one of them was named the most beautiful in the world. Carmel’s charming Arts and Design District hosts the annual Carmel International Arts Festival and is marked by the Museum of Miniature Houses and public sculptures by John Seward Johnson II. The 1,600-seat Palladium concert hall in the Center for Performing Arts is home to the Carmel Symphony Orchestra. In September, the city hosts the Rollfast Gran Fondo cycling tournament.

A lot of people want to be in Carmel and some of them want to stay overnight, both for these attractions and for the major sporting events frequently hosted 13 miles to the south in Indianapolis. The Indianapolis market has about 33,000 hotel rooms, which were sold out in March for the Indianapolis 500 on Memorial Day weekend.

Airbnb, an online short-term rental service, has been a very popular option across for lodging across the Midwest, and is particularly adept at making rooms available during busy times when hotel rooms sell out. In the Indianapolis area, the service grew by more than 200 percent last year. That includes about 300 spaces available for rent in Carmel.

But building commissioner Jim Blanchard recently sent out notices to city homeowners that they had 10 days to remove themselves from the Airbnb website and similar services or face city code enforcement for illegal activity. While apartment dwellers could continue to list with the services, due to differences in zoning requirements, single family dwellings cannot. City officials maintain they’ve received complaints from residents distressed about noise, traffic and other issues they associate with short-term rental visitors. But it means Carmel residents who relied on the extra cash from those rentals—the typical Airbnb host rents about 22 nights per year and earns about $3,000—were suddenly left out in the cold.

In response to actions like Carmel’s, House Majority Flood Leader Matt Lehman, R-Berne, is proposing legislation to pre-empt communities from completely banning rentals of less than 30 days. Lehman has gained recognition for his work on insurance issues with state lawmakers from across the country. He helped to craft a compromise approach on the insurance aspects of ridesharing services offered by companies like Uber and Lyft, which has since gone on to become a national model. He sees no reason why Indiana couldn’t similarly become the pace car for national effort to work out differences between hotels, homeowners, bed-and-breakfast establishments and elected officials passionate about local control.

While his bill would prevent cities and countries from banning spacesharing services outright, it includes a number of sensible limits. If a property is rented out for more than six months, the owner would have to acquire a regular business license and pay merchant innkeeper taxes. It specifies that local regulation of fire, safety, sanitation, pollution, sexually oriented businesses, nuisances, noise, traffic control and the like would continue unimpeded, as long as the regulation and enforcement is applied equitably to all residential housing. The measure also includes minimum insurance requirements, as hosts would have to maintain primary first-dollar liability coverage of one at least $1 million.

After two tries, the Lehman bill passed its house of origin this past week, squeaking by after fending off several amendments from legislators who are more sympathetic to local control than emerging technology businesses. Because of the narrow passage in the House, there may need to be more work done in the Senate to address some of the perceived problems. The main issue, as in many other areas in the emerging “sharing” economy, is how to draw the line on how much professional activity a nominal amateur can engage in before he or she is recognized as a business competing with other businesses, and paying the requisite licensing fees and taxes.

The answer may not come easily, but it seems likely that a reasonable compromise can be found on short-term rentals. This issue will likely get a hearing or two in a great many laboratories of democracy across 21st century America.

Image by sevenMaps7

Criminal justice reform takes center stage at CPAC

cpac 2017

The right end of the political spectrum historically has favored getting tough on crime and increasing criminal penalties as a solution to rising crime rates. But to be “right on crime” nowadays means something else entirely. The lack of measurable results in straightening out the lives of wrongdoers and the huge expense of incarceration both have produced a new ethic focused on data, common sense and a much better cost-benefit ratio for government corrections policy.

Fiscal questions about how and when incarceration is appropriate have generated significant new interest, as well as experimentation by some states that are trying to find the criminal-justice formula that makes the most sense. Texas, Georgia, Oklahoma, Kentucky and many other states have in recent years changed laws, rationalized systems, diminished recidivism and saved billions. The phrase heard several times from presenters at this year’s Conservative Political Action Conference was: “We want to imprison the people we are afraid of, not the people we are mad at.”

FreedomWorks points out that the U.S. Constitution defined just three federal crimes – piracy, treason and counterfeiting. In 1870, Congress added a baker’s dozen more federal crimes, including murder and manslaughter, larceny and perjury. There are now about 5,000 federal crimes, and according to some estimates, around 400,000 federal regulations that can be enforced criminally. No one is quite sure exactly how many.

For the last few years, concern has been focused on the expanding number of new crimes that executive agency regulatory processes have created. In contrast to the legal doctrine of mens rea, which holds that intent is always an element of a crime, there’s no need to prove intent for many of these infractions. Author Harvey Silvergate claims Americans today unknowingly commit an average of three felonies a day.

This year’s CPAC featured a number of both main-stage appearances and expert breakout panels composed of conservatives intent on reformulating aspects of the nation’s criminal-justice system. One of the most interesting issues was highlighted by Stephen Mills, a retired Army military policeman who now serves as chief of police in Lindsay, Oklahoma. Mills also happened to be one of the law-enforcement first responders to the 2009 terrorist attack on Fort Hood.

Mills described how, after retiring from 25 years of active military duty, he became a rancher and hired some help to run his cattle operation. One of his ranch hands was out one day and stole a big roll of copper wire. When the ranch hand was apprehended, Chief Mills’ pickup truck was confiscated “an instrumentality of a crime.” He couldn’t get it back, because he couldn’t prove he knew nothing about the theft.

This is the process of civil asset forfeiture. Law enforcement considers it a valuable tool in the fight against crime, particularly in drug-related cases. The prototypical case proponents of the practice will usually cite is the traffic stop that uncovers thousands of dollars in cash from drug deals. Unfortunately, there seem to be a lot of cases more like the one Mills faced. Many of these forfeiture laws now are under review – for instance, to compel confiscated items be returned where the owners are never charged with a crime. Several states already have enacted asset-forfeiture reform and many more are currently considering it.

Another reform widely enacted by states in recent years is to prohibit government employers from asking about criminal convictions on job applications. The most important thing to reintegrate ex-convicts into a productive life is a job. Prospective employees likely would ultimately have to explain any convictions before they could be hired by a government agency, but this would cut down on automatic rejections at the application level.

Another panel highlighted stories about political prosecutions that ultimately were overturned, but not before they had ruined careers, families and finances. It sparked a lot of passion, which is understandable when it comes to outcomes that clearly seem unjust. But the most important thing about the new criminal justice reform agenda is how practical and data-driven reforms have proven out when tested against real world challenges.

Why the FCC should wait on privacy rules


Chairman Ajit Pai wants an emergency vote this week by the Federal Communications Commission to halt implementation of new rules on how internet service providers like AT&T, Verizon and Comcast can share the consumer data they collect. The rules—implemented by Pai’s predecessor, Thomas Wheeler—will take effect March 2 unless the FCC votes to stay.

With internet connectivity as pervasive as it is, consumer-privacy protection has been the center of policy debates worldwide. Part of this debate requires thinking critically about the government agencies best suited to take the lead.

The Federal Trade Commission has privacy rules that govern all companies collecting personal information. Indeed, Pai’s objective isn’t to kill privacy rules outright, but to make sure they are built into a “comprehensive and uniform regulatory framework.” It’s on this point—not ISP privacy regulations, per se—where Pai departs from Wheeler.

Wheeler’s FCC aggressively sought to expand its purview. Wheeler successfully reclassified ISPs as monopoly carriers and, while promising forbearance, immediately launched regulatory inquiries not just on privacy rules, but on the operation of cable TV set-top boxes, streaming video and pricing plans that offer unlimited data. That’s a long way from its original congressional mandate to manage use of the public airwaves and landline telephone service.

When Wheeler pushed the frontiers of FCC mission creep, he was only following a trend. Well-before he arrived, the FCC had been trying to expand broadcast decency rules to cable programming and demanding a say in merger approvals, a role historically reserved to the Justice Department and FTC. That why Pai’s taking a breath for regulatory reset is so welcome.

Within the communications industry, an uneven regulatory field can’t help but create arbitrage opportunities, rent-seeking and other market distortions that, in the end, cost consumers value, as well as real dollars. It also creates a situation in which ISPs may be bound by different and contradictory rules. Eventually, the courts would have to sort it out, a process that could take years.

This is why Pai is right. The FCC should defer to the FTC. Right now, the FTC is the right place for policymaking on the use of consumer data. We are all best served when privacy rules come from one place.

Image by Mark Van Scyoc

What the Wall Street Journal gets wrong about farming in 2017


With the Farm Bill up for reauthorization in 2018 and legislative debate poised to heat up later this year, there’s been a lot of talk about the plight of the American farmer. A recent Wall Street Journal piece proclaimed that we’re on the brink of a major national farm bust, as a shrinking global grain market and low prices increasingly will drive small family farms out of business.

Citing research from the U.S. Department of Agriculture, the Journal predicts that farm incomes will drop 9 percent in 2017, “extending the steepest slide since the Great Depression into a fourth year.”

Declining farm incomes have not gone unnoticed on Capitol Hill. The House Agriculture Committee recently held its first hearing of the 115th Congress, titled “Rural Economic Outlook: Setting the Stage for the Next Farm Bill.” Chairman Michael Conaway, R-Texas, said in his opening remarks:

America’s farmers and ranchers are facing very difficult times right now… As we begin consideration of the next Farm Bill, current conditions in farm and ranch country must be front and center.

For taxpayer advocates hoping for meaningful reforms in the next farm bill, this doom and gloom does not bode well. Instead of leading to reforms that help farms struggling to stay afloat, it will likely only result in more of the status quo. That means more taxpayer-funded subsidies flowing to wealthy agribusinesses, while small farms become increasingly obsolete.

It’s true that commodity prices are down and many farmers are struggling. But it’s also true that, relatively speaking, the farm economy is doing pretty well. As the Environmental Working Group points out in a response to the Journal piece, median farm household income is expected to grow in 2017. At $76,735, median farm household income is actually $20,000 more than the median income for all U.S. households. While there is certainly risk in starting a farm operation—as there is with any business in a market-based economy—the annual business failure rate is 14 times greater than the annual failure rate for farms.

The alarmists imply that struggling commodity farmers are being left out to dry, but that couldn’t be further from the truth. Not only are commodity farmers protected by the Agricultural Risk Coverage program, which triggers payments when revenues fall below an anticipated threshold, and the Price Loss Coverage program, which pays out when market-year average prices fall below what’s called the reference price, but they also have the option to purchase government-subsidized crop insurance with lavish coverage options. On average, the government subsidizes 62 percent of farmers’ crop-insurance premiums, regardless of the size of the farm operation.

Farmers also have the option to insure not only their projected yields, but also their revenue. Under the most extravagant federal crop insurance product, the “harvest price option,” farmers can cash in either the locked-in price at the time they planted or the current market price, whichever is higher. As we’ve said before, it’s the “crop insurance equivalent of your auto insurer surprising you with a new Cadillac Escalade after you’ve totaled your Toyota Corolla.”

The Wall Street Journal correctly notes that the consolidation of large, industrial-scale farm operations has driven many small farms out of business. But what it doesn’t mention is that our crony agriculture policy is likely driving much of that consolidation. An EWG analysis found that the top 10 percent of U.S. farms are getting more than 50 percent of the subsidies, with 26 farm operations receiving subsidies of $1 million or more. Congress should address this cronyism by putting a cap on the amount of premium support a single farm operation can receive and enacting a means test, so that farmers who are making high incomes cannot receive subsidies. This would help to level the playing field for small family farmers and ensure that our taxpayer dollars are not being spent to boost the incomes of mega-farm agribusinesses.

Accounts like the Journal article invoke nostalgia for the hard-working family farmer, but this sympathy will only be misplaced if lawmakers do not seize on this opportunity to craft a reformed farm bill that puts the interests of taxpayers and struggling farmers above those of the Big Ag lobby. As we look ahead to the next farm bill, let’s not allow dismal, exaggerated narratives distract us from the fact that our current farm-support system is not working and badly needs reform.

Image by Lost Mountain Studio

A cyber mandate isn’t the way to address cyber-insurance takeup


To improve cyber preparedness and help companies recover from cyberattacks, it’s essential that the takeup rate for cyber insurance continues to rise. The insurance capacity plainly exists to write virtually all of the risks for which the market currently seeks coverage. What’s missing is demand.

The federal government can play a role to bolster that demand, but prescriptive measures, like a cyber-insurance backstop or a cyber-insurance mandate, would have a negative impact on those efforts. A recently released paper from R Street Technology Policy Fellow Anne Hobson instead recommended that so-called “internet-of-things” vendors and contractors that do business with the federal government be held financially responsibility should a cybersecurity failure on their part result in costs to U.S. taxpayers.

Some have proposed the best way to accomplish this goal would be to require that federal vendors and contractors buy cyber insurance. It’s a proposal that has some attractive features. Insurance often has the benefit of forcing private actors to take account of their practices and reduce risks that otherwise would cause their premiums to rise or render them unable to obtain coverage at all.

But it’s important to remember that insurance exists to transfer risk. Whether it’s appropriate to buy cyber coverage to address directors and officers liability, or to deal with the potential for business interruption, is a decision each firm’s management team must make for themselves. The government has no special knowledge to know what’s right for every company with which it does business. What it can and should require is that taxpayers be protected from having risks directly or implicitly transferred to them as a result of those private risk-management decisions.

While firms could pursue other mechanisms to address their financial responsibility—from letters of credit to surety bonds to cash—most would find cyber insurance the most efficient means to transfer the risk of cyber-related liabilities. When it comes to risk management, private firms have a number of questions they must ask, including what risks they face, what strategies can be employed to mitigate those risks and whether it is more cost-effective to retain those risks, which will be borne by shareholders and creditors, or to transfer them to third parties like insurers. The key is to make clear that firms know they will be held liable for any risks they create for others—in this case, the government and individuals whose private data are entrusted to the government. There will be no bailout if things go wrong.

If the government simply wants its contractors to undertake cybersecurity measures, it can do that and, to an extent, it already has. It’s not only appropriate, but it’s absolutely essential that federal agencies vet the vendors and contractors with which they do business, and not offer contracts to those who practice poor cybersecurity hygiene. This should include examining vendors’ overall risk-management practices and taking as a positive sign that a given contractor has prepared for contingencies by obtaining insurance.

But as a technical matter, while it’s possible to require and define the scope of financial responsibilities that a government contractor holds to the agency with which it contracts, there is no magic formula to determine what kind and how much insurance every potential government contractor should get. Different firms of different sizes engaged in different kinds of activities that face different kinds of risks and operate under different contract terms all will have vastly different sorts of insurance needs.

The larger point here is that before you can enjoy the benefits of cyber insurance, which are many, you first must have a definable need for insurance. There is much to praise about how the underwriting process can act as a kind of cybersecurity audit. But the social benefits of cyber insurance do not, themselves, create the need for a cyber-insurance policy. First, you must have a risk that it actually would be prudent to transfer.

Image by

Should we have a long-term budget for entitlements?


Federal law treats Social Security, Medicare, Medicaid and other entitlements as “mandatory” spending programs, which means they are not subject to the annual appropriations process like the rest of the budget. These programs annually take on an ever-greater share of the federal budget — now almost 60 percent. They also are racking up huge unfunded long-term obligations, as the nonpartisan Congressional Budget Office has reported. The Government Accountability Office has chimed in by warning that entitlements are on an unsustainable growth trajectory.

So what to do?  Stuart Butler of the Brookings Institution and Maya MacGuineas of Committee for a Responsible Federal Budget suggest a long-term budget for entitlements. Doing this, they posit, would establish an “orderly pathway for helping to resolve inherent tensions” in current budgeting. Additionally, a long-term entitlement budget would “encourage Congress to make clear choices about long-term spending.” Presently, the auto-pilot growth of entitlements is crowding out spending on other priorities and fueling bruising budget fights.

The authors identify two steps to enact a long-term budget: designing a long-term budget plan and treating the plan as binding going forward (unless Congress takes steps to change it). For the initial design phase, Congress would need to map out a 25-year spending plan for major entitlement programs, as well as a funding plan to cover their costs (presumably by identifying specific taxes or revenue, savings, or by proposing a debt increase). The authors advocate that this long-term budget plan should also include tax expenditures, which, like entitlements, tend to grow inexorably.

CBO would then be tasked with publishing an annual 10-year “moving average” (based on the results of the previous five years and projections five years into the future) that would act as the baseline from which it would be determined if the budget was veering outside what the authors call the “corridors” of the long-term plan.

Congress would engage in quadrennial reviews of the long-term budget during the year after each presidential election. This formal review could be used to alter the long-term budget going forward, if Congress deemed it desirable. Once finalized, the budget timeline would be extended by another four years to create a new 25-year budget. Critically, revenue and entitlement levels that had been established during prior reviews would only be alterable by an act of Congress signed by the president.

In between these quadrennial reviews, agencies and Congress would have free rein to change revenues and entitlements, so long as the changes did not cause the long-term budget to diverge from the established corridors.

Once a plan is developed, Congress would vote to enact it, thereby creating a framework moving forward and teeing up the second element of the author’s proposal. That element requires the long-term budget plan to be the default budget for entitlement programs moving forward, and automatic procedures would be triggered if the spending plan failed to stay within the agreed-upon corridors.

The authors sensibly note that a long-term budget plan would be unlikely to survive if it required Congress to take proactive action to maintain it. Therefore, automatic enforcement mechanisms are necessary. The authors criticize the idea of using automatic triggers—modeled after mechanisms like the Medicare Sustainable Growth Rate—that would initiate cuts whenever the long-term plan veered outside the set corridors. As they point out, such proposals are politically difficult given their blunt nature of cutting expenditures across the board.

Instead, they propose establishing a commission similar to the Defense Base Realignment Commission (which was established in the late 1980s to identify and close unnecessary military bases) that would act as the default enforcement mechanism to maintain the long-term budget. This would mean that, in the event the long-term budget started to careen outside the guardrails (for example, entitlement spending started to rise more than anticipated), the commission would be able to engage in the necessary course-correction actions to keep the long-term budget within the corridors (for example, by cutting spending or identifying more revenues for funding).

This commission would be jointly selected by Congress and the president, and its recommendations would be final — unless Congress took action to override it. This could be done by a congressional “super committee” that could develop an alternative method and package to maintain the long-term budget’s proper trajectory. The super committee’s alternative would then be considered on an expedited up-or-down vote in Congress and would replace the commission’s plan if approved.

In their paper, the authors address a few other points regarding their proposal, including some counterarguments to the idea. They note, for example, that it might be wise to allow the automatic-enforcement mechanisms (the commission and super-committee process) to be suspended during a certified recession. They also suggest the importance of setting an explicit fiscal objective that the long-term budget would be trying to help achieve, in an effort to discourage overpromising by politicians. Finally, the authors acknowledge that, while one Congress cannot legally bind future Congresses, the legislature is empowered to establish congressional procedures that can work to shape future congressional behavior and politics.

Congress has struggled mightily in recent years to carry out the fantastically complex annual budget and appropriations processes. A new process is needed, one that both reduces the steps that need to be completed each year, reduces the realm of conflict and soberly confronts fiscal reality. As such, Butler’s and MacGuineas’ proposal is a welcome effort to tackle a problem that too long has been unaddressed.

Image by Digital Storm

How the federal government could lead by example in cyber insurance

The takeup rate of cyber insurance is rising, but the market’s growth to date has been uneven. Anne Hobson, a technology policy fellow at the R Street Institute, may have a solution to that problem. She proposes that the federal government, as a high-profile cyber target and large user of connected devices, is well-positioned to address the uniformity problems currently afflicting cyber-insurance policies by introducing a financial responsibility requirement for some of its most vulnerable vendors and contractors.

In a new paper, Hobson takes stock of the risks presented by growth of the so-called “internet of things.” The “things” are objects, hitherto unconnected, that are now being networked for our convenience. Each of these objects has the ability to send or receive data, often to do both, and is, as a result, susceptible to breach and malicious misuse.

She reasons that, beyond outright ignorance of the existence of cyber insurance, the principal reason firms fail to carry coverage is that the policies are both complex and nonstandard. Thus, while there is reason to believe the takeup rate will continue to climb as the market grows, a step taken to introduce some level of uniformity to the market may speed that process further.

But finding a benign way to introduce greater uniformity to cyber-insurance offerings is challenging, particularly given that the long-predicted rapid growth of the cyber-insurance market is now meaningfully underway.

Nigel Pearson, global head of fidelity at Allianz Global Corporate & Specialty recently noted that “the cyber market is growing by double-digit figures year-on-year, and could reach $20 billion or more in the next 10 years.” His prediction was echoed by similarly bullish analysis from Allied Market Research. Allied projects the cyber-insurance market will reach $14 billion in written premium in five years, by 2022.

Despite those developments, growth has been uneven. While firms in some sectors, particularly large firms in financial services, are now more likely than not to carry some level of cyber insurance, the vast majority of small and midsize businesses do not. The risk for such firms is large and growing. In fact, according to Hartford Steam Boiler, 60 percent of small and midsized businesses that experience a cyber attack go out of business within six months.

Hobson argues that prescriptive regulations establishing cybersecurity standards would do more harm than good for firms of all sizes, but that federal agencies can help encourage the fast-developing cyber-insurance market by insisting that internet-of-things contractors be held financially responsible for any liabilities created for taxpayers as a result of cyber-attacks on their products or services.

The rationale for such a requirement is twofold. First, it is important to insulate taxpayers from the costs associated with a breach. In Hobson’s own words:

In the case of a cyber-attack or data breach that stems from the insecurity of a contractor or vendor’s system, the contracting agency…could have to expend resources on a host of ancillary costs, which can include DDoS mitigation services, forensic investigations, user notifications and data recovery. Rather than pass such costs onto the taxpayers, agencies and government purchasing agents should assert in contractual language their right to subrogate these liabilities from the contractor or vendor.

Second, greater adoption of cyber insurance would help to improve cybersecurity itself, as it would align security incentives. As firms go through the cyber-insurance underwriting process, they are made to audit their cyber vulnerability and to address problems as they are uncovered. For their part, insurers have every reason to ensure that firms maintain a vigilant cyber defense. Thus, each party has an independent pecuniary incentive to foster an effective ongoing cyber defense.

Congress is eager to improve the nation’s cybersecurity preparedness. But instead of a cyber-insurance backstop for large risks, or a prescriptive set of security requirements for small firms to follow, Hobson concludes that the best thing that it can do is set an example as a market participant. By taking a modest step, Congress can both expand the universe of firms with cyber insurance and bolster the nation’s cyber preparedness. That’s a win-win.

CTA previews Innovation Policy Day at SXSW

sxsw preview

The Consumer Technology Association welcomed R Street Tech Policy Fellow Anne Hobson to take part in a recent Facebook Live panel to preview the Innovation Policy Day programming CTA will be hosting at the upcoming SXSW festival in Austin, Texas. Hobson’s comments focus on augmented reality (AR) and virtual reality (VR), the roles these emerging technologies will play a role in improving people’s lives, and the concerns they have raised in such policy areas as cybersecurity, privacy, intellectual property, e-commerce, free expression and health and safety. Video of the panel is embedded below.

Ian Adams on KVOI in Tucson

R Street Senior Fellow Ian Adams and Mike Shaw of KVOI in Tucson, Arizona, to discuss the REINS (Regulations from the Executive in Need of Scrutiny Act) Act and what it means for re-balancing the distribution of power between the legislative and executive branches of government. The half-hour segment also focused on NHTSA’s decision to clear Tesla of any wrongdoing stemming from a recent fatal accident involving its autopilot system, as well as President Donald Trump’s use of a mobile device which might, in fact, be hacked already.

The full show is embedded below.

Transportation regulators are determined to stretch their powers to the limit


Since 2008, the freight-railroad industry has contested an effort by two federal regulators to rebalance the relationship between freight-rail operators and Amtrak. The issue is one of control and timing.

Amtrak, which is government-funded but operates as a for-profit corporation, predominantly runs its many passenger rail routes over lines owned by freight-railroad companies. Even though freights own most of the lines, Congress has historically granted preference to Amtrak trains, meaning that freight trains must generally yield to Amtrak trains if their routes conflict. Even with this priority, however, Amtrak has proven remarkably poor at running its trains on time. Amtrak, in turn, has blamed its poor performance on the freights, arguing that they do not sufficiently prioritize Amtrak trains along their routes.

In an effort to rectify Amtrak’s lagging performance, Congress passed the Passenger Rail Investment and Improvement Act of 2008, which ordered the Federal Railroad Administration and Amtrak to promulgate joint metrics and standards to measuring the performance and quality of intercity passenger trains. Under PRIIA, the metrics are used to investigate substandard performance and, in some situations, may be used to award Amtrak damages if freight railroads are found to be the true cause of Amtrak’s poor performance.

While the statute required the FRA and Amtrak to promulgate the performance metrics, it charged a different federal agency, the Surface Transportation Board, with undertaking any such investigations into laggard performance. This framework drew a constitutional challenge from the freight-rail operators. Because the law empowered FRA and Amtrak to come up with the performance metrics together, the freight railroads argue the law violates the so-called “non-delegation doctrine.” That doctrine prohibits Congress from delegating its legislative powers to other entities, particularly private entities. Given that Amtrak is managed as a for-profit corporation, the rail industry argues that it qualifies as a private entity.

The rail industry initially won on this argument at the D.C. Circuit Court of Appeals, which agreed that PRIIA’s delegation of metric-setting power to a quasi-private entity like Amtrak was an unconstitutional delegation of Congress’ powers. On appeal to the Supreme Court, though, the tide turned. The high court held that Amtrak was, in fact, a government entity rather than a private entity. As a result, Congress enjoyed more leeway in delegating its power to Amtrak.

The drama didn’t end there. The D.C. Circuit revisited the case and again struck down the metric-setting section of PRIIA. This time they struck it down on alternative grounds—namely, that the law violates due process because it allowed Amtrak effectively to act as a regulator in a market in which it had “skin in the game.” In other words, Amtrak was competing as an economic actor with the freight railroads, while also having power to regulate and impact the conduct of its competitors.

The federal government is expected once again to appeal that ruling to the Supreme Court. In the meantime, another related skirmish has broken out. While the non-delegation dispute was working its way up and down from the Supreme Court, the Surface Transportation Board stepped in and declared its authority to define PRIIA’s metrics and standards. As laid out above, PRIIA clearly awarded the power to define metrics—such as on-time performance to the FRA and Amtrak—while it reserved the powers of investigation and enforcement to the STB. But since the FRA/Amtrak version of the regulations are tied up in court, the STB claims it has the power to step into the fray and define its own metrics.

Unsurprisingly, this development sparked another court case by the freight-rail industry—this one in the 8th U.S. Circuit Court of Appeals—arguing that the STB usurped powers not granted to it by the law. At stake in this case is the proposition that, just because a protracted legal battle has halted the implementation of a law, another regulator is not suddenly empowered to fill the gaps.

The best option would be for Congress to step in again and fix the statutory defect and clarify whether the FRA or the STB has metric-setting power (and at the same time, clarify that Amtrak does not have such power). Either way, the drama surrounding this controversy is just one more demonstration of federal agencies pursuing creative methods to implement regulations and accrue power—even in the face of seemingly clear statutory language and court decisions to the contrary. If their power to do so is affirmed, both regulated industry and Congress have something new to fear.

Image by Sherman Cahal

The administration’s bad start on civil asset forfeiture


It appears that President Donald Trump has officially taken a position on civil asset forfeiture. This week, the president offered to help “destroy” the career of a Texas state senator who was supporting reform with the state’s asset forfeiture laws (he later claimed that he was joking.)

Joke or not, the threat came during an exchange with Rockwall County Sheriff Harold Eavenson, who told the president: “We’ve got a state senator in Texas that was talking about introducing legislation to require conviction before we could receive that forfeiture money.” Eavenson went on to say that the Mexican cartels would “build a monument to [the unnamed senator] in Mexico if he could get that legislation passed.”

The particular bill Eavenson was referring to is co-sponsored by state Sens. Konni Burton, R-Fort Worth, and Juan “Chuy” Hinojosa, D-McAllen, with the goal to reform civil asset forfeiture in Texas. The two senators championed this reform effort because of the inherent injustices associated with the asset-forfeiture process.

Because such seizures are civil, a victim of forfeiture does not have access to appointed counsel and the standard of proof is typically “preponderance of the evidence,” much lower than the criminal standard of “beyond a reasonable doubt.” Maybe most egregious, the practice has encouraged a policing-for-profit mentality where officers pursue much-needed funds via the forfeiture mechanism, rather than acting as administers of peace—protecting and serving. This mentality has increased the tension between the police and the policed.

A pillar of President Trump’s campaign (and now his presidency) has been the populist defense of “the little guy.” It is mind-boggling that he would defend a practice that targets his base. The rich elite can afford lawyers, but the little guy cannot.

The Supreme Court has performed legal somersaults defending forfeiture without charge or conviction. Thus, federal courts continue to hear cases with titles like United States v. Articles Consisting of 50,000 Cardboard Boxes More or Less, Each Containing One Part of Clacker Balls.

But the legislation in question here is a state bill. Federalism, according to the GOP’s official platform, is “the foundation of personal liberty.” The platform goes on to say:

Federalism is a cornerstone of our constitutional system. Every violation of state sovereignty by federal officials is not merely a transgression of one unit of government against another; it is an assault on the liberties of individual Americans.

It seems reasonable, then, that the state of Texas should decide what is right for the state of Texas, without interference or threats from the federal executive. The notion that the president would use his bully pulpit to “destroy” duly elected state senators because a sheriff with a shiny star on his chest bitched about having his toys taken away is a Nixonesque abuse of power.

Image by Shevs

Neeley addresses press conference on ‘Conservative Texas Budget’

neely conf

R Street Southwest Region Director Josiah Neeley joined with Texas state Sen. Don Huffines, R-Dallas, and other members of the Conservative Texas Budget Coalition at an Austin press conference to unveil a proposed 2018-19 state budget that includes significant tax and spending reforms, capping spending at no more than $218.5 billion, eliminating the state’s costly business margins tax and enacted structural reforms to property taxes. Video of the press conference is embedded below.

Adams talks self-driving cars on Tech Policy Podcast

R Street Institute Senior Fellow Ian Adams recently joined TechDirt’s Tech Policy Podcast for a discussion about the many implications of autonomous vehicles. Full audio of the show is embedded below.

Debunking the ‘experts’ on vaping


A recent article on the millennial-focused website deigned to summarize “what the experts say” about the long-term effects of vaping, which certainly is an important topic. Alas, the piece by author Aric Suber-Jenkins was hardly objective or unbiased. In fact, it’s wrong in nearly every major assertion it makes and every bit of analysis if offers.

Rather than feature diverse perspectives from a variety of experts, the piece focuses almost entirely on the views of one person: Stanton Glantz, director of the Center for Tobacco Control Research and Education at the University of California, San Francisco. To be sure, Glantz is a respected researcher and his perspective deserves a hearing. He’s also wrong on a lot of things. Four major myths he and the Mic piece present very much need to be debunked:

MYTH:  “The scientific community is beginning to see things differently, however. Its consensus: vaping is a scam.”

FACT: There is no consensus that vaping is a scam. If anything, the voices that oppose vaping as a harm-reduction option are being increasingly drowned out by rational applications of science.  In the United Kingdom, public-health authorities acknowledge that health vaping is 95 percent less harmful than traditional cigarettes. In Ireland, vaping now is seen as a crucial tool to help people quit cigarettes. Indeed, even in the United States, experts increasingly regard vaping as a valuable harm-reduction strategy. The only “consensus” lies in the Mic author’s refusal to include diverse views from the scientific community that aren’t represented by Glantz’s sole claims.

MYTH: “The most dangerous thing about e-cigarettes is that they keep people smoking cigarettes.”

FACT:  The Mic piece cites a 2012 study, co-authored by David Abrahms, to back up this assertion. However, he fails to include 2015 research from that same Abrahms making the case that e-cigarettes can assist some smokers’ efforts to quit greatly.  Furthermore, there is an entire body of international research that has illustrated to great effect how vaping has helped many quit or greatly reduce their smoking. Why this plentiful research is ignored in favor of Glantz’s unsupported assertion is unclear.

MYTH: “E-cigarettes deliver as much or more ultrafine particles as the ones found in cigarettes.”

FACT: Here, Glantz contradicts his own 2014 research, which stated:

It is not clear whether the ultrafine particles delivered by e-cigarettes have health effects and toxicity similar to the ambient fine particles generated by conventional cigarette smoke or secondhand smoke.

Did Glantz change his mind? He certainly didn’t publish any new discoveries on the issue. From this contradiction, we find a lack of consensus even among Glantz’s own beliefs, much less those of the scientific community at-large. It would be much more accurate to report what is empirically known and what remains uncertain.

MYTH:  “The e-cigarette industry … [has a] hold on adolescents.”

FACT: The National Institutes of Health reports teen use of e-cigarettes declined significantly in 2016, from 16.2 percent to 12.4 percent.  No one actually wants adolescents to begin consuming nicotine. But kids smoked in high school long before e-cigarettes were invented. Encouragingly, teen rates of tobacco use have been declining consistently ever since vaping became available. Wouldn’t it make sense to guide them toward less destructive options, rather than withholding access altogether?

As a gay man in the United States, I lived through the devastation and grief of the AIDS epidemic, and celebrated when treatment medications began to save the lives of many people who were dying of this deadly disease. At the same time, I’ve seen many of those same individuals and their loved ones consume nicotine through cigarettes out of a sense of habit, dependence and fear of the physical and emotional discomfort of quitting.

As we’ve seen HIV-related deaths in the United States drop to historical lows of around 12,000 a year, we have continued to see lesbian, gay, bisexual and transgender tobacco-related deaths remain stagnant at 30,000 per year.  When the technology and the empirical evidence overwhelmingly demonstrated an opportunity for vaping to save lives, I thought for sure our LGBT and supportive media would support such an advancement. Sadly, this has not been the case. And isn’t helping.

Image by totallypic

Seventh Circuit strikes down Indiana’s protectionist vaping rules


Apparently we weren’t the only observers who thought the Indiana law regulating vaping products was a little over the top. A major reason why Indianapolis rated a grade of “D-” in the 52-city survey of local vaping rules that R Street released in December was a state law with some peculiarities.

We expect and support appropriately tailored regulation related to quality control of anything that is designed to be inhaled by humans. We also support bans on sales to minors and requirements for child-proof packaging. In that vein, I’m sure it appeared to most of the Indiana General Assembly who voted on the Hoosier State’s legislation in 2015 that its very specific cleanliness provisions were in line with what most food-service operations are required to maintain.

But earlier this week, the U.S, 7th Circuit Court of Appeals ruled portions of the law unenforceable against out-of-state manufacturers. They were a little just a little too specific to get a pass under the Commerce Clause’s protection of interstate commerce. The Indiana law required particular sinks and even specific cleaning products. It required manufacturers of vaping liquid to sign security contracts for five years that provide 24-hour video monitoring and high-security key systems. One telltale clue as to the draconian nature of the law was its requirement that manufacturing facilities have “clean rooms” that comply with the Indiana Kitchen Code.

I’m sure that you have guessed by now that there were only a few companies that could meet all of the Indiana requirements. Judge David Hamilton’s opinion stated that the Indiana law “is written so as to have extraterritorial reach that is unprecedented, imposing detailed requirements of Indiana law on out-of-state manufacturing organizations.” The decision noted that 99 percent of vaping-liquid revenue had come from out-of-state manufacturers before the legislation was enacted.

The security provisions are a special case, even beyond the impact of the lawsuit. Although the basic legislation was passed in 2015, an amendment to the law last year effectively granted a single Lafayette company monopoly power to approve security requirements for all manufacturers. It further set a deadline for permit applications that was backdated one week before the bill was signed into law. Involved lawmakers swear that state lawyers and the Indiana Alcohol and Tobacco Commission signed off on the language. But under the new law, only six manufacturers met the security requirements to allow Indiana sales, and four of those turned out—not so surprisingly—to bear Indiana home addresses.

The Republican-controlled Legislature will try again on this particular provision, which may or may not have to do with a rumored but unconfirmed FBI investigation. Speaker Brian Bosma, R-Indianapolis, has publicly stated that his House of Representatives leadership team will support a change to the language to solve this problem, and we can assume it will take up changes that need to be made because of the court decision.

Vaping has become exceedingly popular in this country, including in the Midwest. The out-of-state companies that challenged the Indiana law note in their lawsuit maintains that there are currently about 138 vaping shops or locations in the state, which produce annual sales of more than $77 million annually. Since the overwhelming majority of vapers are formers smokers who are looking to quit or cut down, this is good news. The evidence indicates that vaping is 95 percent less harmful than combustible tobacco cigarettes.

We are rooting for Indianapolis to rate a higher score in our next vaping friendliness survey.

Image by FabrikaSimf

Does Congress have the technology it needs to govern?

During its first two centuries of operations, Congress conducted business the old fashion way: with paper and pencil and face-to-face. Over the past 25 years, however, the legislative branch began—hesitatingly—using computer technology and the Internet.

How is this digital transition going? What has gone well and what need improved? And does Congress have the tech tools it needs to govern in the 21st century?


Meag Doherty, operations associate, OpenGov Foundation
Sasha Moss, technology policy fellow, R Street Institute
Christian Hoehner, policy director, Data Coalition
Daniel Schuman, policy pirector, DemandProgress
Kevin Kosar, governance project director, R Street Institute

Video: Orange County short-term rentals panel

Internet-based short-term rental companies such as Airbnb have built a burgeoning industry connecting homeowners with vacationers, who rent out rooms or entire houses for short-term stays. It’s a fascinating example of the New Economy, but this new business model has run up against stiffer opposition than expected.

On January 27th, 2017, the R Street Institute and the Orange County Register sponsored a panel discussion and breakfast with some key local thought leaders.

Hobson talks ‘Poképolicy’ on the Tech Policy Podcast

Last summer’s megahit game Pokémon Go was many Americans’ first introduction to the notion of “augmented reality,” in which artificial visuals are superimposed onto the real world. R Street Tech Policy Fellow Anne Hobson explored some of the policy implications of augmented reality in a recent policy study, including how it could affect cybersecurity, privacy, intellectual property and public safety. More recently, she sat down with TechFreedom’s Tech Policy Podcast to explore the issue in-depth. That show is embedded below.

FERC chair’s departure spells policy uncertainty and infrastructure delays


In his first week in office, President Donald Trump elevated Commissioner Cheryl LaFleur to serve as acting chairman of the Federal Energy Regulatory Commission. This action removed that designation from then-Chairman Norman Bay, who promptly issued his resignation.

Bay’s decision to leave is customary—most chairmen do not stay following demotion—but the surprising immediacy of the Feb. 3 effective date leaves FERC with only two sitting commissioners. That’s one short of the quorum required to issue orders. Commonly, if a quorum is at stake, a FERC commissioner will retain their position until a replacement is imminent. Bay’s unusually rapid departure leaves the agency’s policy apparatus and infrastructure-approval processes paralyzed.

The apparent motivation for the change was a perceived lack of alignment with the president’s agenda. It also suggests Bay never overcame the distrust of Republican leadership. During Bay’s confirmation, GOP leaders expressed concern that he would serve as a rubber-stamp for the Obama administration’s “extreme anti-coal agenda” On the contrary, Bay’s policy record proved fuel-agnostic, in addition to being otherwise quite consistent with market principles and supportive of fossil-fuel infrastructure expansion.

In an extensive resignation letter, Bay highlighted that FERC issued certificates for more than 1,000 miles of natural gas pipeline for 2016, the largest amount since 2007, and authorized two liquefied natural gas export facilities. It’s worth noting these actions came despite intense, disruptive grassroots opposition, as well as criticism from the U.S. Environmental Protection Agency asking that FERC expand its environmental reviews. Bay also improved market transparency and expanded education tools and other public resources. He modernized FERC staff’s data capabilities, drove robust analysis into decision-making and tore down silos between FERC offices.

Current and former FERC staff (including this author) attest to Bay’s commitment to economically sound market design. His genuine intent was to level the playing field. This manifested most notably in a notice of proposed rulemaking to reduce regulatory barriers to entry for energy storage and distributed energy resources. This departed thematically from some prior FERC policies that gave preferential treatment to certain resources (and still need correction). He also continued FERC’s price formation initiative, which epitomizes the appropriate role of a regulator – to foster competition and healthy markets. But these rules remain merely proposed; the Trump Administration’s choice for permanent FERC chair will determine their fate.

Going forward, the theme of FERC policy should be more one of refinement than course correction. In most areas, Bay’s agenda was consistent with conservative principles that support markets and transparency. But his otherwise pro-market legacy is overshadowed by his reputation as a backer of exuberant enforcement practices.

Enforcement policy marked Bay’s clearest disconnect with conservatives and has grown contentious within industry, policymaker and media circles. Criticisms were prone to hyperbole, such as The Wall Street Journal declaring Bay “Harry Reid’s personal prosecutor.” More thoughtful critiques emerged from scholars who were concerned with FERC enforcement’s legal processes and how it determined what constitutes “manipulative” behavior. LaFleur herself expressed concerns with FERC penalty guidelines and procedural aspects affecting investigation targets. This discrepancy with Bay may have tipped the political scales in her favor.

LaFleur’s policy stances matter little, as her chair status remains temporary until the administration fills other vacancies and presumably appoints a new chair. In the meantime, FERC lacks a quorum to advance a policy agenda.

It could easily take two to three months to confirm a new FERC commissioner. In the interim, FERC cannot act on major orders, rules and policy pronouncements. At least some routine business will continue under authority delegated to FERC office directors, which FERC may expand in the interim. But given pending rulemakings and infrastructure applications, a hobbled FERC cannot approve proposed mergers, including a major pending power plant purchase by Dynegy. Nor can it respond to pressing complaints or enact rule reforms slated to enhance price formation and enable market access for advanced technologies. Lack of a quorum also stalls FERC approval of natural gas infrastructure projects, including pipelines and LNG facilities.

It’s unlikely that delaying pipeline approvals several months would disrupt service, but it does add costs, as pipeline congestion increases the delivered price of natural gas to customers. In particular, extended delay may cause four major Appalachian pipeline projects to miss their in-service dates. This coincides with market forces driving a rebound in natural gas prices – perhaps the highest since 2014. That’s a tricky political calculus for a new administration promising lower energy prices, more infrastructure and less red tape.

The Trump administration should consider accelerating one commissioner nomination. As they mull candidates, it’s worth noting that LaFleur’s leadership, especially in an interim role, should mostly align with administration priorities and not raise red flags. It may be worth nominating the least-controversial candidate first, to restore FERC promptly under LaFleur’s watch. That would serve as a bridge to the permanent chair, whose confirmation process may face greater scrutiny and delay.

Senate leadership also needs to prioritize FERC restoration. Fortunately, Sen. Lisa Murkowski, R-Alaska, has warned of a FERC crisis and promised to move nominees rapidly to re-establish a working quorum. In addition to avoiding infrastructure delays, Senate Republicans should remain mindful that returning FERC to its full functional capacity is simply good governance. After all, the major pending rulemakings would, on balance, enhance market competition. They’d also reduce barriers and bolster compensation for some advanced clean technologies, which should appeal to Democrats. Dems should also note that natural gas expansion has led to large emissions reductions, and pipeline congestion causes the most customer headaches in the Northeast.

Promptly restoring FERC will require that the Senate avoid a drawn-out nominations process. This should be a bipartisan exercise in search of good governance. However, some on the environmental left have already voiced opposition to Trump’s nominees, even though they haven’t been announced. If nominations prove unworkably contentious, it’s plausible that a pro-market Democrat would make the best choice, even for the Republican agenda. This may swiftly cut through partisan debate and serve as the eventual replacement for Democratic Commissioner Colette Honorable, whose term ends in June (Republicans can only fill three of the five FERC spots). It would delay seating a Republican until the expiration of Honorable’s term, a price the GOP may be willing to pay to expedite FERC restoration. Regardless of the political calculus, expediting FERC nominations is a pressing economic need both parties must address.

Image by QiuJu Song

Why process matters in congressional appropriations



As recent legislative sessions repeatedly have ground to a halt with threats of government shutdown, we’re forced to wonder: what has gone so terribly wrong in the appropriations process? Peter Hanson of the University of Denver explores this question in his recent white paper for the Brookings Institution. For much of Congress’ history, discretionary spending was set by regular order, in which appropriations bills were debated and passed individually. Hanson both investigates how the “regular order” method of appropriations went extinct and ultimately calls to restore this time-honored process.

Regular order allowed the budget to be broken into bite-sized pieces, encouraging legislators to exercise greater control over spending. Today, by contrast, Congress has taken to bundling appropriations bills into thousand-page omnibus packages that are rammed through at the end of legislative sessions—often in the face of a looming government shutdown. Lawmakers hardly have time to read these voluminous bills, let alone exercise oversight.

Unsurprisingly, an inferior process leads to inferior results. As Hanson and political scientists like Matthew Green and Daniel Burns have argued, regular order helps encourage debate among legislators and reduces the risk of “substandard legislation” being stuck into appropriations bills. Furthermore, as Yuval Levin has noted, breaking up the budget plays better to the innate strengths of Congress, in that it allows individual lawmakers to exercise greater influence over discrete funding choices.

Hanson pins most of the blame for the collapse of regular order on the Senate, since its rules allow senators to engage in disruptive tactics like filibustering or attaching controversial amendments to bills in order to score political points. This leads Senate leadership to stifle debate by pushing “must-pass” omnibus bills under tight deadlines at the end of legislative sessions.

Hanson proposes four fixes for the appropriations process: reforming the filibuster to allow a simple majority vote to end debate on spending bills; allowing bills to be considered concurrently by the House and Senate; restoring earmarks on a limited basis, to help grease the skids to pass legislation; and shifting toward nonpublic deal-making to shield individual lawmakers from negative publicity.

While there’s room to debate the efficacy of these different proposals—such as whether restoring earmarks would really help pass more legislation—Hanson’s plea to restore congressional authority over the federal budget is timely. Given the well-documented diminution in power Congress has experienced vis-à-vis the executive branch in recent years, it’s time for our First Branch of government to reassert its leadership role.

Congress’ power of the purse is a good place to start; James Madison himself described it as “the most complete and effectual weapon” to be wielded by the people’s representatives. But even the most effectual weapon is useless when wielded incorrectly. Until Congress recognizes why process matters when it comes to appropriations, it will have trouble restoring its control over our government’s spending.

Image by mj007

Reforming the administrative state—and reining it in

kosar hoover

National Affairs released a special report earlier this month on comprehensive proposals to rein in the regulatory state. The report’s three authors — Hoover Institution Research Fellow Adam White, Manhattan Institute Senior Fellow Oren Cass and R Street Institute Senior Fellow Kevin Kosar–took part in a recent panel moderated by National Affairs editor Yuval Levin at an event the magazine co-hosted with Hoover. Video of the discussion is embedded below:

What Dow 20,000 looks like in inflation-adjusted terms


The Dow Jones industrial average closing Jan. 25 at more than 20,000 inspired big, top of the fold front-page headlines in both the Wall Street Journal and the Financial Times (although the story was on Page 14 of The Washington Post). The Journal and FT both ran long-term graphs of the DJIA, but both were in nominal dollars. In nominal dollars, the 100-year history looks like Graph 1—the DJIA is 211 times its Dec. 31, 1916, level.

This history includes two world wars, a Great Depression, several other wars, the great inflation of the 1970s, numerous financial crises, both domestic and international, booms and recessions, amazing innovations, unending political debating and 18 U.S. presidents (10 Republicans and eight Democrats). Through all this, there has been, up until yesterday, an average annual nominal price increase of 5.5 percent in the DJIA.

graf 1

Using nominal dollars makes the graphs rhetorically more impressive, but ignores that, for much of that long history, the Federal Reserve and the government have been busily depreciating the dollar. A dollar now is worth 4.8 percent of what it was 100 years ago, or about 5 cents in end-of-1916 dollars. To rightly understand the returns, we have to adjust for a lot of inflation when we look at history.

Graph 2 shows 100 years of the DJIA in inflation-adjusted terms, stated in end-of-1916 dollars:

graf 2

Average annual inflation over these 100 years is 3.1 percent. Adjusting for this, and measuring in constant end-of-1916 dollars, 20,069 on the DJIA becomes 964. Compared to a level of 95 as of Dec. 31, 1916, the DJIA in real terms has increased about 10 times. Still very impressive, but quite different from the nominal picture. The average annual real price increase of the DJIA is 2.3 percent for the 100 years up to yesterday.

Growth rates of 2 percent, let alone 3 percent, extended over a century do remarkable things.

Image by Vintage Tone

The erosion of ownership in the digital age


When you buy a book from the store, no one doubts that you own it completely. You can take it home, write in its margins, lend it to a friend or turn around resell it to a used bookstore.

But what about a digital version of that same book? Or what of other digital goods, like video games, movies and music? In recent years, many consumers has switched from purchasing physical copies of media to downloading digital equivalents. Others have given up on the ownership model altogether — instead relying on subscription services like Netflix or Pandora. For example, U.S. music-streaming revenues now exceed digital downloads or physical sales.

The R Street Institute recently hosted a panel of experts at its Washington headquarters to discuss these and related issues, featuring Aaron Perzanowski of Case Western Reserve University School of Law; Jason Schultz of New York University School of Law; Todd Dupler of The Recording Academy; Jonathan Band of Policy Bandwidth; Steven Tepp of Sentinel Worldwide; and R Street’s Sasha Moss, who served as moderator. Video of the full panel is embedded below.

The Future of Work: Capturing the American Dream in an internet-enabled economy

lori future

As we race toward a technologically enhanced future, internet-enabled technologies continue to disrupt markets and industries at a breathtaking clip. Corporations see exponential increases in productivity from internet-enabled automation and the “internet of things.” That disruption is pushing the American workforce to a place where it needs to be more and more technologically adroit. With artificial intelligence, autonomous cars, virtual reality, 3D-printing, advanced manufacturing and robotics, the skills required of the American labor market bear little resemblance to what was needed just a generation ago.

R Street Senior Fellow Lori Sanders recently joined a panel of experts to discuss these topics at the 13th Annual State of the Net Internet Policy Conference. Video of the full panel is embedded below

It’s time for the Senate to pull in the REINS


Much of the debate about the new administration’s regulatory-reform agenda centers on the Regulations from Executive in Need of Scrutiny (REINS) Act, which would require both houses of Congress to approve all new “major regulations” that have an annual economic impact of $100 million or more.

The Huffington Post’s Carl Pope calls the REINS Act “the most dangerous bill you’ve never heard of” and asserts it is an effort by the “Tea Party-dominated Republican caucus in the House” to repeal regulations designed to protect the public. Pope concludes that passage of the REINS Act would be nothing short of disastrous:

Over time, as new health, safety, consumer and labor protection issues arise, all of these laws will effectively have been repealed, with no public debate and no accountability.

Pope’s assessment could not be less accurate. In fact, the REINS Act would remove the bureaucrat-driven rulemaking process from behind closed doors and hold elected officials accountable for new regulations.

Under the current process, Congress escapes scrutiny when regulatory agencies issue new rules that affect the lives of Americans. Unlike executive bureaucrats, elected officials can be held accountable by their constituents. Regulators, lacking this type of accountability, are free to promulgate rules without much regard for the costs they will impose.

Today, federal agencies are so pervasive and seemingly immune to reform that they have been labeled the “unaccountable fourth branch of government.” The costs imposed by the regulatory state—$1.9 trillion in 2015—have now reached more than 10 percent of U.S. gross domestic product. The REINS Act would curb costly or overreaching regulations by subjecting the largest regulations (those with $100 million or more in effects) to the congressional approval process.

Other criticism of the REINS Act asserts that the legislation dismisses scientific expertise. Pope makes a similarly incorrect argument:

Congress totally lacks the technical competence to review these kinds of complex rules. Do we really want members of Congress deciding whether a chemical can safely be used in food packaging? Or the proper procedures for approving new drugs as safe and effective? Or setting the allowable safety standard for heavy metals in drinking water?

The REINS Act doesn’t require that members of Congress be scientists, but it does ask that they evaluate whether a specific policy would be worth the various costs it imposes on those it aims to protect. As Reason’s Ronald Bailey points out:

As necessary and valuable as scientific expertise is, scientists and federal bureaucrats are not experts at evaluating and making benefit-risk trade-offs. If members of Congress get those trade-offs wrong, voters can fire those whom they believe are not acting in ways that adequately protect their health, safety and livelihoods.

Simply put, the REINS Act would leave the science to the scientists, while leaving Congress with the responsibility to govern.

Contrary to Pope’s argument that the REINS Act is a “dangerous bill you’ve never heard of,” the legislation has passed the House of Representatives in four consecutive sessions of Congress. Pope also misrepresents its legislative history:

Progressives may be counting on the fact that the Senate has previously refused to pass the bill, and that its broad overreaching will doom it. But these are not ordinary times and past behavior is far from reliable in predicting today’s politics.

The Senate previously failed to pass the REINS Act, not because the measure overreached, but because Senate Democrats were unwilling to take power away from their own party’s administration. Without 60 votes, the measure couldn’t progress. Even if it had, the Obama administration would have vetoed any attempt to restrain executive branch lawmaking.

When the legislature fails to create policies the executive branch arrogates policymaking abilities to itself. Democratic self-government requires that policies are made by the branch closest to the people.

The president’s endorsement of the REINS Act presents a unique opportunity for both parties. Through the REINS Act, Republicans can reduce the cost of the regulatory state and Democrats can cut the Trump administration’s executive power.

If Congress really wants to assert its power over a potentially unchecked executive branch, the Senate should not hold its horses on the REINS Act. Because what is wrong with Congress voting on major policies?

Image by Four Oaks

Ohio and Michigan both kick off 2017 with asset-forfeiture reform


Ohio House Speaker Cliff Rosenberger, R-Clarksville, opened the new legislative session last week by declaring the legislative branch’s job was to improve quality of life for all Ohioans. But it’s clear that the first challenge for state lawmakers is to mitigate the erosion of civilization.

Rosenberger designated legislation to curb domestic abuse as House Bill 1 of the new session. His opening remarks also made note of the immense problem of misused and illegal drugs in Ohio, which now ranks first in the nation in the number of drug-related overdose fatalities.

Ever since the state capital of Columbus famously saw 27 people die from drug overdoses during a single 24-hour period in September—following closely on the heels of the viral photo of the East Liverpool, Ohio couple who overdosed in the front seat of an SUV with the woman’s grandchild secured in a safety seat behind them—Sen. Rob Portman, U.S. Rep. Pat Tiberi and the Ohio General Assembly have been vocal about the need to find answers to Ohio’s tragic opiate-addiction problem.

At the biannual Impact Ohio Post Election Conference in Columbus, Scott Schertzer—mayor of Marion, a beautiful little town 45 miles northeast of Columbus—detailed how even cities like his are reeling to marshal the resources needed to keep drug abuse from upending small-town life. If we could mitigate the impacts of drug abuse, the results would be measurable improvements in education, health care, public housing, transportation, the criminal justice system, poverty rates and maybe even overall economic productivity.

But as we search for solutions, we also must be cognizant of likely impediments. Every policy tool must be assessed by whether it does the job for which it was designed. Currently in Ohio, law enforcement can seize private property believed to have been used in the commission of a crime through civil actions, even where the accused has not been convicted, or sometimes even where no charges are filed.

Earlier this month, at the urging of the Buckeye Institute, the national Right on Crime coalition, the American Civil Liberties Union, the Faith and Freedom Coalition, the U.S. Justice Action Network and other committed public policy organizations, the Ohio General Assembly passed and Gov. John Kasich signed reform legislation that reforms rules around civil asset forfeiture.

For its part, Michigan took two tries to fix the asset-forfeiture situation, the first in 2015. More recently, Gov. Rick Snyder signed a law earlier this month abolishing the requirement that defendants must post a 10 percent bond within 20 days of having their property seized to start the process of getting their property back.

Against the backdrop of what might be the worst environment for law enforcement in decades, perhaps reforms like these will help to ease negative perceptions of policing, helping us all get down to important matters for which we badly need law enforcement’s help.

Image by

Six quick takes on President Trump’s speech

  1. Thank you for making your speech short. Really, the days when folks enjoyed hearing long orations have passed.
  2. I am glad you thanked the Obamas for being “magnificent” to you during the transition. Offering “gracious aid” to an incoming president is part of the orderly transfer of government. I hope you’ll be as helpful when your time to depart comes.
  3. These statements are true and I hope you work with Congress to do something about it: “Americans want great schools for their children, safe neighborhoods for their families and good jobs for themselves … But for too many of our citizens, a different reality exists.”
  4. What in God’s green earth did you mean by this exclamation? “We must protect our borders from the ravages of other countries making our product, stealing our companies and destroying our jobs. Protection will lead to great prosperity and strength.” Ever heard of Adam Smith or taken Economics 101? Free trade is good; protectionism is bad. And the facts of are that nobody stole anything from us. Companies move overseas because they find it advantageous. Toyota and Honda came to America because they found it advantageous.
  5. Sir, do you imagine yourself an almighty despot? What else am I to think when I hear you declare” “The crime and the gangs and the drugs that have stolen too many lives and robbed our country of so much unrealized potential. This American carnage stops right here and stops right now.” Five bucks says that in three years, we will still have junkies and organized crime. They both have been around for a century, and are driven by a complex amalgam of socioeconomic factors.
  6. This is a good sentiment: “We are one nation, and their pain is our pain. Their dreams are our dreams, and their success will be our success. We share one heart, one home and one glorious destiny. The oath of office I take today is an oath of allegiance to all Americans.” but are you truly prepared to govern in a less divisive way?

Image by White House photographer – Official White House Facebook page, public domain.

R Street Panel: The expertise gap between Silicon Valley and Washington

crs panel

It shouldn’t be surprising that our elected representatives don’t always understand new technology. Rather than being early adopters, most of them are at least a decade behind the rest of us. For instance, there are a number of congressmen who don’t use email or smartphones. Some even prefer a typewriter.

Of course, they can’t be expected to know everything about every issue. They rely on research help from their personal staff and expert agencies like the Congressional Research Service (CRS). But the unfortunate fact is that congressional staff can’t do everything. They’re underpaid, suffer from high turnover and often lack the institutional knowledge and incentives to be effective. Additionally, while CRS produces a great deal of excellent research, they aren’t equipped to do everything.

When it comes to complex technology policy issues, this lack of institutional expertise can lead to ignorant or even technically-illiterate legislation. Rather than demanding that America’s innovators “nerd harder” to conform to ill-conceived legislative mandates, perhaps Congress itself should focus on buildings its own expertise.

These topics and more were discussed in a recent R Street panel, moderated by Zach Graves, R Street’s technology policy program director, and featuring Adam Keiper, editor of The New Atlantis; Robin Greene, policy counsel at the New America Foundation’s Open Technology Institute; Mike Masnick, CEO and founder of TechDirt; and Daniel Schuman, policy director of Demand Progress. Full video of the event is embedded below.

Kosar talks Congressional Regulation Office on C-Span

kosar cspan

R Street Governance Project Director Kevin Kosar was on C-Span’s Washington Journal program recently to discuss his recent piece in National Affairs proposing a Congressional Regulation Office and the concept of regulatory budgeting. The full video is embedded below.

Self-driving car bill would authorize Cal DMV to do what’s already done


New legislation introduced in the California Assembly would mandate the state Department of Motor Vehicles revoke the registration of any autonomous vehicle that isn’t operated in compliance with state law.

Sponsored by Assemblyman Phil Ting, D–San Francisco, Assembly Bill 87 also includes fines of $25,000 per day and a provision prohibiting manufacturers or operators found to violate the law from operating any other autonomous vehicle in California for two years. For manufacturers, in particular, this would be a catastrophic punishment. Given how fast the technology is progressing, a two-year ban from operating any autonomous vehicles amounts to “one strike and you are out” of the self-driving car market.

Ting’s bill is a classic example of legislating by headline, which is a terrible way to make law. But it’s also notable in that it appears to create an authority—to revoke the registrations of self-driving cars—that the DMV has already used.

Back in December, ridesharing firm Uber argued its autonomous vehicles tests were legal, based on the plain meaning of the state’s testing regulations. The DMV ultimately disagreed, revoking the vehicle registrations of all of the Uber testing vehicles. Uber would later roll its shiny new Volvo XC90 test vehicles onto semitrailers bound for Arizona, where the company expected a more reasonable regulatory regime.

If the DMV already had the authority to revoke registrations in cases like Uber’s, why is it necessary to grant that authority again, as Ting’s bill does? Perhaps, as written, Ting is seeking to make revocation compulsory in cases like Uber’s. But that seems unlikely, given the department’s own explanation of the registration revocations.

Speaking on the revocations at the time, a DMV spokesperson said:

…consistent with the department’s position that Uber’s vehicles are autonomous vehicles, the DMV has taken action to revoke the registration of 16 vehicles owned by Uber. It was determined that the registrations were improperly issued for these vehicles because they were not properly marked as test vehicles.

Under the Vehicle Code, the DMV has broad discretion when it comes to revoking a vehicle’s registration and enjoys substantial deference in how it interprets the code. But there are only a limited number of areas of authority from which the DMV could draw its power to revoke Uber’s registrations.

The first, Vehicle Code Section 38750, empowers the department to draft regulations related to testing autonomous vehicles. Under that section, it may draft “special rules for the registration of autonomous vehicles.” In fairness to the DMV, within the adopted rules, there are special registration requirements, including a marking requirement. But there is no explicit authority to revoke a vehicle’s registration because it failed to comply with the rules. Whether it’s appropriate to infer that authority is unclear.

Then there’s Vehicle Code Section 8880, which states that registration may be revoked on the basis of fraud, mechanical safety and a slew of other rationales. The DMV could have, for instance, argued that the Uber test vehicles were mechanically unsafe and thus subject to Sec. 8880(a)(2), the explanation I cited above from the DMV spokesperson does not jibe particularly well with the authority granted under Section 8880.

Relying on the vagaries of Section 38750 and the rules promulgated under its authority was risky business for the DMV. Had it chosen, Uber could have plausibly argued that the DMV’s interpretation was dead wrong. Ultimately, it didn’t matter whether the DMV actually had the authority it was claiming because, having marshalled the Office of the Attorney General, the department likely recognized that asserting the pretense of authority would be enough to force Uber’s hand. Litigating the matter, from Uber’s perspective, would have made little sense, given that other testing locales are available.

Now, only after publicly bringing an end to Uber’s testing in California, is the DMV seeking to explicitly affirm its authority to do what it has already done. Whether Ting’s bill is redundant will remain an open question. What is abundantly clear is that the discord between the rhetoric of “innovation” in California’s state government and the reality of legislative opportunism and regulatory capriciousness is more striking than ever.

Image by Cassiohabib

Join us tonight for Le Hackie Awards


DC Legal Hackers hosts its third annual Le Hackie Awards tonight, an event to honor companies, organizations, legal hackers and the best legal hacks of 2016.

For those not familiar, a legal hacker is:

  1. One who uses technology to improve law; or
  2. One who uses law to improve technology.

The legal hack network initially was formed in late 2011 by a group of Brooklyn Law students troubled by the U.S. House’s anti-piracy legislation, the Stop Online Piracy Act, and its Senate counterpart, the PROTECT IP Act, also known as SOPA and PIPA. Like many in the internet community, the students were concerned that Congress had, without consulting the public, crafted a bill that so plainly limited a user’s right to digital free speech.

Legal Hackers had its first meeting in New York in 2012 and has since grown into a movement with chapters in Washington, Los Angeles, Miami and many other cities around the globe.  The group consists of like-minded individuals in the fields of law and technology who enjoy getting together to discuss pertinent legislation, case law and the state of play within the tech and legal communities.

R Street looked to do its own small part to promote legal hacking in 2016 by creating, the Technology Policy Working Group and the Legislative Branch Capacity Working Group. That’s why we are co-sponsoring tonight’s event, along with allies like the Open Government Foundation, the Data Coalition, the Data Foundation and FastCase. We invite you to stop by, grab a slice of pizza and help us honor this year’s recipients.

Image by GaudiLab

Making Congress great again


It hasn’t exactly been a smooth first week for the Republican-controlled 115th Congress. On Monday, House Republicans voted to gut the Office of Congressional Ethics—only to reverse their decision a day later after receiving harsh backlash from constituents, ethics watchdogs and President-elect Donald Trump himself.

But despite the bad optics of the OCE fiasco, which dominated the news cycle, House Republicans actually took some important steps this week to improve democratic accountability and strengthen their ability to carry out their constitutional role in our system of checks and balances.

On Wednesday, the House passed the Midnight Rule Relief Act, a bill that protects Americans from a last-minute avalanche of Obama administration regulations by allowing Congress to repeal any rule finalized in the last 60 days of the administration with a single vote. And on Thursday, the House passed the Regulations from the Executive in Need of Scrutiny (REINS) Act, which provides a meaningful check on regulations by requiring Congress to vote on any executive branch regulation with an economic impact of $100 million or more.

Both ideas have been supported by Republicans in Congress for a while, and the REINS Act has passed the House three times – in the 112th, 113th and 114th Congresses. Now, with a Republican-controlled Senate and an incoming Republican administration, there’s hope that they will finally be enacted as law.

While House Republicans should be applauded for passing these important bills and placing such high priority on regulatory reform during their first week in session, there remains much to be done to stem regulatory overreach. For one, the bills obviously still have to pass the Senate and be signed by President Trump—not necessarily a given, since they actively weaken the power of the executive branch. And even if they are signed into law, clawing back power from the massive regulatory state remains a formidable task that requires continued vigilance and reform.

There are many other reform ideas and regulation-whacking tools at Congress’ disposal that should continue to be pursued and given due consideration in the 115th Congress. R Street Governance Project Director Kevin Kosar has highlighted a number of promising reform ideas, including regulatory budgeting, utilizing the Congressional Review Act and creating a nonpartisan Congressional Regulation Office to analyze significant regulations.

As House Republicans turn their attention to what will be the defining feature of their first 100 days in session—the repeal and replace of Obamacare—it’s important that legislators continue to pursue sensible regulatory reform and hold the executive branch accountable. The time may be ripe for Republican and Democratic lawmakers to come together in the effort to Make Congress Great Again.

The intelligence community should be more accountable to Congress


The salvo between President-elect Donald Trump and the U.S. intelligence community continues to intensify. Having downplayed and dismissed unanimous intelligence agency conclusions about Russian involvement in the DNC hack, Trump now wants to restructure the Office of Director of National Intelligence and the Central Intelligence Agency because he finds our nation’s top spy agencies “bloated and politicized,” the Wall Street Journal reports.

To be sure, there are problems in the intelligence community.  But rather than restructure the agencies, the first step should be to improve congressional oversight of the community’s practices. Rank-and-file members of Congress still have not been briefed on the DNC hack or any possible Russian involvement. They are left entirely in the dark.

The intelligence community has briefed President Barack Obama and senators; and briefed the president-elect this morning. But out of 435 members of the House of Representatives, only the 24 members of the House Permanent Select Committee on Intelligence (HPSCI) have been briefed. The rest of Congress has been stonewalled from fulfilling their constitutional responsibility.

This is par for the course. House rules are structured to make rank-and-file members completely dependent on the HPSCI for morsels of intelligence-related information. Representatives are not properly staffed or guided through the nuances of these often tricky matters.  They are dissuaded from exercising “healthy American skepticism” to question surveillance practices.

Questions about the future of American-Russian relations, or how the intelligence community does its job, are of the utmost of national importance. Yet most of our elected officials are on the sideline. It is time to make sure all our duly elected legislators are fully informed and can enact proper oversight over these delicate issues.

Image by jgolby

Yes, fewer traffic fatalities is a good thing!

Outlet bias is a hallmark of the bored and lazy, and also of The Daily Wire‘s Ben Shapiro.

Taken by the recent headline of a piece I wrote with colleague Anne Hobson, “Self-driving cars will make organ shortages even worse,” Ben declares, in a wonder of incoherence:

Slate thinks that self-driving cars are the problems. Meaning that the real problem is that healthy young people aren’t dying at sufficient rates.


Ben’s histrionic content can be exhausting even when it bothers to reflect reality, but in this case, it simply doesn’t. In sum, the piece describes self-driving technology and a challenge that it may present—since even positive safety outcomes have externalities associated with them—before venturing toward a possible free-market solution.

It should be noted that R Street has advocated consistently for the rapid development and deployment of automated technology. At the same time, that rapid deployment will require considering the otherwise unexpected ways in which self-driving technology may affect society. The point of the Slate piece was to highlight a hitherto underappreciated issue. This is why we were as clear as we could possibly be in writing:

We’re all for saving lives—we aren’t saying that we should stop self-driving cars so we can preserve a source of organ donation. But we also need to start thinking now about how to address this coming problem.

Should folks like Ben wish to learn more about the technology, its upsides and the case for regulatory restraint, here are some links to R Street’s latest work on the issue:

Autonomous vehicles could change everything you know about traffic stops

Self-driving car makers shouldn’t have to ask NHTSA ‘mother, may I?’

In internet of things era, cybersecurity for autonomous vehicles will require restraint

Uber and California DMV fight over definition of self-driving cars

Don’t over-regulate driverless technology

The new federal safety guidelines for self-driving cars are too vague… and states are already making them mandatory

Don’t demonize bitcoin, blockchain, Tor and other online privacy tools


“Can bitcoin be used for good?” read the headline of an April article in The Atlantic.

It’s not just a headline writer’s attempt at provocative rhetoric. The phrases “darkest corners of the web” and “illicit transactions” appear in paragraphs 1 and 2, respectively.

The implication, of course, is that bitcoin was created to facilitate illegal transactions over the internet. Only midway through does the article begin to discuss potential benefits of the cryptocurrency. The negative slant is indicative of the attitude the mainstream media are taking in general to technologies than anonymize users—as if the desire for privacy itself indicates ulterior motives.

We have seen similar treatment of other online privacy tools, such as the Tor browser, which media and law enforcement never miss a chance to associate with child pornography. On the other hand, very little is reported about Tor’s importance in circumventing internet filters and firewalls set up by the world’s repressive governments.

Yet in the larger scheme, anonymizing technologies like these will be important countermeasures for safeguarding privacy, as underlying internet technology allows small bits of personal information spread across countless databases to be searched, collected and analyzed cheaply and instantaneously.

Individuals today are asked to provide greater amounts of personal information just to navigate the routine aspects of daily life. This isn’t simply social networking and search applications; whether it’s banking, credit cards, purchasing (online and off), health insurance, travel, school enrollment or any one of dozens of other daily interactions, governments and institutions are requesting—demanding—more of your personal information for the privilege of doing business. There’s no opt-out.

Right now, as you’re reading this, think of how much of your daily activities third parties have recorded and documented. How much water did you use to shower, shave and brush your teeth? Metered and documented. Did you ride public transportation? Your fare card reported where you got on and where you got off. Surveillance cameras recorded you boarding a specific bus or subway car. Did you drive to work? The EZ pay toll and traffic cams can show the route you took. That coffee and doughnut purchase paid by debit card? Documented and searchable. That cigarette you smoked midmorning outside the building? Security cameras recorded it. Every site you visited, whether on your phone, tablet, home or work PC—logged and documented in the cloud.

None of this information is within your control. It is not your property and, as of now, your rights to correct, edit or delete it are extremely limited. Any of it, at any time, can be used against you.

Today, the bulk of information collected is used for marketing purposes. It’s not all bad. Amazon will remember that great pair of running shoes you bought, so you can purchase them again when you need to. Google can tailor news for you based on what its algorithms discern as your interests. As the “internet of things” takes shape, intelligent systems will be able to crunch massive amounts of data to make traffic flow better; to help businesses manage inventories, supply chains and transactions faster, quicker and more securely. That means less wasted resources, including fossil fuels. It means shorter lines at the checkout counter (or no checkout counter at all) and reduced chances of small mistakes that can have huge consequences, such as an error on medical records.

But just because there are social benefits to sharing some personal information doesn’t mean there are greater social benefits to sharing all information. This is what makes the oft-heard claim that information gathering shouldn’t bother those with “nothing to hide” so infuriating. Privacy is not about keeping secrets. Privacy is about the individual right to set personal boundaries. Legally, this idea is implicit in the First, Second, Fourth, Fifth and Sixth Amendments. To put it more mundanely, I have no obligation to notify the world every time I buy a bottle of milk and a loaf of bread.

So, yes, bitcoin can be used for good. Bitcoin is a cryptocurrency, a form of digital cash that can be exchanged online for goods and services. It is accepted by a growing number of merchants, including, Apple’s iTunes, Virgin Galactic and MGM Resorts (a good list is here). Unlike conventional currencies, transfers do not need to go through a third-party payment system, like a debit or credit card transaction does. This eliminates the cost of that transaction. The protocol that underlies bitcoin, known as blockchain, allows both anonymity and verification, two factors often thought to be mutually exclusive in e-commerce.

Blockchain itself may be the most significant development in safeguarding online privacy. Although the processes involved are complex, it essentially creates a method of secure data authentication independent of personal identification. This could make it ideal for networked device applications by allowing extensive database interaction to analyze dynamic real-time situations without compromising identifiable individuals. That is, unless government interference ruins it. Unsurprisingly, Russia and China are promoting an international effort to regulate the use of blockchain.

As for Tor, an FBI sting—aided by a hyperventilating media—damaged its reputation by associating Tor with child pornography in the public mind. In the sting, which itself raised ethical questions, the FBI for one week kept a seized child porn site online and publishing in order to gather web addresses of its users. The site was part of the so-called “dark web,” accessible only through open-source browsers such as Tor (proprietary browsers such as Chrome, Bing, Explorer and Safari won’t index dark websites). Tor, short for The Onion Router, was created by volunteer programmers and is designed to route browser inquiries through a large number of nodes, making it difficult to trace the IP address and location of the user.

In addition to child porn, the dark web is known for supporting other criminal transactions such as arms purchasing and drug trafficking. Former U.K. Prime Minister David Cameron raised the thought of banning Tor, and Tor and its defenders regularly battle false reports that its use is illegal.

But Tor serves a vital function by allowing individuals to access websites, exchange emails and post video and photos while retaining a high degree of anonymity. (Tor isn’t completely secure in this regard). While for Americans that might mean fewer targeted web ads, for someone in China, Iran or Saudi Arabia, it means there will be no 3 a.m. knock on the door.

Moreover, Tor’s role in propagating child porn likely is overstated. The U.K. Parliament’s Office of Science and Technology cited a U.K. National Crime Agency’s Child Exploitation and Online Protection Command (CEOP) finding that Tor hidden services plays “only a minor role in the online viewing and distribution of indecent images of children.”

Anonymity tools themselves are a marketplace response to public perception of a lack of internet security. Just last week, Yahoo reported that information in 1 billion customer accounts may have been hacked and compromised. Most of this year’s presidential candidates, including President-elect Donald Trump, supported the FBI’s demand that Apple employees be forcibly deputized to unlock the cellphone used by the San Bernardino attackers. Despite debating it for several years, Congress still hasn’t extended constitutional due process protections to personal information stored in the cloud, even though users rely on web-based storage for app functionality across devices and protection against loss due to a local hard drive failure.

If the internet of things is to succeed, consumers must be confident that the information they share will be anonymized as a matter of course. When that is unfeasible, they must have utmost confidence that their data won’t be shared improperly or used against them. Anonymizing tools should not be banned or demonized because the government—or industry—can’t or won’t meet a desired level of security. That will not solve the problem. It will only kill the internet of things. And, frankly, if information security isn’t an option, internet of things development will suffer.

Image by Zapp2Photo

January promises an end to FCC’s zero rating attack


There’s a quote posted in the guidance counselor’s office at my son’s middle school: “Fairness isn’t everyone gets the same.”

Unfortunately, that is how outgoing Federal Communications Commission Chairman Tom Wheeler has defined internet policy throughout his term, typified by the adoption of network neutrality rules under which no Internet Service Provider may provide speedier transmission, error correction or any other added value to any application crossing its network.

To do so, Wheeler believed, would favor certain applications with advantages unavailable or unaffordable to others. Never mind that consumers, applications providers and ISPs all benefit when some applications, like 4K streaming video, are handled differently. When that happens, according to Wheeler, everyone isn’t getting the same thing and it’s unfair.

Now that Wheeler has said he will step down on Inauguration Day, backing off a plan to retain his chairmanship through the beginning of the Trump administration, the FCC can move quickly away from ideologically driven rulemaking to more sensible policies that confront problems that actually exist.

In the days before he said he would step down, it looked as if Wheeler, an Obama appointee, was going to take a network neutrality inquiry well into 2017. Earlier this month, the FCC notified AT&T of a preliminary finding that the company’s DirecTV Now violates net neutrality because it does not apply streamed DirecTV programming against customers’ data caps.

The pricing strategy, known as zero rating, has become popular, especially among wireless carriers that, with limited spectrum, face more network management challenges in delivering bandwidth-intensive services. Zero rating is touted in that T-Mobile ad in which a young driver must choose between streaming Ariana Grande and using her navigation app. The driver gets her music and her app without paying more! Wheeler openly questions whether these promotions are net neutrality violations.

He’s not the first. For all we know, Wheeler ultimately might decide he’s fine with zero rating, given its consumer benefits. Still the fact that he is reviewing it at all underlines the arbitrary nature of the regulation and the “Mother, May I?” climate it has created.

Even if I thought net neutrality was a good idea, zero rating is not the first battle I would pick. Why attack a feature that gives consumers more utility for less cost? Even in the days when strict common carriage rules applied to phone service, the FCC didn’t ban toll-free 800 calling as unfair, even though a small bed-and-breakfast could hardly equal the telecom and marketing budget of a hotel chain like Sheraton. But when ideology dictates that every app gets treated the same, it doesn’t matter if the market has devised a way to foster more usage, more innovative apps and, in general, greater internet connectivity.

This is also an example of how technology regulations always lag the current market reality, as noted in my recent blog post on tech antitrust. The FCC’s network neutrality policy was formulated in the early 2000s, when most video files were 10-minute max uploads by YouTube users and massive multiplayer gaming was nascent.

In 2016, regulators are still trying to impose a one-size-fits-all regime on a medium that not only is replacing multichannel cable TV, but stands to form the foundation for everything from smart homes to autonomous vehicles to automated manufacturing. In this environment, a government policy that deliberately blocks any attempt at specialized differentiation in pricing or handling internet applications downright foolish.

Returning to my son’s middle school wisdom, the full quote, attributed to author Rick Riordan, goes, “Fairness does not mean everyone gets the same. Fairness means everyone gets what they need.” Consumers need zero rating and other such strategies to get full value for their internet dollar. Audio and video streaming providers, and any advertisers that support them, need zero rating to reach a wider audience. ISPs need zero rating to be competitive. It’s that simple.

Image by Mark Van Scyoc

Michigan looks to become autonomous vehicle mecca


Michigan Gov. Rick Snyder last week signed a package of four bills (S.B. 995-998) intended to signal the state’s commitment to the autonomous future, including by opening up 122,000 miles of roadway for autonomous vehicle testing in all kinds of conditions, allowing fleets of autonomous commercial trucks to travel together at a set speed and allowing the driverless vehicles of the future to travel on nearly every roadway in the state.

Michigan first passed legislation in 2013 to allow partially autonomous vehicles to operate on specific stretches of road, but only with a driver at the wheel. This new package should put Michigan ahead of the eight other states that currently have enacted legislation for self-driving cars, as well as a few others, like Arizona and Massachusetts, that have regulated the technology via executive order.

The package also grants legal authority for an AV testing and design center, creates a Michigan Council of Future Mobility and will permit networks of driverless cars to pick up passengers on demand.

Separately, the University of Michigan is readying a 32-acre site at its North Campus in Ann Arbor to become the Mobility Transformation Center, nicknamed Mcity. As the university describes it:

Mcity simulates the broad range of complexities vehicles encounter in urban and suburban environments. It includes approximately five lane-miles of roads with intersections, traffic signs and signals, sidewalks, benches, simulated buildings, street lights, and obstacles such as construction barriers.

California has been a locus of AV testing, but Michigan thinks its four-season climate and history as home to the Big Three auto manufacturers will generate a compelling case.  The Michigan Department of Transportation, engaged at every level in the new plans, is already collecting digital information for autonomous snowplows of the near future.

Image by Karsten Neglia

R Street panel on NHTSA rules for self-driving cars


When the National Highway Traffic Safety Administration released its much anticipated self-driving vehicle guidelines, instant reaction was generally favorable. But upon further reflection, a more complicated picture was revealed.

In light of the guidance, there are important questions that confront Congress, the states and industry. What new authority will Congress need to consider to oversee the development and deployment of self-driving vehicles effectively? Do the guidelines strike the correct balance between the state and federal spheres of authority? Do self-driving vehicles require heightened regulatory scrutiny?

I moderated an Oct. 11 R Street panel to consider those questions, joined by David Strickland, former NHTSA administrator and counsel to the Self-Driving Coalition for Safer Streets; Hilary Cain, director of technology and innovation policy at Toyota; Gary Shapiro, president of the Consumer Technology Association; Marc Scribner, a fellow at the Competitive Enterprise Institute; and, Adam Thierer, a senior research fellow with the Mercatus Center at George Mason University. Video of the event is embedded below.

Robot scalpers and the BOTS Act


The current system for allocating tickets to popular events leaves a lot to be desired. Most tickets for the most popular shows are pre-allocated to fan clubs, VIP clubs, credit-card deals and to managers and artists, who then resell those tickets.

Other ticket resellers operating within that system have in recent years deployed “bots” – software designed to buy up tickets to “scalp,” or resell them on secondary ticket websites. The volume of tickets captured by bots is less than 1 percent of all tickets sold per year.

The BOTS Act—recently passed by Congress and on its way to President Barack Obama’s desk—criminalizes the circumvention of technical measures used by online ticket sellers to stop bots. Those who support the bill say it will level the playing field for people to buy tickets. But will it? I recently participated in a podcast discussion with Evan Swarztrauber of TechFreedom on this topic.

Resold tickets better represent the market value that fans are willing to pay to attend various events. The secondary market makes tickets available to those who desire them the most, rather than those lucky enough to click the “purchase” button at the right time. Scalpers also take on the risk of stale inventory from venues. If the tickets don’t sell, scalpers have to move their tickets at below face value, thus clearing the market and ensuring that the venue fills.

The BOTS Act is unnecessary and introduces unintended consequences. There’s already state legislation and private litigation to punish bots. Such legislation stalls innovation in technologies that look to better match ticket buyers and sellers, such as better user interfaces or ways of incorporating dynamic pricing algorithms for the hottest tickets.

So why is Live Nation pushing the bill? Probably because the BOTS Act will shift enforcement costs from the ticket-selling industry to the Federal Trade Commission. The precedent is a dangerous one: to have the federal government enforce private companies’ terms of service.

Focusing ire on bots detracts from other ways to make tickets more accessible. Artists can follow Garth Brook’s lead and add concerts to cities based on demand. Venues can invest in virtual reality technologies so fans can experience concerts live without needing to purchase a seat. The BOTS Act would solidify the way tickets are currently sold, which harms fans.

Image by Dim Dimich

Tax reformers must take care not to kill reinsurance market


During the 2016 presidential election, one theme from President-elect Donald Trump rang louder than any other: the importance of keeping jobs in America and finding ways to encourage (or perhaps force) companies that have sent jobs overseas to bring them back. The PEOTUS suggested countless ways to do so, from imposing tariffs to rolling back regulations to renegotiating trade deals.

While some of Trump’s ideas make congressional Republicans squirm, one place of firm agreement between the Republican in the White House and those in Congress is tax reform, and for good reason. Our nation’s corporate tax code combines the highest rates in the OECD with a complex system of loopholes and deductions to create a behemoth that only serves to drive businesses (and jobs) overseas.

But in their efforts to entice jobs back to our shores, Congress must be careful not to bring an outsized portion of insurance risk back with them inadvertently. The issue at-hand is a deduction for reinsurance premiums taken by U.S. companies who cede premiums to offshore affiliates.

An insurance company purchases reinsurance to help cover its losses in the case of unexpectedly high claims – for example, massive property damage in a region due to a hurricane. The ceding premiums as a business cost, and it becomes income for the reinsurer receiving the premium payments. If that reinsurer is based in the United States, those premiums are part of their U.S. taxable income. If an insurance company’s losses turn out to be high and their reinsurance contract kicks in, then the reinsurance payouts made can become a loss that is carried forward on future taxes. Obviously, if the reinsurer is foreign, they neither pay U.S. taxes on any income nor can carry forward any loss for reduced U.S. tax liability.

For more than a decade, bills have been introduced in each session of Congress to deny this deduction to U.S. insurers who purchase reinsurance from foreign affiliates. It’s also made it into the president’s budget for the last five years. Proponents of this change claim that it would reap billions in tax revenue, despite outside analysis that shows the federal revenues to only come to $440 million. It has been floated as a pay-for in recent tax-reform efforts, but never made it across the finish line.

This is fortunate, because proponents’ second claim is even more dangerous. Supporters of ending the deduction claim that these offshore reinsurance purchases are simply done for tax avoidance purposes, encouraging capital to fly overseas. However, this is simply not the case.

First, these purchases are screened and approved by insurance commissioners to verify that the purchase is legitimate, rather than for the purpose of income stripping. But second and more significantly, it’s incredibly important to the structure of the reinsurance system that risk is spread all over the globe. This helps reinsurers to diversify their exposure to any given geography or line of business, which would pose solvency concerns and require higher premiums to justify the increased risk. Under the global reinsurance system, when Louisiana is hit by a large hurricane or a devastating earthquake hits Japan, the capital needed to rebuild floods in from all over the world.

Discouraging the purchase of foreign reinsurance would obviously provide incentives for more reinsurance purchases from U.S. companies, but only by distorting the global risk pool and reducing competition. According to a study by J. David Cummins of the Wharton School, along with the Brattle Group, reinsurance costs could rise between $11 and $13 billion from this change.

It doesn’t stop supporters from continuing to press their case, however, and unfortunately for consumers, the chief architect of the legislation to end the deduction—Rep. Richard Neal, D-Mass.—soon will sit as the ranking member on the House Ways and Means Committee, which is charged with crafting U.S. tax policy. Also of concern is that House Republicans appear to be considering the proposal as part of their “blueprint” for tax reform – specifically in the context of a “border adjustment” system, in which all sales to U.S. customers are taxed and all sales to foreign customers are exempt.

Closing the outlandish number of tax loopholes riddled throughout our corporate tax code is a Herculean task, but one that is necessary to ensure lower corporate rates don’t drive up our already high national debt. However, the point of tax reform is to create a level playing field that attracts companies, jobs and products to our shores, and Congress should be wary of calls to disadvantage one segment of an industry over another.

Driving up the cost of insurance for consumers and increasing our exposure to risk from disaster is a wrongheaded way to pay for tax reform. After all, each of the companies we hope will expand their U.S. presence will need insurance. It would be a shame for them to not bring the jobs back due to higher operating costs created by the very tax reform intended to lower them.

Image by SK Design

Time for a reset breath on tech antitrust


Just as the Senate was holding hearings on the $85 billion AT&T-Time Warner merger last week, an interesting tidbit of information crossed my tablet: a story finding that Alphabet Corp.’s Google is no longer the No. 1 search engine for product searches.

Noted a few paragraphs down in an article on VentureBeat, author Soeren Stamer reports that 55 percent of consumers use Amazon for product searches, up from 44 percent last year. The article questioned whether sites like Google that use advertising models have the correct strategy and whether internet companies like China’s Tencent, which derive revenue from transactions, might be in better position for growth, given today’s online user habits.

Google was a huge U.S. antitrust case that never happened. In 2012, the Federal Trade Commission staff produced a 160-page critique of the company’s competitive practices, along with a recommendation to proceed with antitrust action. However, the commissioners declined to prosecute and used the report to pressure the company for changes in certain business practices.

The European Union, for its part, has not given up, and is pushing its case against the company on a number of fronts.

The Senate Judiciary Committee’s hearing on AT&T and Time Warner came in response to concerns about industry consolidation in broadband and home entertainment. To Sen. Bernie Sanders, D-Vt., the merger is anti-consumer and should be blocked. President-elect Donald Trump also criticized the merger while on the campaign trail this fall, although more recently, his transition team said the deal would be reviewed “without prejudice.”

If cooler heads prevail on AT&T-Time Warner, as they did with Google, maybe we can hope that policymakers have at last accepted the reality that, in technology, no one stays on top forever – or even for very long.

The crux of the Google complaint wasn’t so much its status as the de facto search site. It was whether Google could leverage its search dominance to the point that no other web-based ad delivery platform could be viable. The fact that more consumers now use Amazon for product searches—the searches easiest to monetize—completely upends this premise.

The antitrust rumblings about Google, of course, were just the most recent in a long history of government attacks on big tech platforms that were feared as unstoppable, but ultimately proved ephemeral. The Department of Justice spent 11 years prosecuting IBM for monopolizing the computer market. The suit lasted not only through the rise of distributed computing and data networking (think Digital Equipment Corp. and Hewlett-Packard) and into the early days of personal computers. Only then did the U.S. government decide that IBM, a maker of stand-alone million-dollar mainframes, did not illegally control the computer market after all.

Later came Microsoft, which was sued because the government (don’t laugh, kids) thought that its bundling of a browser with an operating system would give the company complete control of the internet. Then there was the FTC’s decision to halt Blockbuster Video’s merger with Hollywood Video because, if permitted, Blockbuster would end up dominating home video distribution and rental for decades (stop laughing, I mean it!). Finally, there was the fuss over the Sirius-XM satellite radio merger. The Justice Department dragged its feet for more than a year over fears that the combination would close off all future consumer options for streaming music and audio entertainment. (Cut it out, already! Don’t make me come over there!).

Members of the Senate Judiciary Committee expressed the same basic thoughts on AT&T and Time Warner at Tuesday’s hearing. Sen. Chuck Grassley, R, Iowa, raised “concern that this acquisition will concentrate too much power into one conglomerate [and] concern about the merger’s implications for a free and diverse press.” Yet cable TV giant Comcast has owned NBC Universal since 2011 and appears to have done nothing to rein in the pro-regulatory, anti-business editorial sentiment regularly heard on programming outlets such as MSNBC, or shut down production of such films such as “Straight Outta Compton,” which, in chronicling the rise urban gangster rap, actually confronts issues of free expression and boasts a largely nonwhite cast.

No industry is in as much flux right now as television and video distribution. To say that a merger of a studio and a broadband provider would endanger competition and choice in home entertainment delivery is risible. Viewer habits are changing. Following the lead of Netflix, Amazon and Hulu, virtually all the major TV and cable networks now offer streaming options independent of their cable channels. A la carte programming, so long unfeasible for cable companies because of their business model, has come about via over-the-top mechanisms. Regulators need to make decisions based on present market realities. Companies like AT&T and Comcast can’t count on once-predictable revenues from multichannel cable TV to fund broadband infrastructure.

Yet if consumers can cut the cord, it means content providers like Time Warner must lay down new avenues to reach them. That’s why the AT&T deal, like the Comcast acquisition of NBCU, stands to benefit consumers rather than hurt them. Would there me as much outcry if Time Warner, rather than AT&T, initiated the deal, or would it be seen as a savvy studio staking out the latest means to audience connection? We could see something like that next. Don’t be surprised if another major studio soon makes a bid for Verizon, or vice versa.

The government’s record on tech antitrust is poor because lawmakers and regulators fail to look past market conditions of the moment nor appreciate that tech companies can’t fit into silos. IBM wasn’t upended by Sperry or Amdahl, its principal competitors back in the 1960s, it was undone by new companies with completely different product and technology approaches. Google’s greatest competitive threat isn’t Microsoft’s Bing, it’s Tencent, with a completely different business model. Likewise, it’s doubtful that multichannel cable will be the prevailing choice for home entertainment delivery the next decade. In this time of flux, where consumers stand to gain from more choice than ever, the government should step back and let market forces play out.

Image by hafakot

R Street’s guide to policy events at SXSW 2017


In addition to its high-profile music and film festivals, Austin’s South By Southwest (SXSW) also hosts one of the biggest annual gatherings of technologists, activists, government officials, entrepreneurs and academics in the country. SXSW Interactive only lasts a week, but features more than 100 panels.

While SXSW offers incredible opportunities to mingle with like-minded individuals and promote free-market ideas about tech policy, it also can be a difficult landscape to navigate. How does one choose among such panels as “What Do Unicorns, Smallpox & Beer Have in Common?” or “How To Hijack the Pizza Delivery Drone“?

Each year, we publish a guide offering our assessments of the best policy-focused events. Check it out below. A (★) indicates a member of Congress will be participating. An (R) designates an R Street policy expert is a panelist.

March 10, 2016

March 11, 2016

March 12, 2016

March 13, 2016

March 14, 2016

March 15, 2016

March 16, 2016

March 18, 2016

We will update this list as more panels are confirmed. If yours isn’t listed, feel free to shoot me an email.

Image by Alfie Photography

It’s time for real innovation in U.S. labor law


Over the next few days, I am sure there will be a fair amount of ink spilled over Andy Puzder, President-elect Donald Trump’s designate to be secretary of the U.S. Labor Department From what I have read—and I don’t know the man—he seems like a good choice.

He does have a significant road ahead of him. Our labor laws need significant updates to better protect workers while facilitating new business models.

I think he may be up to the job, because I do know that Puzder is an innovator. And I know it for sure. He is a man who spearheaded the invention that I think may be the greatest achievement of the fast-food industry in modern history: the Philly Cheesesteak Thickburger. This sandwich product is an epic American innovation similar in magnitude to the practical electrical lightbulb, the Apollo program, internet protocols and the Pumpecapple Piecake. For those who have not already experienced its greatness, Puzder’s company invented the idea of using meat as a topping for meat. His company’s fantastic invention consists of a one-third pound burger with a full cheesesteak as a topping.

Someone capable of this level of innovation also ought to be open to other new idea, such as the concept of labor law waivers that I’ve helped the great labor leader Andy Stern to develop. We outlined the concept for The Washington Examiner a few months ago and have a longer article on it coming out soon in National Affairs. As we describe:

Federal lawmakers should [authorize] waivers from federal labor laws, similar to those granted under sections 1115 and 1119 of the Social Security Act, which allow state experiments with Medicaid, Medicare and other benefits programs. In effect, Congress should clear the way for wide-ranging experiments with new business models, responsibilities and roles for unions, employers, government and workers alike.

An innovator capable of creating the Philly Cheesesteak Thickburger should eat up this idea for true policy innovation.

The insidious campaign of ‘soft’ prohibitionists


This week’s Repeal Day (Dec. 5) marked 83 years since the dreadful social experiment of Prohibition finally was killed.

We think we live in strange times today, but what a weird era that was. Politics makes strange bedfellows, and it was an odd coalition that lobbied for Prohibition — some women, Christian fundamentalists, Nativists, racists, progressives, capitalists, socialists and health nuts. With few exceptions, these reformers all believed drinking was bad. Respectively, these anti-alcohol crusaders declared that alcohol made men irresponsible; humankind impious; immigrants uppity; Blacks violent; humanity retrograde; workers lazy; proletariat imprisoned in false consciousness; and human bodies, sick.

Thus far, it does not appear the old heavy-handed Prohibition will rise from the grave in America. So we can lift our glasses in toast to that.

But all the news is not good news, friends of the cup. Let me mention a few dour matters:

  • While Prohibition lasted just 13 years (1920 to 1933), its deleterious effects linger still. We see it when we try to order a bottle of wine online, only to find it cannot be shipped to our home state. (The 21st Amendment, which repealed Prohibition, permits states to erect barriers to the booze trade.) We see it when we visit a distillery and find we cannot buy hooch directly. (Instead, under the three-tier system, the distiller must sell it to a distributor, who must sell it to a retailer, from whom you can purchase it.) These are affronts to the liberty of producers, retailers and we drinkers, and we should continue encourage irrational drinks restrictions to be wiped from the books.
  • Outside of America, there are places where movements toward prohibition are rising. Prohibitionist regimes can be found in Africa, Asia and its subcontinent, bringing bans on drink. To cite just one example: Sunday’s New York Times review had a piece titled: “Pakistan’s Drinking Problem.” The problem is not the drinkers, but the policies. The author writes:

In the province of Sindh, where I live, licensed shops, usually called wine stores, have operated even since prohibition. The stores are supposed to sell only to non-Muslims, but they don’t discriminate. Owners have to pay off the police, though, and any dispute can result in the shops having to close down … In late October, a High Court judge ordered the closure of all these stores after accepting a petition that said alcohol is prohibited not only in Islam but in Christianity and Hinduism, too. This ban means that only those who can afford imported liquor will keep buying from a flourishing network of bootleggers. Others will have to buy one of the many versions of moonshine brewed all over the country, which routinely blind and kill consumers.

  • This leads me to my third bit of grim news: that a soft form of prohibitionism is rising in Europe and creeping into our nation. Its proponents deny they are against drinking, per se, but they advocate measures to take beverages further and further from consumers. They promote shortening the hours bars and stores can sell; stopping companies from advertising drinks; raising beverage taxes; and setting high minimum prices. This is all done with the explicit objective to reduce the amount of alcoholic beverages consumed by everyone.

Unlike the hard Prohibition that struck us a century ago, this soft prohibition does not have adherents from all over society. Rather, it is an elite-driven phenomenon, mostly composed of people like public health researchers and socioeconomic development wonks.

The strategy deployed by these soft prohibitionists is to tar alcohol as inherently dangerous: alcohol=death. “There is no safe level of drinking,” declared Dame Sally Davies, the United Kingdom’s chief medical officer. One can get a sense of their worldview from this rant posted on a Listserv by a Boston University medical professor. He was steamed that the surgeon general’s report insufficiently demonized alcohol.

[N]ot once does the report inform readers that alcohol consumption itself causes cancer of the mouth, pharynx, esophagus, colon and rectum, and larynx. Instead, the report tells the public that only alcohol ‘misuse’ – in other words, ‘irresponsible’ drinking – can result in these cancers. This is the party line taken directly from Anheuser-Busch. The alcohol companies want the public to believe that there are absolutely no potential health consequences of alcohol use unless a person drinks irresponsibly (i.e., underage or binge drinking). The Surgeon General’s report goes a long way to reinforce this myth.

If this is not neo-prohibitionism, I do not know what is. Indeed, the soft prohibitionists frame alcohol solely in terms of cost and danger. You won’t find in their studies any attention to the positives that drink can bring: a Champagne toast at a wedding; a fun beer-soaked night on the town with friends; belts of whiskey at a wake. The neo-prohibitionists would deny us these moments and memories thereof, for our own good.

Bad news aside, we the thirsty really are living in the best of times. We have more and better choices of fermented and distilled drinks available than ever before. And yet, per capita, we are drinking less than we did 40 years ago. (So much for the neo-prohibitionist insistence that increased consumer access will lead to more or worse drinking.) But we must remain vigilant, because the freedom to consume might be stolen. And it wouldn’t be the first time.

Image by Everett Historical

Regulators’ vaping crackdown should concern other innovative industries

On the Food and Drug Administration’s webpage, Commissioner Robert M. Califf asserts “a successful FDA is a critical factor for better public health in this changing world” and that the organization is “committed to strengthening programs and policies that enable the agency to carry out its mission to protect and promote public health.”

Of course, if the promotion of public health is the mission goal of the agency, one wonders how the regulatory approach delineated in the deeming rules fosters tobacco harm reduction for adult consumers. Indeed, even a cursory read of the nearly 150-page document evinces little to believe the FDA is interested in hewing to a consistent or rational policy of health promotion.

The text informs us the deeming rules were drafted in order to “reduce the death and disease from tobacco products.” Excellent. But the FDA’s insistence that it classify electronic cigarettes alongside their vastly more harmful cousins is baffling in light of the later claim that the “FDA recognizes that completely switching from combusted cigarettes to [electronic cigarettes] may reduce the risk of tobacco related disease for individuals currently using combusted tobacco products.”

Perhaps more shocking, however, is the following statement:

Although FDA is not required to meet a particular public health standard to deem tobacco products, regulation of the newly deemed products will be beneficial to public health… Over time, since the ‘‘appropriate for the protection of the public health’’ standard involves comparison to the general tobacco product market, FDA believes the employment of the premarket authorities could create incentives for producers to develop products that are less dangerous when consumed, less likely to lead to initiation of tobacco use, and/or easier to quit.

If you are rubbing your eyes in disbelief, stop. You read this correctly. The FDA here is claiming they’ve set no public-health standard when it comes to regulating e-cigarettes, among other things, preemptively. This renders the FDA’s equivalency between all tobacco products contradictory. From a public health standpoint, the FDA’s purposeful obscurantism is far worse; it’s downright harmful.

In the absence of a standard, the crusade against tobacco products continues as long as the FDA wishes. While the tobacco giants and the thousands of small-business vape shops around the country have few public defenders, whenever the government justifies regulatory overreach on such slippery pretenses, it behooves all industries in America to step in and say something. Kicking around tobacco these days is fashionable; and the facility with which imprecise justification is used to strangle the industry is enticing for anyone looking to score easy points. Watch out Silicon Valley, you’re next.

CBO confirms OPEN Government Data Act won’t cost taxpayers


The bicameral, bipartisan Open Public Electronic and Necessary (“OPEN”) Government Data Act will not cost taxpayers a dime, according to new scoring analysis from the Congressional Budget Office.

The bill, which was moved to the Senate floor by the Homeland Security and Governmental Affairs Committee back in May, is closely modeled on President Barack Obama’s 2013 Open Data policy. If enacted, it would require all federal agencies publish their data in a machine-readable, open and nonproprietary format and to use open licenses.

The bill also directs agencies to find innovative uses of their collected data and to adopt consistent and best data practices for open data, all of which will be housed on the federal website Not only will bill modernize our agencies, but make it easier for citizens to read about how their agencies operate without having to go to several different websites.

Most importantly, the bill will help taxpayers discover how and where their money is being spent. This could help to avoid costly mishaps like the Department of Veterans Affairs scandal of 2014. Due to the tremendous caseload, many patients were not being seen or were forced to wait longer than the mandated 14-30 day target. Many patients died while on the waiting list. This scandal could have been avoided if the VA’s data was available for citizens and lawmakers to scrutinize.

Critics of the bill often have said, in effect: “if ain’t broke, don’t fix it.” But the proprietary data universal numbering system (DUNS) administered by Duns & Bradstreet Inc. is broken. Any taxpayer who wants to see grantee or contractor information currently must purchase a license to use DUNS. If the government is going to be transparent, it must switch to a nonproprietary identifier. As Hudson Hollister, executive director of the Data Coalition wrote in an op-ed for The Hill, there already is nonproprietary alternative in place for DUNS called the Legal Entity Identifier, or LEI:

The LEI has already been adopted by government agencies in dozens of countries….A study by the General Services Administration’s 18F technology team shows that the DUNS Number could be replaced with a temporary code with the same number of digits, linked to the LEI – which means no expensive system upgrades.

Back in May, R Street joined with a broad coalition that includes tech companies, open-data advocates and free-market groups to press the case for reform. By setting a presumption that the government must use nonproprietary data standards, the OPEN Government Data Act will eliminate monopolies like the DUNS system and give taxpayers full access to download and use the data that they’ve already paid for.

Image by corund

The housing bubble renewed?


Average U.S. house prices are back over their 2006 bubble top, as measured by the Case-Shiller Home Price Indices. “Home Prices Recover Ground Lost During Bust” read the Wall Street Journal headline.

But these prices are in nominal dollars, not inflation-adjusted dollars. While the Federal Reserve assures us that inflation is “low,” it tirelessly works to depreciate the dollar. Over the decade since the housing bubble peak, aggregate inflation has been 19 percent, so 2016 dollars are worth 84 cents compared to 2006 dollars.

House prices since 1987 in nominal terms look like this:


In inflation-adjusted terms, the chart is different. Average house prices in real terms are indeed very high, but still 16 percent below their bubble top. They have reached the level of March 2004, when the bubble was well advanced into exuberance, but not yet at its maximum. It had made about 54 percent of its 1999-2006 run.

From 1999 to 2004, real house prices increased at an average rate of 7.2 percent per year. In our renewed house price boom from 2012 to now, real prices have increased at 6.6 percent per year—pretty similar.

All this is depicted in Chart 2:


The Federal Reserve had reduced short-term interest rates to very low levels in 2001 to 2004, which fed the bubble. In 2004, it started to raise them. The house price run up since 2012 has also been fed by extremely low interest rates. Now the Fed must raise rates again and is getting ready to do so. Long-term mortgage interest rates have already increased sharply.

Should being back to 2004 in real terms worry us?  Yes.

Image by Cranach

Kevin Kosar on Federal News Radio: How do we draw a line between government information and propaganda?

Governance Project director Kevin Kosar discussed executive branch propaganda with Federal News Radio’s Tom Temin. Their conversation focused on the new R Street Institute report, Government Information and Propaganda: How to Draw a Line? , which was authored by Dr. Kosar and Louisiana State University professor John Maxwell Hamilton. Over the past year, the Department of Labor and the Environmental Protection Agency both have been caught running public relations campaigns for policies they want enacted. These are troubling misuses of taxpayers’ dollars, and such instances are likely to continue unless Congress and agencies enact reforms.

More consumer choice in electricity should be on policymakers’ holiday wish lists


As we head into the holiday shopping season, consumers should consider adding one more item to their lists: an alternative electricity supplier.

The process of switching energy providers long has been considered difficult or unclear and many customers haven’t understood the value proposition. But new digital technology is rapidly changing this equation, as sleek customer-facing tools make electricity shopping a breeze.

Retail electricity customers—homes and businesses—can choose their provider in more than a dozen states and the District of Columbia. These areas have adopted a policy known as “retail choice,” part of a broader effort to restructure the electricity industry over the past two decades. Renewed interest in the topic—largely stemming from gripes over the monopoly utility model—has prompted states like Nevada to consider adopting or expanding choice.

To date, the biggest customers happen to be the biggest fans of retail choice. Within retail choice states, roughly half of commercial and industrial demand has switched to competitive suppliers, with small companies less likely to do so. Consumers representing about one-tenth of residential demand have done the same.

This comes as little surprise, as the financial benefits of switching suppliers are proportional to a customer’s size. In 2014, the average industrial customer’s monthly electricity bill was more than $7,000, compared with $114 for a residential customer. If switching providers saves each customer 10 percent, then the industrial customer saves $700 and the residential customer just $14 per month. The former is enough to motivate sizable businesses to research and pursue alternative suppliers. But to save the equivalent of a pizza cost every month, the process for residential customers would have be fairly hassle-free.

No surprise, some customers don’t find it worth the hassle. Economists call the time and effort of switching providers “transaction costs.” These include gathering information, evaluating providers and offers and making necessary arrangements with a new company (e.g., paperwork and communications). Fortunately, the digital revolution is helping to remove these costs from the equation.

The introduction of customer choice initially led to some customer confusion and distrust over the process to switch providers. Many small, less-sophisticated customers were initially duped by misleading and fraudulent advertising, undermining the credibility of providers’ offers. Small customers may also lack the information or financial savvy needed to make a fully informed decision. As evidence, low-income customers are more likely to pick alternative suppliers who charge more than the incumbent utility. Some small-business owners have struggled to understand retail electricity options and employ brokers to assist them, raising their cost to switch providers.

After early struggles with small customers, states recognized the educational hurdle of transitioning to a choice paradigm. They launched public education campaigns. State public utilities commissions—such as the Illinois Commerce Commission—developed online tools to help consumers compare price and service options across qualifying suppliers.

Some states took it a step further. In Texas, for example, customers can use a tool that narrows the list of suppliers based on a customer’s input preferences. Improved customer confidence and understanding helped contribute to a dramatic uptick in residential customer choice accounts, which more than doubled between 2009 and 2013 alone.

The greatest prospects for customer engagement emanate from the private sector, which has incentive to legitimize the industry’s reputation and boasts nimbler means to engage customers. This motive has begun to mesh with the latest automated customer-facing technology, and the results could be transformative. When you sprinkle in shifts in consumer preferences, you have a recipe for customer engagement that resembles those that already have swept telecommunications and transportation.

The “Uber-ization” of the electric industry has created digital markets that stimulate demand for customer sovereignty and product customization. Customers in digitized markets have proven adept with new technologies that provide flexibility, improve quality, lower transactions costs and accommodate choice among a diverse customer base. For example, electric digitization can enable low-income customers more flexible payment options, fixed-income customers may choose more rate stability than working customers, techies may pursue automated usage control (e.g., via smart appliances) and green-minded customers may want to purchase green energy and customize their electric usage to minimize environmental impact.

The traditional monopoly utility model—marked by rigid rates and service terms—has become archaic. Customer engagement innovation is leaps and bounds ahead in retail choice jurisdictions, where entrepreneurs are continuously driving product innovations.

Zentility is one such tech pioneer that’s transforming the customer experience. Their service monitors competitive electricity providers around the clock and automatically switches the customer to the best deal available. The traditional broker approach is often too manually intensive to provide such continuing market evaluation affordably, which is where automation changes the game. In creating a fully-automated energy decision engine, Zentility boasts minimal overhead costs (no need for a large staff or corporate office). This allows the service to charge fees that are 50 to 90 percent lower than traditional brokers or existing aggregator platforms. Increasing numbers of small- and medium-sized businesses see this and have jumped onboard.

To sweeten the deal, Zentility monitors and manages a customer’s electricity consumption. Zentility makes usage recommendations after analyzing a customer’s past and current consumption patterns. Eventually, this learning function could enable the platform to provide advice on building improvement investments, like energy-efficiency upgrades or solar-panel installations.

Perhaps most importantly, an app-based platform is exhaustive but not exhausting. In fact, it’s a remedy for customer headaches associated with doing-it-yourself or coordinating through brokers. It asks basic questions to optimize the provider choice based on the customer’s preferences, such as the length of contract, budget goals (e.g., bill stability requirements) and any green-energy preference.

The app-based model provides superior service at a fraction of the cost. Competitive markets breed such entrepreneurial breakthroughs. Yet technology entrepreneurs will only engage consumers if policymakers enable choice; companies like Zentility are shut out under the monopoly utility model.

Digital engagement elevates the value of electricity choice. This propels the case for competitive electricity reforms in states where consumers remain handcuffed to a monopoly utility. A retail choice resurgence should be on policymakers’ holiday lists.

Image by Inara Prusakova

What’s the outlook for legal sports betting in the Trump era?


With the state-level battles over daily fantasy sports (DFS) largely settled, the online gambling industry, along with professional and recreational sports bettors, are speculating how the incoming Trump administration and Republican-controlled Congress will respond to growing calls for greater legalization of sports betting, particularly in the online sphere.

In 2015 and early 2016 there were a flurry of inquiries and accusations by numerous state attorneys general as to whether DFS constituted a form of illegal online sports betting. In the end, only a few states pushed ahead with prosecution or legislative bans. That relatively libertarian outcome has raised hopes that 2017 will see progress on a rewrite or repeal of the 1992 Professional and Amateur Sports Protection Act (PASPA), the federal law that limits legal sports betting to four states: Nevada, Oregon, Delaware and Montana. PASPA, the Unlawful Internet Gambling Enforcement Act (UIGEA) and the Wire Act are the only major federal laws that specifically regulate or prohibit gambling. All other gambling laws and regulation are at the state level.

Still, debate over legal gambling divides the GOP. PASPA and UIGEA, which effectively banned internet poker and casino-type games in the United States, were products of Republican lawmakers. On the other hand, while internet gambling in general did not come up during the Republican primary debates, most candidates—including Jeb Bush and New Jersey Gov. Chris Christie—were cool to the idea of banning DFS.

There’s already been some discussion as to how the new administration might approach sports betting (examples here and here). There are reasons for sports bettors to be encouraged and disheartened. Here are some of them:

Why legal sports betting will expand

  • Both the federal and state governments need sources of revenue

The federal deficit has hit $20 trillion. Entitlement spending continues to grow. Even if the Affordable Care Act is repealed or changed, it is likely to be a year or two before there would be any potential impact on Medicare and Medicaid costs. President-elect Donald Trump has proposed a $1 trillion infrastructure plan while promising to reduce marginal tax rates on individuals and businesses. Meanwhile, states and cities are bumping up against their own fiscal issues, especially in regard to pension obligations. Gambling revenue becomes increasingly attractive as a solution to budget challenges.

Legalization of sports betting would bring what a National Basketball Association report claims could be as much as $380 billion a year in underground wagering activity out of the shadows. Even if that figure is inflated, other research has shown that Americans wager $95 billion a year on professional and college football and $9 billion on the annual National Collegiate Athletic Association’s men’s basketball tournament alone.

To be sure, a portion of this money makes up office pools and friendly wagers. A fraction is handled by sports books in Nevada, the only state where wagering on individual games is legal. Yet that fraction produced $224 million in revenue for Nevada casinos in 2014. Global gaming research firm GamblingCompliance projects that a fully developed legal American market—where bets are placed at casinos, online and at retail bookmaking shops—would produce $12.4 billion in annual revenue, all of which would be subject to taxation.

The $70 million Colorado raised in 2015 from taxing marijuana sales has shown states that there is genuine economic benefit to legalizing and taxing activities that otherwise law-abiding citizens enjoy in private. Greater legalization will reduce the risks consumers take when they place illegal bets. It will also encourage offshore sports books to set up operations in the United States. Currently, these sites take billions in action from U.S. bettors, but pay taxes on those earnings to other governments.

In addition, as sports books would be operating in the daylight, wagering activity would be documented and auditable by third parties. Like casinos, legal sports books would be to obliged to report big payoffs through W-2G forms with the Internal Revenue Service.

  • Trump has a background in the casino business

While not as successful as international casino magnates Steve Wynn or Sheldon Adelson, Trump—who built casinos in Atlantic City and Las Vegas—has no record of moral opposition to gambling as a business. There’s no reason to believe that expanding prohibition is on his agenda, or that he would be opposed to greater liberalization, especially if it can create or increase revenue streams for his economic agenda.

  • Momentum

Now that the popularity of DFS has shown bans can be politically problematic, some see 2017 as an opportunity to overturn PASPA. DFS also cracked the wall of uniform and adamant opposition that professional sports leagues have had to legal sports betting for decades. NBA Commissioner Adam Silver openly supports legalized sports betting, acknowledging that it generates interest—and TV viewers—for games outside local markets. Major League Baseball and the National Hockey League are sending mixed signals, embracing DFS but stopping short of endorsing legal wagering on games. Only the NFL remains steadfastly opposed to expanding sports betting, despite the fact that two team owners own stakes in DFS companies and that the league’s own cable channel, NFL Network, promotes DFS and “game picks” during its regular programming.

Why legal sports betting won’t expand

  • Republican opposition in Congress

Republicans will control both houses of Congress in 2017, but gambling is a social issue that divides the party. The only meaningful legislation introduced this year, a bill to “harmonize” PASPA, the Wire Act and UIGEA, came from Rep. Frank Pallone, D-N.J., ranking member of the House Committee on Energy and Commerce. That bill went nowhere. Any new gambling legislation is likely to go through the Commerce Committee, which is chaired by Fred Upton, R-Mich. Upton himself has consistently voted to keep internet gambling out of the United States. In addition, Vice President-elect Mike Pence, who will preside over the Senate, has come out in the past against online poker.

Prospective opposition, however, must be balanced against the fact that the Republican Party dropped opposition to online gambling from its 2016 platform. Also, the proactive Restoration of America’s Wire Act (RAWA), which would have reversed the 2011 Department of Justice (DOJ) interpretation that the Wire Act does not ban in-state online gambling, gained no traction despite support from Sens. Lindsey Graham, R-S.C., Tom Cotton, R-Ark., and Mike Lee, R-Utah.

  • Jeff Sessions as attorney general

Enforcement of the Wire Act now falls to Jeff Sessions, who stands to be the most conservative attorney general on social issues since John Ashcroft. It was a memo from President Barack Obama’s Justice Department that opened the door to intrastate internet poker in Nevada, New Jersey and Delaware. Although it would be a daring move that probably would lead to mountains of lawsuits, there’s nothing to stop Sessions from reversing that decision.

Trump appears to have nominated Sessions for his hard line record on immigration and that stands to be the department’s priority at the outset. The rest comes down to how much freedom Trump gives Sessions to set an agenda. An early signal might be the amount of resources given to enforcement of existing gambling laws, particularly crackdowns on illegal movement of cash to offshore sites. The feds like to score public relations points around the Super Bowl and March Madness. The scope of these operations and the quality of the targets they take down could be a good barometer as to the administration’s attitude toward gambling expansion.

Image by wavebreakmedia

Moving forward on unwinding proxy access


The shareholder empowerment movement consists of activists who advocate shifting corporate decision-making authority to shareholders, and thus away from boards of directors and executive management. In effect, this means allowing uninformed shareholders to interfere with the decision making of the most informed locus of corporate authority.

Among the expected effects of such shifts are suboptimal board and executive decision-making, fewer successful companies willing to become or remain publicly traded and constraints on society’s ability to create economic wealth.

The aspirations of the shareholder empowerment movement—whose members include public pension funds, labor-union-related funds and those ever-growing number of individuals who feed off the trough of such activism—are most apparent in the movement’s advocacy of shareholder proposals that seek to implement proxy access. Under proxy access, certain privileged large shareholders—primarily institutional investors who can meet the standard three-year holding period and 3 percent ownership threshold—may have their own slate of director nominees included in a public company’s proxy materials (the proxy statement and voting card), whether or not the company’s board of directors approves.

But there is a fundamental problem with proxy access. Even the large institutional investors who are eligible and most willing to participate in the process tend to be no more informed about the companies in which they invest than the average investor. That is, no matter how well the corporate governance departments of these eligible investors have memorized the latest principles of what is considered good corporate governance, they do not have enough inside information to decide who should sit on the boards of the thousands of large and complex companies in which they hold a stake.

Typically, the only locus of corporate authority that has this vital information is the board itself and its nominating committee. The board understands how current members interact, both with each other and with executive officers. They know what kind of background and experience would enhance board decision making and what kind of personality the nominee should have in order to meld well with current board members and executive management.

The board, as an informed locus of authority, makes proxy access both inefficient and unnecessary. As I put it in an earlier post:

On efficiency grounds, proxy access may have value where the investor base is made up primarily of informed investors. It does not make any practical sense in the context of a large public company with thousands of shareholders, both institutional and retail, who overwhelmingly are uninformed about critical aspects of how the companies they invest in operate or are managed.

When a company does actually suffer from large agency costs or inefficient decision making due to its current board composition, it isn’t institutional investors who will save the day through proxy access. Rather, it’s the informed investors—such as activist hedge funds and those who make either friendly or hostile bids for corporate control—who will be able to react quickly enough to reduce those inefficiencies before it is too late.

For those in the shareholder empowerment movement, having proxy access is a great negotiating tool when dealing with a company’s board and executive officers. It’s great if you are looking to expand employee benefits or increase wages. It’s great if you are looking to present yourself as a populist political leader fighting the evil corporate interests on topics like climate change and sustainability. But for the core question that should preoccupy every board—how to enhance shareholder value—it has no relevance.

Fortunately, the U.S. House of Representatives already has taken a first step that could help to unwind some of the recent damage. In the Financial Control Act of 2016, a bill that has been approved by the House Financial Services Committee, there is a provision to repeal the Securities and Exchange Commission’s authority to issue rules on proxy access. Specifically, this would mean repealing Section 971 of the Dodd-Frank Act.

Moreover, the SEC should soon be back to its full complement of five members. However, for the first time in many years, the commission will most likely be composed of three Republications and two Democrats. When that occurs, the SEC should be positioned to modify Rule 14a-8(i)(8), the election exclusion rule for shareholder proposals, and thereby return the SEC to its traditional position on proxy access, i.e., providing the board with discretion to omit shareholder proposals on proxy access from a company’s proxy materials.

Image by Imagentle

Yesterday we talked about Congress reclaiming the power of the purse — a bit


I co-direct the Legislative Branch Capacity Working Group, a nonpartisan gathering of scholars and congressional staff that aims to “make Congress great again.” We meet each month on the Hill to discuss aspects of congressional capacity, and to commission and produce research on the subject.

So, obviously I was REALLY delighted to be invited to testify at this House Oversight and Government Reform subcommittee hearing yesterday, which considered Congress’ power of the purse.

It’s a big topic, and we focused on just a sliver of it — agencies’ authority to collect money from the public in the form of user fees, fines and the like, and then to spend it. The president’s FY2017 budget reports the government collected $516 billion from the public this past year.


We also discussed a potentially major piece of legislation, H.R. 5499, the Agency Accountability Act of 2016. Rep. Gary J. Palmer, R-Ala., introduced this concise piece of legislation, which would require agencies to turn over fees, fines, penalties and settlement proceeds to the U.S. Treasury, allowing Congress to choose whether to reappropriate them (or not).

A big problem with Congress over the past century is that it frequently has delegated its fundamental powers to the executive branch. That has weakened our separation-of-powers system. Allowing agencies to collect fees and spend them, as I noted in my testimony, is not a new thing. The very first Congress authorized customs officials to pay themselves from the duties they collected from ships using American ports. But the more often Congress allows agencies to do this, the more difficult it becomes to oversee and control the executive branch.

One of the huge takeaways from this hearing is that Congress does not really know which agencies are charging fees, how those fees are set (rarely are the fee levels written into law), and how much freedom agencies have to spend the fees.

An issue I brought up is that the very complexity of the current budget makes parsing the data difficult. Money flowing from the public is put under various categories and subcategories: receipts versus offsetting receipts and offsetting collections, fees, duties, taxes, penalties and fines, settlements, etc. The complexity is so great that very few folks (appropriators and other budget wonks) can comprehend what’s what, which is a problem for self-governance. (To appreciate how baffling federal budgeting is, have a quick look at pp. 5–6 here. Careful — your head may throb.)

Again, agencies’ authority to collect and spend fees is just one aspect of the power of the purse. Nonetheless, it is central to our constitutional system, and it is heartening to see our legislature taking up the issue. Hopefully, the 115th Congress will study the subject further and come up with reforms.

Consumer Review Fairness Act heads to Obama’s desk


The U.S. Senate this week gave unanimous consent to H.R. 5111, the Consumer Review Fairness Act, legislation that would protect reputational information online. Having previously passed the House in September, the bill now moves to President Barack Obama’s desk.

Reputational information is essential to functioning online markets. Every day, countless Americans use consumer review sites like Yelp, TripAdvisor and Zenefits to share their experiences and opinions on the businesses and services upon which they rely. These reviews have become instrumental in markets as diverse as restaurants, hotels, pet shops and even physicians.

Such rating systems have made big cities feel like small towns by helping consumers confidently seek out retailers through online word-of-mouth. Indeed, nearly 70 percent of customers rely on these online reviews by their peers before making a purchase.

Unfortunately some unscrupulous vendors have tried to silence dissatisfied customers by burying nondisparagement clauses in terms of service agreements or claiming a copyright in information shared about a given business. The CRFA strengthens free-speech protections by prohibiting businesses from using certain clauses in their contracts to deter honest feedback from consumers. When signed into law, the bill will render void any contract provision either that prohibits customers from writing, speaking or otherwise communicating honest reviews or that imposes a penalty on those who do.

In order to avoid unintended consequences, the legislation wouldn’t apply to nonform contracts between employers and employees. These might include impeding a vendor’s legitimate grounds for removing disparaging content or removing reviews that are unlawful, false, misleading or confidential.

Given the bill’s swift passage, members also should consider action on H.R. 2304, the Speak Free Act, introduced by Reps. Blake Farenthold, R-Texas, and Anna Eshoo, D-Calif. That measure—the subject of a hearing by the House Subcommittee on the Constitution and Civil Justice earlier this year—would strengthen consumer protections against “strategic lawsuits against public participation.”  R Street joined with a broad coalition of groups—from industry groups like the Consumer Technology Association to nonprofit advocacy groups like TechFreedom, Public Knowledge—to encourage Judiciary Committee Chairman Bob Goodlatte, R-Va., to move the bill through his panel and to a vote on the House floor.

Image by Joseph Sohm

Ohio wants to be testing ground for self-driving cars


The world knows Ohio as home of the Wright brothers and the first man to step on the moon. But locals also know it as home to aviatrix Jerrie Mock, the first woman to fly around the world solo. She began the 23,000 journey at Columbus’ airport, recently named for John Glenn, another Ohioan famous for blasting through barriers to human travel.

The Buckeye State opens a new chapter in transportation history this week when driverless trucks will power up and down a 30-plus mile stretch of divided highway that connects Columbus’ northwestern suburbs with the Transportation Research Center, a national facility that does independent automotive testing.  A set of separate tests will be run this week or next on the Ohio Turnpike. As announced by Ohio Gov. John Kasich, a string of sensors, a fiber-optic cable network and other investments all will be part of plans to become a center for autonomous vehicle research.

It is a highly appreciated appearance for the governor, as practically his first act in office was to kill high-speed train plans due to the state’s budget instability.

In addition to Kasich’s support of this new technology, a meeting organized in mid-November by state Reps. Cheryl Grossman, R-Grove City, and Bill Reineke, R-Tiffin, drew interested parties from manufacturers, auto companies, universities, chambers of commerce, transportation network companies and many other fields. Columbus earned a $40 million “smart cities” grant from the U.S. Department of Transportation that will underwrite part of its campaign to become a national center for autonomous-vehicle research on actual streets and highways, in addition to another $10 million grant from Paul G. Allen’s Vulcan Inc. to transition to an electrified, low-emissions transportation system.

In fact, Ohio has several advantages for advanced transportation research, including the state government’s network of partnerships with tech companies and universities, its history with auto and auto parts manufacturing and its four distinct seasons of weather, which are needed to test how vehicles respond to suboptimal conditions.

This is evidence we in the Midwest are making good progress on transportation freedom – also seen in recent legislative, regulatory and (most recently, in Chicago and Milwaukee) legal decisions to liberalize the rules around transportation network companies like Uber and Lyft. Advances in commercial transportation have the potential to save thousands of lives by avoiding crashes; it’s helpful to have a robot driver who never fails a drug test.  When the research produces the necessary reliability, Ohio may play a significant role in the process.

Image by Chesky

What Joe Maddon can teach Illinois lawmakers about competition


Illinois policymakers are considering a sprawling, damaging and wasteful energy bill that seeks to bail out two unprofitable Exelon Corp. nuclear plants. This massive 446-page piece of legislation—dubbed the Future Energy Jobs Bill, S.B. 2814— must be rejected quickly.

The bill reflects a log-rolled accumulation of special interests with everyday Illinois citizens left to foot the bill.  Rather than allow competitive electricity markets to continue to produce benefits for the Illinois economy, Exelon hopes to appeal to tepid political partners by pushing a “Christmas tree” of giveaways that could cost up to $24 billion over 23 years. There are provisions for energy efficiency and renewables to entice environmental groups. A last-minute addition of subsidies for coal plants sought to build support in downstate Illinois. Various provisions to change the structure of electric ratepayer charges—issues the Illinois Commerce Commission is better equipped to handle—have come and gone.

Wrangling this array of interests may well sink the bloated bill, as many provisions conflict with the interests of those favoring other provisions. Indeed, the far-reaching bill includes “something for everyone to hate.” The haggling is happening in haste with just days left in the Illinois legislative session. This hectic process undermines the thoughtful policy Illinois citizens deserve.

While the eventual winners remain unclear, as the bill’s provisions fluctuate wildly, consumers are sure to lose. The chief of the Public Interest Division for Illinois’ attorney general said the bill would add $10 billion in costs for Illinois customers through 2030, with the nuclear provision alone coming to $285 million a year. Even considering the claims of 1,400 nuclear jobs saved, it would cost more than $200,000 a year for each job saved, ignoring the job losses that would be caused by higher electricity bills.

One alarmed business group called it “the largest rate hike in U.S. history.” Another de facto tax hike is something the tapped-out state of Illinois can’t afford.

Illinois Gov. Bruce Rauner has valiantly fought an uphill battle to straighten out the state’s finances. His conservative accolades suggest he’d clearly reject the bill; the governor’s office once called the bill’s rate increases “insane.” After all, special-interest-driven economic policy led the state into its current financial mess. Rauner and other conservatives should be alarmed by the direct cost to consumers, as well as the damage done to competitive markets. But unfortunately, some bill tweaking may have garnered his support.

In the haste of special-interest pandering, it’s important to reflect on what’s at stake. Illinois was a pioneer in the late 1990s, when it broke with the regulated monopoly model and made power companies compete, while enabling customers to choose their power provider. At the time, Illinois electric rates were well above the national average and double-digit rate increases were the norm.

After the reforms, Illinois electricity prices have been consistently below the national average. A report by four business groups concluded that competitive reforms saved Illinois customers up to $37 billion over 16 years, with the average household saving $3,600. Competition drove electric prices down by demanding efficiency and innovation from companies that want to be successful. Fundamental to this is that competition puts the risk of capital investment on investors, not ratepayers.

Bailouts for unprofitable power plants breaks this commitment to markets. The political precedent they would set encourages businesses to seek subsidies, not business practice improvements. Low-priced natural gas has shifted electricity economics dramatically, with consumers benefiting most in states with competitive markets. Captive ratepayers in monopoly-utility states have been left to finance uneconomic coal and nuclear plants, but in competitive states the investment loss is borne by shareholders. That is, unless states abandon their commitment to markets and cave to corporate welfare pleas.

Economic development relies on allowing markets to work. Companies should succeed or fail based on their market positions, not political ones. This can create difficulty for unprofitable enterprises, whether that’s Kodak and Blockbuster failing in the digital age or retiring power plants in the low-priced natural gas age. A more appropriate political response would be to examine worker retraining and other ways to reduce the sting of a local economic shift.

But subsidizing unprofitable enterprises is an unnecessarily expensive route to help power plant communities and undercuts the benefits of competitive markets. It can also severely distort market prices that creates artificial investment risk for competitive suppliers. This escalates investment costs and exerts upward pressure on electricity prices.

Clearly, competition is a win for the economy, but it’s also a victory for the environment. The geographic scope and transparency of prices in competitive electricity markets are essential to clean-energy development. Competition spurs innovation to cut fossil-fuel consumption, as demonstrated by more efficient operation of power plants by competitive “merchant” power producers compared to monopoly utilities. Competition also democratizes clean energy, evidenced by customers with choice increasingly opting for “green” power supply. This creates organic demand for cleaner energies.

Proponents of nuclear subsidies argue the plants deserve compensation for something they lack – pollution. But this misapprehends the underlying concepts. The market failure is underpricing of pollution, not overpricing of green energy. Failing to enact efficient emissions pricing (e.g., an emissions tax) does not warrant abandoning market principles (e.g., distortionary subsidies). As former Exelon Chairman and CEO John Rowe stated “in a world that’s driven by unfriendly market prices and unfriendly public policy, you shut them down… it is the proper market-driven answer.”

Illinois legislators should take a page from the World Champion Chicago Cubs’ playbook. As skipper Joe Maddon noted, he’s “not into overlegislating the human race.” S.B. 2814 embodies an excessive legislative burden. After a lifetime of struggle, the refreshing mantra of personal freedoms yielded a cohesive clubhouse that brought the “lovable losers” their first championship in 108 years. Likewise, the liberation of energy consumers has made winners of hurting Illinoisans. Let’s keep the momentum going by protecting energy choice and competition to keep the Illinois economy a winner for years to come.

Image by Derek Henkle

Household incomes can fall even when everyone’s getting richer


One of the politically hottest statistics right now is median household income, especially its slow growth. But there is a big problem with understanding what this statistic means, since it mixes up two different things: the changing composition of households and changes in incomes. If the makeup of households is altering dramatically, as it has in recent decades, median household income may be a quite misleading number.

For example, it is mathematically possible for everyone’s income to be rising, while the median household income is falling. How is that possible?  The paradox is caused by counting by households, when the relationship between individuals and households keeps shifting.

To take the simplest possible case: Consider a population of one household, a married couple, each of whom has an income of $50,000. The median household income is $100,000. Their incomes each rise by 10 percent to $55,000—but they get divorced. Now there are two households. The median household income has become $55,000. The median household income has gone down by 45 percent!  Obviously, we have a demographic event, not an income event.

Suppose our married couple stays married with their new household income of $110,000. An immigrant joins the population, making $20,000, which is three or four times his previous income. In this case, the median household income has become $65,000, falling 35 percent!  But everybody is better off than they were before.

In what is naturally a more complicated way, just these sorts of major changes have been going on inside the statistics that count income by household. If the composition of households were unchanged, the statistics would be more straightforward. But this is obviously not the case. Until the demographic changes are untangled from the results, it’s not clear what the changes in median household income tell us.

Image by Photografeus

Panel Video: Legislative Reform: Then and Now

In the early 1970s, Congress found itself overpowered by the executive branch. In response, it reorganized, resuming power over budgeting and augmenting legislative support agencies. This was a not the first such instance. Congress also reorganized itself in the mid-1940s, reestablishing the First Branch to its pride of place and adjusting operating procedures to meet the challenges of a changing world.

The 21st century is upon us, and many citizens are concerned that Congress is now the “broken branch.” Is it time for legislative reorganization? If so, how should Congress undertake it?

Kevin R. Kosar of the R Street Institute, Walter Oleszek of the Congressional Research Service, Mark Strand of the Congressional Institute and Lee Drutman of New America discussed these issues at a recent R Street event, video of which is embedded below.

The kids suing over climate change better *not* be our best hope


Following election results and a subsequent transition period that probably has been fairly dispiriting to those concerned about tackling the problem of climate change, recent days have seen what could be characterized, by some, as one unusual ray of sunshine in an otherwise cloudy sky.

An improbable ruling handed down by a federal district court in Eugene, Oregon, will allow a collection of plaintiffs to proceed in a case brought to compel the federal government to “prepare and implement an enforceable national remedial plan to phase out fossil fuel emissions and draw down excess atmospheric CO2.”

Or, as one Slate writer put it, “the kids suing the government over climate change are our best hope now.

For the sake of both our democracy and the climate, let’s hope not.

To comprehend how this fairytale of judicial activism was conceived, cast one’s gaze toward the brick façade of the University of Oregon School of Law, situated in that very same Eugene, Oregon. This unusual legal theory comes from U of O property-law professor Mary Wood, who—to give it the imprimatur of legitimate and binding law—retooled the common law doctrine of the “public trust” and recast it as an “atmospheric trust”.

Wood’s pivot was clever, though unsubtle. The well-established public trust doctrine states that some assets—specifically coastlines and navigable waterways—are held by the government in trust for the public at-large. Thus, the government must act to enjoin private parties from damaging or otherwise appropriating them.

The “atmospheric trust” doctrine purports to function in a similar manner, but aspires to extend its reach to an altogether grander scale. Rather than be limited to navigable waterways, the entirety of the natural environment falls under its reach. In practice, claims under the atmospheric trust doctrine could conceivably proceed on the basis of harm caused to the climate as a whole, provided the plaintiffs involved are particularly aggrieved and, as a result, have standing to sue.

Which brings us to the plaintiffs in the Eugene case and the amazing order out of the Oregon District Court. The plaintiffs are a collection of young people—ranging in age from nine to 20—assembled by a group called “Our Children’s Trust” for the twin purposes of supporting the organization’s novel standing argument and attracting fawning publicity. In U.S. District Judge Ann Aiken’s courtroom, they succeeded on both fronts. Aiken, another product of U of O Law, produced a 54-page opinion and order drafted with fawning deference to the plaintiffs’ cause.

Alas, despite that length and passion, the document fails to obscure a mountain of legal and public-policy problems that are sure to be exposed on appeal.

To establish standing for the young plaintiffs, the court eagerly transmuted their facially generalized grievances about the climate into specific addressable harms on the basis of the children’s experiences of climate change-related events. For instance, one plaintiff lamented that “he has been unable to ski during winter as a result of decreased snow pack” while another alleged that fires caused by climate change have aggravated her asthma. Aiken found these individual harms sufficient to allow the plaintiffs to move forward with their effort to transform the lives of every citizen in the United States.

Given the scale of the equitable relief sought—and that it literally entails spontaneously legislating from the bench in ways that could change the course of the entire global economy—Aiken at least felt the need to try to address the political question doctrine, which precludes courts from interfering with matters more properly within the purview of the elected branches of government. In a nutshell, the judge is unconcerned that matters of this kind are, historically, left for the elected branches of government to decide and is instead persuaded that the gravity of the plaintiffs’ claims demands that they be heard.

On both issues, the order presses the very limits of both standing and political question jurisprudence. A less ideologically sympathetic court, and a less sympathetic jurist, will almost certainly look with extreme skepticism on the order’s conclusions. Time will tell.

More importantly, what this case actually signals is a new level of hopelessness with the electoral process within the environmental community. The proponents of the atmospheric trust doctrine have given up on politics and are so convinced of the immediate and existential threat of climate change that they will even take down the pillars of American jurisprudence to realize their policy goals.

Climate change is a serious problem that begs for a solution. But making farcical legal arguments to change the behavior of an entire nation will only harden opposition to the legitimate concerns expressed by groups like Our Children’s Trust. No matter the outcome of this case, if the environmental movement has given up on persuasion, it has already lost.

Image by anetta

How the coming Republican Congress could cut regulations lickety-split


For the better part of eight years, Republicans have tried to stop the Obama administration from issuing new regulations. They have not had much success. But this may well change in January, when President-elect Donald Trump arrives in the Oval Office.

Congress’ primary regulation-whacking tool is the Congressional Review Act (CRA), a bipartisan statute enacted 20 years ago and signed by then-President Bill Clinton. The CRA established a process for Congress to vote to nullify a regulation from taking effect.

To date, just one regulation has been struck down using the CRA. Why? For one, it requires both House and Senate leadership to schedule CRA resolutions for a vote, which takes away time that could be spent on other legislative matters. For another, a president needs to sign the joint disapproval resolution after both chambers of Congress pass it, and presidents tend not to want to kill rules issued by their agencies. Obama himself has vetoed four CRA bills in the past two years.

Trump’s arrival changes that. Now Republicans can schedule votes in both chambers with the expectation that they will win every vote and their work will meet a willing presidential pen. The American Action Forum’s Dan Goldbeck and Sam Batkins observe:

Utilizing the CRA, Congress and President Trump could potentially repeal at least 48 major regulations with, at a minimum, total regulatory costs of more than $42 billion and 53 million hours of paperwork.

Now, which regulations might get nixed is far from clear. Do the new Head Start Program Performance Standards impose more costs than benefits? Would deleting the Department of Labor’s rule expanding the number of workers who must be paid overtime be cheered or jeered by voters? (The nonpartisan Congressional Research Service, by the way, has produced a list of major rules subject to the CRA.)

President-elect Trump has promised to reduce regulation, and Republicans pride themselves on being anti-red tape. But shooting down every major regulation issued since late May of this year would be foolish. Which means that—between now and Jan. 20, 2017—Republicans are going to need to delve into the policy thicket and think a great deal about how they can use the CRA wisely. Unified government is rare these days, and Republicans would be wise not to squander the rare opportunity to lighten the regulatory burden.

Image by Jason Salmon

Self-driving car makers shouldn’t have to ask NHTSA ‘mother, may I?’


The National Highway Traffic Safety Administration’s recently issued safety guidelines for self-driving cars would upend five decades of federal auto-safety policy by embracing a system of pre-market approval.

The potential for such a dramatic shift tops a list of concerns with the proposed Federal Automated Vehicles Policy that the R Street Institute highlights in new public comments filed jointly with the Competitive Enterprise Institute and TechFreedom.

A pre-market approval system would require manufacturers to delay deploying technologies as they await regulatory approval from a process that, based on the experience of the Federal Aviation Administration, could take literally years to complete. In our comments, we strongly recommend that Congress exercise great scrutiny before it considers granting the NHTSA any such authority. This is particularly important since, in the agency’s own words, under the current system: “instances of non-compliance [with federal motor vehicle safety standards], especially non-compliance having substantial safety implications, are rare.”

Other areas of concern raised in the joint comments include:

  1. Vehicle performance guidance for highly automated vehicles

Allowing automated vehicle owners to opt out of the mere collection of personally identifiable information could undermine data analysis of all kinds, including crash reconstruction. Denying manufacturers the ability to collect needed data also could negatively affect the evolving tort liability environment and the development of new insurance products. The NHTSA needs to find a way to better balance these elements as it continues to develop a robust consumer-notice system for data usage.

The agency also needs to avoid attempting to regulate the vehicles’ so-called “ethical” considerations. The issues raised by philosophical abstractions like the “trolley car problem” are better addressed simply by modernizing the rules of the road in ways that ease safe compliance for automated vehicles and reduce the need for vehicles to exercise complex judgments.

Finally, the NHTSA needs to clarify that its FAVP guidelines are not compliance yardsticks for manufacturers, particularly when it comes to state-level permitting policy. At best, the guidelines only represent today’s best practices. They will evolve—they have to evolve—as more is learned by manufacturers. We suggest as an alternative that federal regulators focus on continually revisiting and refining its 15-point voluntary checklist and avoid the urge to cement more restrictive standards into law.

  1. Model state policy

The NHTSA should be commended for explicitly delineating and affirming current federal and state authorities. But the agency needs to chart a different course when it comes to mandating compliance with the FAVP, as in its suggestion that states require manufacturers to certify their “accordance” with the 15-point safety checklist.

By its own terms, the FAVP is explicitly voluntary. It did not go through notice-and-comment rulemaking and, thus, there remain seriously problematic provisions within the safety checklist that have not been subjected to rigorous scrutiny. If the guidance is treated as binding, as it has been by states like California, it’s not at all clear what that compliance would entail or even what regulator would assess it.

As it considers revisions to the FAVP, the NHTSA should make clear that it is inappropriate for states to mandate compliance with a nonbinding federal guidance document.

  1. NHTSA’s current regulatory tools

It’s not just the NHTSA’s proposed regulatory tools that are a source of concern. The agency also needs to be much more transparent about how it uses its existing tools. The guidance should include a summary table containing information on requests for letters of interpretation; requests for temporary exemptions from existing standards; petitions for rulemaking regarding vehicle automation systems; and what enforcement actions NHTSA has taken against vehicle-automation-system manufacturers. The table also should contain all relevant information, including dates, the statutory and regulatory provisions at issue, the vehicle component at issue and a description of any actions NHTSA has taken. Parties looking to engage should not have to be industry insiders to understand how to partake in the process.

Image by Neirfy

GDP per-worker vs. GDP per-capita


We have previously compared the growth in real per-capita gross domestic product between the United States and Japan and among the 10 largest advanced economies. Growth in GDP per-capita measures the increase in the average economic well-being in the country, and adjusts gross GDP growth for whether the population is increasing, stable or declining.

We now shift to comparisons of growth in GDP per-worker (more precisely, per employed person). This addresses productivity, rather than overall economic well-being, and adjusts for shifts in the composition of the population among those who are employed. Those are who not employed include, for example, children, full-time students, retired people, those unemployed and looking for work, those unemployed and not looking for work, and those (especially mothers) who do plenty of work in the home, but not as paid employees.

If the overall population is growing, it’s possible for GDP to grow while GDP per-capita does not. Similarly, if there is a shift within the population toward greater workforce participation, GDP per-capita might grow, while GDP per-worker does not. More generally, the growth rates of these measures of economic performance may be quite different.

Table 1 compares the striking slowdown in economic growth between the last half of the 20th century and the first 15 years of the 21st in the growth of real GDP, both per-capita and per-worker. However, the 21st century slowdown, while marked, is less extreme when measured per-worker (1.82 percent to 1.11 percent) than when measured per-capita (2.25 percent to 0.90 percent). In other words, the productivity slowdown is less than the overall economic welfare deceleration. This reflects demographic changes: from 1959 to 2000, the number of workers grew faster than the population as a whole. In the 21st century, it’s grown more slowly.


How does the United States compare to Japan, when measured in growth in real GDP per-worker?  Here our data makes us shift to 1960 to 2014, still a more than 50-year run. The relative growth performance of the two countries flips dramatically between the 20th and 21st centuries, although both are significantly slower, as shown in Table 2. Japan will continue to be an interesting case of a very technically advanced, but rapidly aging economy with falling employment and a falling population going forward.


Seemingly small differences in compound growth rates make for big differences if they continue over time. Table 3 shows the multiple of real GDP per-worker over 50 years in the actual second half of the 20th century, compared to a projection for 50 years of the 21st century if the century’s current trends continue. The result is a drop from an aggregate improvement of 2.5 times, to 1.7 times.


Can the growth in real GDP per-worker reaccelerate or not?  That is indeed the question.

Image by Sean K

Ohio’s fight to preserve energy competition and choice


Ohio regulators recently approved a problematic, watered-down subsidy for investor-owned utility FirstEnergy Corp. amounts to corporate welfare to protect captive ratepayers of its monopoly utility segment from the poor management of the company’s competitive generation portfolio. But the real fight is yet to come.

FirstEnergy announced plans to join American Electric Power Ohio (AEP) in lobbying the state Legislature to re-regulate its generation assets. This would backtrack groundbreaking reforms enacted last decade that opened up the Ohio power industry to competition and enabled customers to choose their supplier. Worse, it would set a dangerous precedent of destroying market institutions to rescue fledgling companies at the expense of successful competitors and consumers.

The Public Utilities Commission of Ohio (PUCO) approved the new subsidy plan for FirstEnergy in October. The potential $600 million electric rate plan is a fraction of the $4 billion FirstEnergy requested. The company’s request simply reformulated a plan to subsidize unprofitable power plants that PUCO previously approved in March, but was struck by the Federal Energy Regulatory Commission in April (a similar plan for AEP was also rejected).

According to PUCO’s chairman, the primary purpose of the revised plan is “to ensure that FirstEnergy retains a certain level of financial health and creditworthiness.” This rationale flies in the face of what electricity competition is supposed to do – leave markets to determine the health of power companies. Shifts in economic fundamentals create investment risk. Those risks are socialized under the regulated monopoly utility model, but are supposed to be shifted to the private sector under a market regime. This improves how the private sector manages risk and drives performance improvements. Subsidizing poor performers completely undercuts incentives to perform, socializes risk and damages competitors.

PUCO’s decision highlights a potential problem with Ohio’s power industry structure, where the financial health of a power company’s competitive arm (i.e., generation) affects that of its regulated arm (i.e., distribution, which is a monopoly utility). PUCO noted that the infusion of capital it permitted would ensure FirstEnergy had the financial health to make future investments in grid modernization (FirstEnergy’s bond rating is barely investment grade status). In plain language, this is corporate welfare to protect captive ratepayers of the distribution utility from the poor management of the company’s competitive arm.

A company that can subsidize its competitive generation segment through its regulated distribution segment raises major competitive concerns in the generation market. In such a case, economics tells us that “rival generation firms, without recourse to such subsidies, might go out of business even if they were actually more efficient generators, to the detriment of consumers.” Full separation of the competitive and monopoly functions would eliminate the rationale for corporate welfare and avoid market distortions.

As the pursuit of ratepayer-backed subsidies largely backfired, FirstEnergy and AEP are going for the competition’s jugular. They’re teaming-up to push re-regulation of their generation assets. Exactly how this structure would look is unknown, but it would, at the least, severely limit or kill customer choice. It would undoubtedly cause customer rates to rise by keeping unprofitable plants in operation at ratepayers’ expense, as well as harm competitive generation owners who still play by the rules. As the independent monitor of the wholesale electricity market noted: “Ohio customers have nothing to gain from paying above market prices to preserve aging and obsolete assets.”

Todd Snitcher, the former chair of the Public Utilities Commission of Ohio, called this “crony capitalism at its worst.” At its best, it’s rent-seeking behavior that undermines consumers, innovation and the environment. But the worst implications are the precedent it could set in other states, where losers of the low-price natural gas era see an opportunity to claw profit via government assistance.

Fortunately, the two-year drag of multiple regulatory disputes over subsidies enabled proponents of competition and choice to organize. The Alliance for Energy Choice formed and provides a voice for customers and competitive suppliers hurt by diminished competition. The Environmental Defense Fund has taken a leadership role in recognizing that protecting competition and the environment are one and the same. Ohio conservatives, such as the state’s Buckeye Institute, increasingly are alarmed by the anti-market games of AEP and FirstEnergy.

This debate comes at a time when the consumer benefits of competition and choice are becoming increasingly clear. Low-priced natural gas benefits customers in restructured markets more, and the investment risks it presents are better managed by competitive generation owners than monopoly utilities. Competitive wholesale electric markets have increased generation efficiency and innovation. As the market monitor put it “competition remains [the] best choice for Ohio customers.” Furthermore, the competitive wholesale market that includes Ohio has seen strong emissions reductions, in part connected to the incentives competition creates to use fuel more efficiently.

When it comes down to it, companies should excel based on their market positions, not political ones. The dynamism of capitalism can only be captured by letting markets pick winners and losers. That requires political discipline.

Ohio was a pioneer for competition and choice. That’s a legacy Buckeyes should hang their hat on. They now face a challenge that would move capitalism backward. Ohio policymakers must protect the integrity of their market principles and the reputation of the Buckeye State.

Image by Joseph Sohm

Does Congress finally get that business as usual should not continue?

Capitol Hill Washington DC

Nearly everyone is gobsmacked by last night’s election results. But should we really be so surprised? No, because a significant portion of the public has been down on the federal government for a long time.

Candidates for the presidency have been running as change-agents for decades. Jimmy Carter was the first of the recent outsider candidates: he was a Georgia governor with zero Washington experience. Then came Ronald Reagan, whom much of the media wrote off as an actor – until he won. In what must be a painful irony for Hillary Clinton, Bill Clinton took the presidency from incumbent George H.W. Bush. His theme? Change. Current President Barack Obama also ran against Washington and promised change.

Donald Trump ran his campaign on the theme of change. He promised to “drain the swamp” and “make America great again.” Hillary Clinton, a former secretary of state and first lady with extensive Washington experience, could not credibly cast herself as a change agent.

Ever since the tumult of the 1960s, American’s trust in the federal government has been eroding, according to both Gallup and Pew studies. The public rates new presidents highly, but the bloom inevitably falls from the rose. Voters have been particularly down on Congress of late.


Gerrity Congressional Approval 1974-2014

“Congressional approval,” writes Jessica Gerrity, has been greater than 50 percent only four times in the past 40 years – 1985, 1987, 2001 and 2002. On average, in any given year, only about 32 percent of the public approve of the work Congress is doing. And in a recent development, no longer does John Q. Public hate Congress but think well of his own representatives and senators. He increasingly dislikes them, too.

It’s not difficult to see. Our national legislature struggles to keep the government open, takes forever to enact new laws and rarely abolishes failed programs. Partisan warfare is the norm, and reams of political science data show that the public hates to see such politicking.

So what’s the takeaway for Congress? When Rep. Paul Ryan, R-Wis., took the speaker’s gavel, he hit the nail on the head:

[If] there were ever a time for us to step up, this would be that time. America does not feel strong anymore because the working people of America do not feel strong anymore. I’m talking about the people who mind the store and grow the food and walk the beat and pay the taxes and raise the family. They do not sit in this House. They do not have fancy titles. But they are the people who make this country work, and this House should work for them … [And when] they look at Washington, and all they see is chaos. What a relief to them it would be if we finally got our act together.

Congress needs to reform itself so that it can meet the public’s expectations. But it has a hard time doing that, because it continues to operate as if it is still 1900. It is in session only part of the year, valuable time is spent on hoary ceremonies and gasbaggery, and moving legislation requires working through Rube Goldberg-esque procedures. Government waste and fraud are common because Congress lacks sufficient staff and technical expertise to manage the $3.4 trillion colossus.

It is long past time for Congress to adjust to the realities of governing in the 21st century. The Constitution empowers both chambers to set their own rules, to organize themselves as they deem best and to invest in the legislature’s capacity to get stuff done. For certain, if business as usual continues, we can’t be surprised if voters vent their spleen in the next election.

The anarchist case for climate action


A spate of recent articles have purported to offer the “conservative case for a carbon tax” or the “libertarian case for climate action,” or some similar riff. R Street itself has contributed to the genre on occasion.

I want to do something even more challenging: present the case for action on climate change from an anarchist—specifically, anarcho-capitalist—perspective.

For those not up on the terminology, an anarcho-capitalist is someone who believes the free market can and should handle all government functions. An anarcho-capitalist takes the sorts of arguments that a conservative or libertarian might use to argue that Amtrak and the U.S. Postal Service ought to be privatized and applies them to things like roads, police, courts and national defense. It is, needless to say, a very uncompromising point of view.

I am not an anarchist of any sort. But I do think it can be useful to give the anarcho-capitalist perspective on an issue like climate change, because it clarifies how principles like property rights and individual liberty would apply, if we all weren’t such statist squishes (if this sort of philosophical musing isn’t your thing, you can go read about the Environmental Protection Agency and Star Wars instead).

One might think the anarcho-capitalist position on climate change would be obvious. Calls for action on climate typically involve proposing the government do something to deal with the situation. An anarcho-capitalist, by definition, doesn’t believe in government – that is, in an agency that enjoys a monopoly on the lawful use of force within a given territory. The anarcho-capitalist approach to climate change would be roughly the same as Michael Corleone’s offer to Sen. Geary: nothing.

But in fact, anarcho-capitalists have worked out responses to pollution and environmental issues generally that are at odds with common stereotypes of defenders of the free market. In fact, as some anarcho-capitalists would see it, the problem with most environmental regulation is not that it unjustly restricts business, but that it isn’t restrictive enough.

According to the late anarcho-capitalist economist Murray Rothbard, for instance, pollution should be seen as a violation of property rights. Just as it would violate my rights if a business decided to dump its garbage in my backyard, so too is it a violation of my rights when a factory pollutes the air I breathe or otherwise damages my health or my property without my consent. In keeping with the strict stand against all violations of property rights, Rothbard called for an absolute ban on pollution, regardless of cost to industry:

The remedy is simply for the courts to return to their function of defending person and property rights against invasion, and therefore to enjoin anyone from injecting pollutants into the air…

The argument that such an injunctive prohibition against pollution would add to the costs of industrial production is as reprehensible as the pre-Civil War argument that the abolition of slavery would add to the costs of growing cotton, and that therefore abolition, however morally correct, was ‘impractical.’

Rothbard later added some procedural requirements (such as clarifying the burden of proof) that would make it more difficult for plaintiffs to get an injunction without showing harm. Nevertheless, under the Rothbardian approach, if greenhouse-gas emissions are shown to result in harm to individuals or their property (e.g., by causing rising sea levels that flood the property and render it uninhabitable), then the property owner should be able to go to court and get an injunction against the emitters.

You don’t have to take my word for it. A while back, two of my favorite anarcho-capitalists, Tom Woods and Bob Murphy, took precisely this approach when discussing the proper free-market response to climate change on Woods’ radio show. The show was mostly devoted to trashing the case for a carbon tax as a nonlibertarian response to the risks of climate change. Toward the end, however, Woods asked a very interesting question:

Let’s suppose for the sake of argument [climate change is happening]… and various forms of human activity are contributing to it and so therefore we want to discourage this kind of activity so the glaciers don’t all melt and we all drown and die and California falls into the sea, whatever the results would be. What kind of libertarian response could there be? This would seem to be the classic case of needing a global government, right? You need a global government to go out and deal with a global problem… What is the libertarian answer?  

After several attempts to evade the issue (positing that, even if there is a problem, you can’t trust government to solve it; that it won’t get bad for more than 50 years, etc.) Murphy finally responds the same way Rothbard’s approach suggests: treat pollution as a violation of property rights that people can go to court to stop.

Ultimately the solution would be to have property rights in the integrity of the atmosphere and that you’re not allowed as a business – just as you can’t dump chemicals in the river – you’re not allowed to emit tons of CO2 into the atmosphere if that really is causing physical demonstrable harm to everyone else on the planet. So that would be a property rights violation… And people who were being harmed could go to a libertarian judge and say ‘hey that power plant is emitting all sorts of CO2 and here’s the science and see they’re directly causing us property damage.’ And so then they could get an injunction or they could have some kind of agreement where they work out with the owner some kind of compensation scheme.

In order to get an injunction, a property owner would, of course, have to show that they would be harmed by the continued emissions. However, it’s important to note what they wouldn’t have to prove under the Rothbard-Murphy approach. They wouldn’t have to prove that climate change poses serious risks to human civilization, or that the costs from climate change are greater than the costs of stopping all emissions. It could even be the case that climate change overall was beneficial to humanity. All they need to show is that their property rights have been violated.

Now I’ll be the first to admit that this method of dealing with climate change is not very practical. Consider, for example, the residents of Lima, Peru. Climate change is altering the melt rate for the mountain ice on which the city depends, compounding the already difficult problem of providing a reliable and safe water supply. Suppose the residents sue and are able to prove that greenhouse-gas emissions are causing or will cause this problem. Under the Rothbard-Murphy approach, they would be able to get an injunction against most of the industrial activity in the world (even if you are skeptical of this particular example, recall that all it would take is for someone, somewhere to prove some harm from greenhouse-gas emissions to produce the same result).

Clearly, anarcho-capitalists don’t want to see the end of industrial civilization. But how can they avoid it, without abandoning their principles? One possibility would be to fall back on contract. A factory owner, for example, might agree to pay the residents of Lima for the right to continue to emit greenhouse gases. If the value of being able to operate the factory was greater than the cost to the residents, it should be possible to work out some mutually beneficial agreement that would allow the factory to keep running.

This might work for cases where the numbers of polluters and property owners are both reasonably small. In the case of greenhouse gases, however, we are dealing with millions of emitters and potentially billions of property-owner plaintiffs. The sheer cost of negotiating contracts that allow for continued emissions would be prohibitive.

As I mentioned before, I am not an anarchist. While I think government should be as small as possible, there are some areas, including environmental protection, where I think the impracticality of a pure property-rights approach means the government can play a role. I do, however, find the Rothbard-Murphy approach important in highlighting an often-overlooked feature of debates on environmental policy – namely that pollution is a violation of property rights. Indeed, from a pure free-market perspective, the problem with things like a carbon tax may not be that they are too burdensome on business, but that they aren’t nearly burdensome enough.

Image by Darryl Brooks

Discussion: Government regulation and the internet of things

The R Street Institute hosted an expert panel last month to explore ways in which the “internet of things” is affecting cybersecurity, privacy and ownership rights. With cyberattacks on the rise and increased adoption of networked devices, these issues are now at the forefront of tech-policy debates.

The panel included University of Washington law professor Ryan Calo, Pepperdine University law and public-policy professor Gregory McNeal, Owners’ Rights Initiative Executive Director Andrew Shore and George Mason University law professor James Cooper.

The internet of things erodes traditional concepts of ownership. If the software in your fridge or thermostat is licensed, how does one transfer the license to new homeowners? Shore advocated for the You Own Devices Act—better known as “YODA”—which would allow the transfer of software ownership for the purposes of security updates and bug patches.

Privacy is another challenge. Calo highlighted a scenario in which one’s home’s devices become, in effect, “tiny salespeople,” offering products for sale and eroding spaces traditionally reserved for privacy. Calo further added that the internet of things presents opportunities for companies to communicate privacy information in ways people can digest.

Cooper argued the traditional notice-and-consent privacy model inadequately informs consumers. However, the existence of sensor-heavy environments does not mean that we need a new regulatory regime to protect privacy. Instead, the Federal Trade Commission should continue to pursue a harm-based approach. Both Cooper and Calo agreed that notice is better than the heavy-handed approach of dictating how devices are designed.

McNeal said he believes companies’ incentives need to change before they will improve their cybersecurity and privacy practices. Right now, privacy and cybersecurity are viewed as compliance tasks, rather than mission-critical features. Thus, cyberattacks—such as “ransomware” and distributed denial of service (DDoS) attacks—are becoming more prevalent and attracting the attention of lawmakers.

But sloppy government mandates for device security could harm innovation, by adding costs to businesses. They also could diminish security by mandating outdated solutions or lead to baked-in vulnerabilities, such as backdoors for government access.

Video of the full panel is embedded below:

If elected, would Clinton declare war on the sharing economy?

NEW YORK, NY - JUNE 03:  Hillary Rodham Clinton speaks onstage at the 2013 CFDA Fashion Awards on June 3, 2013 in New York, United States.  (Photo by Theo Wargo/Getty Images)

The following post was co-authored by Senior Fellow Lori Sanders.

This coming Tuesday, Hillary Clinton is expected to claim a resounding victory at the climax of a long, hard-fought election. But even if the country moves decidedly to the left, congressional Republicans will soon have to pivot to prepare for new policy battles that will take center stage in 2017. Regardless who is elected, one of these fights is likely to be over the status of millions of contract workers in the so-called “gig” or “sharing” economy, and beyond.

While Donald Trump has made some favorable comments about the sharing economy, prominent conservatives like Grover Norquist have expressed fears that, if Clinton is elected, she will wage a crusade against the employment practices of companies like Uber, Postmates or TaskRabbit, who many on the progressive left see as “exploiting” their workers by classifying them as independent contractors. As they see it, this “share-the-scraps economy” is transforming us into a nation of part-timers with low pay and no job security.

The answer, they say, is to reclassify all of these workers. As former Labor Secretary Robert Reich notes: “Congress doesn’t have to pass a new law” to change the test for employment classification. It can simply issue a new rule. President-elect Clinton could make this happen quite easily if she wanted to. But does she?

Clinton has already circulated an extensive policy agenda. But nowhere in her extensive, more than 7,000-word tech policy plan—which deals with everything from cybersecurity to intellectual property—is there a mention of the “sharing economy.” This is no small omission.

In a July 2015 speech at the New School—the first major economic address of her campaign—Clinton alluded to many of the strong criticisms of the gig economy that have taken root on the left, remarking that the rise of the sharing economy “[raises] hard questions about workplace protections and what a good job will look like in the future.” Clinton also notes that many young people—particularly in minority communities—“cannot find a job.” While talent is “everywhere,” she notes, “opportunity is not.” She subsequently added that, as president, she would work to make this trend “strengthen, not hollow out” the American middle class.

For sure, the gig economy is a big deal. Research conducted in 2014 by PriceWaterhouseCoopers predicts global sharing-economy revenues could reach $335 billion by 2025. Uber, the largest sharing-economy company, is valued at more than $60 billion. That’s more than most Fortune 500 companies. The sector is only going to get bigger. Already, there are two dozen “unicorns”—privately or closely held companies valued more than $1 billion—in the sharing-economy space.

With that said, Clinton appears to misdiagnose the problem. There’s next-to-no evidence that the gig economy is taking over the world. Over the past two decades, the number of people working part-time has declined as a percentage of the labor force. Bigger companies have come to employ more workers and startup activity has declined precipitously. Tenure at the same job has risen. The data simply doesn’t support the idea that we’re becoming a nation of part-timers thanks to the rise of Uber and its ilk.

Some real changes have happened. Job schedules have become less regular for many unskilled workers, for example, and the percentage of people employed full-time by one company but working at another have also risen. Still, most people in the sharing economy are using it to supplement their income, rather than to replace an existing job. For example, 69 percent of Uber drivers have other full-time or part-time work.

Perhaps it’s no surprise that—following a resounding backlash to her speech—the Clinton campaign walked back her remarks, stopped talking about the issue and left it out of her innovation policy plan. When the campaign is over, rather than pursuing a strategy of forcing innovative companies into the outdated dichotomy of employee versus contractor or giving into everything that old-line unions and their advocates want, the next president should view it as an opportunity to start a conversation about modernizing American labor policy for the 21st century.

The difficult truth is that our labor regulations are out of date. Americans need fresher and better skills, newer models for delivering benefits and, in many cases, flexible opportunities to supplement their income. Any change that forces companies and individuals to comply with a more stringent interpretation of employment-classification rules will not only do harm in the short term, it will stymie our ability to create something better suited to our current and future needs.

The quickest and easiest change the next president could make would be to encourage the creation regulatory of safe harbors for these companies to experiment with new ways to offer benefits and structure work. This would build on efforts the Obama administration began in June when it awarded competitive grants to organizations seeking to develop new portable retirement-benefit systems.

In some cases, these experiments are possible under current labor law. In others, they may require special legislation or a process to allow for labor-law waivers.

In the process, we will not only see improvements in the lives of those drawing parts of their income from the sharing economy, but will