Through what administrative means should a democratic society in an advanced economy implement regulation? In practice, democratic governments opt for a variety of solutions to this challenge. Historically, these approaches earned their legitimacy by allocating power to elected officials who make the laws or directly oversee their agents.
Increasingly, however, governments have chosen to implement policy through agencies with varying degrees of independence from both the legislature and the executive. Under what circumstances does it make sense in a democracy to delegate powers to the unelected officials of independent agencies (IA) who are shielded from political influence? How should those powers be allocated to ensure both legitimacy and sustainability?
These are the critical issues that Paul Tucker addresses in his ambitious and broad-ranging book, Unelected Power. In addition to suggesting areas where delegation has gone too far, Tucker highlights others—such as the maintenance of financial resilience (FR)—where agencies may be insufficiently shielded from political influence to ensure effective governance. His analysis raises important questions about the regulatory framework in the United States.
In this post, we discuss Tucker’s principles for delegating authority to an IA. A key premise—that we share with Tucker—is that better governance can help substitute where simple policy rules are insufficient for optimal decisions…. Read More
Blockchain is all the rage. We are constantly bombarded by reports of how it will change the world. While it may alter many aspects of our lives, our suspicion is that they will be in areas that we experience only indirectly. That is, blockchain technology mostly will change the implementation of invisible processes—what businesses think of as their back-office functions.
In this post, we briefly describe blockchain technology, the problem it is designed to solve and the impact it might have on finance. Read More
The two leading financial trends of our time are the integration of digital technology and the advance of financial inclusion. The latter involves both provision of access to those who have no account (the “unbanked”) and increased usage of financial services by those with a tenuous link to the formal system (the “underbanked”). A combination of swift technological change and government promotion is speeding the rise of inclusion.
Six years ago, the World Bank estimated that roughly 2.5 billion adults (15 or older) had no bank deposit, no formal credit, and no means of payment other than cash or barter. Stunningly, in its Global Findex Database 2017 published last month, the Bank now estimates that the number of unbanked adults has plummeted to 1.7 billion. Over the past six years, more than 1.2 billion adults have gained at least basic financial access through a financial institution or their mobile phone.
In addition to a range of technological progress, India’s government-led financial inclusion program has been the second key factor in the recent advance of inclusion. By our estimate, the gains in India account for more than one-half of the 515 million persons who acquired access globally between 2014 and 2017!
In the remainder of this post, we briefly describe the benefits of financial inclusion, and highlight key trends regarding access since 2011 as well as prospects for achieving the World Bank’s goal of universal financial access. We conclude with a short discussion of Africa, where the largest gains still lie ahead. Reflecting long-term demographic prospects, we emphasize that the advance of financial inclusion in Africa will matter for the global economy, not just for Africa. Read More
Digital currency is all the rage. Bitcoin has more than one thousand crypto cousins. There is even a token called dentacoin, whose issuers claim it will transform dentistry! In the past, we have been clear in our views. We agree with BIS General Manager Agustín Carstens: these are exactly like past attempts of people to issue their own private money. As Carstens said on another occasion, these tokens are “a combination of a bubble, a Ponzi scheme and an environmental disaster.”
Regardless of whether the blockchain will revolutionize dental health, the appearance of cryptocurrencies has driven central banks to think about one particular aspect of their business: paper currency issuance.
In this post, we expand on some aspects of our earlier discussion of central bank digital currency (CBDC). What is it and what would its wider introduction mean for the financial system? Our conclusion is unambiguous: Watch out what you wish for! …. Read More
In an effort to understand the dynamics of the distribution of consumption, income and wealth, over the past decade, there has been an explosion of research. While important debates about measurement and data interpretation continue, a range of evidence points to two important conclusions. First, over the past two centuries, the global income distribution has become far more equal. But, while the gap between countries is now much smaller, in recent decades, inequality within some advanced countries, especially in the United States, has risen.
Rather than income or consumption, in this post we focus on the distribution of wealth. Wealth affects welfare in at least two key ways. First, in the presence of borrowing constraints, it provides a buffer against fluctuations of income, allowing households to smooth consumption in the face of temporary bouts of illness or unemployment. Second, it provides the basis for household spending in retirement. .
As we will see, the distribution of wealth is far less equal than that of income. Moreover, recent research shows that, following the Great Financial Crisis of 2007-2009, the U.S. wealth distribution has become decidedly more unequal. As a result, a large portion of U.S. households appears to have little scope for meeting retirement needs out of their current net worth, making federal insurance programs key to their future well-being. Read More
Guest post by Richard Berner, Executive-in-Residence (Center for Global Economy and Business) and Adjunct Professor, NYU Stern School of Business
In response to the fragility of LIBOR and other interest-rate benchmarks, regulators globally are working with industry to identify sturdy alternatives. Despite significant progress, concerns persist that the transition to these new reference rates will be disruptive.
While these concerns are legitimate (see Eclipsing LIBOR), both U.S. and global authorities and market participants have begun to address them in ways that should go a long way to managing the risks. In this post, we review why LIBOR’s persistent fragility makes reform critical, and examine progress on some of the ongoing reforms.... Read More
Modern bank regulation has two complementary parts: capital and liquidity requirements. The first restricts liabilities given the structure of assets and the second limits assets based on the composition of liabilities.
While capital regulation―especially in its risk-based form―is a creation of the last quarter of the 20th century, liquidity regulation is much older. In fact, the newly implemented liquidity coverage ratio (LCR) harks back to the system in place over 100 years ago. In the United States, before the advent of the Federal Reserve in 1914, both national and state-chartered banks were required to hold substantial liquid reserves to back their deposits (see Carlson). These are the reserve requirements (RR) that remain in effect in most jurisdictions today, the United States included.
In this post, we briefly examine the long experience with RR as a way to gain insight regarding the LCR. We draw two conclusions. First, we argue strongly against using the LCR as a monetary policy tool in advanced economies with well-developed financial markets. Like RR, it is simply too blunt and unpredictable. Second, for the LCR to work as a prudential policy tool, it should probably be supplemented by something like a fee-based line of credit at the central bank.... Read More
This month, in the guise of supporting community banks, the U.S. Senate passed a bill (S.2155) that eases regulation of large banks. We share the critics’ views that this wide-ranging dilution of existing regulation will reduce the resilience of the U.S. financial system.
In its best known and most publicized feature, the Senate bill raises the asset size threshold that Dodd-Frank established for subjecting a bank to strict scrutiny (such as the imposition of stress tests, liquidity requirements, and resolution plans) from $50 billion to $250 billion. In this post, we examine the role of asset size in determining the systemic importance of a financial intermediary. It turns out that (aside from the very largest institutions, where it does in fact dominate) balance sheet size is not a terribly useful indicator of the vulnerability a bank creates. We conclude that Congress should ease the strict oversight burden on institutions that pose little threat to the financial system without raising the Dodd-Frank threshold dramatically.
Judge makes an elegant proposal for accomplishing this. For institutions with assets between $100 billion and $250 billion, Congress should just flip the default. Rather than obliging the Fed to prove a mid-sized bank’s riskiness, give the bank the opportunity to prove it is safe. This approach gives institutions the incentive to limit the systemic risk they create in ways that they can verify. It also sharply reduces the risk of litigation by banks that the Fed deems risky... Read More
Banks continue to lobby for weaker financial regulation: capital requirements are excessive, liquidity requirements are overly restrictive, and stress tests are too burdensome. Yes, in the aftermath of the 2007-09 financial crisis, we needed reforms, they say, but Basel III and Dodd-Frank have gone too far.
Unfortunately, these complaints are finding sympathetic ears in a variety of places. U.S. authorities are considering changes that would water down existing standards. In Europe, news is not promising either. These developments are not only discouraging, but they are self-defeating. Higher capital clearly improves resilience. And, at current levels of capitalization, it does not limit banks’ ability to support economic activity.
As it turns out, on this particular subject, there may be less of a discrepancy between private and social interests than is commonly believed. The reason is that investors reward banks in jurisdictions where regulators and supervisors promote social welfare through tougher capital standards.... Read More
Ten years ago this week, the run on Bear Stearns kicked off the second of three phases of the Great Financial Crisis (GFC) of 2007-2009. In an earlier post, we argued that the crisis began in earnest on August 9, 2007, when BNP Paribas suspended redemptions from three mutual funds invested in U.S. subprime mortgage debt. In that first phase of the crisis, the financial strains reflected a scramble for liquidity combined with doubts about the capital adequacy of a widening circle of intermediaries.
In responding to the run on Bear, the Federal Reserve transformed itself into a modern version of Bagehot’s lender of last resort (LOLR) directed at managing a pure liquidity crisis (see, for example, Madigan). Consequently, in the second phase of the GFC—in the period between Bear’s March 14 rescue and the September 15 failure of Lehman—the persistence of financial strains was, in our view, primarily an emerging solvency crisis. In the third phase, following Lehman’s collapse, the focus necessarily turned to recapitalization of the financial system—far beyond the role (or authority) of any LOLR.
In this post, we trace the evolution of the Federal Reserve during the period between Paribas and Bear, as it became a Bagehot LOLR. This sets the stage for a future analysis of the solvency issues that threatened to convert the GFC into another Great Depression. Read More
Retail bank runs are mostly a thing of the past. Every jurisdiction with a banking system has some form of deposit insurance, whether explicit or implicit. So, most customers can rest assured that they will be compensated even should their bank fail. But, while small and medium-sized depositors are extremely unlikely to feel the need to run, the same cannot be said for large short-term creditors (whose claims usually exceed the cap on deposit insurance). As we saw in the crisis a decade ago, when they are funded by short-term borrowing, not only are banks (and other intermediaries) vulnerable, the entire financial system becomes fragile.
This belated realization has motivated a large shift in the structure of bank funding since the crisis. Two complementary forces have been at work, one coming from within the institutions and the other from the authorities overseeing the system. This post highlights the biggest of these changes: the spectacular fall in uncollateralized interbank lending and the smaller, but still dramatic, decline in the use of repurchase agreements. The latter—also called repo—amounts to a short-term collateralized loan.... Read More
Last week’s 12th annual U.S. Monetary Policy Forum focused on the effectiveness of Fed large-scale asset purchases (LSAPs) as an instrument of monetary policy. Despite notable disagreements, the report and discussion reveal a broad (if not universal) consensus on key issues:
In a world of low equilibrium real interest rates and low inflation, policymakers could easily hit the zero lower bound (ZLB) in the next recession.
At the ZLB, the Fed should again use a combination of balance-sheet tools and interest-rate forward-guidance to achieve its mandated objectives of stable prices and maximum sustainable employment (see our earlier post).
Yet, significant uncertainties about the impact of balance-sheet expansion mean that LSAPs may not provide sufficient stimulus at the ZLB.
Fed policymakers should undertake a thorough (and potentially lengthy) assessment of alternative policy tools and frameworks—ranging from negative interest rates to a higher inflation target to forms of price-level targeting—to ensure they remain as effective as possible.
The remainder of this post discusses the challenges of measuring the impact of balance-sheet policies. As the now-extensive literature on the subject implies, balance-sheet expansions ease financial conditions. However, as this year’s USMPF report emphasizes, there is substantial uncertainty about the scale of that impact.... Read More
When migrants send money across borders to their families, it promotes economic activity and supports incomes in some of the poorest countries of the world. Annual cross-border remittances are running about US$600 billion, three quarters of which flow to low- and middle-income countries. To put that number into perspective, total development assistance worldwide is $150 billion.
Yet, despite the remarkable technological advances of recent decades, remittances remain extremely expensive. On average, the charge for sending $200―the benchmark used by authorities to evaluate cost―is $14. That is, the combination of fees (including charges from both the sender and recipient intermediaries) and the exchange rate margin typically eats up fully 7% of the amount sent. While it is less expensive to send larger amounts, the aggregate cost of sending remittances in 2017 was about US$30 billion, roughly equivalent to the total non-military foreign aid budget of the United States!
In this post, we discuss remittances, why their costs remain high, and what might be done to lower them. Read More
Over the past 40 years, U.S. capital markets have grown much faster than banks, so that banks’ share of credit to the private nonfinancial sector has dropped from 55% to 34% (see BIS statistics here). Nevertheless, banks remain a critical part of the financial system. They operate the payments system, supply credit, and serve as agents and catalysts for a wide range of other financial transactions. As a result, their well-being remains a key concern. A resilient banking system is, above all, one that has sufficient capital to weather the loan defaults and declines in asset values that will inevitably come.
In this primer, we explain the nature of bank capital, highlighting its role as a form of self-insurance providing both a buffer against unforeseen losses and an incentive to manage risk-taking. We describe some of the challenges in measuring capital and briefly discuss a range of approaches for setting capital requirements. While we do not know the optimal level of capital that banks (or other intermediaries) should be required to hold, we suggest a practical approach for setting requirements that would promote the safety of the financial system without diminishing its efficiency.... Read More
When most people think of investment, what comes to mind is the purchase of new equipment and structures. A restaurant might start with construction, and then fill its new building with tables, chairs, stoves, and the like. This is the world of tangible capital.
We still need buildings and machines (and restaurants). But, over the past few decades, the nature of business capital has changed. Much of what firms invest in today—especially the biggest and fastest growing ones—is intangible. This includes software, data, market analysis, scientific research and development (R&D), employee training, organizational design, development of intellectual and entertainment products, mineral exploration, and the like.
In this post, we discuss the implications of this shift for the structure of finance. Tangible capital can serve as collateral, providing lenders with some protection against default. As a result, firms with an abundance of physical assets can finance themselves readily by issuing debt. By contrast, a company that focuses on software development, employee training, or improving the efficiency of its organization, will find it more difficult and costly to borrow because the resulting assets cannot easily be re-sold. That means relying more on retained earnings or the issuance of equity.... Read More
The problem of time consistency is one of the most profound in social science. With applications in areas ranging from economic policy to counterterrorism, it arises whenever the effectiveness of a policy today depends on the credibility of the commitment to implement that policy in the future.
For simplicity, we will define a time consistent policy as one where a future policymaker lacks the opportunity or the incentive to renege. Conversely, a policy lacks time consistency when a future policymaker has both the means and the motivation to break the commitment.
In this post, we describe the conceptual origins of time consistency. To emphasize its broad importance, we provide three economic examples—in monetary policy, prudential regulation, and tax policy—where the impact of the idea is especially notable.... Read More
Last month, the Federal Reserve Board published proposed refinements to its annual Comprehensive Capital Analysis and Review (CCAR) exercise—the supervisory stress test that evaluates the capital adequacy of the largest U.S. banks (34 in the 2017 test). In our view, the Federal Reserve has an effective framework for carrying out these all-important stress tests. Having started in 2011, the Fed is now embarking on only the seventh CCAR exercise. That means that everyone is still learning how to best structure and execute the tests. The December proposals are clearly in this spirit.
With this same goal in mind, we make the following proposals for enhancing the stress tests and preserving their effectiveness:
--- Change the scenarios more aggressively and unexpectedly, continuing to disclose them only after banks’ exposures are fixed. Read More
--- Introduce an experimental scenario (that will not be used in “grading” the bank’s relative performance or capital plans) to assess the implications of events outside of historical experience and to probe for weaknesses in the system.
--- As a way to evaluate banks’ internal models, require publication of loss rates or risk-weighted assets for the same hypothetical portfolios for which the Fed is disclosing its estimates.
--- Stick with the annual CCAR cycle....
Shortly after Lehman failed in 2008, investors began to flee from money market mutual funds (MMMFs). To halt the run, the U.S. Treasury guaranteed all $3.8 trillion in outstanding MMMF liabilities. That rescue created enduring moral hazard: the expectation that a future crisis will lead to another bailout.
Aside from their legal form as mutual funds, MMMFs functioned much like banks engaged in the transformation of liquidity, credit and (to some extent) maturity. Similar to banks that redeem deposits at face value, they promised investors a fixed share value of $1 (a “buck”) on demand. Unlike depositories, however, MMMFs had no capital, no deposit insurance, and—at least officially—no access to the lender of last resort. So, when the Reserve Primary Fund “broke the buck” (by failing to redeem at the $1 par value) in September 2008, MMMF investors panicked.
Somewhat surprisingly, it took until 2014 for the Securities and Exchange Commission (SEC) to resolve political conflicts and introduce significant rule changes for MMMFs (see our earlier posts here and here). The SEC now requires that institutional prime MMMFs—which (like Reserve Primary) frequently invest in short-term corporate liabilities—operate like other mutual funds with a floating net asset value (NAV). The same rule applies to institutional municipal MMMFs. Retail MMMFs, as well as those investing in federal government (and agency) securities, are exempt.
In light of a recent legislative proposal to water it down, in this post we review the impact of the SEC’s 2014 reform. To highlight our conclusions: (1) it did not go far enough to reduce run risk; (2) aside from temporary dislocations, it has not raised nonfinancial sector funding costs by more than would be accounted for by reducing the implicit taxpayer guarantee for MMMFs; and (3) reversing the floating-NAV requirement would weaken the safety of the U.S. financial system.... Read More
After nearly a decade of negotiations, last month, the Basel Committee on Banking Supervision completed the Basel III post-crisis reforms to capital regulation. The final standards include refinements to: credit risk measurement and the computation of risk-weighted assets; the calculation of off-balance-sheet exposures and of the requirements to address operational risk; and the leverage ratio requirement for global systemically important banks (G-SIBs).
In this post, we focus on revisions to the way in which banks compute risk-weighted assets. To foreshadow our conclusion: the new approach adds unnecessarily to regulatory complexity. If the concern is that current risk-based requirements result in insufficient capital, it would be better simply to raise the requirements. Read More
Many features of our financial system—institutions like banks and insurance companies, as well as the configuration of securities markets—are a consequence of legal conventions (the rules about property rights and taxes) and the costs associated with obtaining and verifying information. When we teach money and banking, three concepts are key to understanding the structure of finance: adverse selection, moral hazard, and free riding. The first two arise from asymmetric information, either before (adverse selection) or after (moral hazard) making a financial arrangement (see our earlier primers here and here).
This primer is about the third concept: free riding. Free riding is tied to the concept of a public good, so we start there. Then, we offer three examples where free riding plays a key role in the organization of finance: credit ratings; schemes like the Madoff scandal; and efforts to secure financial stability more broadly.... Read More