Commentary

Africa’s Fallen Liberators

OXFORD – The leaders of two key African countries resigned their posts within 24 hours of each other last week. South Africa’s Jacob Zuma finally buckled under pressure from his own party to resign the presidency. The following day, Ethiopian Prime Minister Hailemariam Desalegn announced his decision to step down in the face of sustained mass protests and political turmoil.


In both cases, two of the oldest liberation parties in Africa, which have remained in office since first coming to power a quarter-century ago, were forced by deep popular discontent to push their leaders aside. The historical trajectories of both parties are largely similar. Nevertheless, the effects of their leaders’ exit could not be more different.

Yes, both the African National Congress (ANC) in South Africa and the Ethiopian People’s Revolutionary Democratic Front (EPRDF) grew complacent and corrupt, and suffered political decay, over the past quarter-century. But whereas South Africa had put in place a robust set of institutional safeguards in the wake of its transition from apartheid, Ethiopia, after the overthrow of Mengistu Haile Mariam’s dictatorship, never managed to build national institutions strong enough to save the country from the ruling party.

Despite obvious differences in the two countries’ histories and economic conditions, the way their dominant parties conduct business, and the economic model they claim to have adopted, are strikingly similar. Both the ANC and the EPRDF espouse the Leninist principle of democratic centralism, according to which party members are expected to abide by the policies established by the central party leadership. Both parties deploy cadres widely to ensure that the civil service carries out political decisions. More recently, party elites in both countries have moved to embrace heterodox economic policies.

The ANC has been accused of losing touch with its impoverished voters in one of the most unequal countries on earth. Making matters worse, the massacre in 2012 of 34 miners by police in Marikana revived memories of the apartheid regime’s contempt for blacks. Regardless, the ANC, accustomed to winning well above 65% of the vote (mainly owing to its liberation credentials), continued to back the embattled Zuma, who faces accusations ranging from bribery to rape. Governance deteriorated as the economy stagnated and corruption and state capture proceeded apace. The party of Nelson Mandela risked succumbing to internal rot – and taking the country down with it.

Meanwhile, in Ethiopia, the EPRDF harangued the public about how it freed the country decades ago from a brutal military dictatorship, even as millions of young people born and raised under the party’s rule face crippling unemployment. The EPRDF continues to imply that any failure by its leadership does not undermine its right to govern the country, while political challenges by opposition groups are portrayed as treasonous. Like the ANC, the longer the EPRDF remained in power, the less it could imagine Ethiopia without it at the helm.

When Zuma announced his resignation, the South African Rand jumped to a three-year high. But after Desalegn announced his decision to step down, Ethiopia’s dollar bond fell to a six-month low. These indicators, however, are just the start of the divergent implications the two resignations have had for their respective countries’ politics and economics.

The ANC and EPRDF, like many of Africa’s liberation movements in the 1960s, stopped responding to evolving political and economic demands, while predatory politicians and their cronies hid behind the party’s banner. But while Zuma’s successor, Cyril Ramaphosa, promised a “new dawn” for South Africa when he addressed Parliament within days of Zuma’s resignation, Ethiopia declared a state of emergency around the same time, amid widespread concern about whether the state would survive the ethnically charged power struggle to succeed Desalegn.

The crucial difference is that South Africa had visionary leaders who were aware of the danger that an out-of-touch dominant party could pose. Mandela reportedly urged newly enfranchised black South Africans, “If the ANC does to you what the apartheid government did to you, then you must do to the ANC what you did to the apartheid government.” This sentiment was reflected in the democratic checks and balances enshrined by South Africa’s post-apartheid constitution.

Ethiopia, however, has not been so fortunate, particularly because the current regime spent its first decade in office shoring up its precarious power base and fighting a war with neighboring Eritrea. But the EPRDF made matters worse by undermining the constitution that it helped promulgate, and appearing to have no plan for nation building beyond supposedly rectifying historical inequalities between ethnic communities (though it succeeded in stimulating faster economic growth than the country had ever recorded in modern times).

The most telling indicator of how differently the two countries’ democratic institutions responded to the perceived decay of their dominant parties is the recent election results. The ANC kept losing electoral ground to rival parties as its leadership failed to manage South Africa’s multifaceted problems. It was to stem further electoral losses that the party decided to push Zuma aside.

The EPRDF, by contrast, turned Ethiopia into a de facto single-party state. The country began its descent into political turmoil a few months after the party claimed to have won, together with its allies, all parliamentary seats in the 2015 elections. Last week, South Africa’s democratic institutions appeared to have saved the ANC from itself. Ethiopia is unlikely to be so fortunate. Too few institutional guardrails have survived the neglect or active dismantling that the EPRDF brought about.

Biniam Bedasso is a Global Leaders Fellow at the Blavatnik School of Government, University of Oxford.

By Biniam Bedasso

The Data-Driven City

SAN JOSÉ – When you look at your phone or tablet, what do you see? Pixels? Pictures? Digital distraction? I see data.


Every day, we generate enormous amounts of information, a binary trail of breadcrumbs that forms a map of our interests, habits, and interactions. For those of us in the business of urban planning, these disparate datasets represent a goldmine of opportunity. If harnessed properly, user-generated data can help planners build cities that are more in tune with people’s actual needs.

There is just one problem: the world is literally drowning in data. To make use of all the information that people involuntarily produce, planners must improve how data is captured, analyzed, and shared between the public and private sectors. If we succeed, some of the biggest obstacles that the world faces – from poverty to climate change – could become a bit more manageable.

One of the most significant innovations being embraced by the world’s planning agencies is the concept of “open data,” information that can be used by anyone to improve any aspect of public life. In an open-data environment, datasets from transportation, education, health care, and countless other municipal sectors are made available to optimize current services, or to create new ones. For example, France’s La Base Adresse Nationale pulls together location information to improve spatial analysis and emergency response times, while the European Union’s Urban Data Platform facilitates data sharing across the EU.

When governments, universities, research centers, and innovation hubs work together to share information, they become true partners in the urban-planning process. Open data usage can also promote transparency and build trust in government decision-making and official policies.

To be sure, many challenges must be overcome before governments can open the data floodgates. Legal frameworks must protect personal anonymity; data protocols should be adopted to ensure that decisions don’t exclude populations that are cut off from technology; and trustworthy platforms are needed to enable data sharing between agencies and municipalities without risk of sabotage.

But, once the hurdles are overcome, the opportunities for better planning will be virtually endless. Nuggets of digital detail illuminate how people move in and interact with the built environment. When combined with data from government sources – such as information on air quality, traffic patterns, crime, or health statistics – user-generated information can lead to more sustainable cities. For example, by mapping how and when people travel, planners can know where to invest more in cleaner modes of transportation – such as bike-sharing systems or electric car-charging stations.

The interaction between transportation and climate-related challenges is one of the most promising areas for testing open-data solutions. Today, roughly half of the world’s population lives in cities, but cities account for approximately 75% of global carbon dioxide emissions, which are largely attributable to transportation. In many parts of the world, twentieth-century urban development strategies created sprawling, car-centered cities; but accelerating rates of urbanization have made this approach unsustainable.

Faced with such challenges, open data has become a key tool in redefining the urban-planning process. That is why my government is using it to understand how sectors like transportation – as well as agriculture, energy, and others – are affecting strategies to mitigate climate change. Costa Rica’s National Climate Change Metrics System, which is currently being developed, is planned as an open-source data tool that will integrate information from across national agencies into a single public portal.

The goal is to help improve decision-making and enhance the country’s ability to monitor and meet its climate-change goals. Eventually, the system will be used to engage Costa Ricans in sustainability programs, while the source code will be shared with other developing countries.

A world of truly open data will take time to build; people will need to become comfortable with the idea of user-generated information circulating freely. But governments have already recognized the importance of open data in solving key planning challenges. Costa Rica’s climate change portal is just one example of how improved access to information can streamline urban planning.

So, the next time you pick up your smartphone, pause for a moment to consider the full potential of what you’re holding. A more sustainable future for everyone is in the palm of your hand.

Ana Lucía Moya is Sustainable Transportation and Mobility Coordinator at the Center for Urban Development in Costa Rica.

By Ana Lucía Moya

Three Keys to a New South Africa

JOHANNESBURG – Jacob Zuma has resigned as South Africa’s president – an inevitable move, following the African National Congress’ withdrawal of its support. Two decades after Nelson Mandela tried – and failed – to pass the presidency to Cyril Ramaphosa, the former deputy president and current ANC head has become South Africa’s leader. And the challenges that Ramaphosa will face are almost as daunting as those Mandela confronted in lifting his country from the ruins of apartheid.


Nearly a quarter-century ago, four years after Mandela was released from prison, South Africans celebrated the birth of an inclusive, constitutional state. During Zuma’s tenure, however, that euphoria evaporated. Amid allegations of endemic corruption, ratings downgrades, corporate malfeasance, and deepening malaise among state-owned enterprises (SOEs), South Africa’s regional and international standing weakened.

For many, Ramaphosa represents a return to national strength. He has vowed to restore credibility to the management of South Africa’s affairs, and to reinvigorate the values of democratic inclusion. His simple gestures, like starting meetings and rallies on time, are departures from Zuma’s more aloof approach.

But returning accountability and good governance to South Africa will require much more than punctuality. Three key areas will need urgent attention if the country’s incoming leader is to chart a new course.

The first challenge, restoring faith in the country’s rule of law, may be the hardest to meet. Zuma’s “capture” of businesses, the National Prosecuting Authority, and cabinet appointments. was so complete that untangling the webs of influence will take time. But restoring public confidence in these vital institutions must be made a top priority.

Second, Ramaphosa’s government, whenever it is seated, will need to move quickly to reform the state’s relationship with SOEs. Zuma treated these businesses as vehicles for personal gain, and their mismanagement undermined economic growth and development. An economy characterized by poverty, inequality, and unemployment will never recover if the drivers of wealth are not operating effectively.

For example, mining continues to be a significant contributor to the South African economy; if managed properly, the sector could be a powerful lever for supporting the growth of upstream manufacturing. South Africa has some of the world’s largest deposits of chrome and manganese, minerals that are essential for the manufacture of electric vehicles, wind turbines, and other components of the so-called Fourth Industrial Revolution.

Unfortunately, because Zuma’s government misused resource wealth by redistributing mineral rents to loyal clients, trust between the mining industry and the state is nonexistent. And the only way to restore it – and thereby increase exploration and production – will be to overhaul legislation and regulation to ensure stronger protection of industry interests.

Restoring trust and accountability to the business environment would attract investment, create jobs, fill state coffers, and improve redistribution, especially to those for whom employment prospects remain limited. This final point is key; in recent years, South Africa’s welfare programs have been threatened by poor governance and mismanagement, and can be reformed only if economic growth returns.

Finally, Ramaphosa will need to invest heavily in South Africa’s education system, a sector that Zuma largely neglected. A good place to begin would be with early childhood education, where spending often yields high long-term rewards. With the youth unemployment rate currently at a staggering 39%; putting more young people to work will require rethinking how future generations are trained.

South Africa is a small country, but with the right reform-minded leadership, it can reassume its regional role as an economic and political powerhouse. In fact, this may be the ideal time to make changes at the top; much of Africa is undergoing similar shifts, which could bring new opportunities for economic cooperation. In neighboring Zimbabwe, for example, the end of Robert Mugabe’s misrule could reignite growth linked not only to natural resources, but also to value-added products, services, and trade.

As South Africa navigates its own presidential transition, the country must redefine its role in an evolving geopolitical landscape. To do that, the country must reassert its role as an influential actor, while pursuing a more dynamic, effective, and integrated investment strategy. Strong diplomacy and commercial outreach will be essential, and South Africa’s leaders should embrace and develop membership in economic clubs, like the BRICS group of major emerging economies (which also includes Brazil, Russia, India, and China).

South Africans are ready for new leadership. But to achieve a future defined by full employment, social justice, strong governance, and international credibility – the era that Mandela represented – Ramaphosa will need to return to the path from which Zuma so egregiously strayed.

Fred Phaswana is National Chairman of the South African Institute of International Affairs.

By Fred Phaswana

What Happens When Primary Health Care Is Universal?

BOSTON – Two years after the death of her husband, Valeria, a 67-year-old grandmother in San José, Costa Rica, lives alone. Last year, she was diagnosed with high blood pressure and diabetes, conditions that, while not immediately life threatening, require health care to manage. But, thanks to the quality of Costa Rica’s primary health-care system, Valeria has been able to maintain her independence and her health, even in the absence of family.


Costa Rica, a middle-income country that is committed to universal health care for its people, produces better health outcomes, while spending less than most other countries in the world. In fact, Costa Rica has achieved the third-highest life expectancy in the Americas – behind only Canada and Bermuda, and well ahead of the United States. The secret of its success is revealed in our new report, “Building a Thriving Primary Health-Care System: The Story of Costa Rica.”

After her diabetes diagnosis, Valeria was automatically entered into the diabetes chronic-care program at her assigned clinic, a ten-minute walk from her house. She visits with her primary-care team every three months to have her blood pressure checked, and to make sure her diabetes is under control. And, once a year, a community health worker visits her home to ensure that it is safe, to administer vaccinations, and to share information about maintaining a healthy lifestyle.

As the global health community works to implement the United Nations Sustainable Development Goals, including SDG 3 – which targets achievement of wellbeing for all by 2030 – Costa Rica offers a model to consider emulating. Around the world, chronic diseases are increasing and populations are aging, making universal access to affordable care a top priority. Unfortunately, many patients will suffer far worse health outcomes than Valeria, simply because they are unable to access quality primary health-care services.

Well-organized primary health systems – which emphasize promotive, preventive, and chronic care, with general practitioners serving as the first point of contact – increase quality and reduce service fragmentation. Studies show that areas with more primary-care physicians have lower mortality and better health outcomes than those with fewer primary-care physicians. Primary health care is also a key pathway to achieving universal health coverage, a stated goal of the international community.

Over the past 20 years, Costa Rica’s Department of Social Security has built a primary health-care system that today reaches nearly every person in the country. Primary providers are the first place Costa Ricans turn when they have a health problem, as practitioners offer acute, chronic, and preventive services. A similar system has been used successfully in other countries, such as New Zealand, and allows patients and their families to build long-term relationships with providers.

Costa Rica’s approach began with reforms in the 1990s, when the country committed to a few simple changes designed to upgrade its health-care services. Some of these could be emulated by countries today.

For starters, officials in San José merged multiple health-care agencies into one, giving the new body authority over financing decisions and service delivery – everything from vaccinations to complex surgeries. While a consolidated approach might not work for every country, many could benefit from a more integrated bureaucratic approach.

Second, Costa Rica divided the country into 104 coverage areas, assigning every citizen a primary health-care team. This helped providers track health trends more precisely, and allowed for proactive and cost-effective health management.

Third, the authorities created multidisciplinary primary health-care teams capable of delivering preventive care services, such as vaccinations and education, in conjunction with acute and chronic medical support. This holistic approach draws on the combined expertise of physicians, nurses, community health workers, pharmacists, and data clerks.

Finally, the health department created a system to gauge statistically the quality of the care it was delivering. The data are currently used for ongoing monitoring to improve health-care provision in real time.

These four enhancements have had a dramatic effect on the system. Access to primary health care has surged, from 25% of the population in the early 1990s to 93% in 2006. Today, more than 94% of the population is assigned to a specific primary health-care team. Quality has also increased and, thanks to efficiency gains, the costs are a fraction of what other countries pay.

As countries pursue universal health coverage, they will need proven ways to bring higher quality, more affordable care to the underserved. Costa Rica offers one successful approach. By placing primary health care at the center of the system, the country has improved coverage rates and outcomes, while delivering more personalized treatment.

For patients like Valeria, this has meant a system that is accessible, easy to use, and caring in an ongoing basis. Costa Rica’s reforms have greatly improved her quality of life, and no doubt there are many other patients just like her, in every corner of the world, who could benefit from a similar approach.

Asaf Bitton is the director of primary health care at Ariadne Labs, a joint center of Brigham and Women’s Hospital and the Harvard T. H. Chan School of Public Health. Madeline Pesec is a primary-health-care researcher at Ariadne Labs.

By Asaf Bitton and Madeline Pesec

The Ethics of Fighting Drug Resistance

GOTHENBURG – In 2014, the World Health Organization reported that drug resistance – especially resistance to antibiotics – is a growing threat to human health, food security, and “the achievements of modern medicine.” Far from being an “apocalyptic fantasy,” the WHO said, a post-antibiotic era “is instead a very real possibility for the twenty-first century.”


Drug resistance threatens the effective treatment of a growing list of communicable diseases – from bacterial infections to viral to and fungal diseases. When people recklessly use antibiotics to fight a common cold, when farmers use antibiotics to boost livestock productivity, or when pharmacological factories emit antibiotics into the environment to cut production costs, the bacteria that the drugs are designed to kill become immune. The more antibiotics consumed and emitted, the faster resistance develops, leading to “superbugs” that jeopardize human health, both by raising the risk of massive deadly epidemics and by compromising medical services, such as surgery and cancer treatment, that rely on effective antibiotics.

This scary reality continues to frustrate health-care professionals. To be sure, there are solutions to the drug resistance crisis: restricted consumption, better diagnostics and disease surveillance, and expanded clinical development of new drugs are three. And some initial coordinated action has been taken in the WHO global action plan. But every fix has an ethical component, and four years after the WHO’s assessment, the ethical roadmap for addressing this medical emergency remains dangerously ill-defined.

Health-care policies that pursue long-term goals often have short-term costs for human, animal, and environmental wellbeing. For example, restricting antibiotic consumption in certain populations could lead to job losses for those prone to illness. Actions taken to prevent infections may also infringe on personal privacy, as epidemiologists seek to identify and track people who carry resistant bacteria. Controls may even require curbing individual freedoms, like accessing hospitals or getting on airplanes.

Moreover, capping antibiotic use could lead to higher drug prices, threatening access for those who need the medication. And, while many people might prefer a status quo approach that speeds up the development of new antibiotics while leaving current consumption unchanged, this solution brings its own set of ethical considerations – such as how and when to reduce the length of clinical trials.

For all of these reasons, ethicists, health-care researchers, and social scientists have begun to examine how best to ensure that strategies for tackling drug resistance are ethically responsible. In 2015, the year after the release of the WHO’s report, the journal Public Health Ethics published a special issue devoted entirely to this topic.

Then, in November 2017, the Centre for Antibiotic Resistance Research (CARe) at my own university held the first-ever major symposium on the topic, bringing together leading scholars in economics, ethics, law, policy, social science, and health care. The two-day conference provided a platform for the development of collaborative synergies, and the research output is scheduled to appear in the journal Bioethics.

These scholarly gatherings have helped to foster academic interest in the ethical considerations of drug resistance, but represent only a tiny fraction of what is needed to help the world safely navigate the looming moral minefield. Any effort to restrict antibiotic consumption, regulate the food and pharmaceutical industries, or change human behaviors – all strategies that are currently being discussed – will require complex ethical reflection and analysis.

The first ethical hurdle is to reach a consensus on how to characterize drug resistance. Many ethicists see it as a “collective action problem,” a public-health concern that must be addressed in an organized, holistic manner. There is less agreement, however, on what kind of collective action problem is it. Is it similar to other global challenges like climate change, poverty, or inequality? Or is it more of a national issue, best left to sovereign authorities? How we define the problem will determine what trade-offs people and governments are willing to make.

Several participants at the CARe symposium highlighted this problem, noting that to implement drug-resistance strategies successfully, governments must strike a balance between global medical responsibility and local public good. One idea that has been proposed is to tax meat produced with antibiotics, an approach that could move animal agriculture in a more sustainable direction. While meat costs might climb, drug resistance in livestock would decline, as would adverse environmental effects. The ethical question is whether a solution like this would be fair on a global level, especially if the result is more expensive food.

As drug resistance-related challenges become more urgent, one might think that ethical debates are an unaffordable luxury. But, given the risks implied by deploying ill-considered solutions, careful consideration of the ethical implications of drug resistance strategies is essential.

Christian Munthe is a bioethicist and professor of philosophy at the University of Gothenburg.

By Christian Munthe

Tackling Science’s Gender-Parity Problem

LONDON – Two years ago, the United Nations designated February 11 the International Day of Women and Girls in Science. As we approach another commemoration, it is worth reflecting on female scholars’ countless contributions to science and technology.


But even more important is to consider why the UN acted in the first place. Simply put, women have long suffered in their pursuit of science careers, and the global scientific community must recommit to making them full partners in the quest for human knowledge.

Achieving gender parity would yield an enormous payoff for scientific discovery. Last year marked the 150th anniversary of the birth of Poland’s Marie Curie, one of the greatest scientists of all time. Curie was the first woman to win a Nobel Prize, the only woman to win two, and the only person to do so in two different sciences: physics in 1903 and chemistry in 1911.

Curie faced immense gender barriers during her career. In 1891, having been blocked from studying or working at universities in Poland, she joined the Sorbonne in Paris. Working with her husband, Pierre Curie, she conducted groundbreaking research on radiation. But when their work was nominated for the 1903 physics prize, her name was omitted. After her husband complained, the Nobel committee made an exceptional concession, and she was added to the award (she and her husband shared it with the French physicist Henri Becquerel).

Much has changed since then, and gender equality in the sciences has greatly improved. For example, the L'Oréal-UNESCO For Women in Science awards program, which honors female researchers working in the life and physical sciences, is now in its 20th year. Past winners have included experts in everything from quantum electronics to molecular biology (one of us, Vivian Wing-Wah Yam, won the prize in 2011).

Nonetheless, gender parity in the sciences remains a distant goal. Evidence suggests that bias is endemic in nearly every scientific field, and that institutional discrimination is still crippling careers and impeding scientific innovation.

The gender gap in science begins at a young age. As early as elementary school, girls are discouraged from pursuing careers in math and science, and this bias continues into university, where fewer women study for PhDs, hold research positions, or join the faculty. Globally, less than 30% of the world’s researchers are female.

Even for women who do get on the academic ladder, the climb is slowed by inadequate opportunities for grants, promotions, and leadership. One measure of this is seen in publication rates. Producing scholarly papers is critical for career advancement, but studies show that women publish fewer articles than their male colleagues, are less likely to be primary authors, and rarely serve as reviewers.

Worse, sexual harassment is prevalent in science-related academia and industry. Like many other professions, the science community needs to do more to address the issue in a meaningful way.

The cumulative effect of this discrimination is to rob the world of talented female scientists. Even among those with science-related degrees, fewer women than men remain in their field of study or are ever recognized for their work. Of the 599 Nobel Prizes awarded in the sciences since 1901, only 18 have gone to women, just 3% of the total.

Major changes – from grade schools to technology companies – are needed to build gender parity into science-related fields. Easy fixes can target individual industries. For example, bringing more female editors into the field of science publishing could raise the percentage of women appearing in peer-reviewed publications.

Other adjustments would have broader reach. A recent study of grant programs in Canada found that when referees are trained to recognize gender discrimination, funding outcomes naturally rebalance. Launching similar training efforts in other countries could have a profound impact on how science grants are awarded – and how many are awarded to women.

And yet, while individual tweaks can be beneficial, the world’s scientific community must move beyond piecemeal solutions to tackle gender bias in a more holistic way. Academic institutions, research centers, and science-related employers must commit to diversifying their bases of recruitment, and improve efforts to recognize and respond to discrimination. Moreover, through improving cultural competencies (the ability to recognize and respond to biases), organizations can create environments that are equitable and physically, spiritually, socially, and emotionally safe for both women and men.

Achieving gender equity, diversity, and inclusion in the sciences will require cooperation across many sectors. It will also take time. But, 150 years after Marie Curie’s birth, it is clear that action is long overdue.

That is why this February 11, as the world observes the third International Day of Women and Girls in Science, scientists from across the disciplines should take a moment to reflect on how far their female colleagues have come, and to remember how far we still have to go.

Stephen Matlin is an adjunct professor at the Institute of Global Health Innovation, Imperial College London. Vivian Wing-Wah Yam is Professor of Chemistry and Energy at the University of Hong Kong. Henning Hopf is a professor in the Institute of Organic Chemistry, Technische Universität Braunschweig. Alain Krief is Executive Director of the International Organization for Chemical Sciences in Development, Emeritus Professor in the Chemistry Department at Belgium’s Namur University, and an adjunct professor in the HEJ Research Institute of Chemistry, University of Karachi. Goverdhan Mehta is University Distinguished Professor and Chair in the School of Chemistry at the University of Hyderabad.

By Stephen Matlin, Vivian Wing-Wah Yam, Henning Hopf, Alain Krief, and Goverdhan Mehta

Liberia’s Presidential Legacy

DALLAS – Last month, Liberians witnessed something remarkable: a peaceful transfer of power in their country. After 12 years in office, Ellen Johnson Sirleaf stepped down as president, and former soccer star George Weah was inaugurated. It was the first time since 1944 that one democratically elected leader voluntarily made way for another.


To be sure, one well-managed election does not a stable democracy make. But in a region often associated with coups d’état and authoritarian rule, Liberia’s progress is worthy of celebration, as it can help to lay the foundation for a better, more democratic future. As representative government in Liberia enters a new phase of maturity, it is worth reflecting on how the country got here.

In 2005, when Sirleaf was elected, Liberia was a shambles, following 25 years of civil war and dictatorship. Few predicted that Sirleaf, Africa’s first democratically elected female head of state, could set her country on a better path. But, though her tenure was not without challenges – from the Ebola crisis to endemic corruption and fiscal difficulties – she did just that.

Perhaps Sirleaf’s defining legacy will be the improved rights of Liberian women and girls. Female voters powered Sirleaf to victory; the Women of Liberia Mass Action for Peace movement, which helped end Liberia’s second civil war in 2003, was among her strongest political backers. During her tenure, Sirleaf increased the participation of women in all aspects of society and aimed to ensure greater rights and protections for women and girls. Sirleaf, along with another Liberian, Leymah Gbowee, and a Yemeni, Tawakkol Karman, was awarded the 2011 Nobel Peace Prize largely for her work in this field.

The empowerment of women was only one area where Sirleaf made gains. She recognized that peace, strong governance, and growth would be the pillars of her country’s future. So she spearheaded efforts to secure justice for human-rights abuses that had occurred during the civil wars; reignited the economy through debt relief; rebuilt war-torn infrastructure; improved access to clean water and sanitation; and strengthened Liberia’s democratic institutions, including by enacting the country’s first Freedom of Information Act. Much work remains to be done, but this progress should not be underestimated.

As Sirleaf noted in her Nobel Peace Prize acceptance speech, “rebuilding a nation nearly destroyed by war and plunder” was her greatest political priority. While “there was no roadmap for post-conflict transformation,” she continued, “we knew that we could not let our country slip back into the past.” Her government’s “greatest responsibility,” therefore, was to “keep the peace.”

Sirleaf’s leadership served as a catalyst for a more stable, prosperous, and freer Liberia. In its 2017 “Freedom in the World” report, Freedom House concluded that Liberia has made significant progress in human and political rights. Sirleaf’s commitment to democratic ideals were instrumental in enabling these gains. They helped her country to secure greater international support.

For example, in October 2015, the United States awarded Liberia a $257 million grant for energy and infrastructure initiatives under the Millennium Challenge Corporation. This type of assistance is granted only to countries that show improvements in democratic governance and economic development.

Still, Liberia faces many daunting challenges. Human development indicators, such as life expectancy and per capita income, remain well below the regional average. A sluggish economy and rising inflation are also testing economic stability. Liberia’s new president will need to focus on these and other issues to ensure continued progress.

For starters, Weah should continue Sirleaf’s work of investing in women and girls; after all, improving gender equality is a proven catalyst for enhancing national prosperity. Moreover, Weah and his government would do well to embrace Sirleaf’s statement that “poverty, illiteracy, disease, and inequality do not belong in the twenty-first century.” Empowering all Liberians is the only way to keep the country on its positive trajectory.

Finally, like Sirleaf, Weah’s government must embrace coalition-building and active engagement at the local, regional, and global levels. Cooperation will be essential to strengthen existing partnerships and to open new avenues for development.

Liberia now has in place a democratic system that will support continued progress. But that system cannot be taken for granted. On the contrary, Weah must build on the good work of his predecessor to strengthen and sustain it. By embracing and deepening their country’s democracy, Liberians and their new president can ensure a better future.

Natalie Gonnella-Platts is Deputy Director of the Women’s Initiative at the George W. Bush Institute. Lindsay Lloyd is Deputy Director of the Human Freedom Initiative at the George W. Bush Institute.

By Natalie Gonnella-Platts and Lindsay Lloyd

Donald Trump Is Playing to Lose

BERKELY – America certainly has a different kind of president than what it is used to. What distinguishes Donald Trump from his predecessors is not just his temperament and generalized ignorance, but also his approach to policymaking.


First, consider Bill Clinton, who in 1992 was, like Trump, elected without a majority of voters. Once in office, Clinton appealed to the left with fiscal-stimulus and health-care bills (both unsuccessful), but also tacked center with a pro-growth deficit-reduction bill. He appealed to the center right by concluding the North American Free Trade Agreement (NAFTA), which had been conceived under his Republican predecessors; and by signing a major crime bill. And he reappointed the conservative stalwart Alan Greenspan to chair the US Federal Reserve.

Clinton hoped to achieve three things with this “triangulation” strategy: to enact policies that would effectively address the country’s problems; to convince voters who hadn’t supported him that he was looking out for their interests, too; and to keep his own base intact.

In 2008, former President Barack Obama was elected with a popular majority. But, like Clinton, he moderated many of his positions once in office. He tacked to the center with technocratic financial-rescue and fiscal-stimulus plans. And he pushed through a market-oriented health-care bill modeled after legislation that Mitt Romney had enacted while serving as the Republican governor of Massachusetts.

Obama also appealed directly to the right with an (unsuccessful) attempt at a “grand bargain” to cut deficits and social spending. His market-oriented cap-and-trade plan to regulate greenhouse-gas emissions was almost indistinguishable from that of his Republican opponent in the 2008 presidential election, Arizona Senator John McCain. And he reappointed Ben Bernanke, originally nominated by Republican President George W. Bush, to chair the Fed.

Obama strove to represent not “red” or “blue” America, but “purple” America. He pursued cautious and technocratic policies that he hoped would attract Republican support. And when his own supporters objected, he reminded them that national unity and mutual respect, not narrow partisanship, would eventually bend the moral arc of the universe toward justice.

Trump, by contrast, won the presidency while losing the popular vote by a wide margin. Yet, once in office, he promptly appealed to right-wing white nativists by issuing his promised travel ban against Muslims. He tried to destroy the 2010 Affordable Care Act (Obamacare) without having a plan for what would replace it. He again appealed to the nativist right by dismissing police brutality against African-Americans, and by describing white supremacists as “very fine people.” And he finished his first year by signing legislation that cuts taxes for the rich, but does little to win over anyone else.

This is not normal politics. Trump clearly has no interest in unifying the country or enacting policies that will actually work. He has not given the majority of Americans who oppose him any reason to change their minds, nor has he counseled his base on the need for durable policies rather than evanescent legislative victories. Most importantly, he has done nothing to help himself get re-elected.

Of course, the same now applies to many Republicans. Here in California last year, we were treated to a remarkable spectacle in which the state’s Republican delegation in the US House of Representatives did not even bother to argue for a tax package that would benefit their constituents. It was as if they had already given up on winning re-election, and were all looking forward to leaving Congress to take high-paying jobs as lobbyists.

According to the Trump administration, its next legislative priority is infrastructure. That sounds like an issue where Trump could tack left, by devising a plan with egalitarian distributional effects and evidenced-based provisions to boost economic growth.

But we shouldn’t count on that outcome. The Trump administration doesn’t seem to have any coherent policy-design process. There have been no hearings or white papers to assess the costs and benefits of various infrastructure proposals. Nor have there been any discussions with lawmakers to establish a rough consensus upon which to base legislation. As with the travel ban and the attempt to repeal Obamacare, there has been no public deliberation whatsoever. All we have are the president’s tweets.

Back in 1776, Adam Smith argued that, in a system founded on “natural liberty,” the government’s three tasks are to provide national defense, ensure public safety and the enforcement of property rights and contracts, and supply infrastructure. According to Smith, the government has the duty to “[erect and maintain] certain public works and certain public institutions, which it can never be for the interest of any individual, or small number of individuals, to erect and maintain.”

To Smith, the reason why governments must take up the task of building infrastructure was clear: “the profit could never repay the expense to any individual or small number of individuals, though it may frequently do much more than repay it to a great society.” Today, we know that public goods actually can be made profitable, but only by granting monopolies, which comes at a high cost to society.

Unfortunately, Trump’s staff does not seem to have gotten Smith’s memo about good government. The administration will most likely propose an infrastructure program based on public subsidies for private investors, who will then select projects from which they can profit by charging monopoly prices. The plan will be well-received at Fox News, and possibly even by pundits at The New York Times, who might stroke their chins and lament that the Democrats are rejecting Trump’s open hand on infrastructure.

But, unlike Clinton and Obama, Trump will have shown yet again that he does not intend to be the president of most, let alone all, Americans. Rather than use the opportunity provided by a debate over infrastructure to advance the cause of national unity, he will instead push the US further toward kleptocracy.

J. Bradford DeLong, a former deputy assistant US Treasury secretary, is Professor of Economics at the University of California at Berkeley and a research associate at the National Bureau of Economic Research. 

By J. Bradford DeLong

When Will Tech Disrupt Higher Education?

CAMBRIDGE – In the early 1990s, at the dawn of the Internet era, an explosion in academic productivity seemed to be around the corner. But the corner never appeared. Instead, teaching techniques at colleges and universities, which pride themselves on spewing out creative ideas that disrupt the rest of society, have continued to evolve at a glacial pace.


Sure, PowerPoint presentations have displaced chalkboards, enrollments in “massive open online courses” often exceed 100,000 (though the number of engaged students tends to be much smaller), and “flipped classrooms” replace homework with watching taped lectures, while class time is spent discussing homework exercises. But, given education’s centrality to raising productivity, shouldn’t efforts to reinvigorate today’s sclerotic Western economies focus on how to reinvent higher education?

One can understand why change is slow to take root at the primary and secondary school level, where the social and political obstacles are massive. But colleges and universities have far more capacity to experiment; indeed, in many ways, that is their raison d’être.

For example, what sense does it make for each college in the United States to offer its own highly idiosyncratic lectures on core topics like freshman calculus, economics, and US history, often with classes of 500 students or more? Sometimes these giant classes are great, but anyone who has gone to college can tell you that is not the norm.

At least for large-scale introductory courses, why not let students everywhere watch highly produced recordings by the world’s best professors and lecturers, much as we do with music, sports, and entertainment? This does not mean a one-size-fits-all scenario: there could be a competitive market, as there already is for textbooks, with perhaps a dozen people dominating much of the market.

And videos could be used in modules, so a school could choose to use, say, one package to teach the first part of a course, and a completely different package to teach the second part. Professors could still mix in live lectures on their favorite topics, but as a treat, not as a boring routine.

A shift to recorded lectures is only one example. The potential for developing specialized software and apps to advance higher education is endless. There is already some experimentation with using software to help understand individual students’ challenges and deficiencies in ways that guide teachers on how to give the most constructive feedback. But so far, such initiatives are very limited.

Perhaps change in tertiary education is so glacial because the learning is deeply interpersonal, making human teachers essential. But wouldn’t it make more sense for the bulk of faculty teaching time to be devoted to helping students engage in active learning through discussion and exercises, rather than to sometimes hundredth-best lecture performances?

Yes, outside of traditional brick-and-mortar universities, there has been some remarkable innovation. The Khan Academy has produced a treasure trove of lectures on a variety of topics, and it is particularly strong in teaching basic mathematics. Although the main target audience is advanced high school students, there is a lot of material that college students (or anyone) would find useful.

Moreover, there are some great websites, including Crash Course and Ted-Ed, that contain short general education videos on a huge variety of subjects, from philosophy to biology to history. But while a small number of innovative professors are using such methods to reinvent their courses, the tremendous resistance they face from other faculty holds down the size of the market and makes it hard to justify the investments needed to produce more rapid change.

Let’s face it, college faculty are no keener to see technology cut into their jobs than any other group. And, unlike most factory workers, university faculty members have enormous power over the administration. Any university president that tries to run roughshod over them will usually lose her job long before any faculty member does.

Of course, change will eventually come, and when it does, the potential effect on economic growth and social welfare will be enormous. It is difficult to suggest an exact monetary figure, because, like many things in the modern tech world, money spent on education does not capture the full social impact. But even the most conservative estimates suggest the vast potential. In the US, tertiary education accounts for over 2.5% of GDP (roughly $500 billion), and yet much of this is spent quite inefficiently. The real cost, though, is not the squandered tax money, but the fact that today’s youth could be learning so much more than they do.

Universities and colleges are pivotal to the future of our societies. But, given impressive and ongoing advances in technology and artificial intelligence, it is hard to see how they can continue playing this role without reinventing themselves over the next two decades. Education innovation will disrupt academic employment, but the benefits to jobs everywhere else could be enormous. If there were more disruption within the ivory tower, economies just might become more resilient to disruption outside it.

Kenneth Rogoff, a former chief economist of the IMF, is Professor of Economics and Public Policy at Harvard University.

By Kenneth Rogoff

Burying the Legal Ghosts of Brazil’s Hyperinflation

SÃO PAULO – A decades-old legal fight between consumers and financial institutions over the impact of Brazil’s economic policies of the 1980s and 1990s is nearing conclusion. In December, lawyers representing claimants presented Brazil’s Supreme Federal Court with a request to ratify a settlement reached with the banks.


If the court approves the deal, the settlement would put billions of reals into the pockets of savers. But more than a long-awaited payday for some one million claimants, the court-ordered restitution would also mark an official end to Brazil’s seemingly endless war on hyperinflation.

During the late 1980s and early 1990s, the Brazilian government struggled to stabilize the country’s economy and currency. At the height of the crisis, annual inflation reached 2,477%; at that rate, prices for food and household goods increased daily. A string of unsuccessful policies had accelerated inflation in public and private contracts, affecting wages, rents, and bank deposits. Some highly controversial measures – such as a move in 1990 to commandeer deposits – briefly halted inflation but contributed to a deep recession.

The implementation in 1994 of Plano Real (“The Real Plan”) brought some relief, by introducing a set of stabilization measures that created the current currency. But these were also troubled times for Brazil politically, as the country was still consolidating its transition to democracy after 20 years of military dictatorship. Hyperinflation and its social consequences amounted to an economic gauntlet for Brazil’s new leaders, and rising inequality posed a severe threat to the country’s democratic hopes.

Although politicians eventually navigated the currency crisis, the economic tensions never really vanished, even after the 1994 plan took hold. Many of the failed monetary measures cost people significant savings; as a result, nearly every stabilization plan from that period was litigated, including the successful Plano Real. Lawsuits in that case are still pending at the Supreme Court.

Many in Brazil, especially central bankers, have warned that continued litigation of past monetary policies could result in a breakdown of the current financial system, leading to new levels of economic dysfunction, such as insufficient credit.

That possibility seems to have had a sobering effect on the country’s top court. Historically, the Supreme Court has sided with consumers in cases related to inflation adjustments on savings deposits. But the justices have also tempered their decisions in cases with sweeping economic implications. And, in the case of the lawsuits implicating economic policy, no final decision has been reached, despite their being on the docket for years. That is why the settlement reached in December – which was ratified by the attorney general – could be interpreted as giving the court a way out.

Full details of the settlement have not been disclosed. But it seems increasingly clear that after nearly three decades of legal wrangling, consumers and financial institutions have agreed that a negotiated approach is the only way forward.

As a result, whatever the final tally, banks will likely get off easier than authorities had feared. According to reports, the final settlement will be in the vicinity of R$12 billion ($3.8 billion), a far cry from a previous central bank estimate of R$150 billion, or the R$341 billion predicted by Febraban, the Brazilian Federation of Banks.

If this long legal journey does indeed end this year, updates to the Brazilian Code of Civil Procedure will deserve much of the credit. Changes implemented in March 2016 encourage litigants to pursue mediation and arbitration, a move meant to reduce the tens of millions of civil lawsuits that are currently clogging the courts. These revisions could also help other old legal cases move toward conclusion.

An approved settlement would mark the end of a complex and divisive legal fight that for too long has prevented Brazil’s leaders from putting a legacy of failed economic initiatives behind them. That would be welcome news for Brazil’s consumers, financial institutions, and overall economic health.

Camila Villard Duran is a professor of law at the University of São Paulo, and a former Oxford-Princeton Global Leaders Fellow at the Woodrow Wilson School of Public and International Affairs. Arnoldo Wald is a professor of law at Rio de Janeiro State University.

By Camila Villard Duran and Arnoldo Wald

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…