MIT News - Economics - Abdul Latif Jameel Poverty Action Lab (J-PAL) - Behavioral economics - Game theory MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community. en Tue, 03 Mar 2020 19:01:01 -0500 QS World University Rankings rates MIT No. 1 in 12 subjects for 2020 Institute ranks second in five subject areas. Tue, 03 Mar 2020 19:01:01 -0500 MIT News Office <p>MIT has been honored with 12 No. 1 subject rankings in the QS World University Rankings for 2020.</p> <p>The Institute received a No. 1 ranking in the following QS subject areas: Architecture/Built Environment; Chemistry; Computer Science and Information Systems; Chemical Engineering; Civil and Structural Engineering; Electrical and Electronic Engineering; Mechanical, Aeronautical and Manufacturing Engineering; Linguistics; Materials Science; Mathematics; Physics and Astronomy; and Statistics and Operational Research.</p> <p>MIT also placed second in five subject areas: Accounting and Finance; Biological Sciences; Earth and Marine Sciences; Economics and Econometrics; and Environmental Sciences.</p> <p>Quacquarelli Symonds Limited subject rankings, published annually, are designed to help prospective students find the leading schools in their field of interest. Rankings are based on research quality and accomplishments, academic reputation, and graduate employment.</p> <p>MIT has been ranked as the No. 1 university in the world by QS World University Rankings for eight straight years.</p> Afternoon light streams into MIT’s Lobby 7.Image: Jake BelcherRankings, Computer science and technology, Linguistics, Chemical engineering, Civil and environmental engineering, Mechanical engineering, Chemistry, Materials science, Mathematics, Physics, Economics, EAPS, Business and management, Accounting, Finance, DMSE, School of Engineering, School of Science, School of Architecture and Planning, Sloan School of Management, School of Humanities Arts and Social Sciences, Electrical Engineering & Computer Science (eecs), Architecture, Biology, Aeronautical and astronautical engineering The case for economics — by the numbers A multidecade study shows economics increasingly overlaps with other disciplines, and has become more empirical in nature. Tue, 03 Mar 2020 00:00:00 -0500 Peter Dizikes | MIT News Office <p>In recent years, criticism has been levelled at economics for being insular and unconcerned about real-world problems. But a new study led by MIT scholars finds the field increasingly overlaps with the work of other disciplines, and, in a related development, has become more empirical and data-driven, while producing less work of pure theory.</p> <p>The study examines 140,000 economics papers published over a 45-year span, from 1970 to 2015, tallying the “extramural” citations that economics papers received in 16 other academic fields — ranging from other social sciences such as sociology to medicine and public health. In seven of those fields, economics is the social science most likely to be cited, and it is virtually tied for first in citations in another two disciplines.</p> <p>In psychology journals, for instance, citations of economics papers have more than doubled since 2000. Public health papers now cite economics work twice as often as they did 10 years ago, and citations of economics research in fields from operations research to computer science have risen sharply as well.</p> <p>While citations of economics papers in the field of finance have risen slightly in the last two decades, that rate of growth is no higher than it is in many other fields, and the overall interaction between economics and finance has not changed much. That suggests economics has not been unusually oriented toward finance issues — as some critics have claimed since the banking-sector crash of 2007-2008. And the study’s authors contend that as economics becomes more empirical, it is less dogmatic.</p> <p>“If you ask me, economics has never been better,” says Josh Angrist, an MIT economist who led the study. “It’s never been more useful. It’s never been more scientific and more evidence-based.”</p> <p>Indeed, the proportion of economics papers based on empirical work — as opposed to theory or methodology — cited in top journals within the field has risen by roughly 20 percentage points since 1990.</p> <p>The paper, “Inside Job or Deep Impact? Extramural Citations and the Influence of Economic Scholarship,” appears in this month’s issue of the <em>Journal of Economic Literature</em>.</p> <p>The co-authors are Angrist, who is the Ford Professor of Economics in MIT Department of Economics; Pierre Azoulay, the International Programs Professor of Management at the MIT Sloan School of Management; Glenn Ellison, the Gregory K. Palm Professor Economics and associate head of the Department of Economics; Ryan Hill, a doctoral candidate in MIT’s Department of Economics; and Susan Feng Lu, an associate professor of management in Purdue University’s Krannert School of Management.</p> <p><strong>Taking critics seriously</strong></p> <p>As Angrist acknowledges, one impetus for the study was the wave of criticism the economics profession has faced over the last decade, after the banking crisis and the “Great Recession” of 2008-2009, which included the finance-sector crash of 2008. The paper’s title alludes to the film “Inside Job” — whose thesis holds that, as Angrist puts it, “economics scholarship as an academic enterprise was captured somehow by finance, and that academic economists should therefore be blamed for the Great Recession.”</p> <p>To conduct the study, the researchers used the Web of Science, a comprehensive bibliographic database, to examine citations between 1970 and 2015. The scholars developed machine-learning techniques to classify economics papers into subfields (such as macroeconomics or industrial organization) and by research “style” —&nbsp; meaning whether papers are primarily concerned with economic theory, empirical analysis, or econometric methods.</p> <p>“We did a lot of fine-tuning of that,” says Hill, noting that for a study of this size, a machine-learning approach is a necessity.</p> <p>The study also details the relationship between economics and four additional social science disciplines: anthropology, political science, psychology, and sociology. Among these, political science has overtaken sociology as the discipline most engaged with economics. Psychology papers now cite economics research about as often as they cite works of sociology.</p> <p>The new intellectual connectivity between economics and psychology appears to be a product of the growth of behavioral economics, which examines the irrational, short-sighted financial decision-making of individuals — a different paradigm than the assumptions about rational decision-making found in neoclassical economics. During the study’s entire time period, one of the economics papers cited most often by other disciplines is the classic article “Prospect Theory: An Analysis of Decision under Risk,” by behavioral economists Daniel Kahneman and Amos Tversky.</p> <p>Beyond the social sciences, other academic disciplines for which the researchers studied the influence of economics include four classic business fields — accounting, finance, management, and marketing — as well as computer science, mathematics, medicine, operations research, physics, public health, and statistics.</p> <p>The researchers believe these “extramural” citations of economics are a good indicator of economics’ scientific value and relevance.</p> <p>“Economics is getting more citations from computer science and sociology, political science, and psychology, but we also see fields like public health and medicine starting to cite economics papers,” Angrist says. “The empirical share of the economics publication output is growing. That’s a fairly marked change. But even more dramatic is the proportion of citations that flow to empirical work.”</p> <p>Ellison emphasizes that because other disciplines are citing empirical economics more often, it shows that the growth of empirical research in economics is not just a self-reinforcing change, in which scholars chase trendy ideas. Instead, he notes, economists are producing broadly useful empirical research. &nbsp;</p> <p>“Political scientists would feel totally free to ignore what economists were writing if what economists were writing today wasn’t of interest to them,” Ellison says. “But we’ve had this big shift in what we do, and other disciplines are showing their interest.”</p> <p>It may also be that the empirical methods used in economics now more closely match those in other disciplines as well.</p> <p>“What’s new is that economics is producing more accessible empirical work,” Hill says. “Our methods are becoming more similar … through randomized controlled trials, lab experiments, and other experimental approaches.”</p> <p>But as the scholars note, there are exceptions to the general pattern in which greater empiricism in economics corresponds to greater interest from other fields. Computer science and operations research papers, which increasingly cite economists’ research, are mostly interested in the theory side of economics. And the growing overlap between psychology and economics involves a mix of theory and data-driven work.</p> <p><strong>In a big country</strong></p> <p>Angrist says he hopes the paper will help journalists and the general public appreciate how varied economics research is.</p> <p>“To talk about economics is sort of like talking about [the United States of] America,” Angrist says. “America is a big, diverse country, and economics scholarship is a big, diverse enterprise, with many fields.”</p> <p>He adds: “I think economics is incredibly eclectic.”</p> <p>Ellison emphasizes this point as well, observing that the sheer breadth of the discipline gives economics the ability to have an impact in so many other fields. &nbsp;</p> <p>“It really seems to be the diversity of economics that makes it do well in influencing other fields,” Ellison says. “Operations research, computer science, and psychology are paying a lot of attention to economic theory. Sociologists are paying a lot of attention to labor economics, marketing and management are paying attention to industrial organization, statisticians are paying attention to econometrics, and the public health people are paying attention to health economics. Just about everything in economics is influential somewhere.”</p> <p>For his part, Angrist notes that he is a biased observer: He is a dedicated empiricist and a leading practitioner of research that uses quasiexperimental methods. His studies leverage circumstances in which, say, policy changes random assignments in civic life allow researchers to study two otherwise similar groups of people separated by one thing, such as access to health care.</p> <p>Angrist was also a graduate-school advisor of Esther Duflo PhD ’99, who won the Nobel Prize in economics last fall, along with MIT’s Abhijit Banerjee — and Duflo thanked Angrist at their Nobel press conference, citing his methodological influence on her work. Duflo and Banerjee, as co-founders of MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL), are advocates of using field experiments in economics, which is still another way of producing empirical results with policy implications.</p> <p>“More and more of our empirical work is worth paying attention to, and people do increasingly pay attention to it,” Angrist says. “At the same time, economists are much less inward-looking than they used to be.”</p> A new study examines 140,000 economics papers published from 1970 to 2015, tallying the “extramural” citations that economics papers received in 16 other academic fields, including sociology, medicine, and public health.Image: Christine Daniloff, MITEconomics, Sloan School of Management, School of Humanities Arts and Social Sciences, Research, History of science, Social sciences How door-to-door canvassing slowed an epidemic Study finds that in Liberia, volunteers limited damage from Ebola by distributing information within their own communities. Wed, 26 Feb 2020 23:59:59 -0500 Peter Dizikes | MIT News Office <p>Liberia was the epicenter of a high-profile Ebola outbreak in 2014-15, which led to more than 10,000 deaths in West Africa. But for all the devastation the illness caused, it could have been worse without an innovative, volunteer-based outreach program Liberia’s government deployed in late 2014.</p> <p>Now, a study co-authored by an MIT professor shows how much that program, consisting of door-to-door canvassing by community volunteers, spread valuable information and changed public practices during the epidemic. The findings also demonstrate how countries with minimal resources can both fight back against epidemics and gain public trust in difficult circumstances. &nbsp;</p> <p>“Mediated [volunteer-based] government outreach had a positive impact on all of the [health] outcomes we measured,” says Lily Tsai, a professor of political science at MIT and co-author of a new paper detailing the study’s findings. “People knew more [about Ebola], had a more factual understanding of the epidemic, and were more willing to comply with government control measures. And downstream, they’re more likely to trust government institutions.”</p> <p>Indeed, after talking to canvassers, residents of Monrovia, Liberia’s capital, were 15 percentage points more supportive of disease control policies, 10 percentage points less likely to violate a ban on public gatherings (to limit the spread of Ebola), 26 percentage points more likely to support victims’ burials by government workers, and 9 percentage points more likely to trust Liberia’s Ministry of Health, among other outcomes. They were also 10 percentage points more likely to use hand sanitizer.</p> <p>Intriguingly, the volunteer-based outreach program succeeded after an earlier 2014 campaign, using Ministry of Health staff, was abandoned, having been “met with disbelief and outright violence,” as the new paper states.</p> <p>“There’s often an assumption that government outreach doesn’t work,” says Tsai, the Ford Professor of Political Science at MIT. “What we find is that it does work, but it really matters how that government outreach is conducted and structured.”</p> <p>The research shows that, crucially, 30 percent of the people who spoke with canvassers already knew those volunteers, adding a layer of social trust to the program. And all volunteers canvassed in communities where they lived.</p> <p>“They were building interpersonal trust and enabling people to hold them accountable for any misinformation,” Tsai says. “They were like guarantors for a loan. It’s a way of saying, ‘You can trust me. I’m going to co-sign for the government. I’m going to guarantee it.’”</p> <p>The paper, “Building Credibility and Cooperation in Low-Trust Settings: Persuasion and Source Accountability in Liberia During the 2014-2015 Ebola Crisis,” appears in advance online form in the journal <em>Comparative Political Studies.</em></p> <p>In addition to Tsai, the authors are Benjamin S. Morse PhD ’19, a senior training manager and researcher at MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL), and Robert A. Blair, an assistant professor of political science and international and public affairs at Brown University.&nbsp;&nbsp;</p> <p><strong>When “costly signals” build confidence</strong></p> <p>Liberia faced many challenges while responding to the Ebola crisis. The nation’s brutal civil wars, from 1989 to 2003, stripped away much of the government’s functionality, and while the country has since taken major steps toward stability, there is still deep and widespread suspicion about government.</p> <p>“In Liberia, you have a postconflict setting where citizens already mistrusted the government strongly,” Tsai explains. “When citizens say they don’t trust the government, they sometimes think the government is actually out to hurt them, physically.”</p> <p>To conduct the study, the research conducted multiple public-opinion surveys in Liberia in 2014 and 2015, and added 80 in-depth interviews with government leaders and residents in 40 randomly sampled communities in Monrovia.</p> <p>To be sure, Ebola was a substantial problem in Liberia. Overall, there were 10,678 reported cases of Ebola and 4,810 deaths attributed to the illness. In June 2014, the surveys showed, 38 percent of Monrovia residents thought the government’s statements about Ebola constituted a “lie” designed to generate more funding from outside aid groups.</p> <p>However, the study found, once the volunteer-based program got underway, canvassers were able to not only reach large numbers of residents but persuade residents to believe what they were saying.</p> <p>While knocking on doors in their own communities, the canvassers were equipped with bibs and badges to identify themselves as program volunteers. They distributed information and had conversations with other residents, and even offered their own contact information to people — a significant (and potentially risky) gesture providing a form of accountability to other citizens.</p> <p>“A large part of what worked was that the outreach workers made it possible for the people that they were canvassing to track them down,” Tsai says. “That’s a pretty big commitment, what we call a ‘costly signal.’ Costly signals help build trust, because it’s not cheap talk.”</p> <p>Ultimately, while Ebola took a significant toll in Liberia, the volunteer campaign was “remarkably (and surprisingly) effective” in changing both behavior and attitudes, the paper concludes. &nbsp;</p> <p><strong>A case study in rebuilding trust? </strong></p> <p>Tsai believes that beyond the specific contours of Liberia’s Ebola response, there are larger issues that can be applied to the study of other countries. For one, while Liberia received significant aid in combatting Ebola from the World Health Organization and other nongovernmental organizations, she thinks the need for short-term aid should not preclude the long-term building of government capacity.</p> <p>“In the short term, it can make sense for external actors to substitute for the government,” Tsai says. “In the medium and long term we need to think about what that substitution might do to the trust and confidence that people have in their government.” For many people, she adds, “the assumption is the government either isn’t capable of doing it, or shouldn’t be doing it,” when in fact even underresourced governments can make progress on serious issues.</p> <p>Another point is that the Liberia case shows some ways governments can build confidence among their citizens.</p> <p>“In so many countries these days, trust in institutions, trust in authorities, trust in sources of information is so low, and in the past there’s been very little research on how to rebuild trust,” Tsai notes. “There’s a lot of research on what lowers trust.”</p> <p>However, she adds, “That’s what I think is special about this case. Trust was successfully built and constructed under a pretty unlikely set of circumstances.”</p> <p>Support for the study was provided by the International Growth Centre, the Omidyar Network, and the MIT School of Humanities, Arts, and Social Sciences.</p> A billboard in Liberia urges people to help stop the spread of Ebola, which was widespread in 2014-2015. A new study shows how a public awareness campaign helped people understand and cooperate with government efforts to control the disease. Photo: United Nations Mission in Liberia/ Emmanuel TobeyPolitical science, Africa, Public health, Developing countries, Disease, Disaster response, Government, Health, Medicine, Policy, School of Humanities Arts and Social Sciences, Abdul Latif Jameel Poverty Action Lab (J-PAL) The trouble with round numbers Study shows people prefer monthly payments in multiples of $100, even when it may cost them money. Thu, 20 Feb 2020 18:04:50 -0500 Peter Dizikes | MIT News Office <p>Do you have a monthly car payment, or a similar loan? Is each payment a nice round number, like $300? If so, you are hardly alone. But the appeal of that easy-to-remember payment figure may be costing you money.</p> <p>That’s one implication of a new study co-authored by an MIT economist, which shows how much consumers prefer monthly payment figures that are multiples of $100 — indeed, the number of monthly consumer payments at dollar figures just above such multiples drops by 16 percent. That likely makes monthly budgeting easier for people to calculate. But as the study also shows, people select potentially unfavorable loan terms as a result.</p> <p>“People budget with these round numbers and are trained to think in these monthly payment terms, going for the smallest monthly payment possible,” says MIT economist Christopher Palmer, co-author of a newly published paper detailing the results. “In particular, people really bunch around $200 or $300 or $400 a month in payments, which probably keeps them from overspending month-to-month, but it still might not be the best approach if it leads them to pay more interest over the length of the loan.”</p> <p>In fact, after digging into auto loans held by more than 2 million people, Palmer and his colleagues found that this is precisely the case: Given multiple financing options, many people smooth out the monthly figures, often at less money per payment, but with notably increased long-term costs.</p> <p>And while lower monthly payments are important for many, the study shows that borrowers often take such an approach when they can afford to pay more.</p> <p>“One thing we did [in this study] is look at data for people with a lot of debt capacity, a low debt-to-income ratio or high credit scores, and even those people seem to make decisions based on the monthly payment amount, while ignoring the total cost of the loan,” notes Palmer, the Albert and Jeanne Clear Career Development Professor in the MIT Sloan School of Management.</p> <p>The paper, “Monthly Payment Targeting and the Demand for Maturity,” appears in advance online form in the <em>Review of Financial Studies</em>. In addition to Palmer, the authors are Bronson Argyle and Taylor Nadauld, finance professors at Brigham Young University’s Marriott School of Business.</p> <p><strong>The natural experiment</strong></p> <p>To conduct the study, Palmer, Argyle, and Nadauld studied auto loan contracts held by 2.4 million borrowers, using 319 different lenders. The anonymized information came from a data company that works with lending firms. About 70 percent of the loans originated during the period 2012-2015, though some date to 2005. The researchers also examined another 1.3 million loan applications to get a further sense of borrowers’ fiscal circumstances.</p> <p>A key feature of the study — giving the research a quasiexperimental form — involves its use of FICO scores, a basic credit rating. FICO scores range from 300 to 850, but at certain thresholds, some banks offer markedly different loans to customers. When you have a FICO score of 700, which is close to average, you may qualify for much better terms than if your score is slightly lower.</p> <p>“If you have a 701 FICO score, at some banks you can get a much lower interest rate than someone with a 699 FICO score, even though if you asked the company that makes FICO scores, you’re basically the same person,” Palmer says. “But if a bank is treating similar consumers very differently, it becomes this nice laboratory for a natural experiment.”</p> <p>That is, if borrowers offered a variety of loan terms have the same tendency — such as winding up with round-number monthly payments — it suggests how strongly that tendency is rooted in the behavior of consumers. The phenomenon of round-number monthly payments was quickly obvious to the researchers.</p> <p>“This just jumped out of the data,” Palmer says. “You plot the data and people are bunching at hundred-dollar multiples.”</p> <p><strong>So what’s the problem, exactly? </strong></p> <p>To see why this can be a bad personal-finance habit — and clearly is, for some people — note that loans with lower monthly payments will have a greater long-term total cost, given initial purchases of the same amount.</p> <p>That point applies to a second finding of the study: When consumers are offered loan terms, they respond more to changes in the maturity — the length of the loan — than changes in the interest rate.</p> <p>As Palmer, Argyle, and Nadauld found, a bank offer of a 10 percent increase in loan length raises the chances that a borrower will accept the terms by 8.3 percentage points. But a bank offer of a 10 percent decrease in the interest rate raises the chances that a borrower will accept the terms by only 1 percentage point.</p> <p>Why is this? As it happens, changing the maturity of the loan has a bigger impact on monthly payments, which lets more consumers bring those payments to the magic levels of $200, $300, and $400.</p> <p>However, changes in loan length also bring higher long-term costs for consumers. Consider a $20,000 loan with a five-year maturity and a 5 percent interest rate. Increasing the maturity of that loan by one year lowers monthly payments by $55 but raises total interest paid by $546.</p> <p>In short, by having a nose for round numbers, consumers in the new study really are paying more for their cars.</p> <p><strong>Lessons about loans</strong></p> <p>That said, Palmer acknowledges that for different people, there is not necessarily one clear answer about which approach is better: lower monthly payments or a lower long-term repayment.</p> <p>“There’s not great theory on what you should do,” Palmer says. “What we would say you should do is figure out if that tradeoff worth it for you. If having lower payments today is worth paying more interest over the life of the loan, great, and there could be many reasons for that. But for many people I’d expect it could be better to try to get that loan over with more quickly with a shorter maturity.”</p> <p>Palmer hopes that one practical implication of the study would be getting people to recognize that there is a tradeoff in the first place.</p> <p>“Many people think monthly payments are the responsible way to talk about how much a car costs,” Palmer says. “But if you tell me you’re only going to spend $300 a month on a car, I can sell you a Mercedes if I make the car loan long enough.”</p> <p>As the study shows, a significant number of people are gravitating toward a rule of thumb — round-number payments — when doing homework and comparison-shopping about loans is more useful. Still, perhaps it is the nature of auto purchasing that leads people to underinvest in shopping for loans.</p> <p>“I get to test-drive the car,” Palmer says. “I don’t get to test-drive the loan.”</p> A new study shows that even when people are given multiple financing options, many smooth out the monthly payments into multiples of $100, often at less money per payment, but with notably increased long-term costs, even if they can afford to pay more.Behavioral economics, Finance, Research, Economics, Sloan School of Management A road map for artificial intelligence policy In a Starr Forum talk, Luis Videgaray, director of MIT’s AI Policy for the World Project, outlines key facets of regulating new technologies. Thu, 20 Feb 2020 14:08:04 -0500 Peter Dizikes | MIT News Office <p>The rapid development of artificial intelligence technologies around the globe has led to increasing calls for robust AI policy: laws that let innovation flourish while protecting people from privacy violations, exploitive surveillance, biased algorithms, and more.</p> <p>But the drafting and passing of such laws has been anything but easy.</p> <p>“This is a very complex problem,” Luis Videgaray PhD ’98, director of MIT’s AI Policy for the World Project, said in a lecture on Wednesday afternoon. “This is not something that will be solved in a single report. This has got to be a collective conversation, and it will take a while. It will be years in the making.”</p> <p>Throughout his talk, Videgaray outlined an ambitious vision of AI policy around the globe, one that is sensitive to economic and political dynamics, and grounded in material fairness and democratic deliberation.&nbsp;&nbsp;&nbsp;</p> <p>“Trust is probably the most important problem we have,” Videgaray said.</p> <p>Videgaray’s talk, “From Principles to Implementation: The Challenge of AI Policy Around the World,” was part of the Starr Forum series of public discussions about topics of global concern. The Starr Forum is hosted by MIT’s Center for International Studies. Videgaray gave his remarks to a standing-room crowd of over 150 in MIT’s Building E25.</p> <p>Videgaray, who is also a senior lecturer at the MIT Sloan School of Management, previously served as the finance minister of Mexico from 2012 to 2016, and foreign minister of Mexico from 2017 to 2018. Videgaray has also worked extensively in investment banking.</p> <p><strong>Information lag and media hype</strong></p> <p>In his talk, Videgaray began by outlining several “themes” related to AI that he thinks policymakers should keep in mind. These include government uses of AI; the effects of AI on the economy, including the possibility it could help giant tech firms consolidate market power; social responsibility issues, such as privacy, fairness, and bias; and the implications of AI for democracy, at a time when bots can influence political discussion. Videgaray also noted a “geopolitics” of AI regulation — from China’s comprehensive efforts to control technology to the looser methods used in the U.S.</p> <p>Videgaray observed that it is difficult for AI regulators to stay current with technology.</p> <p>“There’s an information lag,” Videgaray said. “Things that concern computer scientists today might become the concerns of policymakers a few years in the future.”</p> <p>Moreover, he noted, media hype can distort perceptions of AI and its applications. Here Videgaray contrasted the <a href="">recent report</a> of MIT’s Task Force on the Future of Work, which finds uncertainty about how many jobs will be replaced with technology, with a recent television documentary presenting a picture of automated vehicles replacing all truck drivers.</p> <p>“Clearly the evidence is nowhere near [indicating] that all jobs in truck driving, in long-distance driving, are going to be lost,” he said. “That is not the case.”</p> <p>With these general issues in mind, what should policymakers do about AI now? Videgaray offered several concrete suggestions. For starters: Policymakers should no longer just outline general philosophical principles, something that has been done many times, with a general convergence of ideas occurring.</p> <p>“Working on principles has very, very small marginal returns,” Videgaray said. “We can go to the next phase … principles are a necessary but not sufficient condition for AI policy. Because policy is about making hard choices in uncertain conditions.”</p> <p>Indeed, he emphasized, more progress can be made by having many AI policy decisions be particular to specific industries. When it comes to, say, medical diagnostics, policymakers want technology “to be very accurate, but you also want it to be explainable, you want it to be fair, without bias, you want the information to be secure … there are many objectives that can conflict with each other. So, this is all about the tradeoffs.”&nbsp;</p> <p>In many cases, he said, algorithm-based AI tools could go through a rigorous testing process, as required in some other industries: “Pre-market testing makes sense,” Videgaray said. “We do that for drugs, clinical trials, we do that for cars, why shouldn’t we do pre-market testing for algorithms?”</p> <p>But while Videgaray sees value in industry-specific regulations, he is not as keen on having a patchwork of varying state-level AI laws being used to regulate technology in the U.S.</p> <p>“Is this a problem for Facebook, for Google? I don’t think so,” Videgaray said. “They have enough resources to navigate through this complexity. But what about startups? What about students from MIT or Cornell or Stanford that are trying to start something, and would have to go through, at the extreme, 55 [pieces of] legislation?”</p> <p><strong>A collaborative conversation</strong></p> <p>At the event, Videgaray was introduced by Kenneth Oye, a professor of political science at MIT who studies technological regulation, and who asked Videgaray questions after the lecture. Among other things, Oye suggested U.S. states could serve as a useful laboratory for regulatory innovation.</p> <p>“In an area characterized by significant uncertainty, complexity, and controversy, there can be benefits to experimentation, having different models being pursued in different areas to see which works best or worse,” Oye suggested.</p> <p>Videgaray did not necessarily disagree, but emphasized the value of an eventual convergence in regulation. The U.S. banking industry, he noted, also followed this trajectory, until “eventually the regulation we have for finance [became] federal,” rather than determined by states.</p> <p>Prior to his remarks, Videgaray acknowledged some audience members, including his PhD thesis adviser at MIT, James Poterba, the Mitsui Professor of Economics, whom Videgaray called “one of the best teachers, not only in economics but about a lot of things in life.” Mexico’s Consul General in Boston, Alberto Fierro, also attended the event.</p> <p>Ultimately, Videgaray emphasized to the audience, the future of AI policy will be collaborative.</p> <p>“You cannot just go to a computer lab and say, ‘Okay, get me some AI policy,’” he stressed. “This has got to be a collective conversation.”</p> Luis Videgaray, director of MIT’s AI Policy for the World Project, talking at his Starr Forum lecture, hosted by the Center for International Studies, on February 19, 2020.Images: courtesy of Laura Kerwin, Center for International StudiesArtificial intelligence, Law, Ethics, Computer science and technology, Political science, Economics, Special events and guest speakers, Global, Center for International Studies, School of Humanities, Arts, and Social Sciences, Sloan School of Management Esther Duflo PhD ’99 to speak at 2020 Investiture of Doctoral Hoods and Degree Conferral Ceremony MIT professor and alumna shared the 2019 Nobel Prize in economics, which recognized collaborators’ “experimental approach to alleviating global poverty.” Thu, 20 Feb 2020 10:10:09 -0500 Institute Events <p>Esther Duflo PhD ’99, the Abdul Latif Jameel Professor of Poverty Alleviation and Development Economics at MIT, will be the guest speaker at the 2020 Investiture of Doctoral Hoods and Degree Conferral Ceremony on Thursday, May 28.</p> <p>“Professor Duflo is an impressive and inspiring leader — someone whose brilliant insight and relentless hard work have improved the lives of millions of people in poverty,” says Chancellor Cynthia Barnhart, host of the ceremony. “I have no doubt that hearing about her research and journey to the Nobel Prize — a path that was marked by hands-on problem-solving, collaboration, and selflessness — will capture the imaginations of our doctoral graduates. Her story will remind them of the impact MIT community members can have when we apply our minds, hands, and hearts to solving society’s most pressing challenges.”</p> <p>Duflo, known for her leadership and innovation in development economics, is a faculty member in the MIT Department of Economics, as well as co-founder and co-director of the Abdul Latif Jameel Poverty Action Lab (J-PAL). She is the second woman and the youngest person ever to receive the Nobel Prize in economic sciences.</p> <p>In her <a href="" target="_blank">Nobel speech</a>, given in December 2019 and titled “Field experiments and the practice of economics,” Duflo framed her own work to understand the economic lives of the poor in the context of a movement that leverages research in guiding social policy. She lauded the worldwide J-PAL network of antipoverty researchers, whose rigorous collection and evaluation of data has led to affecting policy in Africa, Asia, Europe, North America, and South America. Duflo — whose early ambitions included becoming a “changemaker” — said she hopes that J-PAL’s influence will foment a self-sustaining culture of learning within governments.</p> <p>The guest speaker is selected by a working group of doctoral students, from among nominees who hold a PhD or ScD from MIT. The group was unanimous and enthusiastic about Duflo’s nomination. Lily Bui, who will graduate in May with a PhD in urban studies and planning, participated in this year’s selection process. “Our committee is thrilled that Dr. Duflo will be our speaker,” she says. “We look forward to the wisdom that she will impart from both her extraordinary professional and personal experiences.”</p> <p>Following her study of history and economics at École Normale Supérieure in Paris, Duflo came to MIT, earning a PhD in economics and joining the faculty in 1999. The extraordinary list of her academic honors and prizes include the Princess of Asturias Award for Social Sciences (2015), the A.SK Social Science Award (2015), Infosys Prize (2014), the David N. Kershaw Award (2011), a John Bates Clark Medal (2010), and a MacArthur “Genius Grant” Fellowship (2009).&nbsp;With Abhijit Banerjee, the Ford International Professor of Economics at MIT, she wrote&nbsp;“Good Economics for Hard Times” (2019) and “Poor Economics: A Radical Rethinking of the Way to Fight Global Poverty” (2011), the latter of which won the <em>Financial Times</em> and Goldman Sachs Business Book of the Year Award in 2011 and has been translated into more than 17 languages. Duflo is the editor of the&nbsp;<em>American Economic Review</em>, a member of the National Academy of Sciences, and a Corresponding Fellow of the British Academy.</p> <p>Duflo’s passionate commitment to research toward the betterment of humankind led her to a momentous choice: She and co-laureates Banerjee and Professor Michael Kremer of Harvard University made news again in December 2019 for the decision to donate their combined Nobel prize money to support grants sponsored by the Weiss Fund for Research in Development Economics. The Associated Press <a href="" target="_blank">reported</a> that Duflo was inspired in this gift by Marie Curie, who used her Nobel money to buy a gram of radium for research. The three professors’ donation to the Weiss Fund will support development economics for years to come.</p> <p>Nancy Rose, head of the MIT Department of Economics and the Charles P. Kindleberger Professor of Applied Economics, praised Duflo’s teaching and her relationships with MIT students. She commented, “Esther is not only an extraordinary scholar and educator, but a much-loved mentor and advisor for generations of students.&nbsp;As MIT’s first alumna to be recognized with the Nobel Prize, I can think of no finer choice to acknowledge the promise of our current graduates and to inspire them on the launch of their careers.”</p> <p>The 2020 Investiture of Doctoral Hoods and Degree Conferral Ceremony will take place on Thursday, May 28 at 10:30 a.m. on Killian Court. The ceremony is open to family, friends, and mentors of doctoral candidates; no tickets are required.</p> Esther DufloImage: Peter Tenzer/Abdul Latif Jameel Poverty Action LabCommencement, Community, Special events and guest speakers, Administration, Chancellor, Economics, Nobel Prizes, Alumni/ae, School of Humanities Arts and Social Sciences, Abdul Latif Jameel Poverty Action Lab (J-PAL) The complex effects of colonial rule in Indonesia Evidence links Dutch-era sugar production and greater economic activity today. Wed, 05 Feb 2020 23:59:59 -0500 Peter Dizikes | MIT News Office <p>The areas of Indonesia where Dutch colonial rulers built a huge sugar-producing industry in the 1800s remain more economically productive today than other parts of the country, according to a study co-authored by an MIT economist.</p> <p>The research, focused on the Indonesian island of Java, introduces new data into the study of the economic effects of colonialism. The finding shows that around villages where the Dutch built sugar-processing factories from the 1830 through the 1870s, there is today greater economic activity, more extensive manufacturing, and even more schools, along with higher local education levels.</p> <p>“The places where the Dutch established [sugar factories] persisted as manufacturing centers,” says <a href="">Benjamin Olken</a>, a professor of economics at MIT and co-author of a <a href="">paper</a> detailing the results, which appears in the January issue of the <em>Review of Economic Studies</em>.</p> <p>The historical link between this “Dutch Cultivation System” and economic activity today has likely been transmitted “through a couple of forces,” Olken suggests. One of them, he says, is the building of “complementary infrastructure” such as railroads and roads, which remain in place in contemporary Indonesia.</p> <p>The other mechanism, Olken says, is that “industries grew up around the sugar [industry], and those industries persisted. And once you have this manufacturing environment, that can lead to other changes: More infrastructure and more schools have persisted in these areas as well.”</p> <p>To be sure, Olken says, the empirical conclusions of the study do not represent validation of Dutch colonial rule, which lasted from the early 1600s until 1949 and significantly restricted the rights and self-constructed political institutions of Indonesians. Dutch rule had long-lasting effects in many areas of civic life, and the Dutch Cultivation System used forced labor, for one thing.</p> <p>“This paper is not trying to argue that the [Dutch] colonial enterprise was a net good for the people of the time,” Olken emphasizes. “I want to be very clear on that. That’s not what we’re saying.”</p> <p>Instead, the study was designed to evaluate the empirical effects of the Dutch Cultivation System, and the outcome of the research was not necessarily what Olken would have anticipated.</p> <p>“The results are striking,” Olken says. “They just jump out at you.”</p> <p>The paper, “The Development Effects of the Extractive Colonial Economy: The Dutch Cultivation System in Java,” is co-authored by Olken and <a href="">Melissa Dell</a> PhD ’12, a professor of economics at Harvard University.</p> <p><strong>On the ground</strong></p> <p>Historically in Java, the most populous of Indonesia’s many islands, the main crop had been rice. Starting in the 1830s, the Dutch instituted a sugar-growing system in some areas, building 94 sugar-processing factories, as well as roads and railroads to transport materials and products.</p> <p>Generally the Dutch would export high-quality sugar from Indonesia while keeping lower-quality sugar in the country. Overall, the system became massive; at one point in the mid-19th century, sugar production in Java accounted for one-third of the Dutch government’s revenues and 4 percent of Dutch GDP. By one estimate, a quarter of the population was involved in the industry.</p> <p>In developing their research, Olken and Dell used 19th century data from government archives in the Netherlands, as well as modern data from Indonesia. The Dutch built the processing plants next to rivers in places with enough flat land to sustain extensive sugar crops; to conduct the study, the researchers looked at economic activity near sugar-processing factories and compared it with economic activity in similar areas that lacked factories.</p> <p>“In the 1850s, the Dutch spent four years on the ground collecting detailed information for the over 10,000 villages that contributed land and labor to the Cultivation System,” Dell notes. The researchers digitized those records and, as she states, “painstakingly merged them” with economic and demograhic records from the same locations today</p> <p>As the results show, places close to factories are 25-30 percentage points less agricultural in economic composition than those away from factories, and they have more manufacturing, by 6-7 percentage points. They also have 9 percent more employment in retail.</p> <p>Areas within 1 kilometer of a sugar factory have a railroad density twice that of similar places 5 to 20 kilometers from factories; by 1980, they were also 45 percent more likely to have electricity and 4 percent more likely to have a high school. They also have local populations with a full year more of education, on average, than areas not situated near old sugar factories.</p> <p>The study shows there is also about 10 to 15 percent more public-land use in villages that were part of the Dutch Cultivation System, a data point that holds steady in both 1980 and 2003.</p> <p>“The key thing that underlies this paper, in multiple respects, is the linking of the historical data and the modern data,” Olken says. The researchers also observed that the disparity between industrialized places and their more rural counterparts has not arisen since 1980, further suggesting how much Java’s deep economic roots matter.</p> <p><strong>Net Effects?</strong></p> <p>The paper blends the expertise of Olken, who has spent years conducting antipoverty studies in Indonesia, and Dell, whose work at times examines the effects of political history on current-day economic outcomes.</p> <p>“I had never really done a historical project before,” Olken says. “But the opportunity to collaborate with Melissa on this was really exciting.”</p> <p>One of Dell’s best-known papers, published in 2010 while she was still a PhD student at MIT, shows that in areas of Peru where colonial Spanish rulers instituted a system of forced mining labor from the 1500s to the 1800s, there are significant and negative economic effects that persist today.</p> <p>However, somewhat to their surprise, the researchers did not observe similarly promounced effects from the Dutch Cultivation System.</p> <p>“One might have thought that could have had negative consequences on local social capital and local development in other respects,” says Olken, adding that he “wasn’t sure what to expect” before looking at the data.</p> <p>“The differences between the long-run effects of forced labor in Peru and Java suggest that for understanding persistent impacts on economic activity, we need to know more than just whether there was forced labor in a location,” Dell says. “We need to understand how the historical institutions influenced economic incentives and activities initially, and how these initial effects may or may not have persisted moving forward.”</p> <p>Olken adds that the study “can’t measure every possible thing,” and that “it’s possible there are other effects we didn’t see.”</p> <p>Moreover, Olken notes, the paper cannot determine the net effect of the Dutch Cultivation System on Indonesian economic growth. That is, in the absence of Dutch rule, Indonesia’s economy would have certainly grown on it own — but it is impossible to say whether it would have expanded at a rate faster, slower, or equivalent to the trajectory it had under the Dutch.</p> <p>“We can’t say what would have happened if the Dutch had never showed up in Indonesia,” Olken says. “And of course the Dutch [colonizing] Indonesia had all kinds of effects well beyond the scope of this paper, many of them negative for the contemporaneous population.”</p> Researchers find that Dutch sugar production in Indonesia in the 19th century entailed industrialization whose economic benefits are still evident today. Image collage: Christine Daniloff, MITEconomics, Asia, Europe, History, Research, Global, School of Humanities Arts and Social Sciences MIT launches master’s in data, economics, and development policy, led by Nobel laureates The first cohort of 22 students from 14 countries share a common ambition: harnessing data to help others. Tue, 04 Feb 2020 09:00:00 -0500 Abdul Latif Jameel Poverty Action Lab (J-PAL) <p>This week, the first cohort of 22 students begin classes in MIT’s new master’s program in Data, Economics, and Development Policy (DEDP). The graduate program was created jointly by MIT’s Department of Economics and the Abdul Latif Jameel Poverty Action Lab (<a href="">J-PAL</a>), a research center at MIT led by professors Abhijit Banerjee, Esther Duflo, and Benjamin Olken. Banerjee and Duflo are co-recipients of the 2019 Nobel Memorial Prize in Economics.&nbsp;</p> <p>The 22 students beginning the master’s program this week hail from 14 countries around the world, including Brazil, India, Jordan, Lithuania, Mexico, Nigeria, the United States, and Zimbabwe.&nbsp;</p> <p>The students are pioneers of a new approach to higher education: College degrees and standardized test scores are not required for admission. Instead, applicants prove their readiness through their performance in online <em>MITx </em><a href="">MicroMasters</a> courses, completing weekly assignments and taking proctored final exams.&nbsp;</p> <p>The program’s unique admissions process reflects Banerjee, Duflo, and Olken’s ambition to democratize higher education, leveling the playing field to enable students from all backgrounds to succeed.</p> <p>The makeup of the <a href="" target="_blank">cohort</a> reflects this nontraditional approach to admissions. Students joining the Data, Economics, and Development Policy program possess a range of professional backgrounds, with experience in finance, management consulting, and government; and with organizations like UNICEF, Google, and <em>The New York Times</em> — one incoming student is even joining <a href="" target="_blank">directly from high school</a>.&nbsp;</p> <p><strong>Applying data for better public policy</strong></p> <p>The <a href="">master’s program</a> combines five challenging MicroMasters courses, one semester of on-campus learning, and a summer capstone experience to provide students with an accessible yet rigorous academic experience. The curriculum is designed to equip students with the tools to apply data for more effective decision-making in public policy, with a focus on social policies that target poverty alleviation.&nbsp;</p> <p>This includes coursework in microeconomics, econometrics, political economy, psychology, data science, and more — all designed to provide a practical, well-rounded graduate education. Many students hope to apply the knowledge they gain in the DEDP program to improve the lives of people in their home countries.</p> <p>Helena Lima, an incoming student from Brazil, plans to return to Brazil after graduation. “My goal [after completing this program] is to move the needle in Brazilian public education, contributing to increase access to high-quality schools for the most vulnerable people and communities,” says Helena.&nbsp;</p> <p>Lovemore Mawere, an incoming student from Zimbabwe, shares this sentiment. “I intend to return home to Africa after the master’s program. I believe the experience and the skills gained will embolden me to take action and lead the fight against poverty.”</p> <p><strong>Expanding access for all students</strong></p> <p>The blended online and in-person structure of the program means that students spend just one semester on campus at MIT, but program administrators recognize that costs of tuition and living expenses can still be prohibitive. Administrators say that they are working on bringing these costs down and providing scholarship funding.&nbsp;</p> <p>“We’ve partnered with the Hewlett Foundation to provide scholarships for students from sub-Saharan Africa, and are actively seeking other funding partners who share our vision,” says Maya Duru, associate director of education at J-PAL. “The individuals who apply to this program are incredibly smart, motivated, and resourceful. We want to work with donors to establish a sustainable scholarship fund to ensure that finances are never a barrier to participation.”&nbsp;</p> <p>Esther Duflo, the MIT professor and Nobel laureate who helped create the program, emphasized the critical importance of the program’s mission.&nbsp;</p> <p>“It is more important now than ever to ensure that the next generation of leaders understand how best to use data to inform decisions, especially when it comes to public policy,” says Duflo. “We are preparing our students to succeed in future leadership positions in government, NGOs, and the private sector — and, hopefully, to help shift their institutional cultures toward a more data-driven approach to policy.”</p> The first students to enroll in MIT’s new MicroMaster Program in Data, Economics, and Development Policy program arrived at MIT in January.Photo: Amanda Kohn/J-PALEconomics, Abdul Latif Jameel Poverty Action Lab (J-PAL), MITx, Massive open online courses (MOOCs), EdX, Office of Digital Learning, International development, Policy, Poverty, Data, education, teaching, Education, teaching, academics, Social sciences, School of Humanities Arts and Social Sciences Experts join J-PAL North America in advancing conversation on the work of the future Academic, government, and advocacy leaders gathered to promote collaborative research partnerships to identify strategies that help workers thrive in today’s labor market. Fri, 31 Jan 2020 15:30:01 -0500 J-PAL North America <p>On Jan. 14, J-PAL North America’s <a href="" target="_blank">Work of the Future Initiative</a> hosted an afternoon of conversation on how to address the changing nature of work while advancing equity and opportunity. The event, entitled <a href="" target="_blank">Building A Future That Works For All</a>, was attended by 35 leaders from nonprofits, academia, government, philanthropy, and advocacy organizations.&nbsp;</p> <p>“The assumption that we can solve these problems without workers in the conversation is one that we need to leave behind,” said Ai-jen Poo, co-founder and executive director of the National Domestic Workers Alliance, as she kicked off the first panel of the day. This theme was echoed throughout the day’s conversations, which were hosted by the Gerri and Rich Wong family at the Accel office in Palo Alto, California. Rich Wong is an alumnus of MIT engineering and the MIT Sloan School of Management.&nbsp;</p> <p>The event sought to continue J-PAL North America and the Work of the Future Initiative’s efforts to shift the conversations surrounding the future of work to focus on working people and collaborative research partnerships. As J-PAL North America Executive Director Mary Ann Bates stated in her introductory remarks: “We’re here to talk about the work of the future, which is about many big ideas — automation, artificial intelligence, and more — but we care about this work because of the people.”&nbsp;</p> <p>J-PAL North America launched the Work of the Future Initiative in April 2019 to identify effective, evidence-based strategies to increase opportunities, reduce disparities, and help all workers navigate and thrive in the labor markets of the future.&nbsp;</p> <p>Research partnerships are vital to generating the rigorous evidence necessary to identify these effective strategies. The recent event’s conversations sought to provide attendees with a chance to forge new partnerships and discuss innovative ideas for new programs and evaluations.&nbsp;</p> <p>The first panel discussed the role of rigorous research to inform worker-centered policies. Ai-jen Poo focused her discussion on the care sector — a workforce that will grow at five times the rate of any other sector in the coming years. Specifically, Poo noted the creative and innovative measures that the National Domestic Workers Alliance is taking to ensure that care work is dignified and that domestic workers are protected, including turning to technology: “What we’re trying to do is deploy technology to solve for dignity and equity.”&nbsp;</p> <p>Harvard professor and J-PAL North America Co-Scientific Director Lawrence Katz followed Poo’s remarks by discussing the growing divergence between real wages and worker productivity. Katz cited rising inequality as a primary driver of the decline in upward mobility and the stagnation of wages; more so than slow economic growth.&nbsp;</p> <p>Lastly, Aneesh Raman, senior advisor to California Governor Gavin Newsom, closed the conversation with a discussion on why collaboration across sectors and a willingness to innovate is crucial to progress: “We live in a world where politicians have very little opportunity to fail, which makes it very hard to innovate. We need to create a shared ownership of risk. Philanthropy, government, the private sector, and the nonprofit community need to come together to innovate and make a difference.”&nbsp;</p> <p>Other highlights of the day included a discussion of an ongoing research partnership between MIT Professor and Work of the Future Initiative Co-Chair David Autor, Rutgers University professor and J-PAL-affiliated researcher Amanda Agan, and Irene Liu and Jen Yeh of <a href="" target="_blank">Checkr</a>.&nbsp;</p> <p>Checkr is a selected partner through the Work of the Future Initiative’s <a href="" target="_blank">inaugural innovation competition</a>. The company partnered with Autor and Agan to evaluate whether their Positive Adjudication Matrix (PAM) can reduce bias in the background-check and hiring process. PAM allows employers to deem certain types of offenses irrelevant to the roles for which they’re hiring. Companies can then choose to either filter out or de-emphasize these criminal records.&nbsp;</p> <p>The candid conversation addressed the challenging aspects of partnering to design an evaluation and discussed what conditions must hold for more productive research partnerships to form in the future. Autor discussed the need for a champion within a partner organization, stating, “Data is threatening in the sense that it can produce results that you’re not looking for. You need a champion within your organization to move this forward.”&nbsp;</p> <p>The Checkr team expressed their hope that the evaluation of their product can inform policy decisions in the future: “There are states that have laws dictating who can and cannot apply to these companies. If we have evidence there, that can be really helpful.”</p> <p>Other panelists, such as Katy Hamilton of the <a href="" target="_blank">Center for Work Education and Employment</a> and Jukay Hsu of <a href="" target="_blank">Pursuit</a>, run organizations that provide direct support to workers seeking quality jobs. Hamilton and Hsu discussed the programs that they hope to evaluate and turned to the audience for advice and constructive questions to inform their evaluation design processes.&nbsp;</p> <p>To wrap up the day, representatives from academia, philanthropy, the private sector, and government offered a call to action to other leaders within their sectors. Themes included centering workers’ voices and collaborating across sectors.&nbsp;&nbsp;</p> <p>Katy Knight of the Siegel Family Endowment discussed the steps that philanthropic organizations should take to promote people-centered practices: “We need to bring other people into the conversation and listen to their personal expertise to make sure we really understand the work we’re doing.” Mark Gorenberg of Zetta Venture Partners echoed these statements, stressing the private sector’s obligation to invest responsibly in programs that promote dignity.&nbsp;</p> <p>José Cisneros, elected treasurer for the City and County of San Francisco, discussed how collaboration is crucial for innovation: “The government is ready to be creative and work in partnership with philanthropy and the private sector to see if we can do things differently.” Columbia University professor and J-PAL-affiliated researcher Peter Bergman advocated for a similar type of collaboration within the academic community, calling for larger and more diverse research teams to conduct both quantitative and qualitative analyses of programs.&nbsp;</p> <p>The Work of the Future Initiative will continue to shape the dialogue surrounding the future of work by bringing together leaders and innovators across sectors to engage in conversations and research partnerships that center worker voices and concerns. By generating research on strategies to help workers thrive in today’s labor market, the initiative seeks to shape a more equitable future of work.</p> <div></div> Katy Knight (left) of the Siegel Family Endowment and José Cisneros (right), elected treasurer for the City and County of San Francisco, listen as MIT professor and Work of the Future Initiative co-chair David Autor provides feedback on how to design an effective evaluation of a labor force development program.Photo: J-PAL North AmericaAbdul Latif Jameel Poverty Action Lab (J-PAL), Economics, Technology and society, Jobs, Sloan School of Management, School of Humanities Arts and Social Sciences Hospital rankings hold up Some basic metrics do effectively diagnose care quality, according to MIT economists. Thu, 30 Jan 2020 23:59:59 -0500 Peter Dizikes | MIT News Office <p>Given the complexities of health care, do basic statistics used to rank hospitals really work well? A study co-authored by MIT economists indicates that some fundamental metrics do, in fact, provide real insight about hospital quality.</p> <p>“The results suggest a substantial improvement in health if you go to a hospital where the quality scores are higher,” says Joseph Doyle, an MIT economist and co-author of a new paper detailing the study’s results.</p> <p>The study was designed to work around a difficult problem in evaluating hospital quality: Some high-performing hospitals may receive an above-average number of very sick patients. Accepting those difficult cases could, on the surface, worsen the aggregate outcomes of a given hospital’s patients and make such hospitals seem less effective than they are.</p> <p>However, the scholars found a way to study equivalent pools of patients, thus allowing them to judge the hospitals in level terms. Overall, the study shows, when patient sickness levels are accounted for, hospitals that score well on quality measures have 30-day readmission rates that are 15 percent lower than a set of lesser-rated hospitals, and 30-day mortality rates that are 17 percent lower.</p> <p>“It wasn’t clear going in whether these quality measures do a good job of sorting hospitals out,” Doyle adds. “These results suggest that they have predictive power.”</p> <p>The paper, “Evaluating Measures of Hospital Quality: Evidence from Hospital Referral Patterns,” was written by Doyle, <em>the Erwin H. Schell Professor of Management and Applied Economics&nbsp;</em>at the MIT Sloan School of Management; John Graves, an assistant professor in the Department of Health Policy at Vanderbilt University; and Jonathan Gruber, the Ford Professor of Economics at MIT. It appears in the latest issue of the <em>Review of Economics and Statistics</em>.</p> <p><strong>Randomized evaluations</strong></p> <p>To conduct the study, the researchers used a method that eliminates the issue of studying a skewed sample of admissions. They studied areas across the country where dispatchers’ calls are assigned randomly to different ambulance companies. Those ambulance companies tend to deliver patients to particular hospitals. Thus, otherwise similar groups of patients are admitted to different hospitals in what is essentially a random pattern; this allows outcomes to be compared among hospitals.</p> <p>The patient data came primarily from Medicare claims made across the country during the period 2008-2012, and covered over 170,000 hospital admissions for patients who had just suffered a health event requiring “nondiscretionary” hospital admission. The patients also fit some basic criteria, such as not having previously been admitted recently for the same condition.</p> <p>In addition to analyzing 30-day readmission and mortality rates, the researchers looked at patient satisfaction levels. All these criteria, and more, are commonly used in hospital assessments.</p> <p>The researchers also found a 37 percent difference in one-year mortality, among highly-rated and lower-rated hospitals.</p> <p>“I thought our results were reasonable,” says Doyle . “They’re not too big to be believed, but they suggest a substantial improvement in health if you go to a hospital where the quality scores are much higher.”</p> <p>As the authors note in the paper, the subject is topical in the health policy world. Some lawmakers and experts want the hospital payment system to evolve in the direction of reimbursement for quality and oucomes, rather than treatment. As such, it is important to be able to tell if those quality measures are sturdy.&nbsp;</p> <p>“There’s been a lot of interest in whether these quality measures are informative or not, because there is a shift away from paying for the quantity of care provided to the quality of care provided,” Doyle says. “Most of the policymakers I’ve talked to want to use these quality measures.”</p> <p><strong>Management matters</strong></p> <p>Further research will be needed to help illuminate issues surrounding hospital quality in further depth. For instance, the current study is more focused on emergency care and not on care for chronic conditions; Doyle says that analysis of chronic care is “a fascinating question” that merits further investigation.</p> <p>Doyle also acknowledges the need for further study to explain why certain hospitals fare better than others on basic quality measures. He notes that some were historically quicker than others to adopt what are now almost universal practices — the allotment of blood-thinning drugs to heart patients, for instance — and suggests the rate of adoption of new practices is an important factor in this area.</p> <p>“Coming from a management school, we see that a lot of the variation in outcomes stems in large part from differences in management,” Doyle says. “Do you have the right procedures in places so that it’s easy for providers to do what the guidelines suggest? Improving management could yield big improvements in patient health.”</p> <p>The research was supported by the National Institutes of Health.</p> A new study by MIT economists indicates that some metrics used in hospital rankings do, in fact, provide real insight about hospital quality.Economics, Health care, Medicine, Social sciences, Policy, Sloan School of Management, School of Humanities Arts and Social Science MindHandHeart announces a record 21 new Innovation Fund winners The 10th round of MindHandHeart Innovation Fund projects is bringing diversity, equity, and inclusion, wellness, and community-building programming to campus. Wed, 15 Jan 2020 10:30:01 -0500 Maisie O’Brien | MindHandHeart <p>A meditative nature retreat, healthy cooking projects, and several initiatives advancing diversity, equity, and inclusion are coming to MIT courtesy of the <a href="">MindHandHeart Innovation Fund</a>. Sponsored by the <a href="">Office of the Chancellor</a>, the MindHandHeart Innovation Fund offers grants of up to $10,000 to advance ideas that make MIT a more welcoming, inclusive, and healthy place.</p> <p>This cycle, MindHandHeart (MHH) awarded $51,534 to 21 projects selected from 45 applications. Seventy-six percent of awarded projects are spearheaded by students and 24 percent are driven by staff members.</p> <p>Applications were reviewed by Chancellor Cynthia Barnhart, MHH’s Faculty Chair Roz Picard, members of MindHandHeart’s volunteer coalition comprising MIT students, faculty, and staff members as well as representatives from Active Minds, the Undergraduate Association Wellness Committee, the Undergraduate Association Innovation Committee, and the Graduate Student Council.</p> <p>“It’s wonderful to see community members using their many talents to launch projects that bring more ‘heart’ to MIT,” says Barnhart. “From the development of proposals to the review process to the implementation of projects, the Innovation Fund is truly a community-building effort.”</p> <p>Nine projects aim to build community and advance diversity, equity, and inclusion at MIT.</p> <p>The “Graduate Student Council Diversity, Equity, and Inclusion (GSC-DEI) Fellows Program + gradCommunity Dialogues Series” seeks to make MIT a more equitable, inclusive, and engaging place through peer-to-peer dialogues.</p> <p>One of the project’s founders, graduate student Bianca Lepe, describes the project, saying “The GSC DEI Graduate Fellows and subsequent gradCommunity Dialogues will give students the space to have thoughtful conversations across social and cultural differences. Students will gain a better understanding of how to identify inequities, engage in challenging discussions about inequities, and lower the barrier to engage comfortably in these conversations in their classrooms and research groups. We hope that this will help transform MIT’s climate by giving individuals a space to learn and empathize.”</p> <p>Another graduate student-led project, “Spill the Tea” is a monthly program connecting graduate students of color and their allies around tea for open-ended conversation, connection to MIT resources, and a sense of belonging. The goal of the “Aunties and Uncles Freshman Mentorship Program” is to strengthen the support network for first-year students in the MIT African Students Association. “Noches de Cultura” is bringing a series of events showcasing Latin American arts and culture to campus to foster spaces of community and engagement.</p> <p>Organized by the Communications Forum, “Sexual Harassment Culture at MIT” is a moderated panel exploring how harassment affects the MIT community, the experiences of survivors, and what institutional change looks like. The student-driven “MIT Women in Econ Lunch” project is a series of lunches designed to support women in the Department of Economics.</p> <p>“Queer Film and Crafting Nights” is a monthly event series bringing together LGBTQ+ individuals and allies within the Department of Biology. The “VISTA Holiday Celebration” outlines a plan to bring international visiting students and graduate students together to mark the holiday season and reduce potential isolation.</p> <p>“There’s a SPXCE for Everyone” is a campaign to host events in areas that are traditionally not seen as being inclusive of certain marginalized identity groups. “There’s a SPXCE for Everyone” organizer and Assistant Director of Intercultural Engagement for LBGTQ+ Services Lauryn McNair describes the project, saying “The takeover campaign is to extend the inclusive environment for students to be their authentic selves while experiencing events off campus in spaces that are traditionally populated by dominant identities. The MHH Innovation Fund allows SPXCE to take students to see a classical music performance at Symphony Hall with other students in their communities, to 'take over' the space, and to learn more about how diversity and inclusion is shaping Symphony Hall performances.”</p> <p>A number of newly funded projects promote wellness and self-care.</p> <p>Spearheaded by Graduate Resident Advisor in MacGregor House Kaitlyn Gee, “Discovering and Personalizing Self-Care: A Series of Workshops for MIT Students” encourages undergraduate residents of MacGregor to develop self-care practices through events focused on painting, nature, and food. “Natural Inspiration,” led by Integrated Design and Management student Western Bonime, is a nature retreat where participants will meditate, take mindful walks, and admire the natural world.</p> <p>Spearheaded by Buddhist Chaplain Tenzin Priyadarshi, the “Gratitude Project” motivates MIT community members to pause, reflect, and cultivate gratitude. “Mindful MIT” is an initiative to distribute 120 mindfulness journals to Sloan students, along with materials advertising campus support resources. “IDSS Presents: Intelligence Demands Super Relaxation” is a student-led project to add de-stressing tools and furniture to Institute for Data, Systems, and Society common spaces.&nbsp;</p> <p>Led by Hindu Chaplain Sadananda Dasa, “Handling Negativity” consists of a series of workshops where participants will learn techniques from ancient Vedic texts to confront negativity and cultivate positive thoughts.</p> <p>Four projects are designed to build community and promote healthy eating. “EZhealth” is a student-led group hosting cooking classes in independent living groups. “ChopStirHack” is a student-led food magazine, building off the success of their <a href="">cookbook</a>. “Recipes from Home” is a cookbook project that seeks to share cultural and culinary traditions within the Department of Urban Studies and Planning’s 2020 Master in City Planning graduating class. Lastly, the “Cambridge Culinary Cooking Class” brings students and faculty members in the Department of Chemical Engineering together for an interactive cooking class.</p> <p>Other projects include the “Graduate Student Book Exchange,” an event where students can connect over their favorite books, and “Save TFP: Grand Care Package Event,” a large-scale event where undergraduates will make care packages for their friends during Random Acts of Kindness Week in March.</p> <p>MHH has supported <a href="">138 Innovation Fund projects</a> to date, 17 of which are now self-sustaining.</p> <p>The next <a href="" target="_blank">MindHandHeart Innovation Fund</a> cycle opens March 1-31. MIT staff, faculty, students, and students’ spouses with ideas to make MIT a more welcoming, inclusive, and healthy place are encouraged to apply.</p> Fall 2019 MindHandHeart Innovation Fund granteesPhoto: Maisie O'BrienMindHandHeart, Biology, Economics, Urban studies and planning, Chemical engineering, Community, Mental health, Student life, Chancellor, MIT Medical, Grants, Lesbian, gay, bisexual, transgender, queer/questioning (LGBTQ), Campus services, Diversity and inclusion Zeroing in on decarbonization Wielding complex algorithms, nuclear science and engineering doctoral candidate Nestor Sepulveda spins out scenarios for combating climate change. Wed, 15 Jan 2020 00:00:00 -0500 Leda Zimmerman | Department of Nuclear Science and Engineering <p>To avoid the most destructive consequences of climate change, the world’s electric energy systems must stop producing carbon by 2050. It seems like an overwhelming technological, political, and economic challenge — but not to Nestor Sepulveda.</p> <p>“My work has shown me that we&nbsp;do&nbsp;have the means to tackle the problem, and we can start now,” he says. “I am optimistic.”</p> <p>Sepulveda’s research, first as a master’s student and now as a doctoral candidate in the MIT Department of Nuclear Science and Engineering (NSE), involves complex simulations that describe potential pathways to decarbonization. In work published last year in the journal&nbsp;<em>Joule,&nbsp;</em>Sepulveda and his co-authors made a powerful case for using a mix of renewable and “firm” electricity sources, such as nuclear energy, as the least costly, and most likely, route to a low- or no-carbon grid.</p> <p>These insights, which flow from a unique computational framework blending optimization and data science, operations research, and policy methodologies, have attracted interest from&nbsp;<em>The New York Times&nbsp;</em>and&nbsp;<em>The Economist,&nbsp;</em>as well as from such notable players in the energy arena as Bill Gates. For Sepulveda, the attention could not come at a more vital moment.</p> <p>“Right now, people are at extremes: on the one hand worrying that steps to address climate change might weaken the economy, and on the other advocating a Green New Deal to transform the economy that depends solely on solar, wind, and battery storage,” he says. “I think my data-based work can help bridge the gap and enable people to find a middle point where they can have a conversation.”</p> <p><strong>An optimization tool</strong></p> <p>The computational model Sepulveda is developing to generate this data, the centerpiece of his dissertation research, was sparked by classroom experiences at the start of his NSE master’s degree.</p> <p>“In courses like Nuclear Technology and Society [22.16], which covered the benefits and risks of nuclear energy, I saw that some people believed the solution for climate change was definitely nuclear, while others said it was wind or solar,” he says. “I began wondering how to determine the value of different technologies.”</p> <p>Recognizing that “absolutes exist in people’s minds, but not in reality,” Sepulveda sought to develop a tool that might yield an optimal solution to the decarbonization question. His inaugural effort in modeling focused on weighing the advantages of utilizing advanced nuclear reactor designs against exclusive use of existing light-water reactor technology in the decarbonization effort.</p> <p>“I showed that in spite of their increased costs, advanced reactors proved more valuable to achieving the low-carbon transition than conventional reactor technology alone,” he says. This research formed the basis of Sepulveda’s master’s thesis in 2016, for a degree spanning NSE and the Technology and Policy Program. It also informed the MIT Energy Initiative’s report,&nbsp;“The Future of Nuclear Energy in a Carbon-Constrained World.”</p> <p><strong>The right stuff</strong></p> <p>Sepulveda comes to the climate challenge armed with a lifelong commitment to service, an appetite for problem-solving, and grit. Born in Santiago, he enlisted in the Chilean navy, completing his high school and college education at the national naval academy.</p> <p>“Chile has natural disasters every year, and the defense forces are the ones that jump in to help people, which I found really attractive,” he says. He opted for the most difficult academic specialty, electrical engineering, over combat and weaponry. Early in his career, the climate change issue struck him, he says, and for his senior project, he designed a ship powered by hydrogen fuel cells.</p> <p>After he graduated, the Chilean navy rewarded his performance with major responsibilities in the fleet, including outfitting a $100 million amphibious ship intended for moving marines and for providing emergency relief services. But Sepulveda was anxious to focus fully on sustainable energy, and petitioned the navy to allow him to pursue a master’s at MIT in 2014.</p> <p>It was while conducting research for this degree that Sepulveda confronted a life-altering health crisis: a heart defect that led to open-heart surgery. “People told me to take time off and wait another year to finish my degree,” he recalls. Instead, he decided to press on: “I was deep into ideas about decarbonization, which I found really fulfilling.”</p> <p>After graduating in 2016, he returned to naval life in Chile, but “couldn’t stop thinking about the potential of informing energy policy around the world and making a long-lasting impact,” he says. “Every day, looking in the mirror, I saw the big scar on my chest that reminded me to do something bigger with my life, or at least try.”</p> <p>Convinced that he could play a significant role in addressing the critical carbon problem if he continued his MIT education, Sepulveda successfully petitioned naval superiors to sanction his return to Cambridge, Massachusetts.</p> <p><strong>Simulating the energy transition</strong></p> <p>Since resuming studies here in 2018, Sepulveda has wasted little time. He is focused on refining his modeling tool to play out the potential impacts and costs of increasingly complex energy technology scenarios on achieving deep decarbonization. This has meant rapidly acquiring knowledge in fields such as economics, math, and law.</p> <p>“The navy gave me discipline, and MIT gave me flexibility of mind — how to look at problems from different angles,” he says.</p> <p>With mentors and collaborators such as Associate Provost and Japan Steel Industry Professor Richard Lester and MIT Sloan School of Management professors Juan Pablo Vielma and Christopher Knittel, Sepulveda has been tweaking his models. His simulations, which can involve more than 1,000 scenarios, factor in existing and emerging technologies, uncertainties such as the possible emergence of fusion energy, and different regional constraints, to identify optimal investment strategies for low-carbon systems and to determine what pathways generate the most cost-effective solutions.</p> <p>“The idea isn’t to say we need this many solar farms or nuclear plants, but to look at the trends and value the future impact of technologies for climate change, so we can focus money on those with the highest impact, and generate policies that push harder on those,” he says.</p> <p>Sepulveda hopes his models won’t just lead the way to decarbonization, but do so in a way that minimizes social costs. “I come from a developing nation, where there are other problems like health care and education, so my goal is to achieve a pathway that leaves resources to address these other issues.”</p> <p>As he refines his computations with the help of MIT’s massive computing clusters, Sepulveda has been building a life in the United States. He has found a vibrant Chilean community at MIT&nbsp;and discovered local opportunities for venturing out on the water, such as summer sailing on the Charles.</p> <p>After graduation, he plans to leverage his modeling tool for the public benefit, through direct interactions with policy makers (U.S. congressional staffers have already begun to reach out to him), and with businesses looking to bend their strategies toward a zero-carbon future.</p> <p>It is a future that weighs even more heavily on him these days: Sepulveda is expecting his first child. “Right now, we’re buying stuff for the baby, but my mind keeps going into algorithmic mode,” he says. “I’m so immersed in decarbonization that I sometimes dream about it.”</p> “In courses like Nuclear Technology and Society, which covered the benefits and risks of nuclear energy, I saw that some people believed the solution for climate change was definitely nuclear, while others said it was wind or solar,” says doctoral student Nestor Sepulveda. “I began wondering how to determine the value of different technologies.”Photo: Gretchen ErtlNuclear science and engineering, MIT Energy Initiative, School of Engineering, Technology and policy, Students, Research, Alternative energy, Energy, Energy storage, Greenhouse gases, Climate change, Global Warming, Sustainability, Emissions, Renewable energy, Economics, Policy, Nuclear power and reactors, Profile, graduate, Graduate, postdoctoral J-PAL North America seeks partners to research homelessness Housing Stability Evaluation Incubator will provide funding and technical assistance to help partners build evidence on strategies to reduce and prevent homelessness. Mon, 13 Jan 2020 12:50:01 -0500 J-PAL North America <p>J-PAL North America, a research center in the MIT Department of Economics, has announced a new <a href=";utm_medium=email&amp;utm_campaign=hsei_2019" target="_blank">Housing Stability Evaluation Incubator</a> to support organizations fighting homelessness in developing randomized evaluations that test the impacts of their policies, programs, and services.&nbsp;</p> <p>To many, rising rates of homelessness in some U.S. cities might seem like an intractable challenge. In the United States, more than 500,000 people experience homelessness on a given night, and 1.4 million people pass through emergency shelters in a given year. Many more individuals experience housing instability in other, often uncounted forms, whether living doubled-up with friends or family, living in temporary accommodations such as motels, or living under threat of eviction.</p> <p>However, the challenge of housing instability is not insurmountable. There is strong evidence on some strategies for ending homelessness and there are powerful tools for learning even more about how to support unhoused individuals and families in accessing and maintaining safe, affordable housing. For example, <a href="" target="_blank">several randomized evaluations</a> of Housing First programs helped demonstrate that providing permanent supportive housing with no preconditions was a more effective approach to housing unhoused individuals with severe mental illness when compared to conventional transitional housing programs.&nbsp;</p> <p>The results from these studies changed many peoples’ perceptions about how to best help house people experiencing homelessness. These results also led to dramatic reductions in chronic homelessness among communities that adopted a robust Housing First approach and expanded the number of permanent supportive housing units in their jurisdiction.</p> <p>However, many questions remain on how to best design, implement, and target services aimed at reducing and preventing homelessness. To answer these questions, J-PAL North America will support organizations fighting homelessness to evaluate their own programs and learn more about what works for promoting housing stability.&nbsp;</p> <p>Through the Housing Stability Evaluation Incubator, organizations can apply for technical assistance from J-PAL North America staff, connections with J-PAL’s network of leading researchers, and flexible proposal development funding to develop one or more high-quality randomized evaluations.</p> <p>Any organization interested in answering policy-relevant research questions on strategies to reduce homelessness is invited to apply. This may include nonprofit service providers, government agencies or offices, public housing authorities, Continuums of Care, and other organizations that operate programs or policies aimed at reducing homelessness, preventing eviction, or promoting housing stability.</p> <p>To guide the development of future research, J-PAL North America also released an <a href="" target="_blank">evidence review</a> summarizing results from 40 rigorous evaluations of 18 distinct programs related to homelessness prevention and reduction. The publication focuses mainly on questions that can be answered through rigorous impact-evaluation methods and outlines a research agenda for additional evaluation. The Housing Stability Evaluation Incubator is a next step toward supporting new evaluations to fill gaps in the evidence base.&nbsp;</p> <p>J-PAL-affiliated researchers are working on <a href="" target="_blank">five ongoing research projects related to homelessness</a> with local jurisdictions across the United States. For example, the County of Santa Clara’s Office of Supportive Housing and local nonprofit provider, HomeFirst, are working with J-PAL North America and researchers from the University of Notre Dame’s Wilson Sheehan Lab for Economic Opportunities to develop a randomized <a href="" target="_blank">evaluation of a new rapid re-housing program for single adults</a>. The ongoing study in Santa Clara County, California, will inform decisions around expansion of the program in the county and can contribute new evidence to inform other governments facing similar challenges.&nbsp;</p> <p>“Partnering with researchers to improve evidence-based programs is critically important to reducing and preventing homelessness,” says Ky Le, director of the Office of Supportive Housing for the County of Santa Clara. “With J-PAL’s assistance, we are striving to optimize our impact and efficiency.”</p> <p>Interested organizations are encouraged to submit a letter of interest by April 6. Detailed instructions on how to apply to the Housing Stability Evaluation Incubator can be found on the <a href=";utm_medium=email&amp;utm_campaign=hsei_2019" target="_blank">initiative webpage</a>. Please contact Initiative Manager <a href=";utm_medium=email&amp;utm_campaign=hsei_2019">Rohit Naimpally</a> with questions.</p> <p>J-PAL North America will host a <a href="" target="_blank">webinar</a> on Feb. 10 at 2 p.m. to provide an introduction to the evaluation incubator, review the application process, and respond to questions. To receive information about the webinar and other updates about the evaluation incubator, sign up for J-PAL North America’s <a href=";utm_medium=website%20link&amp;utm_campaign=JPALNA_newsletter" target="_blank">mailing list on reducing and preventing homelessness</a>.</p> The Santa Clara County’s Office of Supportive Housing in California is working with J-PAL North America and affiliated researchers to test the impact of rapid re-housing on homeless shelter entry, housing moves, and hospital visits for single adults.Abdul Latif Jameel Poverty Action Lab (J-PAL), Economics, Poverty, Research, School of Humanities Arts and Social Sciences, Housing In health care, does “hotspotting” make patients better? Study shows no effect from program intended to reduce repeated hospitalizations by targeting high-cost patients. Wed, 08 Jan 2020 16:59:59 -0500 Peter Dizikes | MIT News Office <p>The new health care practice of “hotspotting” — in which providers identify very high-cost patients and attempt to reduce their medical spending while improving care — has virtually no impact on patient outcomes, according to a new study led by MIT economists.&nbsp;</p> <p>The finding underscores the challenge of reducing spending on “superutilizers” of health care, the roughly 5 percent of patients in the U.S. who account for half the nation’s health care costs. The concept of hotspotting, a little more than a decade old, consists of programs that give at-risk patients sustained contact with doctors, other caregivers, and social service providers, in an attempt to prevent rehospitalizations and other intensive, expensive forms of care.&nbsp;</p> <p>The MIT study was developed in cooperation with the Camden Coalition of Healthcare Providers, which runs one of the nation’s best-known hotspotting programs. The researchers conducted a four-year analysis of the program and found that being enrolled in it makes no significant difference to patients’ health care use. &nbsp;</p> <p>“This intervention had no impact in reducing hospital readmissions,” says Amy Finkelstein, an MIT health care economist who led the study.</p> <p>Significantly, the new study was a randomized, controlled trial, in which two otherwise similar groups of patients in Camden were separated by one large factor: Some were randomly selected to be part of the hotspotting program, and an equal number of randomly selected patients were not. The two groups generated virtually the same results over time.</p> <p>“The reason it was so important we did a randomized, controlled trial,” Finkelstein says, “is that if you just look at the individuals in the intervention group, it would look like the program caused a huge reduction in readmissions. But when you look at the individuals in the control group — who were eligible for the program but were not randomly selected to get it — you see the exact same pattern.”</p> <p>The paper, “Health Care Hotspotting — A Randomized, Controlled Trial” is being published today in the&nbsp;<em>New England Journal of Medicine</em>. The co-authors are Finkelstein, the John and Jennie S. MacDonald Professor Economics at MIT, who is the paper’s corresponding author; Joseph Doyle, an economist who is the Erwin H. Schell Professor of Management at the MIT Sloan School of Management; Sarah Taubman, a research scientist at J-PAL North America, part of MIT’s Abdul Latif Jameel Poverty Action Lab; and Annetta Zhou, a postdoc at the National Bureau of Economic Research.</p> <p><strong>Camden Coalition “fabulous partners” in seeking answers</strong></p> <p>To conduct the study, the MIT-led research team evaluated 800 patients enrolled in the Camden Coalition of Healthcare Providers program from 2014 to 2017. The patients in the study had been hospitalized at least once in the six months prior to admission and had at least two chronic medical conditions, among other health care issues. The study was constructed after extensive consultation with the coalition.</p> <p>“They were fabulous partners,” Finkelstein says about the coalition. “Because they’re so data-driven, they had the data infrastructure in place, which made this possible.”</p> <p>Finkelstein particularly cites the founder of the Camden Coalition of Healthcare Providers, Jeffrey Brenner, who served as executive director of the organization from 2006 through 2017, and whose development of “hotspotting” concepts has received substantial public attention. In Camden, where 2 percent of patients represent 33 percent of medical expenses, preventing the need for acute care is a pressing concern.&nbsp;</p> <p>“Dr. Brenner is a really extraordinary person, and he’s trying to solve a very hard problem,” Finkelstein says, crediting Brenner for actively seeking data about his organization’s results without knowing what those outcome would be.</p> <p>Half of the study’s 800 patients were placed in a group that used the program’s services, and half were in a control group that did not take part in the program. The Camden hotspotting program includes extensive home care visits, coordinated follow-up care, and medical monitoring — all designed to help stabilize the health of patients after hospitalization. It also helps patients apply for social services and behavioral health programs.</p> <p>Overall, the study found that the 180-day hospital readmission rate was 62.3 percent for people in the program and 61.7 percent for people not in the program.&nbsp;</p> <p>Additional measurements in the study — such as the number of hospital readmissions for patients, aggregate number of days spent in the hospital, and multiple financial statistics — also showed very similar outcomes between the two groups.</p> <p>The study shows that while the overall number of people in hotspotting programs who need rehospitalization declines over the course of the program, it does not decline by a larger amount than it would if those people were outside the program’s reach.</p> <p>In short, people in hotspotting programs require fewer rehospitalizations because any group of patients currently using a lot of health care resources will tend to have lower health care use in the future. Previous reports about hotspotting programs had focused on the roughly 40 percent decline in six-month hospital readmissions — while not comparing that to the rate for comparable patient groups outside such programs.</p> <p>“If you think about health care interventions, almost by definition they’re occurring at a time of unusually poor health or unusually high cost,” Finkelstein says. “That’s why you’re intervening. So they’re almost by construction going to be plagued by the issue of regression to[ward] the mean. I think that’s a really important lesson as we continue to try to figure out how to improve health care delivery, especially as so much of the work focuses on these high-cost patients.”</p> <p><strong>“We’re not going to give up” </strong></p> <p>To be sure, as Finkelstein notes, the new study is a local one, and hotspotting programs exist in many locations. It also examines the four-year results of the program, which underwent some evolution during the study period; if the program had made a breakthrough change in, say, 2016, that would only partially be reflected in the four-year data. As it happens, however, the study found no such large changes over time.&nbsp;</p> <p>Brenner’s perspective about studying the effectiveness of his own initiative, Finkelstein says, was that, by analogy, “if you have a new medication to try to cure cancer, and you run a clinical trial on it and it doesn’t work, you don’t just say, ‘I guess that’s it, we’re stuck with cancer.’ You keep trying other things. … We’re not going to give up on improving the efficiency of health care delivery and the well-being of this incredibly under-served population. We need to continue to develop potential solutions and rigorously evaluate them.”</p> <p>Finkelstein also notes that the current study is just one piece of research in the complicated area of improving health care and reducing costs for people in need of extensive treatment, and says she welcomes additional research in this area.</p> <p>“I hope it inspires more research and that more organizations will partner with us to study [these issues],” Finkelstein says.</p> <p>Finkelstein also serves as the scientific director of J-PAL North America at MIT, which backs randomized controlled trials on a variety of social issues.</p> <p>The data for the study came from the Camden Coalition of Healthcare Providers; Camden’s four hospitals; and the state of New Jersey.&nbsp;</p> <p>The research was supported by the National Institute on Aging of the National Institutes of Health; the Health Care Delivery Initiative of J-PAL North America; and the MIT Sloan School of Management.</p> A new MIT-led study set in Camden, New Jersey (pictured here), finds that “hotspotting” healthcare programs have a very limited effect when it comes to improving care and reducing costs for high-risk patients.Health care, Research, Economics, Aging, Medicine, Health, Abdul Latif Jameel Poverty Action Lab (J-PAL), National Institutes of Health (NIH), Sloan School of Management, School of Humanities Arts and Social Sciences Tracking emissions in China Evaluating a 2014 policy change yields some good news and some concerns. Mon, 30 Dec 2019 11:10:01 -0500 Nancy W. Stauffer | MIT Energy Initiative <p>In January 2013, many people in Beijing experienced a multiweek period of severely degraded air, known colloquially as the “Airpocalypse,” which made them sick and kept them indoors. As part of its response, the central Chinese government accelerated implementation of tougher air pollution standards for power plants, with limits to take effect in July 2014. One key standard limited emissions of <span class="st">sulfur dioxide (</span>SO<sub>2</sub>), which contributes to the formation of airborne particulate pollution and can cause serious lung and heart problems. The limits were introduced nationwide, but varied by location. Restrictions were especially stringent in certain “key” regions, defined as highly polluted and populous areas in Greater Beijing, the Pearl River Delta, and the Yangtze River Delta.</p> <p>All power plants had to meet the new standards by July 2014. So how did they do? “In most developing countries, there are policies on the books that look very similar to policies elsewhere in the world,” says&nbsp;<a href="">Valerie J. Karplus</a>, an assistant professor of global economics and management at the MIT Sloan School of Management. “But there have been few attempts to look systematically at plants’ compliance with environmental regulation. We wanted to understand whether policy actually changes behavior.”</p> <p><strong>Focus on power plants</strong></p> <p>For China, focusing environmental policies on power plants makes sense. Fully 60 percent of the country’s primary energy use is coal, and about half of it is used to generate electricity. With that use comes a range of pollutant emissions. In 2007, China’s Ministry of Environmental Protection required thousands of power plants to install continuous emissions monitoring systems (CEMS) on their exhaust stacks and to upload hourly, pollutant-specific concentration data to a publicly available website.</p> <p>Among the pollutants tracked on the website was SO<sub>2</sub>. To Karplus and two colleagues — Shuang Zhang, an assistant professor of economics at the University of Colorado at Boulder, and Douglas Almond, a professor in the School of International and Public Affairs and the Department of Economics at Columbia University — the CEMS data on SO<sub>2</sub>&nbsp;emissions were an as-yet-untapped resource for exploring the on-the-ground impacts of the 2014 emissions standards, over time and plant-by-plant.</p> <p>To begin their study, Karplus, Zhang, and Almond examined changes in the CEMS data around July 2014, when the new regulations went into effect. Their study sample included 256 power plants in four provinces, among them 43 that they deemed “large,” with a generating capacity greater than 1,000 megawatts (MW). They examined the average monthly SO<sub>2</sub>&nbsp;concentrations reported by each plant starting in November 2013, eight months before the July 2014 policy deadline.</p> <p>Emissions levels from the 256 plants varied considerably. The researchers were interested in relative changes within individual facilities before and after the policy, so they determined changes relative to each plant’s average emissions — a calculation known as demeaning. For each plant, they calculated the average emissions level over the whole time period being considered. They then calculated how much that plant’s reading for each month was above or below that baseline. By taking the averages of those changes-from-baseline numbers at all plants in each month, they could see how much emissions from the group of plants changed over time.</p> <p>The demeaned CEMS concentrations are plotted in the first accompanying graph, labeled “SO<sub>2</sub> concentrations (demeaned).” At zero on the Y axis in Figure 1 in the slideshow above, levels at all plants — big emitters and small — are on average equal to their baseline. Accordingly, in January 2014 plants were well above their baseline, and by July 2016 they were well below it. So average plant-level SO<sub>2</sub>&nbsp;concentrations were declining slightly before the July 2014 compliance deadline, but they dropped far more dramatically after it.</p> <p><strong>Checking the reported data</strong></p> <p>Based on the CEMS data from all the plants, the researchers calculated that total SO<sub>2</sub>&nbsp;emissions fell by 13.9 percent in response to the imposition of the policy in 2014. “That’s a substantial reduction,” notes Karplus. “But are those reported CEMS readings accurate?”</p> <p>To find out, she, Zhang, and Almond compared the measured CEMS concentrations with SO<sub>2</sub>&nbsp;concentrations detected in the atmosphere by NASA’s Ozone Monitoring Instrument. “We believed that the satellite data could provide a kind of independent check on the policy response as captured by the CEMS measurements,” she says.</p> <p>For the comparison, they limited the analysis to their 43 1,000-MW power plants — large plants that should generate the strongest signal in the satellite observations. Figure 2 in the slideshow above shows data from both the CEMS and the satellite sources. Patterns in the two measures are similar, with substantial declines in the months just before and after July 2014. That general agreement suggests that the CEMS measurements can serve as a good proxy for atmospheric concentrations of SO<sub>2</sub>.</p> <p>To double-check that outcome, the researchers selected 35 relatively isolated power plants whose capacity makes up at least half of the total capacity of all plants within a 35-kilometer radius. Using that restricted sample, they again compared the CEMS measurements and the satellite data. They found that the new emissions standards reduced both SO<sub>2</sub>&nbsp;measures. However, the SO<sub>2</sub>&nbsp;concentrations in the CEMS data fell by 36.8 percent after the policy, while concentrations in the satellite data fell by only 18.3 percent. So the CEMS measurements showed twice as great a reduction as the satellite data did. Further restricting the sample to isolated power plants with capacity larger than 1,000 MW produced similar results.</p> <p><strong>Key versus non-key regions</strong></p> <p>One possible explanation for the mismatch between the two datasets is that some firms overstated the reductions in their CEMS measurements. The researchers hypothesized that the difficulty of meeting targets would be higher in key regions, which faced the biggest cuts. In non-key regions, the limit fell from 400 to 200 milligrams per cubic meter (mg/m<sup>3</sup>). But in key regions, the limit went from 400 to 50 mg/m<sup>3</sup>. Firms may have been unable to make such a dramatic reduction in so short a time, so the incentive to manipulate their CEMS readings may have increased. For example, they may have put monitors on only a few of all their exhaust stacks, or turned monitors off during periods of high emissions.</p> <p>Figure 3 in the slideshow above shows results from analyzing non-key and key regions separately. At large, isolated plants in non-key regions, the CEMS measurements show a 29.3 percent reduction in SO<sub>2</sub>&nbsp;and the satellite data a 22.7 percent reduction. The ratio of the estimated post-policy declines is 77 percent — not too far out of line.</p> <p>But a comparable analysis of large, isolated plants in key regions produced very different results. The CEMS measurements showed a 53.6 percent reduction in SO<sub>2</sub>, while the satellite data showed no statistically significant change at all.</p> <p>One possible explanation is that power plants actually did decrease their SO<sub>2</sub>&nbsp;emissions after 2014, but at the same time nearby industrial facilities or other sources increased theirs, with the net effect being that the satellite data showed little or no change. However, the researchers examined emissions from neighboring high-emitting facilities during the same time period and found no contemporaneous jump in their SO<sub>2</sub>&nbsp;emissions. With that possibility dismissed, they concluded that manipulation of the CEMS data in regions facing the toughest emissions standards was “plausible,” says Karplus.</p> <p><strong>Compliance with the new standards</strong></p> <p>Another interesting question was how often the reported CEMS emissions levels were within the regulated limits. The researchers calculated the compliance rate at individual plants — that is, the fraction of time their emissions were at or below their limits — in non-key and key regions, based on their reported CEMS measurements. The results appear in Figure 4 in the slideshow above. In non-key regions, the compliance rate at all plants was about 90 percent in early 2014. It dropped a little in July 2014, when plants had to meet their (somewhat) stricter limits, and then went back up to almost 100 percent. In contrast, the compliance rate in key regions was almost 100 percent in early 2014 and then plummeted to about 50 percent at and after July 2014.</p> <p>Karplus, Zhang, and Almond interpret that result as an indication of the toughness of complying with the stringent new standards. “If you think about it from the plant’s perspective, complying with tighter standards is a lot harder than complying with more lenient standards, especially if plants have recently made investments to comply with prior standards, but those changes are no longer adequate,” she says. “So in these key regions, many plants fell out of compliance.”</p> <p>She makes another interesting observation. Their analyses had already produced evidence that firms in key areas may have falsified their reported CEMS measurements. “So that means they could be both manipulating their data and complying less,” she says.</p> <p><strong>Encouraging results plus insights for policymaking</strong></p> <p>Karplus stresses the positive outcomes of their study. She’s encouraged that the CEMS and satellite data both show emission levels dropping at most plants. Compliance rates were down at some plants in key regions, but that’s not surprising when the required cuts were large. And she notes that even though firms may not have complied, they still reduced their emissions to some extent as a result of the new standard.</p> <p>She also observes that, for the most part, there’s close correlation between the CEMS and satellite data. So the quality of the CEMS data isn’t all bad. And where it’s bad — where firms may have manipulated their measurements — it may have been because they’d been set a seemingly impossible task and timeline. “At some point, plant managers might just throw up their hands,” says Karplus. The lesson for policymakers may be to set emissions-reduction goals that are deep but long-term so that firms have enough time to make the necessary investment and infrastructure adjustments.</p> <p>To Karplus, an important practical implication of the study is “demonstrating that you can look at the alignment between ground and remote data sources to evaluate the impact of specific policies.” A series of tests confirmed the validity of their method and the robustness of their results. For example, they performed a comparable analysis focusing on July 2015, when there was no change in emissions standards. There was no evidence of the same effects. They accounted for SO<sub>2</sub>&nbsp;emitted by manufacturing facilities and other sources, and their results were unaffected. And they demonstrated that when clouds or other obstructions interfered with satellite observations, the resulting data gap had no impact on their results.</p> <p>The researchers note that their approach can be used for other short-lived industrial air pollutants and by any country seeking low-cost tools to improve data quality and policy compliance, especially when plants’ emissions are high to begin with. “Our work provides an illustration of how you can use satellite data to obtain an independent check on emissions from pretty much any high-emitting facility,” says Karplus. “And, over time, NASA will have instruments that can take measurements that are even more temporally and spatially resolved, which I think is quite exciting for environmental protection agencies and for those who would seek to improve the environmental performance of their energy assets.”</p> <p>This research was supported by a seed grant from the Samuel Tak Lee Real Estate Entrepreneurship Laboratory at MIT and by the U.S. National Science Foundation.</p> <div> <p><em>This article appears in the <a class="Hyperlink SCXW206095923 BCX0" href="" rel="noreferrer" style="margin: 0px; padding: 0px; user-select: text; -webkit-user-drag: none; -webkit-tap-highlight-color: transparent; text-decoration-line: none; color: inherit;" target="_blank">Autumn 2019 issue</a> of&nbsp;</em>Energy Futures<em>, the magazine of the MIT Energy Initiative.&nbsp;</em></p> </div> Assistant Professor Valerie Karplus and her collaborators have demonstrated that measurements of air pollutants taken by NASA satellites are often a good indicator of emissions on the ground. Their approach provides regulators with a low-cost tool to ensure that industrial firms are complying with emissions standards.Photo: Kelley TraversMIT Energy Initiative, Sloan School of Management, Energy, China, Emissions, Economics, Policy, Pollution, Research, Government, Business and management When machine learning packs an economic punch Study: After eBay improved its translation software, international commerce increased sharply. Fri, 20 Dec 2019 10:04:08 -0500 Peter Dizikes | MIT News Office <p>A new study co-authored by an MIT economist shows that improved translation software can significantly boost international trade online — a notable case of machine learning having a clear impact on economic activity.</p> <p>The research finds that after eBay improved its automatic translation program in 2014, commerce shot up by 10.9 percent among pairs of countries where people could use the new system.&nbsp; &nbsp;</p> <p>“That’s a striking number. To have it be so clear in such a short amount of time really says a lot about the power of this technology,” says Erik Brynjolfsson, an MIT economist and co-author of a new paper detailing the results.</p> <p>To put the results in perspective, he adds, consider that physical distance is, by itself, also a significant barrier to global commerce. The 10.9 percent change generated by eBay’s new translation software increases trade by the same amount as “making the world 26 percent smaller, in terms of its impact on the goods that we studied,” he says.</p> <p>The paper, “Does Machine Translation Affect International Trade? Evidence from a Large Digital Platform,” appears in the December issue of <em>Management Science</em>. The authors are Brynjolfsson, who is the Schussel Family Professor of Management Science at the MIT Sloan School of Management, and Xiang Hui and Meng Liu, who are both assistant professors in the Olin Business School at Washington University in St. Louis.</p> <p><strong>Just cause</strong></p> <p>To conduct the study, the scholars examined what happened after eBay, in 2014, introduced its new eBay Machine Translation (eMT) system — a proprietary machine-learning program that, by several objective measures, significantly improved translation quality on eBay’s site. The new system initially was focused on English-Spanish translations, to facilitate trade between the United States and Latin America</p> <p>Previously, eBay had used Bing Translator to render the titles of objects for sale. By one evaluation measure, called the Human Acceptance Rate (HAR), in which three experts accept or reject translations, the eMT system increased the number of acceptable Spanish-language item titles on eBay from 82 percent to 90 percent.</p> <p>Using administrative data from eBay, the researchers then examined the volume of trade on the platform, within countries, after the eMT system went into use. Other factors being equal, the study showed that the new translation system not only had an effect on sales, but that trade increased by 1.06 percent for each additional word in the titles of items on eBay.</p> <p>That is a substantial change for a commerce platform on which, as the paper notes, items for sale often have long, descriptive titles such as “Diamond-Cut Stackable Thin Wedding Ring New .925 Sterling Silver Band Sizes 4-12,” or “Alpine Swiss Keira Women’s Trench Coast Double Breasted Wool Jacket Belted.” In those cases, making the translation clearer helps potential buyers understand exactly what they might be purchasing.</p> <p>Given the study’s level of specificity, Brynjolfsson calls it “a really fortunate natural experiment, with a before-and-after that sharply distinguished what happened when you had machine translation and when you didn’t.”</p> <p>The structure of the study, he adds, has enabled the researchers to say with confidence that the new eBay program, and not outside factors, directly generated the change in trade volume among affected countries.</p> <p>“In economics, it’s often hard to do causal analyses and prove that A caused B, not just that A was associated with B,” says Brynjolfsson. “But in this case, I feel very comfortable using causal language and saying that improvement in machine translation caused the increase in international trade.”</p> <p><strong>Larger puzzle: The productivity issue</strong></p> <p>The genesis of the paper stems from an ongoing question about new technology and economic productivity. While many forms of artificial intelligence have been developed and expanded in the last couple of decades, the impact of AI, including things like machine-translation systems, has not been obvious in economics statistics.</p> <p>“There’s definitely some amazing progress in the core technologies, including in things like natural language processing and translation,” Brynjolfsson says. “But what’s been lacking has been evidence of an economic impact, or business impact. So that’s a bit of a puzzle.”</p> <p>When looking to see if an economic impact for various forms of AI could be measured, Brynjolfsson, Hui, and Liu thought machine translation “made sense, because it’s a relatively straightforward implementation,” Brynjolfsson adds. That is, better translations could influence economic activity, at least on eBay, without any other changes in technology occurring.</p> <p>In this vein, the findings fit with a larger postulation Brynjolfsson has developed in recent years — that the adoption of AI technologies produces a “J-curve” in productivity. As Brynjolfsson has previously written, broad-ranging AI technologies nonetheless “require significant complementary investments, including business process redesign, co-invention of new products and business models, and investments in human capital” to have a large economic impact.</p> <p>As a result, when AI technologies are introduced, productivity may appear to slow down, and when the complementary technologies are developed, productivity may appear to take off — in the “J-curve” shape.</p> <p>So while Brynjolfsson believes the results of this study are clear, he warns against generalizing too much on the basis of this finding about the impact of machine learning and other forms of AI on economic activity. Every case is different, and AI will not always produce such notable changes by itself.</p> <p>“This was a case where not a lot of other changes had to happen in order for the technology to benefit the company,” Brynjolfsson says. “But in many other cases, much more complicated, complementary changes are needed. That’s why, in most cases with machine learning, it takes longer for the benefits to be delivered.”</p> A study co-authored by an MIT economist shows that an improved, automated language-translation system significantly boosted commerce on eBay’s website.Sloan School of Management, Business and management, Machine learning, Artificial intelligence, Economics, Technology and society, Social sciences, Innovation and Entrepreneurship (I&E) MIT Press authors earn coveted “best of” book honors in 2019 The book publisher continues to produce intellectually daring, scholarly work. Wed, 18 Dec 2019 15:30:01 -0500 MIT Press <p>The MIT Press recently announced that six MIT Press authors were awarded “best of” recognition in 2019. From Bill Gates’ recommendation of “Growth,” by one of his “favorite authors,” to “2016 in Museums, Money, and Politics,” which was selected as the <em>ARTnews</em> No. 1 pick for “Best Art Books of the Decade,” the authors of the MIT Press continue to produce intellectually daring, scholarly work.</p> <p>“We are thrilled to have this recognition given to our forward-thinking authors,” says Amy Brand, director of the MIT Press. “Their work and expertise continue to drive our mission and foster the exchange of ideas, reinforcing the importance of intellectual conversations across the arts and sciences&nbsp;that advance our world.”</p> <p>Awards were given to the following books:</p> <p>“Gyorgy Kepes: Undreaming the Bauhaus,” by John R. Blakinger, was selected by <em>The New York Times</em> as a top art book of 2019 by critic Martha Schwendener.</p> <p>“An overdue treatment of the Hungarian-born artist and designer Gyorgy Kepes explores his career,” wrote Schwendener. “Technology and war are often common threads in Kepes’s work. Innovating forms of camouflage during World War II, his designs coincided with clashes around M.I.T.’s connections with the military during the Vietnam War. Mr. Blakinger argues that Kepes represents a new form of modern artist fluent in and influenced by technology: ‘the artist as technocrat.’”</p> <p>“2016 in Museums, Money, and Politics<strong><em>,</em></strong>”<strong><em> </em></strong>by Andrea Fraser, was the No. 1 pick on the “The Best Art Books of the Decade” by Alex Greenberger, senior editor for <em>ARTnews.</em></p> <p>“Where would we be without Andrea Fraser’s “2016 in Museums, Money, and Politics?” asked Greenberger. “This book has become a touchstone at a time when activists are calling out board members for their political leanings … seeing it all collected neatly in one tome is powerful — as a cool-headed study, an intelligent research-based artwork, and a clarion call for change all in one.”</p> <p>“Mass Effect: Art and the Internet in the Twenty-First Century,” edited by Lauren Cornell and Ed Halter, was No. 4 on Greenberger’s “Best Art Books of the Decade.”</p> <p>He wrote, “The closest thing to a movement that emerged this decade was a new kind of digital art — one that was termed ‘post-internet’ by some for the way it moved the slick aesthetics of the web into the world at large. Mass Effect has become the go-to critical companion to this style and work made by the artists whose pioneering pieces inspired it.”</p> <p>“Growth,” by Vaclav Smil, was recommended by Bill Gates on <em>Gates Notes
.</em></p> <p>“When I first heard that one of my favorite authors was working on a new book about growth, I couldn’t wait to get my hands on it,” said Gates. “(Two years ago, I wrote that I wait for new Smil books the way some people wait for the next Star Wars movie. I stand by that statement.) His latest doesn’t disappoint. As always, I don’t agree with everything Smil says, but he remains one of the best thinkers out there at documenting the past and seeing the big picture.”</p> <p>“Fables and Futures,” by George Estreich, was featured on <em>NPR Science Friday</em> as among “The Best Science Books of 2019.”</p> <p>“As new prenatal screening tools enter the market and we begin to seriously grapple with the idea of human genome editing, we would do well to think deeply about the consequences of such technologies on the rights and welfare of individuals we consider disabled,” wrote Valerie Thompson, editor for <em>Science Friday.</em> “I recommend 'Fables and Futures' to anyone who wants to seriously engage in the human genome editing debate at the society and species levels.”</p> <p>“Find Your Path: Unconventional Lessons from 36 Leading Scientists and Engineers,” by Daniel Goodman, was featured as a “Selected New Book on Higher Education” by <em>The Chronicle of Higher Education.</em></p> Six MIT Press authors were awarded “best of” recognition in 2019.Image courtesy of The MIT Press.Awards, honors and fellowships, Books and authors, MIT Press, Science communication, Arts, Economics, Politics, History, Science writing The uncertain role of natural gas in the transition to clean energy MIT study finds that challenges in measuring and mitigating leakage of methane, a powerful greenhouse gas, prove pivotal. Mon, 16 Dec 2019 10:43:54 -0500 David L. Chandler | MIT News Office <p>A new MIT study examines the opposing roles of natural gas in the battle against climate change — as a bridge toward a lower-emissions future, but also a contributor to greenhouse gas emissions.</p> <p>Natural gas, which is mostly methane, is viewed as a significant “bridge fuel” to help the world move away from the greenhouse gas emissions of fossil fuels, since burning natural gas for electricity produces about half as much carbon dioxide as burning coal. But methane is itself a potent greenhouse gas, and it currently leaks from production wells, storage tanks, pipelines, and urban distribution pipes for natural gas. Increasing its usage, as a strategy for decarbonizing the electricity supply, will also increase the potential for such “fugitive” methane emissions, although there is great uncertainty about how much to expect. Recent studies have documented the difficulty in even measuring today’s emissions levels.</p> <p>This uncertainty adds to the difficulty of assessing natural gas’ role as a bridge to a net-zero-carbon energy system, and in knowing when to transition away from it. But strategic choices must be made now about whether to invest in natural gas infrastructure. This inspired MIT researchers to quantify timelines for cleaning up natural gas infrastructure in the United States or accelerating a shift away from it, while recognizing the uncertainty about fugitive methane emissions.</p> <p>The study shows that in order for natural gas to be a major component of the nation’s effort to meet greenhouse gas reduction targets over the coming decade, present methods of controlling methane leakage would have to improve by anywhere from 30 to 90 percent. Given current difficulties in monitoring methane, achieving those levels of reduction may be a challenge. Methane is a valuable commodity, and therefore companies producing, storing, and distributing it already have some incentive to minimize its losses. However, despite this, even intentional natural gas venting and flaring (emitting carbon dioxide) continues.</p> <p>The study also finds policies that favor moving directly to carbon-free power sources, such as wind, solar, and nuclear, could meet the emissions targets without requiring such improvements in leakage mitigation, even though natural gas use would still be a significant part of the energy mix.</p> <p>The researchers compared several different scenarios for curbing methane from the electric generation system in order to meet a target for 2030 of a 32 percent cut in carbon dioxide-equivalent emissions relative to 2005 levels, which is consistent with past U.S. commitments to mitigate climate change. The findings appear today in the journal <em>Environmental Research Letters</em>, in a paper by MIT postdoc Magdalena Klemun and Associate Professor Jessika Trancik.</p> <p>Methane is a much stronger greenhouse gas than carbon dioxide, although how much more depends on the timeframe you choose to look at. Although methane traps heat much more, it doesn’t last as long once it’s in the atmosphere — for decades, not centuries. &nbsp;When averaged over a 100-year timeline, which is the comparison most widely used, methane is approximately 25 times more powerful than carbon dioxide. But averaged over a 20-year period, it is 86 times stronger.</p> <p>The actual leakage rates associated with the use of methane are widely distributed, highly variable, and very hard to pin down. Using figures from a variety of sources, the researchers found the overall range to be somewhere between 1.5 percent and 4.9 percent of the amount of gas produced and distributed. Some of this happens right at the wells, some occurs during processing and from storage tanks, and some is from the distribution system. Thus, a variety of different kinds of monitoring systems and mitigation measures may be needed to address the different conditions.</p> <p>“Fugitive emissions can be escaping all the way from where natural gas is being extracted and produced, all the way along to the end user,” Trancik says. “It’s difficult and expensive to monitor it along the way.”</p> <p>That in itself poses a challenge. “An important thing to keep in mind when thinking about greenhouse gases,” she says, “is that the difficulty in tracking and measuring methane is itself a risk.” If researchers are unsure how much there is and where it is, it’s hard for policymakers to formulate effective strategies to mitigate it. This study’s approach is to embrace the uncertainty instead of being hamstrung by it, Trancik says: The uncertainty itself should inform current strategies, the authors say, by motivating investments in leak detection to reduce uncertainty, or a faster transition away from natural gas.</p> <p>“Emissions rates for the same type of equipment, in the same year, can vary significantly,” adds Klemun. “It can vary depending on which time of day you measure it, or which time of year. There are a lot of factors.”</p> <p>Much attention has focused on so-called “super-emitters,” but even these can be difficult to track down. “In many data sets, a small fraction of point sources contributes disproportionately to overall emissions,” Klemun says. “If it were easy to predict where these occur, and if we better understood why, detection and repair programs could become more targeted.” But achieving this will require additional data with high spatial resolution, covering wide areas and many segments of the supply chain, she says.</p> <p>The researchers looked at the whole range of uncertainties, from how much methane is escaping to how to characterize its climate impacts, under a variety of different scenarios. One approach places strong emphasis on replacing coal-fired plants with natural gas, for example; others increase investment in zero-carbon sources while still maintaining a role for natural gas.</p> <p>In the first approach, methane&nbsp;emissions from the U.S. power sector would need to be reduced by 30 to 90 percent from today’s levels by 2030,&nbsp;along with&nbsp;a 20 percent reduction in&nbsp;carbon dioxide.&nbsp;Alternatively,&nbsp;that target could be met through even greater carbon dioxide&nbsp;reductions, such as through faster expansion of low-carbon electricity, without&nbsp;requiring any&nbsp;reductions in natural&nbsp;gas leakage&nbsp;rates. The higher end of the published ranges reflects greater emphasis on methane’s short-term warming contribution.</p> <p>One question raised by the study is how much to invest in developing technologies and infrastructure for safely expanding natural gas use, given the difficulties in measuring and mitigating methane emissions, and given that virtually all scenarios for meeting greenhouse gas reduction targets call for ultimately phasing out natural gas that doesn’t include carbon capture and storage by mid-century. “A certain amount of investment probably makes sense to improve and make use of current infrastructure, but if you’re interested in really deep reduction targets, our results make it harder to make a case for that expansion right now,” Trancik says.</p> <p>The detailed analysis in this study should provide guidance for local and regional regulators as well as policymakers all the way to federal agencies, they say. The insights also apply to other economies relying on natural gas. The best choices and exact timelines are likely to vary depending on local circumstances, but the study frames the issue by examining a variety of possibilities that include the extremes in both directions — that is, toward investing mostly in improving the natural gas infrastructure while expanding its use, or accelerating a move away from it.</p> <p>The research was supported by the MIT Environmental Solutions Initiative. The researchers also received support from MIT’s Policy Lab at the Center for International Studies.</p> Methane is a potent greenhouse gas, and it currently leaks from production wells, storage tanks, pipelines, and urban distribution pipes for natural gas.IDSS, Research, Solar, Energy, Renewable energy, Alternative energy, Climate change, Technology and society, Oil and gas, Economics, Policy, MIT Energy Initiative, Emissions, Sustainability, ESI, Greenhouse gases New health insurance insights Economists analyze how patients and health care providers value Medicaid. Sun, 15 Dec 2019 23:59:59 -0500 Peter Dizikes | MIT News Office <p>A new analysis of a randomized health insurance program in Oregon sheds light on the value the program has for enrollees and providers alike.</p> <p>The study, by MIT economist Amy Finkelstein and two co-authors, suggests that adults with low incomes value Medicaid at only about 20 cents to 50 cents per dollar of medical spending paid on their behalf.</p> <p>“The value of Medicaid for most low-income adults is much lower than the medical expenditures paid by the insurance,” says Finkelstein, the John and Jennie S. MacDonald Professor at MIT and a leading health care economist.</p> <p>That finding reinforces the results of another, separate study that Finkelstein and multiple co-authors conducted in Massachusetts. In that case, across 70 percent of people in the Massachusetts state health insurance program for low-income adults, their valuation of the program was equal to less than 50 percent of their expected insurance costs.&nbsp;</p> <p>While it might seem puzzling that recipients value health insurance at less than the covered medical expenditures, the study also offers an explanation for this: Low-income individuals who do not have insurance still only pay a fraction of their medical costs. In the Oregon data, this figure was roughly 20 percent of medical costs; prior studies have found similar results nationwide. The remainder of the spending on the low-income uninsured comes from a variety of sources, including charity care from nonprofit hospitals, publicly funded health clinics that offer free care, state funding to hospitals for uncompensated care, and unpaid medical debt.</p> <p>“The nominally uninsured have a fair amount of implicit insurance,” Finkelstein says. “Once you put it in that light, it becomes a lot less surprising that Medicaid spending is valued by them at a lot less than dollar for dollar.”</p> <p>One further implication of the findings is that a significant portion of public spending on health insurance for low-income individuals effectively acts as a subsidy for health care providers and state programs that cover the costs of uninsured patients.</p> <p>The new paper, “The Value of Medicaid: Interpreting Results from the Oregon Health Experiment,” appears in the December issue of the <em>Journal of Political Economy</em>. Its co-authors are Finkelstein; Nathan Hendren PhD ’12, a professor of economics at Harvard University; and Erzo F.P. Luttmer, a professor of economics at Dartmouth College.</p> <p>The <a href=";within%5Bauthor%5D=on&amp;journal=1&amp;q=finkelstein&amp;from=j" target="_blank">previous paper</a>, “Subsidizing Health Insurance for Low-Income Adults: Evidence from Massachusetts,” was published last spring in the <em>American Economic Review</em>. Its co-authors are Finkelstein; Hendren; and Mark Shepard, an assistant professor at the Harvard Kennedy School of Government.</p> <p><strong>A random walk in Oregon</strong></p> <p>The latest paper examines a distinctive Medicaid policy that Oregon implemented in 2008. With funding to cover only about 10,000 of eligible adults, Oregon conducted a lottery to decide who would be eligible to apply for Medicaid.</p> <p>That random assigment of slots using a lottery allowed the researchers to develop a study comparing two otherwise similar groups of Oregon residents: those who had obtained Medicaid coverage via lottery and those who entered the lottery but did not gain coverage. In effect, Oregon had developed a randomized controlled trial, which the scholars used for their research.</p> <p>Medicaid eligibility regulations and administrative practices can vary by state. In Oregon, adults and children generally qualify for Medicaid when they live in a household with income no greater than 133 percent of the poverty level defined by the U.S. federal government; in 2016, in the 48 contiguous states, that was $11,800 for a single person and $24,300 for a family of four.</p> <p>Previous studies of the Oregon experiment that Finkelstein has led have shown that, among other things, emergency room use increases among Medicaid recipients, contrary to expectations of many experts.</p> <p>Being covered by Medicaid also increases patient visits to doctors, prescription drug use, and hospital admissions, while reducing out-of-pocket medical expenses and lowering unpaid medical debt for recipients. Medicaid coverage also appears to lower the incidence of depression, although it does not seem to change the available measures of physical health.</p> <p>The current study uses data from the prior Oregon studies, as well as state Medicaid records, and survey data from individuals who applied for Oregon’s lottery. The survey data show how much people used health care, including prescription drugs, outpatient visits, emergency-room visits, and hospital visits.</p> <p>In line with previous studies, the current paper shows that having Medicaid increases total spending on health care — about $3,600 reimbursed to providers annually on behalf of each Medicaid enrollee, compared to $2,721 annually for each low-income uninsured individual. Of that $2,721, the low-income uninsured paid about $569 in annual out-of-pocket costs — the source of the paper’s estimate that uninsured individuals pay about 20 percent of charged costs.</p> <p>Using this data, the researchers also estimated an annual <em>net</em> cost of Medicaid in Oregon of $1,448 per recipient. This is the average annual increase in health care spending by Medicaid recipients, plus their average annual decrease in out-of-pocket spending. Thus moving a low-income uninsured individual in Oregon onto Medicaid results in a $1,448 increase in insured health care spending on behalf of that person.</p> <p>Because the Oregon Medicaid program’s reimbursements to health care providers are an average of $3,600 annually per recipient, the researchers estimate that about 40 percent of Medicaid spending underwrites costs incurred by enrollees. The other 60 percent is, as they write in the paper, “best conceived of as … a monetary transfer to external parties who would otherwise subsidize the medical care for the low-income uninsured.”</p> <p>Simultaneously, the researchers refined their “willingness to pay” metric by using multiple methods to estimate how much having health insurance affects consumer spending generally. These methods yielded three estimates ranging from $793 to $1,675 in annual health care spending for low-income individuals. This is the source of the paper’s conclusion that people value Medicaid at 20 percent to 50 percent of charged costs.</p> <p><strong>Two approaches, similar results</strong></p> <p>Significantly, the two studies use different methodological approaches to study different programs in different states, and arrive at similar conclusions. In Massachusetts, the scholars used data from the state’s health insurance program — a forerunner of the federal Affordable Care Act — to see how the share of eligible individuals who signed up for insurance changed as their subsidy level changed.</p> <p>“Despite a different design and different setting, even though it’s Massachusetts and not Oregon, and different method, we got pretty much the same result,” Finkelstein observes.</p> <p>Overall, Finkelstein says, it will be valuable to keep learning about the care obtained by uninsured people, as well as the ultimate destination of Medicaid funding, including the 60 percent that is routed to other parties that subsidize care for the low-income uninsured. Understanding who ultimately gets those transfers, she notes, could help illuminate how redistributive Medicaid actually is, as a program intended to benefit lower-income Americans.</p> <p>Moreover, Finkelstein says, more research will be needed to study how best to provide health care for lower-income Americans.</p> <p>“Right now we have an implicit, informal insurance system that likely reduces demand for formal insurance but provides a sort of patchwork of care that may not be very good,” Finkelstein says.</p> <p>Funding for the two studies was provided by the National Institute of Aging, the National Science Foundation, and the Harvard University Lab for Economic Applications and Policy.</p> A new analysis of a randomized health insurance program in Oregon sheds light on the value Medicaid has for enrollees and providers alike.School of Humanities Arts and Social Sciences, Economics, Health, Medicine, National Institutes of Health (NIH), National Science Foundation (NSF), Health care, Research Taking the carbon out of construction with engineered wood Substituting lumber for materials such as cement and steel could cut building emissions and costs. Wed, 11 Dec 2019 12:55:01 -0500 Mark Dwortzan | Joint Program on the Science and Policy of Global Change <p>To meet the long-term goals of the Paris Agreement on climate change — keeping global warming well below 2 degrees Celsius and ideally capping it at 1.5 C — humanity will ultimately need to achieve net-zero emissions of greenhouse gases (GHGs) into the atmosphere. To date, emissions reduction efforts have largely focused on decarbonizing the two economic sectors responsible for the most emissions, electric power and transportation. Other approaches aim to remove carbon from the atmosphere and store it through carbon capture technology, biofuel cultivation, and massive tree planting. &nbsp;</p> <p>As it turns out, planting trees is not the only way forestry can help in climate mitigation; how we use wood harvested from trees may also make a difference. Recent studies have shown that engineered wood products — composed of wood and various types of adhesive to enhance physical strength — involve far fewer carbon dioxide emissions than mineral-based building materials, and at lower cost. Now <a href="" target="_blank">new research</a> in the journal <em>Energy Economics</em> explores the potential environmental and economic impact in the United States of substituting lumber for energy-intensive building materials such as cement and steel, which account for <a href="" target="_blank">nearly 10 percent</a> of human-made GHG emissions and are among the hardest to reduce.</p> <p>“To our knowledge, this study is the first economy-wide analysis to evaluate the economic and emissions impacts of substituting lumber products for more CO<sub>2</sub>-intensive materials in the construction sector,” says the study’s lead author <a href="">Niven Winchester</a>, a research scientist at the MIT Joint Program on the Science and Policy of Global Change and Motu Economic and Public Policy Research. “There is no silver bullet to reduce GHGs, so exploiting a suite of emission-abatement options is required to mitigate climate change.”</p> <p>Comparing the economic and emissions impacts of replacing CO<sub>2</sub>-intensive building materials (e.g., steel and concrete) with lumber products in the United States under an economy-wide cap-and-trade policy consistent with the nation’s Paris Agreement GHG emissions-reduction pledge, the study found that the CO<sub>2</sub> intensity (tons of CO<sub>2</sub> emissions per dollar of output) of lumber production is about 20 percent less than that of fabricated metal products, under 50 percent that of iron and steel, and under 25 percent that of cement. In addition, shifting construction toward lumber products lowers the GDP cost of meeting the emissions cap by approximately $500 million and reduces the carbon price.</p> <p>The authors caution that these results only take into account emissions resulting from the use of fossil fuels in harvesting, transporting, fabricating, and milling lumber products, and neglect potential increases in atmospheric CO<sub>2</sub> associated with tree harvesting or beneficial long-term carbon sequestration provided by wood-based building materials.</p> <p>“The source of lumber, and the conditions under which it is grown and harvested, and the fate of wood products deserve further attention to develop a full accounting of the carbon implications of expanded use of wood in building construction,” they write. “Setting aside those issues, lumber products appear to be advantageous compared with many other building materials, and offer one potential option for reducing emissions from sectors like cement, iron and steel, and fabricated metal products — by reducing the demand for these products themselves.”</p> <p>Funded, in part, by Weyerhaeuser and the Softwood Lumber Board, the study develops and utilizes a customized economy-wide model that includes a detailed representation of energy production and use and represents production of construction, forestry, lumber, and mineral-based construction materials.</p> A 70-unit British Columbia lakeside resort hotel was built with local engineered wood products, including cross-laminated timber. New research explores the potential environmental and economic impact in the United States of substituting lumber for energy-intensive building products such as cement and steel.Photo: Province of British Columbia/FlickrResearch, Climate change, Greenhouse gases, Emissions, Climate, Environment, Energy, Economics, Policy, Carbon dioxide, Building, Sustainability, Materials Science and Engineering, Cement, Joint Program on the Science and Policy of Global Change MIT conference focuses on preparing workers for the era of artificial intelligence As automation rises in the workplace, speakers explore ways to train students and reskill workers. Fri, 22 Nov 2019 16:35:55 -0500 Rob Matheson | MIT News Office <p>In opening yesterday’s AI and the Work of the Future Congress, MIT Professor Daniela Rus presented diverging views of how artificial intelligence will impact jobs worldwide.</p> <p>By automating certain menial tasks, experts think AI is poised to improve human quality of life, boost profits, and create jobs, said Rus, director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science.</p> <p>Rus then quoted a World Economic Forum study estimating AI could help create 133 million new jobs worldwide over the next five years. Juxtaposing this optimistic view, however, she noted a recent survey that found about two-thirds of Americans believe machines will soon rob humans of their careers. “So, who is right? The economists, who predict greater productivity and new jobs? The technologists, who dream of creating better lives? Or the factory line workers who worry about unemployment?” Rus asked. “The answer is, probably all of them.”</p> <p>Her remarks kicked off an all-day conference in Kresge Auditorium that convened experts from industry and academia for panel discussions and informal talks about preparing humans of all ages and backgrounds for a future of AI automation in the workplace. The event was co-sponsored by CSAIL, the MIT Initiative on the Digital Economy (IDE), and the MIT Work of the Future Task Force, an Institute-wide effort launched in 2018 that aims to understand and shape the evolution of jobs during an age of innovation.</p> <p>Presenters were billed as “leaders and visionaries” rigorously measuring technological impact on enterprise, government, and society, and generating solutions. Apart from Rus, who also moderated a panel on dispelling AI myths, speakers included Chief Technology Officer of the United States Michael Kratsios; executives from Amazon, Nissan, Liberty Mutual, IBM, Ford, and Adobe; venture capitalists and tech entrepreneurs; representatives of nonprofits and colleges; journalists who cover AI issues; and several MIT professors and researchers.</p> <p>Rus, a self-described “technology optimist,” drove home a point that echoed throughout all discussions of the day: AI doesn’t automate jobs<em>,&nbsp;</em>it automates tasks. Rus quoted a recent McKinsey Global Institute study that estimated 45 percent of tasks that humans are paid to do can now be automated. But, she said, humans can adapt to work in concert with AI —&nbsp;meaning job tasks may change dramatically, but jobs may not disappear entirely. “If we make the right choices and the right investments, we can ensure that those benefits get distributed widely across our workforce and our planet,” Rus said.</p> <p><strong>Avoiding the “job-pocalypse”</strong></p> <p>Common topics throughout the day included reskilling veteran employees to use AI technologies; investing heavily in training young students in AI through tech apprenticeships, vocational programs, and other education initiatives; ensuring workers can make livable incomes; and promoting greater inclusivity in tech-based careers. The hope is to avoid, as one speaker put it, a “job-pocalypse,” where most humans will lose their jobs to machines.</p> <p>A panel moderated by David Mindell, the Dibner Professor of the History of Engineering and Manufacturing and a professor of aeronautics and astronautics, focused on how AI technologies are changing workflow and skills, especially within sectors resistant to change. Mindell asked panelists for specific examples of implementing AI technologies into their companies.</p> <p>In response, David Johnson, vice president of production and engineering at Nissan, shared an anecdote about pairing an MIT student with a 20-year employee in developing AI methods to autonomously predict car-part quality. In the end, the veteran employee became immersed in the technology and is now using his seasoned expertise to deploy it in other areas, while the student learned more about the technology’s real-world applications. “Only through this synergy, when you purposely pair these people with a common goal, can you really drive the skills forward … for mass new technology adoption and deployment,” Johnson said.</p> <p>In a panel about shaping public policies to ensure technology benefits society — which included U.S. CTO Kratsios — moderator Erik Brynjolfsson, director of IDE and a professor in the MIT Sloan School of Management, got straight to the point: “People have been dancing around this question: Will AI destroy jobs?”</p> <p>“Yes, it will — but not to the extent that people presume,” replied MIT Institute Professor Daron Acemoglu. AI, he said, will mostly automate mundane operations in white-collar jobs, which will free up humans to refine their creative, interpersonal, and other high-level skills for new roles. Humans, he noted, also won’t be stuck doing low-paying jobs, such as labeling data for machine-learning algorithms.</p> <p>“That’s not the future of work,” he said. “The hope is we use our amazing creativity and all these wonderful and technological platforms to create meaningful jobs in which humans can use their flexibility, creativity, and all the things … machines won’t be able to do — at least in the next 100 years.”</p> <p>Kratsios emphasized a need for public and private sectors to collaborate to reskill workers. Specifically, he pointed to the Pledge to the America’s Worker, the federal initiative that now has 370 U.S. companies committed to retraining roughly 4 million American workers for tech-based jobs over the next five years.</p> <p>Responding to an audience question about potential public policy changes, Kratsios echoed sentiments of many panelists, saying education policy should focus on all levels of education, not just college degrees. “A vast majority of our policies, and most of our departments and agencies, are targeted toward coaxing people toward a four-year degree,” Kratsios said. “There are incredible opportunities for Americans to live and work and do fantastic jobs that don’t require four-year degrees. So, [a change is] thinking about using the same pool of resources to reskill, or retrain, or [help students] go to vocational schools.”</p> <p><strong>Inclusivity and underserved populations</strong></p> <p>Entrepreneurs at the event explained how AI can help create diverse workforces. For instance, a panel about creating economically and geographically diverse workforces, moderated by Devin Cook, executive producer of IDE’s Inclusive Innovation Challenge, included Radha Basu, who founded Hewlett Packard’s operations in India in the 1970s. In 2012, Basu founded iMerit, which hires employees — half are young women and more than 80 percent come from underserved populations —&nbsp;to provide AI services for computer vision, machine learning, and other applications.</p> <p>A panel hosted by Paul Osterman, co-director of the MIT Sloan Institute for Work and Employment Research and an MIT Sloan professor, explored how labor markets are changing in the face of technological innovations. Panelist Jacob Hsu is CEO of Catalyte, which uses an AI-powered assessment test to predict a candidate’s ability to succeed as a software engineer, and hires and trains those who are most successful. Many of their employees don’t have four-year degrees, and their ages range from anywhere from 17 to 72.</p> <p>A “media spotlight” session, in which journalists discussed their reporting on the impact of AI on the workplace and the world, included David Fanning, founder and producer of the investigative documentary series FRONTLINE, which recently ran a documentary titled “In the Era of AI.” Fanning briefly discussed how, during his investigations, he learned about the profound effect AI is having on workplaces in the developing world, which rely heavily on manual labor, such as manufacturing lines.</p> <p>“What happens as automation expands, the manufacturing ladder that was opened to people in developing countries to work their way out of rural poverty — all that manufacturing gets replaced by machines,” Fanning said. “Will we end up across the world with people who have nowhere to go? Will they become the new economic migrants we have to deal with in the age of AI?”</p> <p><strong>Education: The great counterbalance</strong></p> <p>Elisabeth Reynolds, executive director for the MIT Task Force on the Work of the Future and of the MIT Industrial Performance Center, and Andrew McAfee, co-director of IDE and a principal research scientist at the MIT Sloan School of Management, closed out the conference and discussed next steps.</p> <p>Reynolds said the MIT Task Force on the Work of the Future, over the next year, will further study how AI is being adopted, diffused, and implemented across the U.S., as well as issues of race and gender bias in AI. In closing, she charged the audience with helping tackle the issues: “I would challenge everybody here to say, ‘What on Monday morning is [our] organization doing in respect to this agenda?’”&nbsp;</p> <p>In paraphrasing economist Robert Gordon, McAfee reemphasized the shifting nature of jobs in the era of AI: “We don’t have a job quantity problem, we have a job quality problem.”</p> <p>AI may generate more jobs and company profits, but it may also have numerous negative effects on employees. Proper education and training are keys to ensuring the future workforce is paid well and enjoys a high quality of life, he said: “Tech progress, we’ve known for a long time, is an engine of inequality. The great counterbalancing force is education.”</p> Daniela Rus (far right), director of the Computer Science and Artificial Intelligence Laboratory (CSAIL), moderated a panel on dispelling the myths of AI technologies in the workplace. The AI and the Work of the Future Congress was co-organized by CSAIL, the MIT Initiative on the Digital Economy, and the MIT Work of the Future Task Force.Image: Andrew KubicaResearch, Computer science and technology, Algorithms, Computer Science and Artificial Intelligence Laboratory (CSAIL), Sloan School of Management, Technology and society, Jobs, Economics, Policy, Artificial intelligence, Machine learning, Innovation and Entrepreneurship (I&E), Business and management, Manufacturing, Careers, Special events and guest speakers New MIT fellowship program aims to improve student access to quality schools Program to provide leaders of America’s largest school districts, state agencies, and education nonprofits with tools to improve school performance and enrollment. Thu, 21 Nov 2019 13:50:01 -0500 Stefanie Koperniak | Office of Open Learning <p>School districts nationwide are striving to offer more school options and to increase the overall quality of education for students, yet families everywhere struggle to enroll their children in a school that is the right fit.</p> <p>In an effort to help state and local education leaders improve enrollment systems and address some of education’s most substantial challenges, MIT has launched the MIT School Access and Quality Fellowship Program. This yearlong program, designed by the MIT School Effectiveness and Inequality Initiative (SEII) and the MIT Integrated Learning Initiative (MITili) and supported by the Michael and Susan Dell Foundation and Arnold Ventures, will engage the leaders of America’s largest school districts, state agencies, and education nonprofits, equipping them with the latest tools to improve school performance and enrollment policies.&nbsp;</p> <p>SEII, a research lab based in MIT’s Department of Economics and supported by MITili, conducts research to inform policy with school districts, state agencies, nonprofits, and higher education institutions throughout the United States. Locally, SEII’s scholars have worked with Boston Public Schools for almost 15 years to study student assignment and school choice processes. The lab helps BPS and other large urban districts develop fair and efficient enrollment systems.</p> <p>Now MIT professors and SEII co-directors Joshua Angrist and Parag Pathak hope to extend this work to school districts nationwide through the fellowship.&nbsp;</p> <p>“There is a lot of potential to use data to facilitate data-driven policy-making,” says Pathak. “It became apparent that it would be a missed opportunity not to equip district leaders with the latest thinking and tools.”</p> <p>In addition to exploring the latest evidence on enrollment practices, fellowship discussions also center on the close ties between school quality and school choice. “Every city measures and shares information about school quality — but this information can sometimes be deceiving,” says Angrist. For example, a highly selective school with successful alumni may not necessarily be “better” than other schools when considering the available data. Rather, it may be that the school admits students who are already on track to have good outcomes.</p> <p>The fellowship officially launched at the MIT School Access and Quality Summit Nov. 11-12. At the event, the first cohort of 14 fellows met with researchers and other education leaders to learn and share enrollment best practices. The program is intended to be a “two-way conversation” between fellows and researchers — with the fellows providing valuable perspectives about the specific policy challenges they face, and the researchers sharing approaches to interpreting data. The interaction between the fellows and MIT researchers will continue beyond the summit, with fellows participating in activities throughout the year.</p> <p>The collaborative nature of the program, bringing together practitioners and researchers, is an innovative approach to tackling some of the ongoing challenges that education leaders face. Dana Peterson, assistant state superintendent and CEO of the Baton Rouge Achievement Zone, explains that this program provides an opportunity to expand the infrastructure around school choice in light of the growing charter school sector in his region. He says that families in the area that have had very few schooling choices 10-15 years ago may now have 20 options.</p> <p>“With charter schools, magnet schools, and also some scholarship opportunities, parents now have an abundance of choice,” says Peterson, “but that just creates more challenges for parents.”</p> <p>He looks forward to continuing to streamline Baton Rouge’s enrollment process and to empower parents to make well-informed school decisions.</p> <p>An abundance of choice is also a challenge in the New York City school system, the largest in the United States, which serves more than 1 million students in over 1,500 schools. &nbsp;</p> <p>“Many families learn about schools through word-of-mouth, or look at the schools closest to home,” says Nadiya Chadha, senior director of enrollment research and policy at the New York City Department of Education. “Given that reality, how can we highlight schools that may not have a strong reputation based on the usual metrics — such as graduation rate, or state exam performance — but are showing strong signs of growth and success on less-traditional measures?”</p> <p>Jorge Robles, chief operating officer of Tulsa Public Schools, identifies some of the district’s key challenges as not having easily accessible information about all schools, as well as an overall lack of awareness of the enrollment system.</p> <p>“Tulsa Public Schools is perceived to have a ‘dual system’ where magnet schools provide quality seats, and traditional neighborhood schools and charters do not,” says Robles. “Access to the perceived quality seats is limited and seen as not equitable. Consequently, in several of the traditional schools, enrollment is so low that it can be difficult to provide quality programming.”</p> <p>Robles says that he hopes to learn how to leverage unified enrollment system data to advance his work to improve quality school options in Tulsa, Oklahoma. In addition, he sees the program as a critical opportunity to learn directly from researchers and other K-12 leaders about enrollment policies that can further improve equitable access to first-rate educational experiences for all students.</p> <p>“I hope this will spark new ideas regarding the intersection of admissions and school quality, through examples from other districts and collective problem-solving,” says Chadha. “I’m excited to harness the brainpower of the group and create connections that we can continue beyond the conference to continually improve our work in NYC and across the country.”&nbsp;</p> MIT SEII and MITili Co-Director Parag Pathak (center) speaks with unified enrollment experts Gabriela Fighetti of New Orleans, Louisiana, and Catherine Peretti of Washington, D.C. Photo: Christopher McIntoshOffice of Open Learning, Economics, K-12 education, Education, teaching, academics, School of Humanities Arts and Social Sciences New research partnership evaluates innovation in family engagement Randomized evaluation of the TalkingPoints multilingual family engagement platform will assess the intervention&#039;s impact on student achievement. Tue, 19 Nov 2019 10:40:01 -0500 J-PAL North America <p>This fall, <a href="">J-PAL North America</a> partnered with <a href="">TalkingPoints</a>, an education technology non-profit, and the<a href="" style="text-decoration-line: none;"> </a><a href="">Behavioral Insights and Parenting Lab</a> (BIP Lab) at the University of Chicago, to evaluate the TalkingPoints multilingual family engagement platform. The platform will be assessed through a year-long randomized evaluation that will be conducted in more than 50 third-grade classrooms across the country. This evaluation will produce insights on whether the TalkingPoints platform increases parental engagement, and if so, whether there is a resulting increase in children’s executive function — a precursor to improved academic outcomes across all education levels such as literacy, numeracy, and high school graduation rates.</p> <p>Beginning in 2015, local and federal law began requiring schools to provide programming intended to promote parental engagement in their children’s education. TalkingPoints was founded that year to drive student success — especially in underserved, diverse communities — by using accessible technology to unlock the potential of families to support their children's education. TalkingPoints developed a multilingual family engagement platform that allows educators to communicate directly with English and non-English speaking parents. Currently, it supports two-way messaging in more than 100 languages and provides tips for communicating with teachers and other information about their children’s education.&nbsp;</p> <p>“We are excited for this opportunity to rigorously test our family engagement platform to understand its true impact on parental engagement and, ultimately, student achievement,” says Heejae Lim, founder and CEO of TalkingPoints.&nbsp;</p> <p>“We also hope that this evaluation can raise awareness of the value that rigorous evaluations like this can contribute to the field of using technology in education and leveraging parents and families as key partners to schools,” says Nancy Bromberger, vice president of partnerships at TalkingPoints.</p> <p>The evaluation will be led by the BIP Lab at the University of Chicago, which conducts rigorous research on the science of parental decision-making. The BIP Lab specializes in research to identify light-touch behavioral interventions for parents that work to change child outcomes, particularly for disadvantaged children.&nbsp;</p> <p>“This research partnership highlights the BIP Lab&nbsp;and TalkingPoints’ mutual interest in identifying effective behavioral tools and our shared focus on low cost, accessible interventions,” says Professor Ariel Kalil, BIP Lab co-founder.&nbsp;</p> <p>“Working to improve the quality and quantity of parent engagement to positively affect child outcomes is central to the mission of both of our organizations,” says Susan Mayer, BIP Lab co-founder.&nbsp; “We are excited to be launching this rigorous evaluation to contribute to the evidence base.”&nbsp;</p> <p>The evaluation is funded through the <a href="">J-PAL North America Education, Technology, and Opportunity Initiative</a>, which supports education leaders in using randomized evaluations to generate evidence on how and to what extent uses of technology and innovation work to improve student learning.&nbsp;&nbsp;</p> <p>“We are thrilled to be catalyzing this rigorous evaluation of a promising education technology platform,” says Kim Dadisman, J-PAL North America Education, Technology and Opportunity Initiative manager. “We are inspired by committed researchers and implementing partners like the BIP Lab and TalkingPoints that we connect to identify policy-relevant research questions and translate research into action.”&nbsp;</p> <p>The Education, Technology and Opportunity Initiative,&nbsp;supported by Arnold Ventures and the Overdeck Family Foundation, has funded seven evaluations to date on educational technology programs, ranging from computer-assisted learning to technology-enabled behavioral interventions. The TalkingPoints evaluation will be piloted in spring 2020, followed by full implementation beginning in fall 2020. J-PAL North America, TalkingPoints, and the BIP Lab are committed to sharing study results and identifying relevant policy lessons to inform the broader field of family engagement.</p> <div></div> The TalkingPoints multilingual family engagement platform allows educators to communicate directly with English and non-English speaking parents through two-way messaging. J-PAL North America, TalkingPoints, and the BIP Lab are partnering to rigorously evaluate whether the innovative intervention increases parental engagement and children’s executive function.Photo: TalkingPointsAbdul Latif Jameel Poverty Action Lab (J-PAL), School of Humanities Arts and Social Sciences, Economics, Learning, K-12 education, Technology and society, Education, teaching, academics Times Higher Education ranks MIT No. 1 university worldwide for economics and business for 2020 Top honors awarded to fields in the School of Humanities, Arts, and Social Sciences and in the MIT Sloan School of Management for a second year in a row. Thu, 14 Nov 2019 22:55:01 -0500 School of Humanities, Arts, and Social Sciences <p>For the second year in a row, MIT has achieved the top ranking globally for the Business and Economics subject category in the 2020 Times Higher Education World University Rankings.<br /> <br /> MIT has also been ranked No. 1 in the world for the Social Science fields for 2020 by <em>Times Higher Education (THE),</em> a leading British education magazine.<br /> <br /> AT MIT, business and economic studies are housed in the Department of Economics, within the MIT School of Humanities, Arts, and Social Sciences (SHASS) and in the MIT Sloan School of Management.<br /> <br /> Using a set of 13 rigorous performance indicators, <em>THE</em> compiles and publishes its annual World University Rankings. These rankings evaluate schools as a whole and within individual fields.&nbsp;<em>THE</em> discerns a university’s quality in a given subject area through five area metrics: the learning environment; the&nbsp;volume, income, and reputation of its research; the influence of its citations&nbsp;in other research; the&nbsp;international&nbsp;outlook of its staff, students, and research; and&nbsp;its knowledge transfer to various industries.<br /> <br /> “The work of both MIT SHASS and MIT Sloan continues to advance the highest areas of scholarship and practical application,” says Melissa Nobles, the Kenan Sahin Dean of the MIT School of Humanities, Arts, and Social Sciences. "MIT's longstanding commitment to cross-disciplinary thinking and collaboration fosters the strength of the collective business and economics programs across the Institute.<br /> <br /> We warmly congratulate our colleagues in MIT Sloan with whom we share this honor.&nbsp;From poverty alleviation to the future of work, the combined knowledge and experience of MIT’s experts helps shape economic policy, drive business growth, and prepare future leaders."<br /> &nbsp;<br /> <strong>The MIT Sloan School of Management</strong></p> <p>The MIT Sloan School of Management, which evolved out of Course XV/Engineering Administration, is a powerful force within MIT’s entrepreneurial environment, training alumni whose businesses — which include HubSpot, ZipCar, Akamai, and E*Trade — have created millions of jobs and generate nearly $2 trillion a year in revenue.&nbsp;At the intersection of business and technology, MIT Sloan is exploring the future of work and launching companies that kick-start local economies in the developing world. The school is retooling systems to make health care work better and to engage people around the world in addressing climate change. For&nbsp;students, this means different kinds of opportunities, hands-on learning, global experience — and a relentless focus on impact.<br /> <br /> “We are thrilled to share the honors of being first in business and management for the second year in a row with our colleagues in the MIT Department of Economics and MIT SHASS,” says David Schmittlein, dean of MIT Sloan.<br /> &nbsp;<br /> <strong>The MIT Department of Economics</strong></p> <p>For more than a century, MIT’s Department of Economics has been at the forefront of economics education, research, and public service. Its master's and doctoral programs are renowned worldwide, and graduates of the department are well-represented on the faculties of virtually all leading economics departments.<br /> <br /> “We’re proud of this recognition of the contributions of MIT Economics” says Nancy L. Rose, department head and the Charlies P. Kindleberger Professor of Applied Economics. “MIT Economics continues to play a key role in advancing the frontier in economics research and education, with four of our faculty recognized with Nobel Prizes over just the past decade, and eight more of our graduate alumni among the Nobel ranks since 2001.&nbsp;Our undergraduate and graduate alumni amplify the reach of our research and education program,” adds Rose, “through their impact on industry, public policy, and the economics profession.”<br /> <br /> Last month, faculty in the department were honored with two new Nobel Prizes, awarded to professors&nbsp;Abhijit Banerjee and Esther Duflo, who lead the department’s Abdul Latif Jameel Poverty Action Lab, for their transformative work in poverty alleviation and development economics.</p> "The work of both MIT SHASS and MIT Sloan continues to advance the highest areas of scholarship and practical application," says SHASS Dean Melissa Nobles. "From poverty alleviation to the future of work, the combined knowledge and experience of MIT’s experts helps shape economic policy, drive business growth, and prepare future leaders."Photo: Evan LiebermanBusiness and management, Economics, Rankings, School of Humanities Arts and Social Sciences, Sloan School of Management Economics for hard times In new book, Nobel laureates Banerjee and Duflo examine what we know about the global economy and how to improve it. Mon, 11 Nov 2019 23:59:59 -0500 Peter Dizikes | MIT News Office <p>Economists, on the whole, favor open immigration and free trade policies, which they regard as catalysts for economic growth. But as polling shows, many people in the U.S. and Europe disagree. They are wary of losing jobs and earning power where there is immigration, and they believe free trade pushes industry abroad. So who’s right, the economists, or the people?</p> <p>Well, according to MIT’s newest Nobel Prize laureates, economists Esther Duflo and Abhijit Banerjee, each side gets one count right and one wrong.&nbsp;</p> <p>“If you look at the best evidence, it tells us that economists’ view of migration is more correct,” Duflo says. “It is not a big problem to let more migrants in.” Study after study shows that increased immigration does not affect wages, for instance. And the presence of migrants tends to let more women who are longtime residents enter the work force.</p> <p>Okay, what about trade?</p> <p>“On trade it is the opposite,” Duflo says. “The evidence shows that people’s instinctive view of trade, that it does hurt them, has a lot that is true about it, and economists’ instinctive view on trade, that it should be good for everyone, is not correct.”</p> <p>Although free trade boosts overall growth, it also produces concentrated pockets of job losses. And while economic theory has long held that displaced workers will move to new job opportunities, this rarely happens. In countries that started trading with China during the last two decades, for example, the working-age population has not decreased in the areas most hard-hit by imports from China.</p> <p>What the mistaken ideas about both immigration and trade ignore, Banerjee says, is the “stickiness” of real life. Most people do not want to uproot themselves.</p> <p>“One thing that ties those two issues together is the idea of stickiness,” Banerjee says. “Ordinary people like to stay in place. [Economists] think trade should be fine because, yes, it could hurt some people, but people are going to move to other jobs in other places. But people are very reluctant to do that. They don’t want to go to a different sector and a different place and a different life.”</p> <p>Until recently, these were not the kinds of issues Banerjee and Duflo often discussed. But now, in their second book, “Good Economics for Hard Times,” published today by Public Affairs Press, the MIT duo examines large-scale, politically fraught issues with economic implications, including immigration, trade, social identity, inequality, automation, and more.</p> <p>In each case, the book examines what empirical research tell us about the world — as well as the limits of our knowledge. Only on that basis, Banerjee and Duflo suggest, can we think effectively about economic policy.</p> <p>Or, as the authors write in the new book, “The world is a sufficiently complicated and uncertain place that the most valuable thing economists have to share is often not their conclusion, but the path they took to reach it — the facts they knew, the way they interpreted those facts, the deductive steps they took, the remaining sources of their uncertainty.”</p> <p><strong>Scaling up</strong></p> <p>The new work by Banerjee and Duflo follows “Poor Economics,” (PublicAffairs, 2011), their first book, which focused on helping the world’s 1 billion poorest people, who exist on the equivalent of $1 per day.</p> <p>“Poor Economics” stemmed from research Banerjee and Duflo have created and facilitated as co-founders (with Sendhil Mullainathan) of MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL), a leading antipoverty research network. These smaller-scale, empirical projects are what won Duflo and Banerjee the Nobel Prize in Economic Sciences last month, which they shared with Michael Kremer of Harvard University.</p> <p>By contrast, “Good Economics for Hard Times” examines issues of global scale, while maintaining the authors’ taste for empiricism. Trade is a hotly debated issue, but how much does it contribute to growth? As the authors note, it produces a notably modest benefit in the U.S., where it equals about 2.5 percent of GDP, no more than what a good year of growth is worth.</p> <p>Similarly, while immigration stereotypes abound — think of the “Polish plumber” who supposedly takes away fix-it jobs from the British or French — people do not migrate as often as the popular perception suggests. About 3 percent of Greeks have left the country this decade, despite unemployment rates reaching as high as 27 percent and the presence of open borders in the European Union.</p> <p>“The Polish plumber is an iconic figure in France but lives mostly in Poland,” Duflo says.</p> <p>To be sure, large sections of “Good Economics for Hard Times” focus on other issues. In one chapter, Banerjee and Duflo contend that people’s sense of ethnic or partisan identity is more flexible than is often assumed. If so, that would be good news for some policy advocates. In many countries, ethnic or political divisions can create a barrier to public spending if people are unwilling to be taxed for the sake of other social groups. But Banerjee and Duflo suggest that a significant part of this is a public-perception problem.&nbsp;</p> <p>“At the core of this is a lie,” Banerjee says. “And I think we have to start by saying that. It’s just not true that all federal and state spending goes to ‘other’ people. There’s a lot of polarization that was created by such lies, and while we won’t fix these prejudices in a day, I do think it’s worth pushing back.”</p> <p>As Duflo points out, it is also tough to establish cause and effect when examining why some governments tax and spend more than others.&nbsp;</p> <p>“It’s true the U.S. is a diverse society and Denmark is not diverse, and Denmark has much higher taxes than the U.S.,” she says. “But I’m not sure whether that comes from the fact that Denmark is [more socially homogenous] or from the fact that the government as an enterprise is [seen as] a more legitimate enterprise generally.”</p> <p><strong>“We have much more to learn”</strong></p> <p>While the intent of “Good Economics for Hard Times” may be to get people to think sharply about pressing problems, Banerjee and Duflo also discuss the kinds of policy interventions they think are promising. Some of these aid people in “transitions” during life, especially job loss. These transitions have significant social impact; research shows that people who lose jobs after age 50 have lower life expectancy than those who keep working. In a rapidly-changing economy, we need to worry about the many people who will have a hard time in the labor market.&nbsp;</p> <p>“The idea that people are going to find the opportunity, left to themselves, is implausible,” Banerjee says. “”[Our] view is we need to act on this collectively: You aren’t a failure because you lost your job. This is a transition. It’s society’s problem rather than only yours.”</p> <p>While there are a variety of policy measures to do this — such as improved Trade Adjustment Assistance for people displaced by trade-induced job losses — Banerjee and Duflo also favor what they call the “somewhat radical idea” of subsidizing entire firms and older workers affected by trade, keeping them in business and at work, respectively. A robust effort to do this, they write, would help “prevent communities from falling apart” when firms struggle.&nbsp;&nbsp;</p> <p>And, as Duflo says, “People don’t want just money. They want dignity. Giving them that is not betraying some deep philosophical principle.”</p> <p>For this reason, the authors are more skeptical of universal basic income proposals; as they note, U.S.-based surveys shows that about 80 percent of workers have a strong sense of satisfaction, usefulness, or personal accomplishment tied to their jobs and careers.&nbsp;</p> <p>Perhaps more conventionally, Banerjee and Duflo also strongly favor greater support for “labor-intensive public services” such as public education and care for the elderly. Crucially, these kinds of jobs are unlikely to be either replaced by technology, or outsourced to another country, since they are firmly situated in particular places.</p> <p>As “Good Economics for Hard Times” also points out, a wealth of research strongly suggests the high social value of, say, early childhood education; such investments would clearly pay for themselves, on a society-wide basis.</p> <p>In all cases, Banerjee and Duflo write, “The goal of social policy, in these times of change and anxiety, is to help people absorb the shocks that affect them without allowing those shocks to affect their sense of themselves.” And, as they note, “we clearly don’t have all the solutions, and suspect that nobody else does either. We have much more to learn. But as long as we understand what the goal is, we can win.”&nbsp;</p> MIT economists Abhijit Banerjee and Esther Duflo and their new book, “Good Economics for Hard Times.”Image: Bryce VickmarkSchool of Humanities Arts and Social Sciences, Economics, Faculty, Books and authors, Social sciences, Abdul Latif Jameel Poverty Action Lab (J-PAL), Trade, Development, Policy, Politics 3 Questions: Jonathan Gruber on academics engaging with policymakers Professor of economics cites the importance of initiatives like the MIT Policy Lab, which helps academics focus some energy on influencing public policy. Wed, 06 Nov 2019 12:45:01 -0500 Center for International Studies <p><em>Jonathan Gruber is the Ford Professor of Economics at MIT and director of the Health Care Program at the National Bureau of Economic Research. An associate editor of both the&nbsp;</em>Journal of Public Economics&nbsp;<em>and the&nbsp;</em>Journal of Health Economics<em>, he has been heavily involved in crafting public health policy. Gruber joined the MIT Policy Lab at the Center for International Studies as a core faculty member in 2017.</em>&nbsp;</p> <p><em>The Policy Lab works with faculty to create, support,&nbsp;and execute strategies to influence the policy community in an effort to maximize the impact of research on public policy. Launched in 2014, the Policy Lab has sponsored more than 90 projects with more than&nbsp;50 principal investigators from all five schools at MIT. The lab distilled its experience connecting researchers to policymakers into a <a href="" target="_blank">short online resource</a> on the EdX platform, and it recently</em><em> issued its fifth <a href="">call for proposals</a>. </em></p> <p><em>Gruber sat down to discuss the importance of faculty members engaging in public policy, as well as some successes of the Policy Lab.</em></p> <p><strong>Q: </strong>What is the MIT Policy Lab and why did you decide to get involved with the program?</p> <p><strong>A: </strong>The MIT Policy Lab is a vital initiative begun out of the Center for International Studies, which&nbsp;provided seed funding that was then supplemented by the dean of the School of Humanities, Arts, and Social Sciences (SHASS) and the provost.&nbsp;The idea of this initiative is to build a series of connections between MIT faculty and policymakers.</p> <p>While there is a whole host of policy-relevant research being carried out at MIT, there are two important barriers to that work influencing policy decisions: the translation of sometimes quite complicated research findings into policy relevant lessons, and making the connections between MIT faculty and relevant government policymakers.</p> <p>So far, this initiative has been very successful in overcoming both of these barriers. Our excellent managing director, Daniel Pomeroy, has wide-ranging experience in a variety of scientific fields, as well as experience in science policy in Washington, D.C., making him a perfect person to help faculty members translate their work into policy-relevant discussions. Our generous funding from SHASS and the provost has allowed us to provide financial support to faculty who want to dedicate time to this activity and/or hire students to help. And the connections of all of our leadership in government, as well as partnership with the MIT Washington Office, has allowed us to make valuable connections between researchers and policymakers.</p> <p>I learned about the Policy Lab through discussions with MIT leadership about my frustrations with the lack of translation of MIT's research to the policy landscape. When I found out about the Policy Lab I was very excited to realize that an institution already existed to facilitate this translation.</p> <p><strong>Q: </strong>Why do you think it is important for faculty to engage with public policymakers?</p> <p><strong>A:</strong> In my view, one of the central fights in the U.S. today is over the role of expertise and the scientific method more generally. Traditionally, when policymakers wanted to make decisions over technically complicated issues, they and their staff turned to subject-matter experts to help. This process was supported by the public's respect for such experts; after all,&nbsp;<em>Time </em>magazine once named “U.S. Scientists” as Man of the Year!</p> <p>Both the public support for scientific expertise and policymakers’ willingness to rely on evidence have diminished over time. Partly this reflects a set of political developments which have led to general lack of respect for expertise or the use of the scientific method over personal intuition and biases. But the problem is exasperated by an increasingly specialized and distant base of academics who are interested solely in impressing each other, and not providing translation of their insights to the general public.</p> <p>For both of these reasons, it is a critical time for academics to focus some of their energy on engaging with policymakers and the public. The Policy Lab is an excellent resource for facilitating those interactions.</p> <p><strong>Q: </strong>Since joining the MIT Policy Lab as a core faculty member in 2017, what have you seen as the most successful aspects of the program?</p> <p><strong>A: </strong>Two different aspects of the Policy Lab have been very pleasant surprises to me. The first is faculty excitement and willingness to engage with the program. I thought that the Policy Lab would have to work hard to get any faculty to participate. But I was, fortunately, very wrong: From the beginning there have been an abundance of faculty who are very excited about this initiative and eager to participate. Indeed, we have been unable to support all of the requests that we have received! The fact that there is this pre-existing demand for an initiative of this type is very exciting.</p> <p>The second is the ability of the Policy Lab to leverage relatively the limited time of our staff and small grants to make real and valuable connections in the policy world. A variety of projects have already yielded significant impacts, on topics as diverse as using fluid dynamics to predict the transmission of disease and using predictive modeling to help assess the environmental implications of deep-sea mining. These are vital policy issues that cannot be effectively addressed without the kind of scholarly work that MIT brings to bear — and we are seeing that expertise being used to make a real difference.</p> <p>The model that the MIT Policy Lab has created over the last five years has proven to be an effective and efficient way to connect MIT research to public policy. I hope that we can continue to build on these successes to provide a platform for broadly sharing the enormous policy-relevant knowledge base at MIT with the world.</p> <p><em>Editorial team: Dan Pomeroy &amp; Michelle English</em></p> According to MIT Professor Jonathan Gruber, there are two important barriers to MIT research influencing policy decisions: the translation of sometimes complicated research findings into policy-relevant lessons, and making connections between MIT faculty and relevant government policymakers. The MIT Policy Lab has been very successful in overcoming both barriers.Photo: Laura Kerwin/Center for International StudiesCenter for International Studies, Provost, Economics, Policy, School of Humanities Arts and Social Sciences, 3 Questions, Government, Political science, Faculty Optimizing kidney donation and other markets without money MIT economist Nikhil Agarwal analyzes the efficiency of markets that match suppliers and consumers but don’t use prices. Mon, 04 Nov 2019 23:59:59 -0500 Peter Dizikes | MIT News Office <p>When people die, they can become organ donors for a period of about 24 to 48 hours. But 20 percent of kidneys in the U.S. that could be transplanted in these situations are never used.</p> <p>Meanwhile, by some estimates, 30 to 50 percent of living people who are willing to donate a kidney never find a recipient. With around 100,000 Americans waiting for kidney transplants at any given time, those are suboptimal situations.&nbsp;&nbsp;</p> <p>What can be done to help fix this? Give the problem to a market design scholar, such as MIT economist Nikhil Agarwal, who has studied the issue in close detail.</p> <p>From within the walls of MIT’s Building E52, where economics equations litter the whiteboards, Agarwal’s work has now leapt out to the medical establishment. In the last year, a new method he and some colleagues formulated for a more efficient kidney-donation system has been approved for implementation by the Alliance for Paired Donation, the second-largest platform for such transplants in the U.S.</p> <p>“It’s particularly exciting,” says Agarwal, who is low-key about his accomplishments but allows that he is thrilled to see his work having a tangible effect. Currently there are about 800 kidney exchange transplants in the U.S. annually; by Agarwal’s estimation, a more efficient exchange market could increase that number by 30 to 60 percent.</p> <p>Though Agarwal’s work is still being implemented, and it is not yet easy to quantify its impact yet, it is simple enough to see his rising trajectory in academia. For his research and teaching, Agarwal was granted tenure at MIT earlier this year.</p> <p><strong>“That’s not how a lot of markets work” </strong></p> <p>At first glance, transplants might not seem to be a problem for an economist. But a growing cadre of economists have made notable progress understanding markets that match pairs of things — transplant donors and recipients, applicants and schools — and do not use money to settle matters.</p> <p>“In economics,” Agarwal says, “we often [assume] there’s the demand, the supply, the price, and the market clears, somehow. It just happens.” And yet, he says, “That’s not how a lot of markets work. There are all these different important markets where we do not allow prices.”</p> <p>Scholars in the field of “market design,” therefore, closely examine these nonfinancial markets, observing how their rules and procedures affect outcomes. Agarwal calls himself a specialist in “resource allocation systems that do not use prices.” These include kidney donations: The law forbids selling vital organs. Many education systems and entry-level labor markets, for example, also fit into this category.&nbsp;</p> <p>In Agarwal’s case, he has a specialty within his specialty. Some market-design scholars are theorists. Agarwal is an empiricist who locates data on nonpriced markets, evaluates their efficiency, and works out improvements.</p> <p>“Data can teach you new things you maybe wouldn’t have otherwise thought,” Agarwal says.</p> <p>In a series of papers examining the inefficiencies of kidney transplant systems in the U.S., Agarwal and a variety of co-authors looked at the numbers and came back with solutions. One major source of inefficiency, Agarwal has discovered, is a lack of scale. Bigger networks of hospitals could better match donors and recipients. Right now, 62 percent of kidney donor-and-recipient pairings consist of patients at the same hospitals; that number would be lower in a more efficient system.</p> <p>One reason for this: Donors and recipients must have matching blood types. People with type O blood can donate kidneys across blood types, but they can only receive kidneys from other type O people. Due to the timing of when people enter kidney markets, a bigger network is more efficient in this regard. In single-hospital networks, 22.8 percent of type O donors give a kidney to a non-type O recipient (for whom other donors might be found), while in the biggest U.S. kidney network, just 6.5 percent do, meaning its type O participants are connecting more optimally.</p> <p>Agarwal’s research also suggests that hospitals tend to be very concerned about the financial and administrative costs they incur while handling the transplant process — although such costs are small compared to the overall social value of transplants. Well-crafted subsidies and mandates, as he has detailed, can help address this particular problem.</p> <p><strong>Open questions in need of answers</strong></p> <p>Agarwal was an economics and math double major at Brandeis University, where he received his BA in 2008. Directly out of college, Agarwal was accepted into Harvard University’s PhD program in economics, but, as he recounts it, he did not have a clear idea of what he wanted to study. Before long, though, Agarwal connected at Harvard with Alvin Roth, an innovative market-design theorist who would soon be awarded the Nobel Prize, in 2012; Roth’s work helped create new mechanisms for school-choice programs.</p> <p>Working with Roth, as well as Harvard professors Susan Athey (now of Stanford University) and Ariel Pakes, and MIT Professor Parag Pathak, Agarwal began focusing on market-design problems and developing his taste for empiricism. The theorists had broken the field of market design open; as a result, unanswered questions about the activity in many markets had been identified but not necessarily answered.</p> <p>“I’ve always liked combining different ways of learning about something,” Agarwal says. “Initially I was training as a theorist, but then I got interested in data, because I just saw a big set of open questions there, which wasn’t informed by numbers.” Pakes, who Agarwal cites as a major influence, “showed me what data, especially when combined with theory, can teach us.”</p> <p>Agarwal joined the MIT faculty in 2014 and began publishing papers on a range of topics, on a variety of markets. He has studied online advertising and school-choice systems; one of his first prominent papers, in the <em>American Economic Review</em> in 2015, <a href="">examined</a> the system used to allocate medical students to residencies.</p> <p>Still, the majority of Agarwal’s work has been on kidney transpants specifically, a field of knowledge he has gradually built up.</p> <p>“You need to have domain expertise,” Agarwal says. “It’s very important to have that. Otherwise [theories] may not be directly implementable. For that reason, people really do specialize, so they understand the setting.” One of Agarwal’s co-authors is a kidney transplant surgeon.</p> <p>“I’ve learned a lot from other people,” Agarwal notes.</p> <p>He has also benefitted, as he tells it, from his home in the MIT Department of Economics, where all kinds of work is valued — even work on nonpriced markets, which, as Agarwal quips, can seem like “kind of a weird thing to study,” at least to outsiders.</p> <p>“The economics department is an intellectually amazing place to think about things,” Agarwal adds. “People value good work on the merits and they’re open-minded.”</p> <p>Now Agarwal is also encouraging others to research markets of all kinds: His students are studying topics as diverse as electricity markets, the palm oil industry in Indonesia, and water markets in Australia, among many others. Every such market, he notes, can differ from others, in its practices and in the behavior of its participants.</p> <p>“We have to think a little more carefully about how markets work and demand meets supply, and what are all the implications of that,” Agarwal says.</p> <p>After all, as Agarwal has already seen, a little more careful thought about markets could have a lot more real-world impact.</p> Nikhil AgarwalImage: Jared CharneyEconomics, Medicine, Health care, Profile, Faculty, School of Humanities Arts and Social Sciences J-PAL North America announces five new partnerships with state and local governments New partners will work with J-PAL to develop rigorous evaluations of policies related to criminal justice, health, housing stability, and economic security. Mon, 28 Oct 2019 12:00:01 -0400 J-PAL North America <p>J-PAL North America, a research center in MIT’s Department of Economics, announced new partnerships with five state and local governments across the United States.</p> <p>The California Department of State Hospitals, Minnesota Board of Pharmacy, Minnesota State Court Administrator’s Office, Shasta County Superior Court, and Virginia Department of Social Services were selected to partner with J-PAL North America and its network of leading academic researchers through the<a href=""> J-PAL State and Local Innovation Initiative</a>. These partnerships will develop randomized evaluations, also known as randomized controlled trials (RCTs), which have the potential to yield rigorous evidence about which programs and policies are most effective.&nbsp;</p> <p>“We are thrilled to partner with these five state and local governments to address pressing policy problems across the country through rigorous evaluation,” says Mary Ann Bates, J-PAL North America executive director and co-chair of the State and Local Innovation Initiative. “We are particularly excited about how many of these projects will build on evidence from prior randomized evaluations to test if similar interventions can be effective in different settings.”</p> <p>These proposals will examine a wide variety of topics and intervention methods, including reducing over-prescription of opioids, reducing failures to appear for arraignment, addressing homelessness and housing instability, and increasing the take-up of federal tax credits for low-income families.</p> <p>In California, individuals with serious mental illness who face felony charges and are likely to be found incompetent to stand trial are directed to the Department of State Hospitals (DSH) to receive treatments to regain competency. DSH has launched a new Pre-trial Felony Mental Health Diversion program, through which approximately 20 counties will receive funding to divert individuals who are incompetent to stand trial out of the criminal justice system and into wrap-around community treatment services. DSH is partnering with J-PAL North America to evaluate the effects of the Pre-Trial Felony Mental Health Diversion program. This evaluation will contribute significant insight into how diverting those found to be incompetent to stand trial due to mental health reasons to community mental health services may impact their individual well-being, as well as broader outcomes in the criminal justice and behavioral health systems.</p> <p>“The California Department of State Hospitals is pleased to partner with J-PAL to evaluate how Felony Mental Health Diversion impacts individuals living with serious mental illness in California,” says Stephanie Clendenin, director of the California Department of State Hospitals. “The Felony Mental Health Diversion program seeks to provide long-term community mental health treatment and other services for individuals with serious mental illness so that they avoid criminalization and institutionalization and receive the critical mental health care and supportive services they need.”</p> <p>While the opioid prescribing rate among physicians has declined in recent years, the number of opioids prescribed per person remains three times higher than in 1999. Many states are testing interventions to reduce the overprescription of opioids among physicians. The Minnesota Board of Pharmacy and Minnesota Management and Budget will partner with J-PAL North America to identify ways to increase the use of Minnesota’s prescription monitoring program (PMP) and measure the impact of that increased use on prescribers’ rate of controlled-substance prescriptions. The PMP database maintains a secure record of all controlled-substance prescriptions, and more frequent use of the database may help avoid prescribing to individuals misusing opioids and, instead, allow prescribers to make referrals to treatment services.</p> <p>"We are excited about this innovative partnership that will help Minnesota use data to increase the use of Prescription Monitoring Programs and reduce overprescribing of opioids,” says Myron Frans, commissioner of Minnesota Management and Budget. “The opioid crisis has caused tremendous damage to our families and communities. To achieve better results, we will continue to collaborate and use evidence-based governing principles to combat this crisis.”</p> <p>“The Minnesota Board of Pharmacy looks forward to working with Minnesota Management and Budget and J-PAL to evaluate our PMP, with the aim to increase use of the PMP and to analyze its impact on controlled substance prescription rates,” says Cody Wiberg, executive director of the Minnesota Board of Pharmacy.</p> <p>In 2017, Minnesota’s largest county piloted a text reminder program for individuals with court hearings; after promising results suggested the reminders increased court appearances, the Minnesota Judicial Branch decided to make these e-reminders available statewide. The Minnesota State Court Administrator’s Office is now working with Minnesota Management and Budget and J-PAL North America to test the content and timing of different messages to determine which behavioral strategies are most effective in reducing failures to appear for court hearings. Previous <a href="">research</a> suggests that behavioral nudges, like text messages and redesigned summons, can reduce failure-to-appears for criminal hearings. This evaluation will measure the effectiveness of different message content and whether these reminders can also be effective for increasing court appearances among tenants facing eviction proceedings.</p> <p>“Hearing eReminders are about keeping the justice system, and those within it, from incurring additional costs. We are focused on keeping the court process moving efficiently, because we know that justice delayed is justice denied,” says Minnesota State Court Administrator Jeff Shorba. “We have already seen from the 18-month pilot that parties who received some form of Hearing eReminder were 35 percent more likely to appear for their hearing. Rigorous evaluation of Hearing eReminder messages and sequencing will ensure we are delivering the most effective messages on the most impactful schedule.”</p> <p>Similarly, in Shasta County, California, individuals who commit low-level offenses receive court summonses that require offenders to appear in court. Failure to appear in court results in the issuance of an arrest warrant, which is costly for the criminal justice system and recipients. The Shasta County Superior Court will partner with J-PAL North America to evaluate behavioral interventions to reduce defendants’ failure to appear at arraignments. This evaluation will expand the previous literature on reducing failures to appear and provide insight into the effectiveness of these interventions in a different setting.</p> <p>“The Shasta County Superior Court is excited to join with J-PAL to analyze strategies that could help reduce the incidence of homeless individuals failing to appear to in court,” says Melissa Fowler-Bradley, court executive officer of the Shasta County Superior Court. “In Shasta County, about one-third of the people who fail to appear for their court cases are homeless, and similar statistics exist throughout the country. Unfortunately, a failure to appear has an added impact on those who are economically challenged. The Shasta County Superior Court’s goal is to reduce those failures to appear among the homeless population as much as possible using strategies that are proven effective through the rigorous evaluation made possible by J-PAL. A successful project created under this state and local initiative could have nationwide impact, improve the plight of the impoverished, and increase the efficiency of the criminal justice system.”</p> <p>Millions of dollars of the Earned Income Tax Credit (EITC), a federal tax credit for low-income households, go unclaimed every year. Previous <a href="">research</a> suggests that behavioral interventions, like messages and simplified materials, can increase the uptake of the EITC. The Virginia Department of Social Services will partner with J-PAL North America to develop an evaluation of a text-messaging intervention to generate higher rates of tax filing and EITC claims. This evaluation will add to the growing body of literature on behavioral interventions to increase EITC claims.</p> <p>“This new partnership with J-PAL is an opportunity to transform how we pursue our mission of triumphing over poverty, abuse, and neglect,” says Duke Storen, commissioner of the Virginia Department of Social Services. “We will gain valuable insight from the RCT about how to more effectively encourage eligible Virginians to claim the EITC, which has been found to be one of the most effective federal antipoverty programs. We anticipate the results of this work will not only improve how we communicate with our customers as it relates to the EITC and the full-spectrum of our programs and services, but will prove to benefit other states as well.”&nbsp;</p> <p>The California Department of State Hospitals, Minnesota Board of Pharmacy, Minnesota State Court Administrator’s Office, Shasta County Superior Court, and Virginia Department of Social Services join <a href="">13 state and local governments selected through previous rounds</a> of the J-PAL State and Local Innovation Initiative: Baltimore, Maryland; King County, Washington; Minneapolis, Minnesota; Philadelphia, Pennsylvania; Rochester, New York; Santa Clara, California; and Washington; the states of California, Washington, Massachusetts, Pennsylvania, New Mexico, and South Carolina; and the U.S. territory of Puerto Rico. These state and local governments are part of a growing movement to use evidence to improve the effectiveness of policies and programs and ultimately the lives of people experiencing poverty.</p> <p>Anyone wishing to&nbsp;learn more about the initiative or to receive updates about its progress is invited to&nbsp;<a href="">visit online</a>.&nbsp;The J-PAL contact for&nbsp;more information is Initiative Manager&nbsp;<a href="">Rohit Naimpally</a>.</p> The State of Minnesota is one of J-PAL North America's five new partners across the U.S. The Minnesota Board of Pharmacy and Minnesota Management and Budget will look to identify ways to increase the use of Minnesota’s prescription monitoring program. The Minnesota State Court Administrator’s Office and Minnesota Management and Budget will seek to determine which behavioral strategies are most effective in reducing failures to appear for court hearings.Abdul Latif Jameel Poverty Action Lab (J-PAL), Economics, School of Humanities Arts and Social Sciences, Policy, Poverty, Government, Funding, Research J-PAL North America announces first partners through Work of the Future Innovation Competition Partners will work with J-PAL North America to develop randomized evaluations addressing today’s rapidly shifting labor market. Fri, 25 Oct 2019 10:00:00 -0400 J-PAL North America <p>J-PAL North America, a research center at MIT, will partner with four organizations to test promising models to help workers navigate the shifting labor market as part of the center’s inaugural <a href="" target="_blank">Work of the Future Innovation Competition</a>.&nbsp;</p> <p>Currently in its first year, J-PAL North America’s <a href="" target="_blank">Work of the Future Initiative</a> provides direct support to organizations and agencies interested in evaluating programs or policies related to the changing nature of work in North America. In the coming year, J-PAL North America will partner with the <a href="">Center for Work Education and Employment (CWEE)</a>, <a href="">Checkr</a>, the<a href="" style="text-decoration-line: none;"> </a><a href="">City of Los Angeles Mayor's Innovation Team</a>, and the <a href="">Montana Department of Labor and Industry</a> to develop and support rigorous evaluations of programs seeking to improve outcomes for workers. These programs aim to reduce barriers to employment, support workers as they navigate the complex job market, and bolster jobseekers’ abilities to secure and retain quality work.&nbsp;</p> <p>"It's exciting to see so many promising proposals in just the first year of the Work of the Future Initiative," says Lawrence Katz, co-scientific director of J-PAL North America and academic advisor to the Work of the Future Initiative. "We're hopeful that the initiative can continue to generate this level of enthusiasm as it seeks to develop promising programs and identify effective methods to help workers navigate the shifting labor market."&nbsp;&nbsp;&nbsp;</p> <p>CWEE is a workforce development agency based in Denver, Colorado, that provides low-income parents, the majority of whom are Temporary Assistance for Needy Families recipients, with complete career readiness and retention skills. CWEE will partner with J-PAL North America to develop an evaluation on the impact of its intensive case management and career readiness program on employment outcomes.&nbsp;</p> <p>“CWEE has been supporting low-income families in the Denver metro community for almost four decades, so it feels incumbent upon us to&nbsp; take a deeper dive and better understand how the programs and support provided contribute to economic mobility,” says Katy Hamilton, CEO of CWEE. “It’s rare for a nonprofit of our size to have an opportunity like this to learn from the foremost thinkers in the space of academic assessments and social programs.”</p> <p>Checkr is a background-check company focused on making hiring more inclusive and efficient. J-PAL North America will partner with Checkr to evaluate if the implementation of the Positive Adjudication Matrix (PAM) can reduce bias in the background-check and hiring process. PAM allows employers to deem certain types of offenses irrelevant to the roles for which they’re hiring. Companies can then choose to either filter out or de-emphasize these criminal records.&nbsp;</p> <p>“We believe advanced technology plays a critical role in reducing hiring bias,” says Checkr Co-Founder and CEO Daniel Yanisse. “With rigorous research, we can better develop products to ensure a fairer, more inclusive hiring process.”</p> <p>The Mayor’s Innovation Team (i-team) works with Los Angeles, California, city departments to closely examine complex challenges and discover innovative solutions that can improve the quality of life in LA. The LA i-team will partner with J-PAL North America to design an evaluation of a program that helps job seekers better access employment services at the city's WorkSource Centers. The i-team seeks to apply behavioral science techniques and use technology to support job seekers, increase usage of the centers’ resources, and improve job placements that create opportunities for Angelenos and lift families out of poverty.</p> <p>“Our mission is simple: We want to help people get access to resources that can change their lives,” says Los Angeles Mayor Eric Garcetti. “This partnership will help us expand that work — so that more Angelenos have opportunities to find a career, support their families, contribute to the economy, and strengthen our communities.”&nbsp;</p> <p>The Montana Department of Labor and Industry (MTDLI) is a government agency that promotes the well-being of Montana’s workers, employers, and citizens. MTDLI will partner with J-PAL North America to evaluate the effectiveness of Reemployment Services and Eligibility Assessments, a national program for individuals claiming unemployment insurance who have been identified as likely to exhaust their UI benefits.</p> <p>“All organizations must set priorities and find the most effective solutions using limited resources, and MTDLI is no different. MTDLI strives to provide top-notch services to Montana’s workforce in cost-effective and successful ways,” says Barbara Wagner, chief economist at MTDLI. “Using data and research to drive decision-making helps us focus our resources on the best solutions, therefore allowing us to have a greater impact on our workers, businesses, and economy.”</p> <p>In working with these four organizations J-PAL North America looks to support the development of randomized evaluations of strategies and innovations that address the changing nature of work in North America. Partnerships with the Center for Work Education and Employment, Checkr, the City of Los Angeles, and the Montana Department of Labor and Industry will help J-PAL North America generate solutions that increase opportunity, reduce disparities, and help all workers navigate shifts in the labor market.</p> <p>J-PAL North America is a regional office of the Abdul Latif Jameel Poverty Action Lab. J-PAL was established in 2003 as a research center at MIT’s Department of Economics. Since then, it has built a global network of affiliated professors based at over 58 universities and regional offices in Africa, Europe, Latin America and the Caribbean, North America, South Asia, and Southeast Asia. J-PAL North America was established with support from the Alfred P. Sloan Foundation and Arnold Ventures and works to improve the effectiveness of social programs in North America through three core activities: research, policy outreach, and capacity building. The Work of the Future Initiative was launched with support from Arnold Ventures, the Bill and Melinda Gates Foundation, and the Gerri and Rich Wong Family.</p> <div></div> Two career seekers work in the Center for Work Education and Employment’s (CWEE) computer lab with a digital literacy instructor. CWEE is one of four organizations to partner with the Work of the Future Initiative to design an evaluation of their program. Photo: Center for Work Education and EmploymentAbdul Latif Jameel Poverty Action Lab (J-PAL), Economics, Technology and society, Jobs, School of Humanities Arts and Social Sciences Meet the 2019 tenured professors in the School of Humanities, Arts, and Social Sciences SHASS faculty members Nikhil Agarwal, Sana Aiyar, Stephanie Frampton, Daniel Hidalgo, and Miriam Schoenfield were recently granted tenure. Tue, 22 Oct 2019 15:30:01 -0400 School of Humanities, Arts, and Social Sciences <p>Dean Melissa Nobles and the School of Humanities, Arts, and Social Sciences (SHASS) announced that five members of the school's faculty members have received tenure. Their extensive research and writing investigates a wide variety of topics, from&nbsp;the history of western thought to electoral behavior in low-income areas. They are:</p> <p><a href="" target="_blank">Nikhil Agarwal</a>,<strong> </strong>associate professor of economics, joined the MIT faculty in 2014 after earning his PhD at Harvard University and teaching economic policy at Stanford University. He has received grants from the National Institute of Health and a Sloan Research Fellowship. He teaches Microeconomic Theory and Public Policy (Course 14.03), and courses on industrial organization.</p> <p><a href="" target="_blank">Sana Aiyar</a>, associate professor of history, is a specialist in the history of modern South Africa, She is the author of "Indians in Kenya: The Politics of Diaspora" and her research focuses on colonial and postcolonial politics and society in the Indian Ocean. She formerly taught at the University of Wisconsin at Madison.</p> <p><a href="" target="_blank">Stephanie Frampton</a>, associate professor of literature, is a classicist, comparatist, historian of media in antiquity, and the author of "Empire of Letters." She joined the MIT faculty in fall 2012 after teaching at Harvard University and the College of the Holy Cross.</p> <p><a href="" target="_blank">F. Daniel Hidalgo</a>, the Cecil and Ida Green Associate Professor of Political Science, focuses on the political economy of elections, campaigns, and representation in developing democracies, especially in Latin America, as well as quantitative methods in the social sciences.<br /> <br /> <a href="" target="_blank">Miriam Schoenfield PhD '12</a>, associate professor of philosophy, returned to MIT in 2017 after holding teaching positions at the University of Texas at Austin and at New York University. Her primary research interests are in epistemology with ethics and normativity more broadly.</p> Newly-tenured faculty in the School of Humanities, Arts, and Social Sciences: (clockwise from top left) Nikhil Agarwal, Sana Aiyar, Miriam Schoenfield, F. Daniel Hidalgo, and Stephanie Frampton.School of Humanities Arts and Social Sciences, Faculty, Economics, History, Literature, Political science, Philosophy, awards, Awards, honors and fellowships Economist Stanley Fischer calls for autonomy in central banking In MIT talk, the former vice chair of the U.S. Federal Reserve reflects on his career as a policy leader. Wed, 16 Oct 2019 14:05:03 -0400 Peter Dizikes | MIT News Office <p>Former U.S. Federal Reserve vice chair Stanley Fischer PhD ’69 emphasized the importance of independence in central banking, while outlining key aspects of his own career as a policy leader, in an MIT lecture on Sept. 30.</p> <p>“Should a central bank be independent? The answer is yes,” Fischer said, emphasizing the need for policymakers to have maximum flexibility to determine interest rates while grappling with complex economic situations.</p> <p>Specifically regarding the U.S., Fischer noted, “We are the country with the highest interest rate in the G7, because our economy is in the best shape.” For that reason, he observed, the U.S. has the most leverage to address future economic slowdowns — but would still need to be judicious about it.</p> <p>“We need to be careful not to let the system degenerate” and head too quickly toward a zero interest rate, Fischer said, which would then likely limit room for the Federal Reserve to spur the economy by lowering rates at a future point when it might be more useful.</p> <p>Fischer noted that current uncertainty surrounding U.S. economic conditions is considerable. Fears of a recession have lessened in the last two months, he said, but the gains of recent years were not automatically going to continue.&nbsp;</p> <p>“We’re not guaranteed to have a recession, but we’re not guaranteed to not have a recession,” he said.</p> <p>In his remarks, Fischer added that the Fed’s supervisory role in the banking system was vital, and suggested that the 2010 Dodd-Frank financial-sector legislation — which provided additional banking oversight and limited some forms of banking activity — should be fully enforced.</p> <p>“The regulations have been eased back,” Fischer said. “I think that’s a mistake.”</p> <p>Fischer was an MIT economics professor from 1976 to 1998 and built an influential career in global monetary policy after leaving the Institute. Besides being vice chair of the U.S. Federal Reserve, from 2014 to 2017, where he worked with then-chair Janet Yellen, Fischer served as governor of the Bank of Israel from 2005 to 2013; first deputy managing director of the International Monetary Fund (IMF) from 1994 to 2001; and chief economist of the World Bank from 1988 to 1990.</p> <p>Fischer is a native of Zambia, who attended school in multiple countries before working across the world professionally. Still, he told the audience, when people ask him what he considers to be his home, “I say, and I mean it, MIT.”</p> <p>Fischer’s talk was delivered to a standing-room-only audience of over 125 people in MIT’s lecture hall 1-190. The event was jointly sponsored by MIT’s Undergraduate Economics Association and the Finance and Policy Club of the MIT Sloan School of Management.</p> <p>Fischer was introduced by James A. Poterba, the Mitsui Professor of Economics at MIT, who called Fischer a “remarkably effective policymaker” and “an incredibly thoughtful and informed source of wisdom about how to think through policy challenges.”</p> <p>At MIT, Poterba added, Fischer made his mark “not just as a stellar researcher, but as one of the absolute clearest teachers and most successful mentors of graduate students and undergraduates alike.” Poterba added that Fischer was known for the quality of his lectures in MIT’s course 14.02 (Principles of Macroeconomics): “People used to hang from the rafters just to get into Stan’s 14.02 lectures.”</p> <p>Fischer was also the principal PhD thesis adviser of Ben Bernanke PhD ’79, chair of the U.S. Federal Reserve from 2006 to 2014.</p> <p>In his remarks, Fischer also talked about gender issues in central banking. He noted that Yellen, whom he called an “excellent economist,” would prepare intensively for four or five days ahead of public Fed meetings. After a while, he suggested to Yellen that her performance would be equally good with less prep time, noting her strong record of the last two years. However, Yellen told him, “I’ve always done that. I’ve always prepared absolutely fully.” In part, Fischer suggested, Yellen thought the attention she might draw for a public misstatement, as the first woman to chair the Fed, would be considerable.</p> <p>Fischer later raised the subject with Christine Lagarde, head of the International Monetary Fund, who will become next chair of the European Central Bank in November and had a similar perspective. As Fischer recalled, Lagarde told him, “You men simply don’t understand the pressure that is on women in the public sector. We know if we make a mistake, we will be fried. Whereas if a man makes a mistake, no one gets very excited.”</p> <p>Fischer also talked in some detail about his work as governor of the Bank of Israel — equivalent to the role of Fed chair — where he introduced a monetary policy committee, among other reforms intended to diffuse the governor’s power.</p> <p>The idea, Fischer said, “was to precisely change the model of the single decision-maker.” By intentionally giving himself less power, he added, jokingly, “I was very idealistic, or stupid, or both.” But he felt the change would align Israel with the practice of allowing more voices into the process of setting rates — where a lot of information must be processed and multiple interpretations of data can arise, making extensive discussion useful.</p> <p>Expanding the Bank of Israel’s administration required some additional investment, Fischer noted, drawing laughs by observing, “What you can’t say as a central banker is, ‘We don’t have the money.’ [In fact,] you have all the money you can print.”</p> <p>In Israel, Fischer faced unusual economic conditions: Israel made it through the financial crisis relatively unscathed but faced a resulting inflow of global capital and had to work to keep economic conditions relatively stable. He assessed his own performance in the job as “pretty good.”</p> <p>Fischer said he thought Yellen’s Fed had been “very successful” at its postcrash efforts at normalization, and noted that its leaders, including himself, spent a significant amount of time examing the prospect of interest rates hitting the “Zero Lower Bound,” beyond which they would become negative. Fischer said he was “stunned” there was not more general public discussion about that issue at the moment.</p> <p>Noting that he had taken on the governorship of the Bank of Israel with a list of 15 policy goals to accomplish, Fischer also offered some career advice to the audience members, most of whom were MIT students: “If you take a job, it’s a good idea to decide what you want to do there.”</p> Stanley FischerImage: Zach WinnSchool of Humanities Arts and Social Sciences, Economics, Students, Undergraduate, Special events and guest speakers, Policy, Alumni/ae, Global MIT economists Esther Duflo and Abhijit Banerjee win Nobel Prize Professors share prize with Michael Kremer of Harvard University, are cited for breakthrough antipoverty work. Mon, 14 Oct 2019 06:09:00 -0400 Peter Dizikes | MIT News Office <p>Esther Duflo and Abhijit Banerjee, MIT economists whose work has helped transform antipoverty research and relief efforts, have been named co-winners of the 2019 Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel, along with another co-winner, Harvard University economist Michael Kremer.&nbsp;</p> <p>“We are incredibly happy and humbled,” Duflo told <em>MIT News</em> after learning of the award. “We feel very fortunate to see this kind of work being recognized.”</p> <p>Banerjee told <em>MIT News</em> it was “wonderful” to receive the award, adding “you don’t get this lucky many times in your life.”</p> <p>The work of Duflo and Banerjee, which has long been intertwined with Kremer’s, has been highly innovative in the area of development economics, emphasizing the use of field experiments in research in order to realize the benefits of laboratory-style randomized, controlled trials. Duflo and Banerjee have applied this new precision while studying a wide range of topics implicated in global poverty, including health care, education, agriculture, and gender issues, while developing new antipoverty programs based on their research.</p> <p>Duflo and Banerjee also co-founded MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL) &nbsp;in 2003, along with a third co-founder, Sendhil Mullainathan, now of the University of Chicago. J-PAL, a global network of antipoverty researchers that conducts field experiments, has now become a major center of research, facilitating work across the world.</p> <p>J-PAL also examines which kinds of local interventions have the greatest impact on social problems, and works to implement those programs more broadly, in cooperation with governments and NGOs. Among J-PAL’s notable interventions are deworming programs that have been adopted widely.</p> <p>In the statement released this morning, the Royal Swedish Academy of Sciences, which grants the Nobel awards, noted that the work of Duflo, Banerjee, and Kremer has “dramatically improved our ability to fight poverty in practice” and cited their “new approach to obtaining reliable answers about the best ways to fight global poverty.”</p> <div class="cms-placeholder-content-video"></div> <p><strong>“A collective effort”</strong></p> <p>Duflo, 46, is the second woman and the youngest person ever to receive the Nobel in economic sciences.</p> <p>“We’re fortunate to see this kind of work being recognized,” Duflo told <em>MIT News</em>, noting that their work was “born at MIT and supported by MIT.” She called the work in this area a “collective effort” and said that “we could not have created a movement without hundreds of researchers and staff members.” The Nobel award, she said, also represented this collective enterprise, and was “larger than our work.”</p> <p>Banerjee, 58, noted that experiment-based work in development economics was a little-explored area of research 20 years ago but has grown significantly since then.</p> <p>“The kind of work we’ve done over the years, when we started, was marginal in economics,” Banerjee said. In that light, he added, the Nobel award is “great for the development field” within economics, reflecting the signifance of work done by many of his colleagues.&nbsp;</p> <p>Duflo added that she and Banerjee were “absolutely delighted to share this award with Michael Kremer,” calling his work an “inspiration” for antipoverty researchers. Kremer is a former MIT faculty member and postdoc who served at the Institute from 1992 to 1999, and remains an affiliated professor with J-PAL; he is currently the Gates Professor of Developing Societies at Harvard University. The three award-winners have known each other since the mid-1990s and have long viewed their research efforts as being intellectually aligned. The Nobel statement also cited Kremer’s research on education in Kenya as a key launching point for the new experimental method.</p> <p>While J-PAL researchers conduct experiments globally, Duflo and Banerjee have situated much of their own research in Africa and India. They have studied a wide range of issues implicated in global poverty, producing significant results over time. In one widely noted experiment, Duflo and Banerjee found that immunization rates for children in rural India jump dramatically (from 5 percent to 39 percent) when their families are offered modest incentives for immunization, such as lentils.</p> <p>They have also studied educational issues extensively, often with additional co-authors, uncovering new results about improvements in student achievement (when classes are divided into small groups) and ways to improve teacher attendance. But the range of topics Duflo and Banerjee have studied is immense, and includes fertilizer use by Kenyan farmers, physician training in India, HIV prevention in Africa, the effects of small-scale lending programs, and the impact of aid programs in Indonesia, among many other studies.</p> <p>In one study conducted on three continents, Duflo and Banerjee also reported significant welfare gains from an intervention that helps the poor simultaneously in multiple ways, including job training, productive assets, and health information.</p> <p>Duflo and Banerjee have published dozens of research papers, together and with other co-authors. They have also co-written two books together, “Poor Economics” (2011) and the forthcoming “Good Economics for Hard Times” (2019).</p> <p>A significant part of J-PAL’s mission is to scale up successful experiments that can be applied more widely in society. When Kremer and economist Edward Miguel demonstrated the immense value of deworming children in the developing world, J-PAL helped start Deworm the World, a nonprofit that has treated millions of children in Africa.</p> <p><strong>Scholarship and impact</strong></p> <p>At a press conference for Duflo and Banerjee held today in MIT’s Building E51, MIT President L. Rafael Reif introduced the two economists, praising their scholarship and the impact of their work.</p> <p>“By providing an experimental basis for development economics, professors Banerjee and Duflo have reimagined their field and profoundly changed how goverments and agencies around the world intervene to help people beat poverty,” Reif said. “In doing so, they provide a proud reminder of MIT’s commitment to bringing knowledge to bear on the world’s great challenges.” He added: “We’re deeply proud of our newest Nobel laureates and the entire economics department.”</p> <p>After an extended round of applause from students, faculty, and administrators at the start of the press conference, Banerjee joked, “It feels like I wandered onto the set of the wrong movie.”</p> <p>Speaking to <em>MIT News</em>, Nancy Rose, the economics department head and the Charles P. Kindleberger Professor of Applied Economics, lauded Duflo and Banerjee’s scholarship and mentorship, as well as their extensive efforts to turn their findings into real-world policy.</p> <p>“Esther and Abhijit have been exceptional colleagues and contributors to the MIT economics department,” said Rose. “Their passion for the power of economics to do good in the world inspires us all, and their generosity and compassion in working with students and colleagues has propelled countless careers forward.&nbsp;We couldn’t be more thrilled for this recognition of all they have done.”<br /> <br /> Rose added that “Abhijit, Esther, and Michael's work shows economic research at its finest.&nbsp;They have not only transformed the way economists approach the study of poverty and development economics, but deployed their findings to improve the lives of hundreds of million people across the globe.&nbsp;Their founding&nbsp;of MIT’s J-PAL has created a vibrant network of scholars who are bringing evidence-based antipoverty policy into every corner of the world.”</p> <p>Melissa Nobles, the Kenan Sahin Dean of MIT’s School of Humanities, Arts, and Social Sciences, praised the ethical foundations guiding the work of Duflo and Banerjee.</p> <p>“The significance of Abhijit’s and Esther’s scholarship is not only that it has transformed the ways in which economists and policymakers think about and approach poverty alleviation, but that, at the core, their research is guided by deeply humanistic values,” Nobles said. “In their vision, the materially poor are at the center, as are remedies for global poverty that actually work, that open doors for millions to education, health care, economic well-being, and safe communities — to the full promise of human life.”</p> <p>Duflo received her undergraduate degree from the École Normale Supérieure in Paris in 1994, after studying both history and economics. She earned a master’s degree in economics the next year, jointly through the École Normale Supérieure and the École Polytechnique. Duflo then earned her PhD in economics from MIT in 1999. She joined the MIT faculty the same year, and has remained at MIT her entire career.&nbsp;</p> <p>Currently Duflo is the Abdul Latif Jameel Professor of Poverty Alleviation and Development Economics at MIT. Banerjee is the Ford International Professor of Economics at MIT.</p> <p>Previously, Duflo has earned a series of awards and honors, including a MacArthur Foundation fellowship (2009), the John Bates Clark Medal from the American Economic Association (2010), and, also in 2009, the BBVA Foundation Frontiers of Knowledge Award for Development Cooperation.</p> <p>Duflo has also helped create an MITx MicroMasters program in <a href="">Data, Economics, and Development Policy</a>, which the Institute launched in 2016.&nbsp;</p> <p>In her remarks at the press conference, Duflo thanked a variety of people instrumental in the development of J-PAL, including Bengt Holmström, the 2016 Nobel laureate in economics, who encouraged Duflo and Banerjee to pursue the idea when he was department chair; former MIT president Susan Hockfield; Mohammed Abdul Latif Jameel, the foundational supporter of the organization; and Rachel Glennerster, the long-time executive director of J-PAL (who is currently on leave and working as chief economist of Great Britain’s Department of International Development).</p> <p>Duflo also thanks her students, as well as another of her graduate advisors, MIT professor Joshua Angrist, a long-time advocate of using rigorous empirical methods in the social sciences.</p> <p>Asked at today’s press conference about the significance of being only the second woman to win the Nobel Prize for Economic Sciences, Duflo said she strongly wants to encourage other women to enter the discipline.</p> <p>“There are not enough women in the economics profession at all levels,” Duflo said. “That has to change.” The issue, she noted, “is something the profession is starting to reckon with.” Banerjee, for his part, observed that development economics has a higher percentage of female scholars than other subfields within the discipline, and he agreed that women should be encouraged to become scholars in economics.</p> <p>Banerjee received his undergraduate degree from the University of Calcutta, and a master’s degree from Jawaharlal Nehru University in New Delhi. He earned his PhD in Economics from Harvard University in 1988. He spent four years on the faculty at Princeton University, and one year at Harvard, before joining the MIT faculty in 1993.</p> <p>Among other honors and awards, Banerjee was elected a fellow of the American Academy of Arts and Sciences in 2004, and was granted the BBVA Foundation Frontiers of Knowledge Award for Development Cooperation in 2009.</p> <p>Duflo and Banerjee are the sixth and seventh people to win&nbsp;the award while serving as MIT faculty members, following Paul Samuelson (1970), Franco Modigliani (1985), Robert Solow (1987), Peter Diamond (2010), and Bengt Holmström (2016). There are now 12 MIT alumni, including Duflo, who have won the Nobel in economics; eight former faculty have also won the award.</p> MIT economists Abhijit Banerjee and Esther Duflo stand outside their home after learning that they have been named co-winners of the 2019 Nobel Prize in economic sciences. They will share the prize with Michael Kremer of Harvard University.Photo: Bryce VickmarkSchool of Humanities Arts and Social Sciences, Economics, Awards, honors and fellowships, Faculty, Nobel Prizes, Social sciences, Developing countries, Abdul Latif Jameel Poverty Action Lab (J-PAL), Poverty, International development, Office of Open Learning, EdX, MITx, India, Africa Greener and fairer: Balancing pollution, energy prices, and household income New research looks at how environmental taxes can work for everyone, in Spain and beyond. Wed, 25 Sep 2019 12:00:02 -0400 Mark Dwortzan | Joint Program on the Science and Policy of Global Change <p>Governments that impose taxes on carbon dioxide and other greenhouse gas emissions can benefit from a cleaner, more climate-friendly environment and a revenue stream that can be tapped to lower other taxes and create jobs. But environmental taxes may also exact an excessive financial burden on low-income households, which spend a much greater fraction of their budgets than richer households do on heating oil, natural gas, and electricity. This concern has limited the use of green taxes in Spain, where emissions are taxed at levels far below average for the European Union, which seeks to lower emissions across the continent to fulfill its 2015 Paris Agreement climate pledge.</p> <p>Now a new <a href=";id=283">study</a> by researchers at the MIT Joint Program on the Science and Policy of Global Change, the University of Oldenburg in Germany, and the Basque Center for Climate Change in Spain shows that low-income households in Spain can actually benefit from environmental taxes if revenues are redistributed to all taxpayers. Using a computational model to assess the environmental and economic impacts of a green tax reform policy in which revenues are recycled in equal amounts to households in annual lump-sum payments, the researchers found that the policy significantly reduces emissions without imposing economic hardship on any segment of the population. The study appears in the journal <em>Economics of Energy and Environmental Policy.</em></p> <p>“There may be a tradeoff between efficiency and equity in climate policy design,” says <a href="">Xaquin Garcia-Muros</a>, a co-author of the study and postdoctoral associate at the MIT Joint Program. Noting the perfect can be the enemy of the good, as indicated by the <a href="">November 2018 Yellow Vest protests</a> against fuel tax hikes in France, he adds, “Governments that seek to introduce environmental policies need to show they can cut emissions equitably in order for the public to support them. Otherwise, climate mitigation measures will be rejected by public opinion, and attempts to tackle climate change will be unsuccessful.”</p> <p>The proposed policy includes a tax on carbon dioxide (CO<sub>2</sub>) of 40 euros per metric ton in all sectors (except transportation) not covered by the EU emissions trading system, tax increases on fossil fuels to match the EU average of 1.5 percent of GDP, and economy-wide taxes on air pollutants — nitrogen oxides, (NOx) and sulfur dioxide (SO<sub>2</sub>) emissions at 1,000 euros/metric ton. In addition, it provides annual lump-sum rebates to private households based on household income.</p> <p>Combining a “computable general equilibrium” model of the Spanish economy with a “micro-simulation” sub-model that characterizes households of different income levels, the researchers determined the tax reform policy’s impact on pollution levels, energy prices, and household net income. They found that the policy would significantly reduce emissions of CO<sub>2</sub> (10 percent), NOx (13 percent) and SO<sub>2</sub> (20 percent); produce an estimated 7.3 billion euros in annual revenues; and enable annual lump-sum rebates of 400 euros. Most importantly, the rebates would offset the cost of the green taxes for the bottom half of income levels, with the poorest households receiving an average annual net benefit of 203 euros and the richest paying a net cost of 599 euros.</p> <p>“We expect similar results in other southern European and public transit-oriented countries,” says Garcia-Muros. “But while results will differ for each country, all can benefit by ensuring that green tax policies accommodate economic inequality.” An earlier MIT Joint Program <a href="">study</a> showed how this principle can be applied in the design of carbon pricing policies in the United States.</p> Traffic in Madrid, SpainJoint Program on the Science and Policy of Global Change, Climate change, Greenhouse gases, Emissions, Environment, Energy, Economics, Policy, Carbon dioxide, Research, Sustainability Computing and artificial intelligence: Humanistic perspectives from MIT How the humanities, arts, and social science fields can help shape the MIT Schwarzman College of Computing — and benefit from advanced computing. Tue, 24 Sep 2019 00:00:00 -0400 School of Humanities, Arts, and Social Sciences <p><em>The MIT Stephen A. Schwarzman College of Computing </em><em>(SCC) </em><em>will reorient the Institute to bring the power of computing and artificial intelligence to all fields at MIT, and to allow the future of computing and AI to be shaped by all MIT disciplines.</em></p> <p><em>To support ongoing planning for the new college, Dean Melissa Nobles invited faculty from all 14 of MIT’s humanistic disciplines in the School of Humanities, Arts, and Social Sciences to respond to two questions:&nbsp;&nbsp; </em></p> <p><em>1) What domain knowledge, perspectives, and methods from your field should be integrated into the new MIT Schwarzman College of Computing, and why? </em><br /> <br /> <em>2) What are some of the meaningful opportunities that advanced computing makes possible in your field?&nbsp; </em></p> <p><em>As Nobles says in her foreword to the series, “Together, the following responses to these two questions offer something of a guidebook to the myriad, productive ways that technical, humanistic, and scientific fields can join forces at MIT, and elsewhere, to further human and planetary well-being.” </em></p> <p><em>The following excerpts highlight faculty responses, with links to full commentaries. The excerpts are sequenced by fields in the following order: the humanities, arts, and social sciences. </em></p> <p><strong>Foreword by Melissa Nobles, professor of political science and the Kenan Sahin Dean of the MIT School of Humanities, Arts, and Social Sciences </strong></p> <p>“The advent of artificial intelligence presents our species with an historic opportunity — disguised as an existential challenge: Can we stay human in the age of AI?&nbsp; In fact, can we grow in humanity, can we shape a more humane, more just, and sustainable world? With a sense of promise and urgency, we are embarked at MIT on an accelerated effort to more fully integrate the technical and humanistic forms of discovery in our curriculum and research, and in our habits of mind and action.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Comparative Media Studies: William Uricchio, professor of comparative media studies</strong></p> <p>“Given our research and practice focus, the CMS perspective can be key for understanding the implications of computation for knowledge and representation, as well as computation’s relationship to the critical process of how knowledge works in culture — the way it is formed, shared, and validated.”</p> <p>Recommended action: “Bring media and computer scholars together to explore issues that require both areas of expertise: text-generating algorithms (that force us to ask what it means to be human); the nature of computational gatekeepers (that compels us to reflect on implicit cultural priorities); and personalized filters and texts (that require us to consider the shape of our own biases).” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Global Languages: Emma J. Teng, the T.T. and Wei Fong Chao Professor of Asian Civilizations</strong></p> <p>“Language and culture learning are gateways to international experiences and an important means to develop cross-cultural understanding and sensitivity. Such understanding is essential to addressing the social and ethical implications of the expanding array of technology affecting everyday life across the globe.”</p> <p>Recommended action: “We aim to create a 21st-century language center to provide a convening space for cross-cultural communication, collaboration, action research, and global classrooms. We also plan to keep the intimate size and human experience of MIT’s language classes, which only increase in value as technology saturates the world.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>History: Jeffrey Ravel, professor of history and head of MIT History </strong></p> <p>“Emerging innovations in computational methods will continue to improve our access to the past and the tools through which we interpret evidence. But the field of history will continue to be served by older methods of scholarship as well; critical thinking by human beings is fundamental to our endeavors in the humanities.”</p> <p>Recommended action: “Call on the nuanced debates in which historians engage about causality to provide a useful frame of reference for considering the issues that will inevitably emerge from new computing technologies. This methodology of the history field is a powerful way to help imagine our way out of today’s existential threats.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Linguistics: Faculty of MIT Linguistics</strong></p> <p>“Perhaps the most obvious opportunities for computational and linguistics research concern the interrelation between specific hypotheses about the formal properties of language and their computational implementation in the form of systems that learn, parse, and produce human language.”</p> <p>Recommended action: “Critically, transformative new tools have come from researchers at institutions where linguists work side-by-side with computational researchers who are able to translate back and forth between computational properties of linguistic grammars and of other systems.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Literature: Shankar Raman, with Mary C. Fuller, professors of literature</strong></p> <p>“In the age of AI, we could invent new tools for reading. Making the expert reading skills we teach MIT students even partially available to readers outside the academy would widen access to our materials in profound ways.”</p> <p>Recommended action: At least three priorities of current literary engagement with the digital should be integrated into the SCC’s research and curriculum: democratization of knowledge; new modes of and possibilities for knowledge production; and critical analysis of the social conditions governing what can be known and who can know it.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Philosophy: Alex Byrne, professor of philosophy and head of MIT Philosophy; and Tamar Schapiro, associate professor of philosophy</strong></p> <p>“Computing and AI pose many ethical problems related to: privacy (e.g., data systems design), discrimination (e.g., bias in machine learning), policing (e.g., surveillance), democracy (e.g., the&nbsp;Facebook-Cambridge Analytica data scandal), remote warfare, intellectual property, political regulation, and corporate responsibility.”</p> <p>Recommended action: “The SCC presents an opportunity for MIT to be an intellectual leader in the ethics of technology. The ethics lab we propose could turn this opportunity into reality.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Science, Technology, and Society: Eden Medina and Dwaipayan Banerjee, associate professors of science, technology, and society</strong></p> <p>“A more global view of computing would demonstrate a broader range of possibilities than one centered on the American experience, while also illuminating how computer systems can reflect and respond to different needs and systems. Such experiences can prove generative for thinking about the future of computing writ large.”</p> <p>Recommended action: “Adopt a global approach to the research and teaching in the SCC, an approach that views the U.S. experience as one among many.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Women's and Gender Studies: Ruth Perry, the Ann Friedlaender Professor of Literature; with Sally Haslanger, the Ford Professor of Philosophy, and Elizabeth Wood, professor of history</strong></p> <p>“The SCC presents MIT with a unique opportunity to take a leadership role in addressing some of most pressing challenges that have emerged from the role computing technologies play in our society — including how these technologies are reinforcing social inequalities.”</p> <p>Recommended action: “Ensure that women’s voices are heard and that coursework and research is designed with a keen awareness of the difference that gender makes. This is the single-most powerful way that MIT can address the inequities in the computing fields.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Writing: Tom Levenson, professor of science writing </strong></p> <p>“Computation and its applications in fields that directly affect society cannot be an unexamined good. Professional science and technology writers are a crucial resource for the mission of new college of computing, and they need to be embedded within its research apparatus.”</p> <p>Recommended action: “Intertwine writing and the ideas in coursework to provide conceptual depth that purely technical mastery cannot offer.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Music: Eran Egozy, professor of the practice in music technology</strong></p> <p>“Creating tomorrow’s music systems responsibly will require a truly multidisciplinary education, one that covers everything from scientific models and engineering challenges to artistic practice and societal implications. The new music technology will be accompanied by difficult questions. Who owns the output of generative music algorithms that are trained on human compositions? How do we ensure that music, an art form intrinsic to all humans, does not become controlled by only a few?”</p> <p>Recommended action: Through the SCC, our responsibility will be not only to develop the new technologies of music creation, distribution, and interaction, but also to study their cultural implications and define the parameters of a harmonious outcome for all.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Theater Arts: Sara Brown, assistant professor of theater arts and MIT Theater Arts director of design</strong></p> <p>“As a subject, AI problematizes what is means to be human. There are an unending series of questions posed by the presence of an intelligent machine. The theater, as a synthetic art form that values and exploits liveness, is an ideal place to explore the complex and layered problems posed by AI and advanced computing.”</p> <p>Recommended action: “There are myriad opportunities for advanced computing to be integrated into theater, both as a tool and as a subject of exploration. As a tool, advanced computing can be used to develop performance systems that respond directly to a live performer in real time, or to integrate virtual reality as a previsualization tool for designers.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Anthropology: Heather Paxson, the William R. Kenan, Jr. Professor of Anthropology</strong></p> <p>“The methods used in anthropology —&nbsp;a field that systematically studies human cultural beliefs and practices — are uniquely suited to studying the effects of automation and digital technologies in social life. For anthropologists, ‘Can artificial intelligence be ethical?’ is an empirical, not a hypothetical, question. Ethical for what? To whom? Under what circumstances?”</p> <p>Recommended action: “Incorporate anthropological thinking into the new college to prepare students to live and work effectively and responsibly in a world of technological, demographic, and cultural exchanges. We envision an ethnography lab that will provide digital and computing tools tailored to anthropological research and projects.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Economics: Nancy L. Rose, the Charles P. Kindleberger Professor of Applied Economics and head of the Department of Economics; and David Autor, the Ford Professor of Economics and co-director of the MIT Task Force on the Work of the Future</strong></p> <p>“The intellectual affinity between economics and computer science traces back almost a century, to the founding of game theory in 1928. Today, the practical synergies between economics and computer science are flourishing. We outline some of the many opportunities for the two disciplines to engage more deeply through the new SCC.”</p> <p>Recommended action: “Research that engages the tools and expertise of economics on matters of fairness, expertise, and cognitive biases in machine-supported and machine-delegated decision-making; and on market design, industrial organization, and the future of work. Scholarship at the intersection of data science, econometrics, and causal inference. Cultivate depth in network science, algorithmic game theory and mechanism design, and online learning. Develop tools for rapid, cost-effective, and ongoing education and retraining for workers.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><strong>Political Science: Faculty of the Department of Political Science</strong></p> <p>“The advance of computation gives rise to a number of conceptual and normative questions that are political, rather than ethical in character. Political science and theory have a significant role in addressing such questions as: How do major players in the technology sector seek to legitimate their authority to make decisions that affect us all? And where should that authority actually reside in a democratic polity?”</p> <p>Recommended action: “Incorporate the research and perspectives of political science in SCC research and education to help ensure that computational research is socially aware, especially with issues involving governing institutions, the relations between nations, and human rights.” <a href="" target="_blank">Read more &gt;&gt;</a></p> <p><span style="font-size:11px;"><em>Series prepared by SHASS Communications<br /> Series Editor and Designer: Emily Hiestand<br /> Series Co-Editor: Kathryn O’Neill</em></span></p> Image: Christine Daniloff, MITEducation, teaching, academics, Humanities, Arts, Social sciences, Computer science and technology, Artificial intelligence, Technology and society, MIT Schwarzman College of Computing, Anthropology, School of Humanities Arts and Social Sciences, Comparative Media Studies/Writing, Economics, Global Studies and Languages, History, Linguistics, Literature, Music, Philosophy, Political science, Program in STS, Theater, Music and theater arts, Women's and Gender Studies Meet Carolyn Stein: Researching the economics of science MIT PhD student explores the impact of scientists being &quot;scooped&quot; when a competing research team publishes results first, a concern for many disciplines. Mon, 23 Sep 2019 09:00:00 -0400 School of Humanities, Arts, and Social Sciences <p>Carolyn Stein says she’s not a morning person. And yet …</p> <p>“All of a sudden I’m going on bike rides with people that leave at 5:30 a.m.,” she says, shaking her head in surprise.</p> <p>Such is the appeal of MIT Cycling Club for Stein, a doctoral student in MIT’s Department of Economics, located within the School of Humanities, Arts, and Social Sciences. After inheriting an old road bike last year she has been shifting gears, literally and figuratively.</p> <p>“It’s a wonderful thing to have happened and it's how I’ve met people across the institute,” Stein says.</p> <p>After graduating from Harvard University with degrees in applied mathematics and economics, Stein worked for a Boston hedge fund for two years. Upon arriving at MIT, she planned to study labor economics and explore why some people reach their potential in the labor force while others do not. But before long, Stein had decided to shift her area of research to the economics of science.</p> <p><strong>The economics of science</strong></p> <p>“The focus on science was influenced by one of my advisers, Professor Heidi Williams," she says, "and also just by being at MIT surrounded by people who do science all the time. I’ve been learning what an interesting and difficult career path science is. On its surface, academic science is different from other jobs that economists typically study. For one, scientists are often motivated by factors other than wages.<br /> <br /> “But many insights from labor economics can still help us understand how the field of science functions. Incentive and career concerns still matter. And risk is a big concern in science. You could have a very good idea, but get scooped. That can derail a scientist, and a whole year’s worth of work could be lost. That's where this research idea began.”<br /> <br /> Stein and her research partner, Ryan Hill, also a doctoral student in the MIT economics department, are working on two projects simultaneously, both of which focus on the careers of scientists and the incentives they face. Their first paper explores what happens when a scientist is “scooped” or, in other words, what happens to scientists when a competing research team publishes their results first. It’s a concern that resonates with researchers across many disciplines.<br /> <br /> <strong>The impact of being scooped</strong></p> <p>“Economists often worry that while we’re working on something we’re going to flip open a journal and see that someone else has already written the same paper,” Stein says. “This is an even bigger deal in science. In our project, we're studying a particular field of structural biology where we can actually look at data at the level of proteins and find cases where two scientists are simultaneously trying to solve the structure of the same protein.<br /> <br /> “But one person gets there first and publishes. We’re trying to learn what happens to the other scientist, who has been scooped. Are they still able to publish? Do they get published in a lower-ranked journal, or receive fewer citations? Anecdotally, scientists say they’re very stressed about being scooped, so we’re trying to measure how much they’re penalized, if they are.”<br /> <br /> <strong>The tension between quality and competition</strong></p> <p>Stein's and Hill's second paper examines the tradeoff between competition and quality in science. If competition is fierce and scientists are working overtime to get their work done sooner, the science may progress faster, Stein reasons. But if the fear of being scooped is high, scientists may decide to publish early. As a result, the work may not be as thorough.<br /> <br /> “In that case, we miss out on the highest quality work these scientists could produce,” Stein says. “You’re looking at a trade-off. Competition means that science progresses faster, but corners may have been cut. How we as a society should feel about this probably depends on the balance of that trade-off. That’s the tension that we’re trying to explore.”<br /> <br /> <strong>Work that resonates</strong></p> <p>After several years working and studying at MIT, Stein is now excited to see how things have coalesced: Her research topic has received positive feedback from the MIT community; she’s “super happy” with her advisers — professors Heidi Williams and Amy Finkelstein in the Department of Economics, and Pierre Azoulay, a professor of management in the MIT Sloan School of Management — and collaborating with Hill has “made the whole experience much more fun and companionable. (Williams, who continues to serve as Stein's adviser, is now on the faculty of Stanford University.)<br /> <br /> “I want to do things that resonate with people inside and outside the economics field," Stein reflects. "A really rewarding part of this project has been talking to people who do science and asking them if our work resonates with them. Having scientists completely understand what we’re talking about is a huge part of the fun for me.”<br /> <br /> Another activity Stein is enthusiastic about is her teaching experience with professors Williams and David Autor, which has affirmed her interest in an academic career. “I find teaching incredibly gratifying,” Stein says. “And I’ve had the privilege of being a teaching assistant here for professors who care a great deal about teaching.”<br /> <br /> <strong>Women in economics</strong></p> <p>Stein would also like to encourage more women to explore a career in economics. She notes that if you were to poll students in their first year, they would likely say that economics is about what they read in <em>The Wall Street Journal:</em> finance, international trade, and money.<br /> <br /> “But it’s much more than that,” Stein says. “Economics is more like a set of tools that you can apply to an astonishingly wide variety of things. I think that if more people knew this, and knew it sooner in their college career, a much more diverse group of people would want to study the field.”<br /> <br /> Career options in the private sector are also increasing for economists, she says. “A lot of tech companies now realize they love economics PhDs. These companies collect so much data. It’s an opportunity to actually do a job that uses your degree.”<br /> <br /> <strong>A sport with data</strong></p> <p>As the 2019 fall academic term gets underway, Stein is focused on writing her thesis and preparing for the academic job market. To explore her native New England as well as to escape the rigors of thesis-writing, she’s also looking forward to rides with the MIT Cycling Club.</p> <p>“A few weekends ago," she says, "we drove up to Vermont to do this completely insane ride over six mountain passes. The club is such a wonderful group of people. And cycling can be a very nerdy sport with tons of data to analyze.”</p> <p>So, maybe not a total escape.<br /> &nbsp;</p> <h5><em>Story by MIT SHASS Communications<br /> Editorial Team: Emily Hiestand and Maria Iacobo </em></h5> "Scientists are often motivated by factors other than wages,” says Carolyn Stein, "but many insights from labor economics still help us understand how the field of science functions. Incentive and career concerns still matter. And risk is a big concern.”Photo: Maria Iacobo Economics, School of Humanities Arts and Social Sciences, Social sciences, Profile, Women, History of science, Data, Analytics, Behavioral economics, Students, Graduate, postdoctoral Using rigorous evaluation to reduce and prevent homelessness in North America New J-PAL North America publication highlights how rigorous research can improve policies to help people access and maintain stable, affordable housing. Mon, 09 Sep 2019 12:20:01 -0400 J-PAL North America <p>For millions of people in the United States, the struggle for stable housing both shapes and is shaped by numerous factors, such as employment opportunities and wages, housing market dynamics, access to health care, financial stability, and involvement with the criminal justice system. In the United States, <a href="" target="_blank">more than 500,000 people experience homelessness</a> on a given night, and <a href=";utm_medium=mit_news&amp;utm_campaign=homelessness_evidence_review" target="_blank">1.4 million people pass through emergency shelters</a> in a given year. Many more individuals experience housing instability in other, often uncounted forms, whether living doubled up with friends or family, living in temporary accommodations such as motels, or living under threat of eviction.</p> <p>The scope and complexity of housing instability and homelessness in the United States highlight the need for rigorous evidence on the effectiveness of strategies to prevent and reduce homelessness. Each year, <a href="">billions of dollars in public financial resources</a> are devoted to combatting housing instability, between federal expenditures and additional spending within local jurisdictions. It is critical that these resources fund policies and programs that will efficiently help to end homelessness.</p> <p>In the past few decades, many organizations have shifted the types of services offered to individuals and families experiencing housing instability to prioritize immediate housing, referred to as a Housing First approach. Evidence played a fundamental role in <a href=";utm_medium=mit_news&amp;utm_campaign=homelessness_evidence_review">building support for this new model from the beginning</a>, with several randomized evaluations demonstrating that a Housing First approach could more effectively house people experiencing chronic homelessness than shelter-based approaches.</p> <p>While the rigorous evidence on the Housing First model and other approaches to reducing and preventing homlessnesss provides a start, open questions remain as to effectiveness of the current organization of homelessness programs in North America. How can rigorous evaluation continue to drive improvements to policies and services aimed at helping people experiencing housing instability access and maintain stable, affordable housing?</p> <p>To answer this question, <a href="">J-PAL North America</a> released an <a href=";utm_medium=mit_news&amp;utm_campaign=homelessness_evidence_review">Evidence Review</a> summarizing results from 40 rigorous evaluations of 18 distinct programs related to homelessness prevention and reduction. The publication focuses mainly on questions that can be answered through rigorous impact evaluation methods and outlines a research agenda for additional evaluation based on a recently released academic working paper on homelessness, “<a href=";utm_medium=mit_news&amp;utm_campaign=homelessness_evidence_review">Reducing and Preventing Homelessness: A Review of the Evidence and Charting a Research Agenda</a>,” by William Evans and David Phillips of the University of Notre Dame and Krista Ruffini of the Goldman School of Public Policy at the University of California at Berkeley.</p> <p>The body of evidence suggests some areas of promise, but demonstrates that additional research on the effectiveness of other strategies to reduce homelessness is needed.</p> <p>First, homelessness prevention is an area that demands more rigorous evaluation. An existing body of research demonstrates that emergency financial assistance and more comprehensive interventions that provide a range of financial assistance, counseling, and legal supports can prevent homelessness among families at risk of eviction. Additionally, legal representation for tenants at risk of losing their homes holds promise for reducing evictions. However, more research is needed on how prevention programs can best be delivered and targeted towards those most in need.</p> <p>Second, permanent supportive housing programs, which provide long-term housing support and wrap-around services with no preconditions, can increase housing stability for veterans and individuals with severe mental illness. Based on the body of rigorous evidence behind Housing First approaches to homelessness reduction, many organizations have shifted toward this model of intensive assistance, and away from the traditional model of requiring preconditions, such as sobriety and employment, before obtaining permanent housing. However, there has been little rigorous evaluation of the impact of permanent supportive housing programs for other groups of people.</p> <p>Third, although rapid re-housing is a potentially cost-effective solution to provide immediate access to housing, there is limited evidence on the impacts of rapid re-housing on long-term housing stability. Rapid re-housing aims to house people experiencing homelessness as quickly as possible by providing short-term rental assistance and services to help households overcome barriers to long-term housing stability.</p> <p>Fourth, subsidized long-term housing assistance in the form of housing vouchers helps low-income families avoid homelessness and stay stably housed. The federally subsidized housing program with the most rigorous evidence to date is the Housing Choice Voucher program. Also known as Section 8, the program provides eligible low-income households with rental assistance to pay for private-market housing in units that they select.</p> <p>The publication also identifies existing gaps in the literature and outlines key open questions about the effectiveness of strategies to reduce and prevent homelessness to consider going forward. For instance, it is important to rigorously test the impact of existing programs with a limited evidence base, such as rapid re-housing. Additional questions remain on how homelessness programs and services affect non-housing related outcomes and how best to design and target services to maximize potential impact.&nbsp;</p> <p>To help answer these questions, J-PAL North America’s <a href="">work on homelessness</a> seeks to expand the base of rigorous evidence on strategies to reduce and prevent homelessness and promote housing stability by partnering directly with nonprofits and government agencies.&nbsp;<br /> <br /> Organizations interested in being paired with researchers to rigorously evaluate strategies to ameliorate homelessness are encouraged to contact project manager <a href="">Rohit Naimpally</a>. J-PAL North America may be able to offer technical assistance, matchmaking with researchers, and funding to cover the cost of an evaluation. For more information, please see our <a href=";utm_medium=mit_news&amp;utm_campaign=homelessness_evidence_review">Housing Stability Evaluation Incubator</a>.</p> <p>J-PAL North America is a regional office of the Abdul Latif Jameel Poverty Action Lab (J-PAL), a global research center based at MIT.</p> J-PAL North America's recently released publication summarizes 40 rigorous evaluations of policies and programs aiming to reduce and prevent homelessness.Photo: A. KatzAbdul Latif Jameel Poverty Action Lab (J-PAL), Economics, Research, Housing, Poverty, School of Humanities Arts and Social Sciences Reaching climate solutions through negotiation At “SimPlanet” event, students test-drive new computer simulation to reveal outcomes of different policy decisions. Fri, 06 Sep 2019 12:32:13 -0400 David L. Chandler | MIT News Office <p>Evaluating the many possible strategies for curbing greenhouse gas emissions and limiting the destructive effects of a warming planet is a daunting and contentious task. This week, about 50 MIT students got a chance to try out new software that can visually demonstrate how different policy choices could affect the global outcome.</p> <p>At Tuesday’s “SimPlanet” event at the MIT Media Lab, students had a chance to beta-test a new interactive energy and climate policy simulation model, <a href="" target="_blank">En-ROADS</a>, developed jointly by the <a href="">MIT Sloan Sustainability Initiative</a> and <a href="" target="_blank">Climate Interactive</a>, a nonprofit, nonpartisan think tank.&nbsp; Sponsored by MIT’s Environmental Solutions Initiative, the students worked in teams representing each of eight different interest groups — including developed and developing nations, environmental activists, and industries that make and use energy — using the model to explore the impacts of dozens of possible policies and how hard to push for each.</p> <p>Then John Sterman, the Jay W. Forrester Professor of Management and a lead developer of the simulation, entered each group’s policies into the En-ROADS dashboard, instantly showing how those policies would affect energy use and greenhouse gas emissions between now and the year 2100, and what the expected change in global temperatures, sea level, and other impacts would be.</p> <p>The nations of the world, through the 2015 <a href="" target="_blank">Paris Agreement</a>, committed to holding the increase in the global average temperature to well below 2 degrees Celsius compared to preindustrial levels, and pursuing efforts to limit the temperature increase to 1.5 C. As shown by the En-ROADS display, without any new policies, the expected increase by the end of the century would be roughly twice that limit, at approximately 4.1 C.</p> <p>As the group discovered during the four-hour interactive event, cutting greenhouse gas emissions enough to achieve the goal is difficult. But it is achievable — even without any unproven technological breakthroughs.</p> <p>That encouraging bottom line was not at all apparent as the role-play began. Each team was provided a confidential briefing memo outlining the priorities of their constituency. The delegations represented the developed nations, rapidly emerging nations (including China, India, and Brazil), developing nations, conventional energy companies (coal, oil, natural gas, and nuclear), clean tech (renewable energy companies), industry and commerce, agriculture and forestry, and climate activists. Conflict quickly emerged as the interests of some groups were directly contrary to those of others — for example, developing versus developed nations, or clean tech versus conventional energy — so in many cases the policies proposed by one group were promptly reversed by another, revealing the complexity of negotiating to find common ground.</p> <p>The outcome? Less global warming, but not by much: 3.7 degrees Celsius by 2100, far short of the reduction needed and enough to cause significant harm, including extreme weather, declining crop yield, and sea level rise.</p> <p>To dramatize the reality of such changes, volunteers dragged a large blue sheet over the heads of the participants, providing a visceral feel for being caught under the rising waters. Then Sterman showed images of various coastal cities, revealing how they would be affected by the rising seas and what would happen if these cities were then hit by the storm surge from hurricanes and typhoons such as Dorian. In many cases, such as Miami and Shanghai, entire cities would be inundated.</p> <p>“The participants saw, for themselves, that it is in their own interests to reach a stronger agreement,” Sterman says. He encouraged members of each group to get up, walk around, and negotiate with the other groups to arrive at a stronger proposal. Adding a touch of realism to the simulation, during an ice-cream break, members of the team representing climate activists and indigenous peoples formed a human chain to block access to the ice cream servers until members of other teams agreed to implement a price on carbon — a policy that the simulation showed to be one of the most effective drivers of emissions reductions.</p> <p>Throughout the session, Sterman, playing the role of the UN Secretary General, provided background information on climate change and its consequences, and on the pros and cons of the many policy options under consideration, including incentives for new technology, subsidies or taxes on different forms of energy, and policies to encourage efficiency and other changes in energy systems, buildings, transportation, and land use.</p> <p>The En-ROADS simulation, Sterman explained, is an updated and far more detailed version of one that he has used for several years. Both models have been used by members of Congress and their staffs, national representatives at the actual UN climate negotiations, and political and business leaders around the world, including then-Secretary of State John Kerry and, during a visit at MIT, the Dalai Lama. Participants in SimPlanet used the same model as was used in these briefings.</p> <p>By the end of the session, participants tried many different proposals, learning which worked best — and which could withstand attempts by other interest groups to roll them back. They discovered that there’s no single policy that can achieve the goal, but the group finally arrived at a set of policies that held the expected warming to just 1.9 C. And they did so using policies that are well within the bounds of what are considered to be technically achievable and economically affordable. “There is no silver bullet, but there is silver buckshot,” Sterman commented.</p> <p>“Climate change is no longer a science problem,” he declared. “It’s not an engineering problem — we have the technologies we need and they are improving rapidly. It’s not even an economic problem — the costs of action are far lower than the costs of inaction, and many policies generate cobenefits that create jobs and improve public health.” Rather, he said, tackling climate change is a social and political problem that requires personal and political actions to implement the policies needed to cut emissions in time.</p> <p>“<a href="" target="_blank">Research shows</a> that showing people research doesn’t work” Sterman says. “Interactive simulations such as SimPlanet enable people to learn for themselves, not only building their knowledge of the issue, but motivating them to take action, personally, professionally, and as citizens working for change.”</p> Professor John Sterman displays maps showing the consequences of sea-level rise on various coastal cities, as part of the “SimPlanet” event at MIT.Images: Melanie Gonick, MITSloan School of Management, Environment, Climate change, Climate, Policy, Government, International relations, Sustainability, Developing countries, Economics, Special events and speakers, Students, Emissions, Behavioral economics How “information gerrymandering” influences voters Study analyzes how networks can distort voters’ perceptions and change election results. Wed, 04 Sep 2019 13:00:00 -0400 Peter Dizikes | MIT News Office <p>Many voters today seem to live in partisan bubbles, where they receive only partial information about how others feel regarding political issues. Now, an experiment developed in part by MIT researchers sheds light on how this phenomenon influences people when they vote.</p> <p>The experiment, which placed participants in simulated elections, found not only that communication networks (such as social media) can distort voters’ perceptions of how others plan to vote, but also that this distortion can increase the chance of electoral deadlock or bias overall election outcomes in favor of one party.&nbsp;&nbsp;</p> <p>“The structure of information networks can really fundamentally influence the outcomes of elections,” says David Rand, an associate professor at the MIT Sloan School of Management and a co-author of a new paper detailing the study. “It can make a big difference and is an issue people should be taking seriously.”</p> <p>More specifically, the study found that “information gerrymandering” can bias the outcome of a vote, such that one party wins up to 60 percent of the time in simulated elections of two-party situations where the opposing groups are equally popular. In a follow-up empirical study of the U.S. federal government and eight European legislative bodies, the researchers also identified actual information networks that show similar patterns, with structures that could skew over 10 percent of the vote in the study’s experiments.</p> <p>The paper, “Information gerrymandering and undemocratic decisions,” is being published today in&nbsp;<em>Nature</em>.</p> <p>The authors are Alexander J. Stewart of the University of Houston; Mohsen Mosleh, a research scientist at MIT Sloan; Marina Diakonova of the Environmental Change Institute at Oxford University; Antonio Arechar, an associate research scientist at MIT Sloan and a researcher at the Center for Research and Teaching in Economics (CIDE) in Aguascalientes, Mexico; Rand, who is also the principal investigator for MIT Sloan’s Human Cooperation Lab; and Joshua B. Plotkin of the University of Pennsylvania. Stewart is the lead author.</p> <p><strong>Formal knowledge</strong></p> <p>While there is a burgeoning academic literature on media preferences, political ideology, and voter choices, the current study is an effort to create general models of the fundamental influence that information networks can have. Through abstract mathematical models and experiments, the researchers can analyze how strongly networks can influence voter behavior, even when long-established layers of voter identity and ideology are removed from the political arena.</p> <p>“Part of the contribution here is to try to formalize how information about politics flows through social networks, and how that can influence voters’ decisions,” says Stewart.</p> <p>The study used experiments involving 2,520 particpants, who played a “voter game” in one of a variety of conditions. (The participants were recruited via Amazon’s Mechanical Turk platform and took part in the simulated elections via Breadboard, a platform generating multiplayer network interactions.) The players were divided into two teams, a “yellow” team and a “purple” team, usually with 24 people on each side, and were allowed to change their voting intentions in response to continuously updated polling data.</p> <p>The participants also had incentives to try to produce certain vote outcomes reflective of what the authors call a “compromise worldview.” For instance, players would receive a (modest) payoff if their team received a super-majority vote share; a smaller payoff if the other team earned a super-majority; and zero payoff if neither team reached that threshold. The election games usually lasted four minutes, during which time each voter had to decide how to vote.</p> <p>In general, voters almost always voted for their own party when the polling data showed it had a chance of reaching a super-majority share. They also voted for their own side when the polling data showed a deadlock was likely. But when the opposing party was likely to achieve a super-majority, half the players would vote for it, and half would continue to vote for their own side.</p> <p>During a baseline series of election games where all the players had unbiased, random polling information, each side won roughly a quarter of the time, and a deadlock without a super-majority resulted about half the time. But the researchers also varied the game in multiple ways. In one iteration of the game, they added information gerrymandering to the polls, such that some members of one team were placed inside the other team’s echo chamber. In another iteration, the research team deployed online bots, comprising about 20 percent of voters, to behave like “zealots,” as the scholars called them; the bots would strongly support one side only.</p> <p>After months of iterations of the game, the researchers concluded that election outcomes could be heavily biased by the ways in which the polling information was distributed over the networks, and by the actions of the zealot bots. When members of one party were led to believe that most others were voting for the other party, they often switched their votes to avoid deadlock.</p> <p>“The network experiments are important, because they allow us to test the predictions of the mathematical models,” says Mosleh, who led the experimental portion of the research “When we added echo chambers, we saw that deadlock happened much more often — and, more importantly, we saw that information gerrymandering biased the election results in favor of one party over the other.”</p> <p><strong>The empirical case</strong></p> <p>As part of the larger project, the team also sought out some empirical information about similar scenarios among elected governments. There are many instances where elected officials might either support their first-choice legislation, settle for a cross-partisan compromise, or remain in deadlock. In those cases, having unbiased information about the voting intentions of other legislators would seem to be very important.</p> <p>Looking at the co-sponsorship of bills in the U.S. Congress from 1973 to 2007, the researchers found that the Democratic Party had greater “influence assortment” — more exposure to the voting intentions of people in their own party — than the Republican Party of the same time. However, after Republicans gained control of Congress in 1994, their own influence assortment became equivalent to that of the Democrats, as part of a highly polarized pair of legislative influence networks. The researchers found similar levels of polarization in the influence networks of six out of the eight European parliaments they evaluated, generally during the last decade.</p> <p>Rand says he hopes the current study will help generate additional research by other scholars who want to keep exploring these dynamics empirically.</p> <p>“Our hope is that laying out this information gerrymandering theory, and introducing this voter game, we will spur new research around these topics to understand how these effects play out in real-world networks,” Rand says.</p> <p>Support for the research was provided by the U.S. Defense Advanced Research Projects Agency, the Ethics and Governance of Artificial Intelligence Initiative of the Miami Foundation, the Templeton World Charity Foundation and the John Templeton Foundation, the Army Research Office, and the David and Lucile Packard Foundation.</p> A new study co-authored by MIT scholars examines the impact of political information on voter behavior, under a variety of conditions. The schematic images here represent voter information networks, ranging from those with connections across political parties, upper left, to those with no contact between opposing party members, bottom right.Image: Courtesy of the researchersPolitics, Voting and elections, Behavioral economics, Game theory, Networks, Research, Social media, Technology and society, Film and Television, Government, Political science, Sloan School of Management MIT report examines how to make technology work for society Task force calls for bold public and private action to harness technology for shared prosperity. Wed, 04 Sep 2019 08:59:59 -0400 Peter Dizikes | MIT News Office <p>Automation is not likely to eliminate millions of jobs any time soon — but the U.S. still needs vastly improved policies if Americans are to build better careers and share prosperity as technological changes occur, according to a new MIT report about the workplace.</p> <p><a href="">The report</a>, which represents the initial findings of MIT’s Task Force on the Work of the Future, punctures some conventional wisdom and builds a nuanced picture of the evolution of technology and jobs, the subject of much fraught public discussion.</p> <p>The likelihood of robots, automation, and artificial intelligence (AI) wiping out huge sectors of the workforce in the near future is exaggerated, the task force concludes — but there is reason for concern about the impact of new technology on the labor market. In recent decades, technology has contributed to the polarization of employment, disproportionately helping high-skilled professionals while reducing opportunities for many other workers, and new technologies could exacerbate this trend.</p> <p>Moreover, the report emphasizes, at a time of historic income inequality, a critical challenge is not necessarily a lack of jobs, but the low quality of many jobs and the resulting lack of viable careers for many people, particularly workers without college degrees. With this in mind, the work of the future can be shaped beneficially by new policies, renewed support for labor, and reformed institutions, not just new technologies. Broadly, the task force concludes, capitalism in the U.S. must address the interests of workers as well as shareholders.</p> <p>“At MIT, we are inspired by the idea that technology can be a force for good. But if as a nation we want to make sure that today’s new technologies evolve in ways that help build a healthier, more equitable society, we need to move quickly to develop and implement strong, enlightened policy responses,” says MIT President L. Rafael Reif, who called for the creation of the Task Force on the Work of the Future in 2017.</p> <p>“Fortunately, the harsh societal consequences that concern us all are not inevitable,” Reif adds. “Technologies embody the values of those who make them, and the policies we build around them can profoundly shape their impact. Whether the outcome is inclusive or exclusive, fair or laissez-faire, is therefore up to all of us. I am deeply grateful to the task force members for their latest findings and their ongoing efforts to pave an upward path.”</p> <p>“There is a lot of alarmist rhetoric about how the robots are coming,” adds Elisabeth Beck Reynolds, executive director of the task force, as well as executive director of the MIT Industrial Performance Center. “MIT’s job is to cut through some of this hype and bring some perspective to this discussion.”</p> <p>Reynolds also calls the task force’s interest in new policy directions “classically American in its willingness to consider innovation and experimentation.”</p> <p><strong>Anxiety and inequality</strong></p> <p>The core of the task force consists of a group of MIT scholars. Its research has drawn upon new data, expert knowledge of many technology sectors, and a close analysis of both technology-centered firms and economic data spanning the postwar era.</p> <p>The report addresses several workplace complexities. Unemployment in the U.S. is low, yet workers have considerable anxiety, from multiple sources. One is technology: A 2018 survey by the Pew Research Center found that 65 to 90 percent of respondents in industrialized countries think computers and robots will take over many jobs done by humans, while less than a third think better-paying jobs will result from these technologies.</p> <p>Another concern for workers is income stagnation: Adjusted for inflation, 92 percent of Americans born in 1940 earned more money than their parents, but only about half of people born in 1980 can say that.</p> <p>“The persistent growth in the quantity of jobs has not been matched by an equivalent growth in job quality,” the task force report states.</p> <p>Applications of technology have fed inequality in recent decades. High-tech innovations have displaced “middle-skilled” workers who perform routine tasks, from office assistants to assembly-line workers, but these innovations have complemented the activities of many white-collar workers in medicine, science and engineering, finance, and other fields. Technology has also not displaced lower-skilled service workers, leading to a polarized workforce. Higher-skill and lower-skill jobs have grown, middle-skill jobs have shrunk, and increased earnings have been concentrated among white-collar workers.</p> <p>“Technological advances did deliver productivity growth over the last four decades,” the report states. “But productivity growth did not translate into shared prosperity.”</p> <p>Indeed, says David Autor, who is the Ford Professor of Economics at MIT, associate head of MIT’s Department of Economics, and a co-chair of the task force, “We think people are pessimistic because they’re on to something. Although there’s no shortage of jobs, the gains have been so unequally distributed that most people have not benefited much. If the next four decades of automation are going to look like the last four decades, people have reason to worry.”</p> <p><strong>Productive innovations versus “so-so technology”</strong></p> <p>A big question, then, is what the next decades of automation have in store. As the report explains, some technological innovations are broadly productive, while others are merely “so-so technologies” — a term coined by economists Daron Acemoglu of MIT and Pascual Restrepo of Boston University to describe technologies that replace workers without markedly improving services or increasing productivity.</p> <p>For instance, electricity and light bulbs were broadly productive, allowing the expansion of other types of work. But automated technology allowing for self-check-out at pharmacies or supermarkets merely replaces workers without notably increasing efficiency for the customer or productivity.</p> <p>“That’s a strong labor-displacing technology, but it has very modest productivity value,” Autor says of these automated systems. “That’s a ‘so-so technology.’ The digital era has had fabulous technologies for skill complementarity [for white-collar workers], but so-so technologies for everybody else. Not all innovations that raise productivity displace workers, and not all innovations that displace workers do much for productivity.”</p> <p>Several forces have contributed to this skew, according to the report. “Computers and the internet enabled a digitalization of work that made highly educated workers more productive and made less-educated workers easier to replace with machinery,” the authors write.</p> <p>Given the mixed record of the last four decades, does the advent of robotics and AI herald a brighter future, or a darker one? The task force suggests the answer depends on how humans shape that future. New and emerging technologies will raise aggregate economic output and boost wealth, and offer people the potential for higher living standards, better working conditions, greater economic security, and improved health and longevity. But whether society realizes this potential, the report notes, depends critically on the institutions that transform aggregate wealth into greater shared prosperity instead of rising inequality.</p> <p>One thing the task force does not foresee is a future where human expertise, judgment, and creativity are less essential than they are today. &nbsp;</p> <p>“Recent history shows that key advances in workplace robotics — those that radically increase productivity — depend on breakthroughs in work design that often take years or even decades to achieve,” the report states.</p> <p>As robots gain flexibility and situational adaptability, they will certainly take over a larger set of tasks in warehouses, hospitals, and retail stores — such as lifting, stocking, transporting, cleaning, as well as awkward physical tasks that require picking, harvesting, stooping, or crouching.</p> <p>The task force members believe such advances in robotics will displace relatively low-paid human tasks and boost the productivity of workers, whose attention will be freed to focus on higher-value-added work. The pace at which these tasks are delegated to machines will be hastened by slowing growth, tight labor markets, and the rapid aging of workforces in most industrialized countries, including the U.S.</p> <p>And while machine learning — image classification, real-time analytics, data forecasting, and more — has improved, it may just alter jobs, not eliminate them: Radiologists do much more than interpret X-rays, for instance. The task force also observes that developers of autonomous vehicles, another hot media topic, have been “ratcheting back” their timelines and ambitions over the last year.</p> <p>“The recent reset of expectations on driverless cars is a leading indicator for other types of AI-enabled systems as well,” says David A. Mindell, co-chair of the task force, professor of aeronautics and astronautics, and the Dibner Professor of the History of Engineering and Manufacturing at MIT. “These technologies hold great promise, but it takes time to understand the optimal combination of people and machines. And the timing of adoption is crucial for understanding the impact on workers.”</p> <p><strong>Policy proposals for the future</strong></p> <p>Still, if the worst-case scenario of a “job apocalypse” is unlikely, the continued deployment of so-so technologies could make the future of work worse for many people.</p> <p>If people are worried that technologies could limit opportunity, social mobility, and shared prosperity, the report states, “Economic history confirms that this sentiment is neither ill-informed nor misguided. There is ample reason for concern about whether technological advances will improve or erode employment and earnings prospects for the bulk of the workforce.”</p> <p>At the same time, the task force report finds reason for “tempered optimism,” asserting that better policies can significantly improve tomorrow’s work.</p> <p>“Technology is a human product,” Mindell says. “We shape technological change through our choices of investments, incentives, cultural values, and political objectives.”</p> <p>To this end, the task force focuses on a few key policy areas. One is renewed investment in postsecondary workforce education outside of the four-year college system — and not just in the STEM skills (science, technology, engineering, math) but reading, writing, and the “social skills” of teamwork and judgment.</p> <p>Community colleges are the biggest training providers in the country, with 12 million for-credit and non-credit students, and are a natural location for bolstering workforce education. A wide range of new models for gaining educational credentials is also emerging, the task force notes. The report also emphasizes the value of multiple types of on-the-job training programs for workers.</p> <p>However, the report cautions, investments in education may be necessary but not sufficient for workers: “Hoping that ‘if we skill them, jobs will come,’ is an inadequate foundation for constructing a more productive and economically secure labor market.”</p> <p>More broadly, therefore, the report argues that the interests of capital and labor need to be rebalanced. The U.S., it notes, “is unique among market economies in venerating pure shareholder capitalism,” even though workers and communities are business stakeholders too.</p> <p>“Within this paradigm [of pure shareholder capitalism], the personal, social, and public costs of layoffs and plant closings should not play a critical role in firm decision-making,” the report states.</p> <p>The task force recommends greater recognition of workers as stakeholders in corporate decision making. Redressing the decades-long erosion of worker bargaining power will require new institutions that bend the arc of innovation toward making workers more productive rather than less necessary. The report holds that the adversarial system of collective bargaining, enshrined in U.S. labor law adopted during the Great Depression, is overdue for reform.</p> <p>The U.S. tax code can be altered to help workers as well. Right now, it favors investments in capital rather than labor — for instance, capital depreciation can be written off, and R&amp;D investment receives a tax credit, whereas investments in workers produce no such equivalent benefits. The task force recommends new tax policy that would also incentivize investments in human capital, through training programs, for instance.</p> <p>Additionally, the task force recommends restoring support for R&amp;D to past levels and rebuilding U.S. leadership in the development of new AI-related technologies, “not merely to win but to lead innovation in directions that will benefit the nation: complementing workers, boosting productivity, and strengthening the economic foundation for shared prosperity.”</p> <p>Ultimately the task force’s goal is to encourage investment in technologies that improve productivity, and to ensure that workers share in the prosperity that could result.</p> <p>“There’s no question technological progress that raises productivity creates opportunity,” Autor says. “It expands the set of possibilities that you can realize. But it doesn’t guarantee that you will make good choices.”</p> <p>Reynolds adds: “The question for firms going forward is: How are they going to improve their productivity in ways that can lead to greater quality and efficiency, and aren’t just about cutting costs and bringing in marginally better technology?”</p> <p><strong>Further research and analyses</strong></p> <p>In addition to Reynolds, Autor, and Mindell, the central group within MIT’s Task Force on the Work of the Future consists of 18 MIT professors representing all five Institute schools. Additionally, the project has a 22-person advisory board drawn from the ranks of industry leaders, former government officials, and academia; a 14-person research board of scholars; and eight graduate students. The task force also counsulted with business executives, labor leaders, and community college leaders, among others.</p> <p>The task force follows other influential MIT projects such as the Commission on Industrial Productivity, an intensive multiyear study of U.S. industry in the 1980s. That effort resulted in the widely read book, “Made in America,” as well as the creation of MIT’s Industrial Performance Center.</p> <p>The current task force taps into MIT’s depth of knowledge across a full range of technologies, as well as its strengths in the social sciences.</p> <p>“MIT is engaged in developing frontier technology,” Reynolds says. “Not necessarily what will be introduced tomorrow, but five, 10, or 25 years from now. We do see what’s on the horizon, and our researchers want to bring realism and context to the public discourse.”</p> <p>The current report is an interim finding from the task force; the group plans to conduct additional research over the next year, and then will issue a final version of the report.</p> <p>“What we’re trying to do with this work,” Reynolds concludes, “is to provide a holistic perspective, which is not just about the labor market and not just about technology, but brings it all together, for a more rational and productive discussion in the public sphere.”</p> MIT’s Task Force on the Work of the Future has released a report that punctures some conventional wisdom and builds a nuanced picture of the evolution of technology and jobs.School of Engineering, School of Architecture and Planning, School of Humanities Arts and Social Sciences, School of Science, Sloan School of Management, Jobs, Economics, Aeronautical and astronautical engineering, Urban studies and planning, Program in STS, Industrial Performance Center, employment, Artificial intelligence, Industry, President L. Rafael Reif, Policy, Machine learning, Faculty, Technology and society, Innovation and Entrepreneurship (I&E), Poverty, Business and management, Manufacturing, Careers, STEM education, MIT Schwarzman College of Computing New science blooms after star researchers die, study finds Deaths of prominent life scientists tend to be followed by a surge in highly cited research by newcomers. Thu, 29 Aug 2019 00:00:01 -0400 Peter Dizikes | MIT News Office <p>The famed quantum physicist Max Planck had an idiosyncratic view about what spurred scientific progress: death. That is, Planck thought, new concepts generally take hold after older scientists with entrenched ideas vanish from the discipline.</p> <p>“A great scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,” Planck once wrote.</p> <p>Now a new study co-authored by MIT economist Pierre Azoulay, an expert on the dynamics of scientific research, concludes that Planck was right. In many areas of the life sciences, at least, the deaths of prominent researchers are often followed by a surge in highly cited research by newcomers to those fields.</p> <p>Indeed, when star scientists die, their subfields see a subsequent 8.6 percent increase, on average, of articles by researchers who have not previously collaborated with those star scientists. Moreover, those papers published by the newcomers to these fields are much more likely to be influential and highly cited than other pieces of research.</p> <p>“The conclusion of this paper is not that stars are bad,” says Azoulay, who has co-authored a new paper detailing the study’s findings. “It’s just that, once safely ensconsed at the top of their fields, maybe they tend to overstay their welcome.”</p> <p>The paper, “Does Science Advance one Funeral at a Time?” is co-authored by Azoulay, the International Programs Professor of Management at the MIT Sloan School of Management; Christian Fons-Rosen, an assistant professor of economics at the University of California at Merced; and Joshua Graff Zivin, a professor of economics at the University of California at San Diego and faculty member in the university’s School of Global Policy and Strategy. It is forthcoming in the <em>American Economic Review</em>.</p> <p>To conduct the study, the researchers used a database of life scientists that Azoulay and Graff Zivin have been building for well over a decade. In it, the researchers chart the careers of life scientists, looking at accomplishments that include funding awards, published papers and the citations of those papers, and patent statistics.</p> <p>In this case, Azoulay, Graff Zivin, and Fons-Rosen studied what occurred after the unexpected deaths of 452 life scientists, who were still active in their disciplines. In addition to the 8.6 percent increase in papers by new entrants to those subfields, there was a 20.7 percent decrease in papers by the rather smaller number of scientists who had previously co-authored papers with the star scientists.</p> <p>Overall, Azoulay notes, the study provides a window into the power structures of scientific disciplines. Even if well-established scientists are not intentionally blocking the work of researchers with alternate ideas, a group of tightly connected colleagues may wield considerable influence over journals and grant awards. In those cases, “it’s going to be harder for those outsiders to make a mark on the domain,” Azoulay notes.</p> <p>“The fact that if you’re successful, you get to set the intellectual agenda of your field, that is part of the incentive system of science, and people do extraordinary positive things in the hope of getting to that position,” Azoulay notes. “It’s just that, once they get there, over time, maybe they tend to discount ‘foreign’ ideas too quickly and for too long.”</p> <p>Thus what the researchers call “Planck’s Principle” serves as an unexpected — and tragic — mechanism for diversifying bioscience research.</p> <p>The researchers note that in referencing Planck, they are extending his ideas to a slightly different setting than the one he himself was describing. In his writing, Planck was discussing the birth of quantum physics — the kind of epochal, paradigm-setting shift that rarely occurs in science. The current study, Azoulay notes, examines what happens in everyday “normal science,” in the phrase of philosopher Thomas Kuhn.</p> <p>The process of bringing new ideas into science, and then hanging on to them, is only to be expected in many areas of research, according to Azoulay. Today’s seemingly stodgy research veterans were once themselves innovators facing an old guard.</p> <p>“They had to hoist themselves atop the field in the first place, when presumably they were [fighting] the same thing,” Azoulay says. “It’s the circle of life.”</p> <p>Or, in this case, the circle of life science.</p> <p>The research received support from the National Science Foundation, the Spanish Ministry of Economy and Competitiveness, and the Severo Ochoa Programme for Centres of Excellence in R&amp;D.</p> A study co-authored by MIT professor Pierre Azoulay has shown that in many areas of the life sciences, the deaths of prominent researchers are often followed by a surge in highly cited research by newcomers to those fields.Sloan School of Management, Research, Faculty, Economics, History of science, National Science Foundation (NSF) Shift to renewable electricity a win-win at statewide level MIT research finds health savings from cleaner air exceed policy costs. Wed, 14 Aug 2019 11:10:01 -0400 Mark Dwortzan | Joint Program on the Science and Policy of Global Change <p>Amid rollbacks of the Clean Power Plan and other environmental regulations at the federal level, several U.S. states, cities, and towns have resolved to take matters into their own hands and implement policies to promote renewable energy and reduce greenhouse gas emissions. One popular approach, now in effect in 29 states and the District of Columbia, is to set Renewable Portfolio Standards (RPS), which require electricity suppliers to source a designated percentage of electricity from available renewable-power generating technologies.</p> <p>Boosting levels of renewable electric power not only helps mitigate global climate change, but also reduces local air pollution. Quantifying the extent to which this approach improves air quality could help legislators better assess the pros and cons of implementing policies such as RPS. Toward that end, a research team at MIT has developed a new modeling framework that combines economic and air-pollution models to assess the projected subnational impacts of RPS and carbon pricing on air quality and human health, as well as on the economy and on climate change. In a <a href="" target="_blank">study</a> focused on the U.S. Rust Belt, their assessment showed that the financial benefits associated with air quality improvements from these policies would more than pay for the cost of implementing them. The results appear in the journal <em>Environmental Research Letters.</em></p> <p>“This research helps us better understand how clean-energy policies now under consideration at the subnational level might impact local air quality and economic growth,” says the study’s <a href="" target="_blank">lead author Emil Dimanchev</a>, a senior research associate at MIT’s Center for Energy and Environmental Policy Research, former research assistant at the MIT Joint Program on the Science and Policy of Global Change, and a 2018 graduate of the MIT Technology and Policy Program.</p> <p>Burning fossil fuels for energy generation results in air pollution in the form of fine particulate matter (PM2.5). Exposure to PM2.5 can lead to adverse health effects that include lung cancer, stroke, and heart attacks. But avoiding those health effects — and the medical bills, lost income, and reduced productivity that comes with them — through the adoption of cleaner energy sources translates into significant cost savings, known as health co-benefits.</p> <p>Applying their modeling framework, the MIT researchers estimated that existing RPS in the nation’s Rust Belt region generate a health co-benefit of $94 per ton of carbon dioxide (CO<sub>2</sub>) reduced in 2030, or 8 cents for each kilowatt hour (kWh) of renewable energy deployed in 2015 dollars. Their central estimate is 34 percent larger than total policy costs. The team also determined that carbon pricing delivers a health co-benefit of&nbsp;$211 per ton of CO<sub>2</sub> reduced in 2030, 63 percent greater than the health co-benefit of reducing the same amount of CO<sub>2</sub> through an RPS approach.</p> <p>In an extension to their published work focused on the state of Ohio, the researchers evaluated the health effects and economy-wide costs of Ohio’s RPS using economic and atmospheric chemistry modeling. According to their best estimates, an average of 50 premature deaths per year will be avoided as a result of Ohio’s RPS in 2030. This translates to an economic benefit of $470 million per year, or 3 cents per kWh of renewable generation supported by the RPS. With costs of the RPS estimated at $300, that translates to an annual net health benefit of $170 million in 2030.</p> <p>When the Ohio state legislature took up Ohio House Bill No. 6, which proposed to repeal the state’s RPS, Dimanchev shared these results on the Senate floor.</p> <p>“According to our calculations, the magnitude of the air quality benefits resulting from Ohio’s RPS is substantial and exceeds its economic costs,” he argued. “While the state legislature ultimately weakened the RPS, our research concludes that this will worsen the health of Ohio residents.”</p> <p>The MIT research team’s results for the Rust Belt are consistent with previous studies, which found that the health co-benefits of climate policy (including RPS and other instruments) tend to exceed policy costs.</p> <p>“This work shows that there are real, immediate benefits to people’s health in states that take the lead on clean energy,” says MIT Associate Professor <a href="" target="_blank">Noelle Selin</a>, who led the study and holds a joint appointment in the Department of Earth, Atmospheric and Planetary Sciences and Institute for Data, Systems and Society. “Policymakers should take these impacts into account as they consider modifying these standards.”</p> <p>The study was supported by the U.S. Environmental Protection Agency’s Air, Climate and Energy Centers Program.</p> A wind turbine on the coast of Lake Erie in Cleveland, Ohio Photo: Sam Bobko/FlickrJoint Program on the Science and Policy of Global Change, EAPS, IDSS, School of Science, School of Engineering, MIT Energy Initiative, Climate change, Economics, Emissions, Environment, Global Warming, Greenhouse gases, Health, Pollution, Research, Sustainability, Policy, Government Lowering emissions without breaking the bank New research provides a look at how India could meet its climate targets while maintaining economic growth Wed, 31 Jul 2019 14:20:01 -0400 Mark Dwortzan | Joint Program on the Science and Policy of Global Change <p>India’s economy is booming, driving up electric power consumption to unprecedented levels. The nation’s installed electricity capacity, which increased fivefold in the past three decades, is expected to triple over the next 20 years. At the same time, India has committed to limiting its carbon dioxide emissions growth; its Paris Agreement climate pledge is to decrease its carbon dioxide emissions intensity of GDP (CO<sub>2</sub>&nbsp;emissions per unit of GDP) by 33 to 35 percent by 2030 from 2005 levels, and to boost carbon-free power to about 40 percent of installed capacity in 2030.</p> <p>Can India reach its climate targets without adversely impacting its rate of economic growth — now estimated at 7 percent annually — and what policy strategy would be most effective in achieving that goal?</p> <p>To address these questions, researchers from the MIT Joint Program on the Science and Policy of Global Change developed an economy-wide model of India with energy-sector detail, and applied it to simulate the achievement of each component of the nation’s Paris pledge. Representing the emissions intensity target with an economy-wide carbon price and the installed capacity target with a Renewable Portfolio Standard (RPS), they assessed the economic implications of three policy scenarios — carbon pricing, an RPS, and a combination of carbon pricing with an RPS. <a href="" target="_blank">Their findings</a> appear in the journal <em>Climate Change Economics. </em></p> <p>As a starting point, the researchers determined that imposing an economy-wide emissions reduction policy alone to meet the target emissions intensity, simulated through a carbon price, would result in the lowest cost to India’s economy. This approach would lead to emissions reductions not only in the electric power sector but throughout the economy. By contrast, they found that an RPS, which would enforce a minimum level of currently more expensive carbon-free electricity, would have the highest per-ton cost — more than 10 times higher than the economy-wide CO<sub>2</sub> intensity policy.</p> <p>“In our modeling framework, allowing emissions reduction across all sectors of the economy through an economy-wide carbon price ensures that the least-cost pathways for reducing emissions are observed,” says <a href="">Arun Singh</a>, lead author of the study. “This is constrained when electricity sector-specific targets are introduced. If renewable electricity costs are higher than the average cost of electricity, a higher share of renewables in the electricity mix makes electricity costlier, and the impacts of higher electricity prices reverberate across the economy.” A former research assistant at the MIT joint program and graduate student at the MIT Institute for Data, Systems and Society’s Technology and Policy Program, Singh now serves as an energy specialist consultant at the World Bank.</p> <p>Combining an economy-wide carbon price with an RPS would, however, bring the price per ton of CO<sub>2</sub> down from $23.38/tCO<sub>2</sub>&nbsp;(in 2011 U.S. dollars) under a standalone carbon-pricing policy to a far more politically viable $6.17/tCO<sub>2</sub> when an RPS is added. If wind and solar costs decline significantly, the cost to the economy would decrease considerably; at the lowest wind and solar cost levels simulated, the model projects that economic losses under a carbon price with RPS would be only slightly higher than those under a standalone carbon price. Thus, declining wind and solar costs could enable India to set more ambitious climate policies in future years without significantly impeding economic growth.</p> <p>“Globally, it has been politically impossible to introduce CO<sub>2</sub>&nbsp;prices high enough to mitigate climate change in line with the Paris Agreement goals,” says <a href="">Valerie Karplus</a>, co-author and assistant professor at the MIT Sloan School of Management. “Combining pricing approaches with technology-specific policies may be important in India, as they have elsewhere, for the politics to work.”</p> <p>Developed by Singh in collaboration with his master’s thesis advisors at MIT (Karplus, and MIT Joint Program Principal Research Scientist&nbsp;<a href="">Niven Winchester</a>, who also co-authored the study), the economy-wide model of India enables researchers to gauge the cost-effectiveness and efficiency of different technology and policy choices designed to transition the country to a low-carbon energy system.</p> <p>“The study provides important insights about the costs of different policies, which are relevant to nations that have pledged emission targets under the Paris Agreement but have not yet developed polices to meet those targets,” says Winchester, who is also a senior fellow at Motu Economic and Public Policy Research.</p> <p>The study was supported by the MIT Tata Center for Technology and Design, the Energy Information Administration of the U.S. Department of Energy, and the MIT Joint Program.</p> Haloe Energie solar photovoltaic plant in Andhra Pradesh, India Photo: Penn StateJoint Program on the Science and Policy of Global Change, Institute for Data, Systems, and Society, Sloan School of Management, MITEI, School of Engineering, Tata Center, Climate, India, Carbon, Alternative energy, Climate change, Development, Economics, Global Warming, Greenhouse gases, Renewable energy, Research, Department of Energy (DoE) Health effects of China’s climate policy extend across Pacific Improved air quality in China could prevent nearly 2,000 premature deaths in the U.S. Mon, 29 Jul 2019 16:00:01 -0400 Mark Dwortzan | Joint Program on the Science and Policy of Global Change <p>Improved air quality can be a major bonus of climate mitigation policies aimed at reducing greenhouse gas emissions. By cutting air pollution levels in the country where emissions are produced, such policies can avoid significant numbers of premature deaths. But other nations downwind from the host country may also benefit.</p> <p>A <a href="">new MIT study</a> in the journal <em>Environmental Research Letters</em> shows that if the world’s top emitter of greenhouse gas emissions, China, fulfills its climate pledge to peak carbon dioxide emissions in 2030, the positive effects would extend all the way to the United States, where improved air quality would result in nearly 2,000 fewer premature deaths. &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</p> <p>The study estimates China’s climate policy air quality and health co-benefits resulting from reduced atmospheric concentrations of ozone, as well as co-benefits from reduced ozone and particulate air pollution (PM2.5) in three downwind and populous countries: South Korea, Japan, and the United States. As ozone and PM2.5 &nbsp;give a well-rounded picture of air quality and can be transported over long distances, accounting for both pollutants enables a more accurate projection of associated health co-benefits in the country of origin and those downwind. &nbsp;</p> <p>Using a modeling framework that couples an energy-economic model with an atmospheric chemistry model, and assuming a climate policy consistent with China’s pledge to peak CO<sub>2</sub> emissions in 2030, the researchers found that atmospheric ozone concentrations in China would fall by 1.6 parts per billion in 2030 compared to a no-policy scenario, and thus avoid 54,300 premature deaths — nearly 60 percent of those resulting from PM2.5. Total avoided premature deaths in South Korea and Japan are 1,200 and 3,500, respectively, primarily due to PM2.5; for the U.S. total, 1,900 premature deaths, ozone is the main contributor, due to its longer lifetime in the atmosphere.</p> <p>Total avoided deaths in these countries amount to about 4 percent of those in China. The researchers also found that a more stringent climate policy would lead to even more avoided premature deaths in the three downwind countries, as well as in China.</p> <p>The study breaks new ground in showing that co-benefits of climate policy from reducing ozone-related premature deaths in China are comparable to those from PM2.5, and that co-benefits from reduced ozone and PM2.5 levels are not insignificant beyond China’s borders.</p> <p>“The results show that climate policy in China can influence air quality even as far away as the U.S.,” says <a href="">Noelle Eckley Selin</a>, an associate professor in MIT’s Institute for Data, Systems, and Society and Department of Earth, Atmospheric and Planetary Sciences (EAPS), who co-led the study. “This shows that policy action on climate is indeed in everyone’s interest, in the near term as well as in the longer term.”</p> <p>The other co-leader of the study is <a href="">Valerie Karplus</a>, the assistant professor of global economics and management in MIT’s Sloan School of Management. Both co-leaders are faculty affiliates of the MIT Joint Program on the Science and Policy of Global Change. Their co-authors include former EAPS graduate student and lead author Mingwei Li, former Joint Program research scientist Da Zhang, and former MIT postdoc Chiao-Ting Li.&nbsp;</p> Polluted air over Beijing, ChinaPhoto: Patrick He/FlickrEAPS, Joint Program on the Science and Policy of Global Change, School of Science, IDSS, School of Engineering, China, Climate change, Developing countries, Economics, Emissions, Environment, Global Warming, Greenhouse gases, Health, International initiatives, Pollution, Research, Sustainability, Earth and atmospheric sciences J-PAL North America announces second round of competition partners Education, Technology, and Opportunity Innovation Competition aims to improve student learning. Wed, 17 Jul 2019 09:25:01 -0400 J-PAL North America <p><a href="">J-PAL North America</a>, a research center at MIT, will partner with two leading education technology nonprofits to test promising models to improve learning, as part of the center’s second Education, Technology, and Opportunity Innovation Competition.&nbsp;</p> <p>Running in its second year, J-PAL North America’s Education, Technology, and Opportunity Innovation Competition supports education leaders in using randomized evaluations to generate evidence on&nbsp;how technology can improve student learning, particularly for students experiencing poverty or facing barriers to academic success. Last year, J-PAL North America partnered with the <a href="">Family Engagement Lab</a> to develop an evaluation of a multilingual digital messaging platform, and with <a href="">Western Governors University</a>’s Center for Applied Learning Science to evaluate scalable models to improve student learning in math.</p> <p>This year, J-PAL North America will continue its work to support rigorous evaluations of educational technologies aimed to reduce disparities by partnering with <a href="">Boys &amp; Girls Clubs of Greater Houston</a>, a youth-development organization that provides education and social services to students from low-income families, and <a href="">MIND Research Institute</a>, a nonprofit committed to improving math education.</p> <p>“Even just within the first and second year of the J-PAL ed-tech competition, there continues to be an explosion in promising new initiatives,” says <a href="">Philip Oreopoulos</a>, professor of economics at the University of Toronto and co-chair of the J-PAL <a href="">Education, Technology, and Opportunity Initiative</a>. “We’re excited to try to help steer this development towards the most promising and effective programs for improving academic success and student well-being.”</p> <p>Boys &amp; Girls Clubs of Greater Houston will partner with J-PAL North America to develop an evaluation of the <a href="">BookNook</a> reading app, a research-based intervention technology that aims to improve literacy skills of K-8 students.</p> <p>“One of our commitments to our youth is to prepare them to be better citizens in life, and we do this through our programming, which supplements the education they receive in school,” says Michael Ewing, director of programs at Boys &amp; Girls Clubs of Greater Houston. “BookNook is one of our programs that we know can increase reading literacy and help students achieve at a higher level. We are excited about this opportunity to conduct a rigorous evaluation of BookNook’s technology because we can substantially increase our own accountability as an organization, ensuring that we are able to track the literacy gains of our students when the program is implemented with fidelity.”</p> <p>Children who do not master reading by a young age are often <a href="">placed at a significant disadvantage to their peers</a> throughout the rest of their development. However, many <a href="">effective interventions for students struggling with reading</a> involve one-on-one or small-group instruction that places a heavy demand on school resources and teacher time. This makes it particularly challenging for schools that are already resource-strapped and face a shortage of teachers to meet the needs of students who are struggling with reading.</p> <p>The BookNook app offers a channel to bring research-proven literacy intervention strategies to greater numbers of students through accessible technology. The program is heavily scaffolded so that both teachers and non-teachers can use it effectively, allowing after-school staff like those at Boys &amp; Girls Clubs of Greater Houston to provide adaptive instruction to students struggling with reading.</p> <p>“Our main priority at BookNook is student success,” says Nate Strong, head of partnerships at for the BookNook team. “We are really excited to partner with J-PAL and with Boys &amp; Girls Clubs of Greater Houston to track the success of students in Houston and learn how we can do better for them over the long haul.”</p> <p>MIND Research Institute seeks to partner with J-PAL North America to develop a scalable model that will increase students’ conceptual understanding of mathematical concepts. <a href=";__hssc=142723050.1.1561132912684&amp;__hsfp=2177231255&amp;_ga=2.24602942.588039468.1561132912-972992093.1559936738">MIND’s Spatial Temporal (ST) math program</a> is a pre-K-8 visual instructional program that leverages the brain's spatial-temporal reasoning ability using challenging visual puzzles, non-routine problem solving, and animated informative feedback to understand and solve mathematical problems.</p> <p>“We’re thrilled and honored to begin this partnership with J-PAL to build our capacity to conduct randomized evaluations,” says Andrew Coulson, chief data science officer for MIND. “It's vital we continue to rigorously evaluate the ability of ST Math's spatial-temporal approach to provide a level playing field for every student, and to show substantial effects on any assessment. With the combination of talent and experience that J-PAL brings, I expect that we will also be exploring innovative research questions, metrics and outcomes, methods and techniques to improve the applicability, validity and real-world usability of the findings.”</p> <p>J-PAL North America is excited to work with these two organizations and continue to support rigorous evaluations that will help us better understand the role technology should play in learning. Boys &amp; Girls Clubs of Greater Houston and MIND Research Institute will help J-PAL contribute to growing evidence base on education technology that can help guide decision-makers in understanding which uses of education technology are truly helping students learn amidst a rapidly-changing technological landscape.</p> <p>J-PAL North America is a regional office of the Abdul Latif Jameel Poverty Action Lab. J-PAL was established in 2003 as a research center at MIT’s Department of Economics. Since then, it has built a global network of affiliated professors based at over 58 universities and regional offices in Africa, Europe, Latin America and the Caribbean, North America, South Asia, and Southeast Asia. J-PAL North America was established with support from the Alfred P. Sloan Foundation and Arnold Ventures and works to improve the effectiveness of social programs in North America through three core activities: research, policy outreach, and capacity building. J-PAL North America’s education technology work is supported by the Overdeck Family Foundation and Arnold Ventures.</p> J-PAL North America’s Education, Technology, and Opportunity Innovation Competition supports education leaders in using randomized evaluations to generate evidence on how technology can improve student learning, particularly for students from disadvantaged backgrounds.Abdul Latif Jameel Poverty Action Lab (J-PAL), Economics, School of Humanities Arts and Social Sciences, Learning, K-12 education, Technology and society, STEM education, Mathematics, Education, teaching, academics Visiting lecturer to spearhead project exploring the geopolitics of artificial intelligence At MIT, Luis Videgaray, alumnus and former foreign minister of Mexico, will launch project to help shape international AI policies. Fri, 12 Jul 2019 10:18:13 -0400 Rob Matheson | MIT News Office <p>Artificial intelligence is expected to have tremendous societal impact across the globe in the near future. Now Luis Videgaray PhD ’98, former foreign minister and finance minister of Mexico, is coming to MIT to spearhead an effort that aims to help shape global AI policies, focusing on how such rising technologies will affect people living in all corners of the world.</p> <p>Starting this month, Videgaray, an expert in geopolitics and AI policy, will serve as director of the MIT Artificial Intelligence Policy for the World Project (MIT AIPW), a collaboration between the MIT Sloan School of Management and the new MIT Stephen A. Schwarzman College of Computing. Videgaray will also serve as a senior lecturer at the MIT Sloan and as a distinguished fellow at the MIT Internet Policy Research Initiative.</p> <p>The MIT AIPW will bring together researchers from across the Institute to explore and analyze best AI policies for countries around the world based on various geopolitical considerations. The end result of the year-long effort, Videgaray says, will be a report with actionable policy recommendations for national and local governments, businesses, international organizations, and universities —&nbsp;including MIT.</p> <p>“The core idea is to analyze, raise awareness, and come up with useful policy recommendations for how the geopolitical context affects both the development and use of AI,” says Videgaray, who earned his PhD at MIT in economics. “It’s called AI Policy for the World, because it’s not only about understanding the geopolitics, but also includes thinking about people in poor nations, where AI is not really being developed but will be adopted and have significant impact in all aspects of life.”</p> <p>“When we launched the MIT Stephen A. Schwarzman College of Computing, we expressed the desire for the college to examine the societal implications of advanced computational capabilities,” says MIT Provost Martin Schmidt. “One element of that is developing frameworks which help governments and policymakers contemplate these issues. I am delighted to see us jump-start this effort with the leadership of our distinguished alumnus, Dr. Videgaray.”</p> <p><strong>Democracy, diversity, and de-escalation</strong></p> <p>As Mexico’s finance minister from 2012 to 2016, Videgaray led Mexico’s energy liberalization process, a telecommunications reform to foster competition in the sector, a tax reform that reduced the country’s dependence on oil revenues, and the drafting of the country’s laws on financial technology. In 2012, he was campaign manager for President Peña Nieto and head of the presidential transition team.</p> <p>As foreign minister from 2017 to 2018, Videgaray led Mexico’s relationship with the Trump White House, including the renegotiation of the North American Free Trade Agreement (NAFTA). He is one of the founders of the Lima Group, created to promote regional diplomatic efforts toward restoring democracy in Venezuela. He also directed Mexico’s leading role in the UN toward an inclusive debate on artificial intelligence and other new technologies. In that time, Videgaray says AI went from being a “science-fiction” concept in the first year to a major global political issue the following year.</p> <p>In the past few years, academic institutions, governments, and other organizations have launched initiatives that address those issues, and more than 20 countries have strategies in place that guide AI development. But they miss a very important point, Videgaray says: AI’s interaction with geopolitics.</p> <p>MIT AIWP will have three guiding principles to help shape policy around geopolitics: democratic values, diversity and inclusion, and de-escalation.</p> <p>One of the most challenging and important issues MIT AIWP faces is if AI “can be a threat to democracy,” Videgaray says. In that way, the project will explore policies that help advance AI technologies, while upholding the values of liberal democracy.</p> <p>“We see some countries starting to adopt AI technologies not for the improvement for the quality of life, but for social control,” he says. “This technology can be extremely powerful, but we are already seeing how it can also be used to … influence people and have an effect on democracy. In countries where institutions are not as strong, there can be an erosion of democracy.”</p> <p>A policy challenge in that regard is how to deal with private data restrictions in different countries. If some countries don’t put any meaningful restrictions on data usage, it could potentially give them a competitive edge. “If people start thinking about geopolitical competition as more important than privacy, biases, or algorithmic transparency, and the concern is to win at all costs, then the societal impact of AI around the world could be quite worrisome,” Videgaray says.</p> <p>In the same vein, MIT AIPW will focus on de-escalation of potential conflict, by promoting an analytical, practical, and realistic collaborative approach to developing and using AI technologies. While media has dubbed the rise of AI worldwide as a type of “arms race,” Videgaray says that type of thinking is potentially hazardous to society. “That reflects a sentiment that we’re moving again into an adversarial world, and technology will be a huge part of it,” he says. “That will have negative effects of how technology is developed and used.”</p> <p>For inclusion and diversity, the project will make AI’s ethical impact “a truly global discussion,” Videgaray says. That means promoting awareness and participation from countries around the world, including those that may be less developed and more vulnerable. Another challenge is deciding not only what policies should be implemented, but also where those policies might be best implemented. That could mean at the state level or national level in the United States, in different European countries, or with the UN.</p> <p>“We want to approach this in a truly inclusive way, which is not just about countries leading development of technology,” Videgaray says. “Every country will benefit and be negatively affected by AI, but many countries are not part of the discussion.”</p> <p><strong>Building connections</strong></p> <p>While MIT AIPW won’t be drafting international agreements, Videgaray says another aim of the project is to explore different options and elements of potential international agreements. He also hopes to reach out to decision makers in governments and businesses around the world to gather feedback on the project’s research. &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</p> <p>Part of Videgaray’s role includes building connections across MIT departments, labs, and centers to pull in researchers to focus on the issue. “For this to be successful, we need to integrate the thinking of people from different backgrounds and expertise,” he says.</p> <p>At MIT Sloan, Videgaray will teach classes alongside Simon Johnson, the Ronald A. Kurtz Professor of Entrepreneurship Professor and a professor of global economics and management. His lectures will focus primarily on the issues explored by the MIT AIPW project.</p> <p>Next spring, MIT AIPW plans to host a conference at MIT to convene researchers from the Institute and around the world to discuss the project’s initial findings and other topics in AI.</p> Luis Videgaray PhD ’98, former foreign minister and finance minister of Mexico, is coming to MIT to spearhead an effort that aims to help shape global AI policies, focusing on how such rising technologies will affect people living in all corners of the world.Credit: Courtesy of Luis VidegarayArtificial intelligence, Machine learning, Classes and programs, Computer science and technology, Technology and society, Economics, Policy, Data, Latin America, Government, Industry, Community, Alumni/ae, International relations, Global, Diversity and inclusion, Political science, MIT Schwarzman College of Computing, Sloan School of Management, School of Engineering, School of Humanities Arts and Social Sciences Daron Acemoglu named Institute Professor Versatile economist awarded MIT’s highest faculty honor. Wed, 10 Jul 2019 10:00:00 -0400 Peter Dizikes | MIT News Office <p>Economist Daron Acemoglu, whose far-ranging research agenda has produced influential studies about government, innovation, labor, and globalization, has been named Institute Professor, MIT’s highest faculty honor.</p> <p>Acemoglu is one of two MIT professors earning that distinction in 2019. The other, political scientist Suzanne Berger, has <a href="">been named the inaugural John M. Deutch Institute Professor</a>.</p> <p>Acemoglu and Berger join a select group of people holding the <a href="">Institute Professor</a> title at MIT. There are now 12 Institute Professors, along with 11 Institute Professors Emeriti. The new appointees are the first faculty members to be named Institute Professors since 2015.</p> <p>“As an Institute Professor, Daron Acemoglu embodies the essence of MIT: boldness, rigor and real-world impact,” says MIT President L. Rafael Reif. “From the John Bates Clark Medal to his decades of pioneering contributions to the literature, Daron has built an exceptional record of academic accomplishment. And because he has focused his creativity on broad, deep questions around the practical fate of nations, communities and workers, his work will be essential to making a better world in our time.”</p> <p>In a letter sent to the MIT faculty today, MIT Provost Martin A. Schmidt and MIT Chair of the Faculty Susan Silbey noted that the honor recognizes “exceptional distinction by a combination of leadership, accomplishment, and service in the scholarly, educational, and general intellectual life of the Institute and wider community.” Schmidt and Silbey also cited Acemoglu’s “significant impacts in diverse fields of economics” and praised him as “one of the most dedicated teachers and mentors in his department.”</p> <p>Nominations for faculty to be promoted to the rank of Institute Professor may be made at any time, by any member of the faculty, and should be directed to MIT’s Chair of the Faculty.</p> <p>A highly productive scholar with broad portfolio of research interests, Acemoglu has spent more than 25 years at MIT examining complicated, large-scale economic questions — and producing important answers.</p> <p>“I’m greatly honored,” he says. “I’ve spent all my career at MIT, and this is a recognition that makes me humbled and happy.”</p> <p>At different times in his career, Acemoglu has published significant research on topics ranging from labor economics to network effects within economies. However, his most prominent work in the public sphere examines the dynamics of political institutions, democracy, and economic growth.</p> <p>Working with colleagues, Acemoglu has built an extensive empirical case that the existence of government institutions granting significant rights for individuals has spurred greater economic activity over the last several hundred years. At the same time, he has also produced theoretical work modeling political changes in many countries. &nbsp;&nbsp;</p> <p>He has researched the relationship between institutions and economics most extensively with political scientist James Robinson at the University of Chicago, as well as with Simon Johnson of the MIT Sloan School of Management. However, he has published papers about political dynamics with many other scholars as well.</p> <p>Acemoglu has also been keenly interested in other issues during the course of his career. In labor economics, Acemoglu’s work has helped account for the wage gap between higher-skill and lower-skill workers; he has also shown why firms benefit from investing in improving employee skills, even if those workers might leave or require higher wages.&nbsp;</p> <p>In multiple papers over the last decade, Acemoglu has also examined the labor-market implications of automation, robotics, and AI. Using both theoretical and empirical approaches, Acemoglu has shown how these technologies can reduce employment and wages unless accompanied by other, counterbalancing innovations that increase labor productivity.</p> <p>In still another area of recent work, Acemoglu has shown how economic shocks within particular industrial sectors can produce cascading effects that propagate through an entire economy, work that has helped economists re-evaluate ideas about the aggregate performance of economies. &nbsp;</p> <p>Acemoglu credits the intellectual ethos at MIT and the environment created by his colleagues as beneficial to his own research.&nbsp;&nbsp;</p> <p>“MIT is a very down-to-earth, scientific, no-nonsense environment, and the economics department here has been very open-minded, in an age when economics is more relevant than ever but also in the midst of a deep transformation,” he says. “I think it’s great to have an institution, and colleagues, open to new ideas and new things.”</p> <p>Acemoglu has authored or co-authored over 120 (and still rapidly counting) peer-reviewed papers. His fifth book, “The Narrow Corridor,” co-authored with Robinson, will be published in September. It takes a global look at the development of, and pressures on, individual rights and liberties. He has advised over 60 PhD students at MIT and is known for investing considerable time reading the work of his colleagues.&nbsp;</p> <p>As a student, Acemoglu received his BA from the University of York, and his MSc and PhD from the London School of Economics, the latter in 1992. His first faculty appointment was at MIT in 1993, and he has been at the Institute ever since. He was promoted to full professor in 2000, and since 2010 has been the Elizabeth and James Killian Professor of Economics.&nbsp;</p> <p>Among Acemoglu’s honors, in 2005 he won the John Bates Clark Medal, awarded by the American Economic Association to the best economist under age 40. Acemoglu has also won the Nemmers Prize in Economics, the BBVA Foundation Frontiers of Knowledge Award, and been elected to the National Academy of Sciences. This month, Acemoglu also received the Global Economy Prize 2019, from the Institute for the World Economy.</p> Daron AcemogluImage: Jared CharneyFaculty, Awards, honors and fellowships, Administration, School of Arts Humanities and Social Sciences, Economics, President L. Rafael Reif Pathways to a low-carbon China Study projects a key role for carbon capture and storage. Mon, 08 Jul 2019 15:50:01 -0400 Mark Dwortzan | Joint Program on the Science and Policy of Global Change <p>Fulfilling the ultimate goal of the 2015 Paris Agreement on climate change — keeping global warming well below 2 degrees Celsius, if not 1.5 C — will be impossible without dramatic action from the world’s largest emitter of greenhouse gases, China. Toward that end, China began in 2017 developing an emissions trading scheme (ETS), a national carbon dioxide market designed to enable the country to meet its initial Paris pledge with the greatest efficiency and at the lowest possible cost. China’s pledge, or nationally determined contribution (NDC), is to reduce its CO<sub>2</sub>&nbsp;intensity of gross domestic product (emissions produced per unit of economic activity)&nbsp;by 60 to 65 percent in 2030 relative to 2005, and to peak CO<sub>2</sub>&nbsp;emissions around 2030.</p> <p>When it’s rolled out, China’s carbon market will initially cover the electric power sector (which currently produces more than 3 billion tons of CO<sub>2</sub>) and likely set CO<sub>2</sub>&nbsp;emissions intensity targets (e.g., grams of CO<sub>2</sub> per kilowatt hour) to ensure that its short-term NDC is fulfilled. But to help the world achieve the long-term 2 C and 1.5 C Paris goals, China will need to continually decrease these targets over the course of the century.</p> <p>A new study of China’s long-term power generation mix under the nation’s ETS projects that until 2065, renewable energy sources will likely expand to meet these targets; after that, carbon capture and storage (CCS) could be deployed to meet the more stringent targets that follow. Led by researchers at the MIT Joint Program on the Science and Policy of Global Change, the <a href="">study</a> appears in the journal <em>Energy Economics.</em></p> <p>“This research provides insight into the level of carbon prices and mix of generation technologies needed for China to meet different CO<sub>2</sub> intensity targets for the electric power sector,” says <a href="">Jennifer Morris</a>, lead author of the study and a research scientist at the MIT Joint Program. ”We find that coal CCS has the potential to play an important role in the second half of the century, as part of a portfolio that also includes renewables and possibly nuclear power.”</p> <p>To evaluate the impacts of multiple potential ETS pathways — different starting carbon prices and rates of increase — on the deployment of CCS technology, the researchers enhanced the MIT Economic Projection and Policy Analysis (<a href="">EPPA</a>) model to include the joint program’s latest assessments of the costs of low-carbon power generation technologies in China. Among the technologies included in the model are natural gas, nuclear, wind, solar, coal with CCS, and natural gas with CCS. Assuming that power generation prices are the same across the country for any given technology, the researchers identify different ETS pathways in which CCS could play a key role in lowering the emissions intensity of China’s power sector, particularly for targets consistent with achieving the long-term 2 C and 1.5 C Paris goals by 2100.</p> <p>The study projects a two-stage transition — first to renewables, and then to coal CCS. The transition from renewables to CCS is driven by two factors. First, at higher levels of penetration, renewables incur increasing costs related to accommodating the intermittency challenges posed by wind and solar. This paves the way for coal CCS. Second, as experience with building and operating CCS technology is gained, CCS costs decrease, allowing the technology to be rapidly deployed at scale after 2065 and replace renewables as the primary power generation technology.</p> <p>The study shows that carbon prices of $35-40 per ton of CO<sub>2</sub>&nbsp;make CCS technologies coupled with coal-based generation cost-competitive against other modes of generation, and that carbon prices higher than $100 per ton of CO<sub>2</sub>&nbsp;allow for a significant expansion of CCS.</p> <p>“Our study is at the aggregate level of the country,” says Sergey Paltsev, deputy director of the joint program. “We recognize that the cost of electricity varies greatly from province to province in China, and hope to include interactions between provinces in our future modeling to provide deeper understanding of regional differences. At the same time, our current results provide useful insights to decision-makers in designing more substantial emissions mitigation pathways.”</p> Coal-fired electric plant, Henan Province, China Photo: V.T. Polywoda/FlickrJoint Program on the Science and Policy of Global Change, MIT Energy Initiative, Climate change, Alternative energy, Energy, Environment, Economics, Greenhouse gases, Carbon dioxide, Research, Policy, Emissions, China, Technology and society Among India’s working poor, sobriety may boost savings Economist’s study of rickshaw drivers shows effects of alcohol consumption on financial decision-making. Fri, 21 Jun 2019 00:00:00 -0400 Peter Dizikes | MIT News Office <p>Trying to stay sober does not change the earnings of some workers — but it does increase the amount of money they save, acording to an MIT economist’s field experiment about low-income workers in India.</p> <p>The experiment, involving a large group of rickshaw drivers in the city of Chennai, on India’s eastern coast, produced a number intriguing results and paints a detailed picture about the effects of alcohol consumption on low-income workers in the developing world.</p> <p>In the study, workers offered monetary incentives for staying sober decreased their daytime drinking. While this did not significantly affect their earnings, their savings shot up by more than 50 percent, indicating an overall change of spending habits.</p> <p>In addition to examining alcohol’s effects on earnings and savings, the study showed that many workers value sobriety quite highly: They would prefer to receive less income in exchange for periodic (but smaller) financial rewards for staying sober.&nbsp;</p> <p>“Workers are willing to give up money in order to provide themselves incentives for future sobriety,” says Frank Schilbach, who is an assistant professor of economics at MIT, a faculty affiliate with MIT’s Abdul Latif Jameel Poverty Action Lab, and the author of a new paper detailing the results of the experiment.</p> <p>“These choices reveal self-control problems with respect to alcohol consumption, to a greater extent than we found in other domains such as exercising or smoking,” Schilbach adds. “These workers would like to change their behavior in the future.”</p> <p>Schilbach’s paper, “Alcohol and Self-Control: A Field Experiment in India,” appears in the <em>American Economic Review</em>.</p> <p><strong>Preferring a path to sobriety</strong></p> <p>The impetus for the study came from interviews that Schilbach, a development economist, conducted with poor families in India as part of a different project. Struck by how often alcohol was mentioned as a problem during these interviews, he conducted a 2014 survey of 1,227 low-income workers in Chennai; about 76 percent of respondents reported that they had consumed alcohol the previous day.</p> <p>The experiment itself consisted of a three-week study in 2014 of 229 rickshaw drivers. (India still has some hand-pulled rickshaws, in which workers pull pasengers in small wheeled vehicles.) Schilbach divided the workers into three groups. Some received monetary incentives for staying sober; others received unconditional payments, whether they stayed sober or not; and a third group could choose which approach they took. Participants were given breathalyzer tests to assess their consumption levels.</p> <p>The experiment produced a 13-percentage-point decrease in daytime drinking, although some participants self-reported that they shifted their drinking from daytime hours to nights, such that overall consumption did not change. And the earnings of drivers did not change significantly during the study period — something Schilbach was not necessarily anticipating.</p> <p>“Perhaps surprisingly, I didn’t find evidence of impacts on labor outcomes,” Schilbach says, although he does not rule out such an effect in future studies with larger samples or bigger reductions in drinking.</p> <p>The increase in savings, however, was significant, especially since it outpaced any decline in spending on alcohol. This suggests that at least some workers found sobriety led to more cautious spending habits.</p> <p>“The effect of increased sobriety on savings are effects beyond any mechanical effects of having more money available,” Schilbach says.</p> <p>Moreover, the study revealed significant demand for programs that could help workers stay sober. Over one-third of the study’s participants actually preferred to be given smaller incentive payments for staying sober, rather than have larger payments that were not linked to sobriety.</p> <p>The average rickshaw worker in the study earned from 300 to 500 rupees per day, or about $5 to $8 in U.S. dollars; those opting into the sobriety-incentive system were willing to spurn about 30 rupees per day.</p> <p>“The amounts that people are willing to pay are fairly large,” Schilbach notes. “They’re willing to forego about 10 percent of their daily income.”</p> <p>As Schilbach notes, the result suggests that heavy drinkers themselves might also prefer higher taxation of alcohol, since it could help them reduce their drinking in the future.</p> <p><strong>Drinking to do the job</strong></p> <p>Overall, Schilbach says, the study helps us better comprehend the full relationship between drinking and work among some blue-collar laborers. For rickshaw drivers, he observes, drinking may help them as they perform their jobs.</p> <p>“The nature of work is such that maybe alcohol helps some people in some ways,” Schilbach says. “The two primary reasons why people tell you they drink is, one, physical pain, and two, they’re been drinking for years and now it’s hard to stop. Rickshaw driving is a very tough job. This is very hard work, it’s a painful job to do. That’s a more general issue in low-income populations.”</p> <p>The study is part of a broader research agenda Schilbach is pursuing that aims to better understand the psychological lives of poor people around the world and issues they grapple with, including alcohol consumption, sleep deprivation, and depression. These and related issues, he observes, have not been extensively studied, even relative to other elements in the lives of the working poor.</p> <p>For his part, Schilbach readily states that the experiment is just a first step toward quantifying the study of these complicated social issues.</p> <p>“You can think of this work as a start to try to analyze alcohol use and abuse in developing countries, which is a very understudied topic in general,” Schilbach says. “I think there is much more to be studied, and learned, and the hope is that leads to improved policies.”</p> <p>The study was supported by funding from the Weiss Family Fund for Research in Development Economics, the Lab for Economic Applications and Policy, the Warburg Funds, the Inequality and Social Policy Program, the Pershing Square Venture Fund for research on the Foundations of Human Behavior, and an anonymous donor.</p> Rickshaw drivers at work in India. A study by MIT economist Frank Schilbach explores the effects of sobriety programs among rickshaw drivers in the country.Stock ImageEconomics, Social sciences, India, Labor, Work and family life, Abdul Latif Jameel Poverty Action Lab (J-PAL), Behavior, Poverty, Developing countries, Behavioral economics, School of Humanities Arts and Social Sciences QS ranks MIT the world’s No. 1 university for 2019-20 Ranked at the top for the eighth straight year, the Institute also places first in 11 of 48 disciplines. Tue, 18 Jun 2019 20:01:00 -0400 MIT News Office <p>MIT has again been named the world’s top university by the QS World University Rankings, which were announced today. This is the eighth year in a row MIT has received this distinction.</p> <p>The full 2019-20 rankings — published by Quacquarelli Symonds, an organization specializing in education and study abroad — can be found at <a href=""></a>. The QS rankings were based on academic reputation, employer reputation, citations per faculty, student-to-faculty ratio, proportion of international faculty, and proportion of international students. MIT earned a perfect overall score of 100.</p> <p>MIT was also ranked the world’s top university in <a href="">11 of 48 disciplines ranked by QS</a>, as announced in February of this year.</p> <p>MIT received a No. 1 ranking in the following QS subject areas: Chemistry; Computer Science and Information Systems; Chemical Engineering; Civil and Structural Engineering; Electrical and Electronic Engineering; Mechanical, Aeronautical and Manufacturing Engineering; Linguistics; Materials Science; Mathematics; Physics and Astronomy; and Statistics and Operational Research.</p> <p>MIT also placed second in six subject areas: Accounting and Finance; Architecture/Built Environment; Biological Sciences; Earth and Marine Sciences; Economics and Econometrics; and Environmental Sciences.</p> Image: Christopher HartingRankings, Architecture, Chemical engineering, Chemistry, Civil and environmental engineering, Electrical Engineering & Computer Science (eecs), Economics, Linguistics, Materials Science and Engineering, DMSE, Mechanical engineering, Aeronautical and astronautical engineering, Physics, Business and management, Accounting, Finance, Arts, Design, Mathematics, EAPS, School of Architecture and Planning, School of Humanities Arts and Social Sciences, School of Science, School of Engineering, Sloan School of Management Engineers set the standards MIT business historian’s new book chronicles the emergence of global standardization in technology. Wed, 12 Jun 2019 00:00:00 -0400 Peter Dizikes | MIT News Office <p>It might not seem consequential now, but in 1863, <em>Scientific American</em> weighed in on a pressing technological issue: the standardization of screw threads in U.S. machine shops. Given standard-size threads — the ridges running around screws and bolts — screws missing from machinery could be replaced with hardware from any producer. But without a standard, fixing industrial equipment would be harder or even impossible.</p> <p>Moreover, Great Britain had begun standardizing the size of screw threads, so why couldn’t the U.S.? After energetic campaigning by a mechanical engineer named William Sellers, both the U.S. Navy and the Pennsylvania Railroad got on board with the idea, greatly helping standardization take hold.</p> <p>Why did it matter? The latter half of the 1800s was an unprecedented time of industrial expansion. But the products and tools of the time were not necessarily uniform. Making them compatible served as an accelerant for industrialization. The standardization of screw threads was a signature moment in this process — along with new standards for steam boilers (which had a nasty habit of exploding) and for the steel rails used in train tracks.</p> <p>Moreover, what goes for 19th-century hardware goes for hundreds of things used in daily life today. From software languages to batteries, transmission lines to power plants, cement, and more, standardization still helps fuel economic growth.</p> <p>“Everything around us is full of standards,” says JoAnne Yates, the Sloan Distinguished Professor of Management at MIT. “None of us could function without standards.”</p> <p>But how did this all come about? One might expect government treaties to be essential for global standards to exist. But time and again, Yates notes, industrial standards are voluntary and have the same source: engineers. Or, more precisely, nongovernmental standard-setting bodies dominated by engineers, which work to make technology uniform across borders.</p> <p>“On one end of a continuum is government regulation, and on the other are market forces, and in between is an invisible infrastructure of organizations that helps us arrive at voluntary standards without which we couldn’t operate,” Yates says.</p> <p>Now Yates is the co-author of a new history that makes the role of engineers in setting standards more visible than ever. The book, “Engineering Rules: Global Standard Setting since 1880,” is being published this week by Johns Hopkins University Press. It is co-authored by Yates, who teaches in the MIT Sloan School of Management, and Craig N. Murphy, who is the Betty Freyhof Johnson ’44 Professor of International Relations at Wellesley College.</p> <p><strong>Joint research project</strong></p> <p>As it happens, Murphy is also Yates’ husband — and, for the first time, they have collaborated on a research project.</p> <p>“He’s a political scientist and I’m a business historian, but we had said throughout our careers, ‘Some day we should write a book together,’” Yates says. When it crossed their radar as a topic, the evolution of standards “immediately appealed to both of us,” she adds. “From Craig’s point of view, he studies global governance, which also includes nongovernmental institutions like this. I saw it as important because of the way firms play a role in it.”</p> <p>As Yates and Murphy see it, there have been three distinct historical “waves” of technological standardization. The first, the late 19th- and early 20th-century industrial phase, was spurred by the professionalization of engineering itself. Those engineers were trying to impose order on a world far less organized than ours: Although the U.S. Constitution gives Congress the power to set standards, a U.S. National Bureau of Standards was not created until 1901, when there were still 25 different basic units of length — such as “rods” — being used in the country.</p> <p>Much of this industrial standardization occured country by country. But by the early 20th century, engineers ramped up their efforts to make standards international — and some, like the British engineer Charles le Maistre, a key figure in the book, were very aspirational about global standards.</p> <p>“Technology evangelists, like le Maistre, spread the word about the importance of standardizing and how technical standards should transcend politics and transcend national boundaries,” Yates says, adding that many had a “social movement-like fervor, feeling that they were contributing to the common good. They even thought it would create world peace.”</p> <p>It didn’t. Still, the momentum for standards created by Le Maistre carried into the post-World War II era, the second wave detailed in the book. This new phase, Yates notes, is exemplified by the creation of the standardized shipping container, which made world-wide commerce vastly easier in terms of logistics and efficiency.</p> <p>“This second wave was all about integrating the global market,” Yates says.&nbsp;</p> <p>The third and most recent wave of standardization, as Yates and Murphy see it, is centered on information technology — where engineers have once again toiled, often with a sense of greater purpose, to develop global standards.</p> <p>To some degree this is an MIT story; Tim Berners-Lee, inventor of the World Wide Web, moved to MIT to establish a global standards consortium for the web, W3C, which was founded in 1994, with the Institute’s backing. More broadly, Yates and Murphy note, the era is marked by efforts to speed up the process of standard-setting, “to respond to a more rapid pace of technological change” in the world.</p> <p><strong>Setting a historical standard</strong></p> <p>Intriguingly, as Yates and Murphy document, many efforts to standardize technologies required firms and business leaders to put aside their short-term interests for a longer-term good — whether for a business, an industry, or society generally.</p> <p>“You can’t explain the standards world entirely by economics,” Yates says. “And you can’t explain the standards world entirely by power.”</p> <p>Other scholars regard the book as a significant contribution to the history of business and globalization. Yates and Murphy “demonstrate the crucial impact of private and informal standard setting on our daily lives,” according to Thomas G. Weiss, a professor of international relations and global governance at the Graduate Center of the City University of New York. Weiss calls the book “essential reading for anyone wishing to understand the major changes in the global economy.”</p> <p>For her part, Yates says she hopes readers will, among other things, reflect on the idealism and energy of the engineers who regarded international standards as a higher cause.</p> <p>“It is a story about engineers thinking they could contribute something good for the world, and then putting the necessary organizations into place.” Yates notes. “Standardization didn’t create world peace, but it has been good for the world.”</p> JoAnne Yates and her new book “Engineering Rules: Global Standard Setting Since 1880.” Image: Ed CollierSloan School of Management, Business, History, Economics, Innovation and Entrepreneurship (I&E), Information Technology, Books and authors, Technology and society, Faculty