MIT News - Aeronautical and astronautical engineering MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community. en Fri, 06 Mar 2020 13:30:01 -0500 School of Engineering fourth quarter 2019 awards Faculty members recognized for excellence via a diverse array of honors, grants, and prizes over the last quarter. Fri, 06 Mar 2020 13:30:01 -0500 School of Engineering <p>Members of the MIT engineering faculty receive many awards in recognition of their scholarship, service, and overall excellence. Every quarter, the School of Engineering publicly recognizes their achievements by highlighting the honors, prizes, and medals won by faculty working in our academic departments, labs, and centers.</p> <p>Hal Abelson, of the Department of Electrical Engineering and Computer Science,&nbsp;received an <a href="">honorary doctorate in education from the Education University of Hong Kong</a>&nbsp;on Nov. 22, 2019.</p> <p>Jesús del Alamo, of the Department of Electrical Engineering and Computer Science, <a href="">won the University Researcher Award</a> from the Semiconductor Industry Association and the Semiconductor Research Corporation on Nov. 7, 2019.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</p> <p>Mohammad Alizadeh, of the Department of Electrical Engineering and Computer Science,&nbsp;won the&nbsp;<a href="">2019 VMware Systems Research Award</a>&nbsp;on Dec. 18, 2019.</p> <p>Hari Balakrishnan, of the Department of Electrical Engineering and Computer Science,&nbsp;was named a&nbsp;<a href="">2020 fellow of the Institute of Electrical and Electronics Engineers</a> (IEEE)&nbsp;on Dec. 3, 2019.</p> <p>Irmgard Bischofberger, of the Department of Mechanical Engineering, won the&nbsp;<a href="">2019 APS/DFD Milton van Dyke Award</a>&nbsp;on Dec. 4, 2019.</p> <p>Adam Chlipala, of the Department of Electrical Engineering and Computer Science,&nbsp;was named a distinguished member of the Association for Computing Machinery on Dec. 20, 2019.</p> <p>William Freeman, of the Department of Electrical Engineering and Computer Science, <a href="">won the Distinguished Researcher Award</a> from the IEEE Computer Society's Technical Committee on Pattern Analysis and Machine Intelligence on Oct. 30, 2019.</p> <p>Shafi Goldwasser, of the Department of Electrical Engineering and Computer Science,&nbsp;received an <a href="">honorary doctorate of science from Oxford University</a> on June 26, 2019, and she received an <a href="">honorary doctorate in mathematics from the University of Waterloo</a>&nbsp;on June 13, 2019.</p> <p>Wesley L. Harris, of the Department of Aeronautics and Astronautics, was named a&nbsp;<a href="">2019 AAAS Fellow</a>&nbsp;on Nov. 26, 2019.</p> <p>Jonathan How, of the Department of Aeronautics and Astronautics, won the&nbsp;<a href="">2020 AIAA Intelligent Systems Award</a>&nbsp;on Dec. 5, 2019.</p> <p>Roger Kamm, of the Department of Mechanical Engineering,&nbsp;won the&nbsp;<a href="">Shu Chien Achievement Award</a>&nbsp;on Jan. 2.</p> <p>David Karger, of the Department of Electrical Engineering and Computer Science, was <a href="">inducted into the American Academy of Arts and Sciences</a>&nbsp;on Nov. 12, 2019.</p> <p>Heather Lechtman, of the Department of Materials Science and Engineering, <a href="">won the Pomerance Award for Scientific Contributions to Archaeology</a>&nbsp;on Jan. 4.</p> <p>Charles Leiserson, of the Department of Electrical Engineering and Computer Science, <a href="">won the Test of Time Award for 1999</a> from IEEE Symposium on the Foundations of Computer Science on Nov. 9, 2019.</p> <p>Nancy Leveson, of the Department of Aeronautics and Astronautics, won the&nbsp;<a href="">2020 IEEE Medal for Environmental and Safety Technologies</a> on Dec. 18, 2019.</p> <p>Barbara Liskov, Institute Professor Emerita of the Department of Electrical Engineering and Computer Science, received an <a href="">honorary doctorate in mathematics from the University of Waterloo</a>&nbsp;on June 13, 2019.</p> <p>Leonid Mirny, of the Institute for Medical Engineering and Science,&nbsp;was selected for the&nbsp;<a href="">Chaires Blaise Pascal 2019</a>&nbsp;on Oct. 30, 2019.</p> <p>Dava Newman, of the Department of Aeronautics and Astronautics, was <a href="">elected to the Aerospace Corporation’s Board of Trustees</a>&nbsp;on Dec. 23, 2019.</p> <p>Wim van Rees, of the Department of Mechanical Engineering,&nbsp;won the&nbsp;<a href="">2019 APS/DFD Milton van Dyke Award</a>&nbsp;on Dec. 4, 2019.</p> <p>Ellen Roche, of the Department of Mechanical Engineering,&nbsp;was named <a href="">associate scientific advisor of <em>Science Translational Medicine</em></a>&nbsp;on Jan. 17.</p> <p>Kripa Varanasi, of the Department of Mechanical Engineering, won the&nbsp;<a href="">2019 APS/DFD Milton van Dyke Award</a>&nbsp;on Dec. 4, 2019.</p> <p>Alan Willsky (post-tenure), of the Department of Electrical Engineering and Computer Science,&nbsp;won the&nbsp;<a href="">IEEE Jack S. Kilby Signal Processing Medal</a>&nbsp;on May 17, 2019.</p> <p>Maria Yang, Sang-Gook Kim, and Caitlin Mueller, of the Department of Mechanical Engineering, won the&nbsp;<a href="">National Science Foundation LEAP HI Award</a>&nbsp;on Dec. 4, 2019.</p> <p>Xuanhe Zhao, of the Department of Mechanical Engineering,&nbsp;won the&nbsp;<a href="">Thomas J.R. Hughes Young Investigator Award</a>&nbsp;on Jan. 2.</p> Members of the MIT engineering faculty receive many awards in recognition of their scholarship, service, and overall excellence.Photo: Lillie Paquette/School of EngineeringSchool of Engineering, Mechanical engineering, Awards, honors and fellowships, Faculty, Electrical Engineering & Computer Science (eecs), DMSE, Aeronautical and astronautical engineering Showing robots how to do your chores By observing humans, robots learn to perform complex tasks, such as setting a table. Thu, 05 Mar 2020 23:59:59 -0500 Rob Matheson | MIT News Office <p>Training interactive robots may one day be an easy job for everyone, even those without programming expertise. Roboticists are developing automated robots that can learn new tasks solely by observing humans. At home, you might someday show a domestic robot how to do routine chores. In the workplace, you could train robots like new employees, showing them how to perform many duties.</p> <p>Making progress on that vision, MIT researchers have designed a system that lets these types of robots learn complicated tasks that would otherwise stymie them with too many confusing rules. One such task is setting a dinner table under certain conditions. &nbsp;</p> <p>At its core, the researchers’ “Planning with Uncertain Specifications” (PUnS) system gives robots the humanlike planning ability to simultaneously weigh many ambiguous —&nbsp;and potentially contradictory —&nbsp;requirements to reach an end goal. In doing so, the system always chooses the most likely action to take, based on a “belief” about some probable specifications for the task it is supposed to perform.</p> <p>In their work, the researchers compiled a dataset with information about how eight objects — a mug, glass, spoon, fork, knife, dinner plate, small plate, and bowl — could be placed on a table in various configurations. A robotic arm first observed randomly selected human demonstrations of setting the table with the objects. Then, the researchers tasked the arm with automatically setting a table in a specific configuration, in real-world experiments and in simulation, based on what it had seen.</p> <p>To succeed, the robot had to weigh many possible placement orderings, even when items were purposely removed, stacked, or hidden. Normally, all of that would confuse robots too much. But the researchers’ robot made no mistakes over several real-world experiments, and only a handful of mistakes over tens of thousands of simulated test runs. &nbsp;</p> <p>“The vision is to put programming in the hands of domain experts, who can program robots through intuitive ways, rather than describing orders to an engineer to add to their code,” says first author Ankit Shah, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro) and the Interactive Robotics Group, who emphasizes that their work is just one step in fulfilling that vision. “That way, robots won’t have to perform preprogrammed tasks anymore. Factory workers can teach a robot to do multiple complex assembly tasks. Domestic robots can learn how to stack cabinets, load the dishwasher, or set the table from people at home.”</p> <p>Joining Shah on the paper are AeroAstro and Interactive Robotics Group graduate student Shen Li and Interactive Robotics Group leader Julie Shah, an associate professor in AeroAstro and the Computer Science and Artificial Intelligence Laboratory.</p> <div class="cms-placeholder-content-video"></div> <p><strong>Bots hedging bets</strong></p> <p>Robots are fine planners in tasks with clear “specifications,” which help describe the task the robot needs to fulfill, considering its actions, environment, and end goal. Learning to set a table by observing demonstrations, is full of uncertain specifications. Items must be placed in certain spots, depending on the menu and where guests are seated, and in certain orders, depending on an item’s immediate availability or social conventions. Present approaches to planning are not capable of dealing with such uncertain specifications.</p> <p>A popular approach to planning is “reinforcement learning,” a trial-and-error machine-learning technique that rewards and penalizes them for actions as they work to complete a task. But for tasks with uncertain specifications, it’s difficult to define clear rewards and penalties. In short, robots never fully learn right from wrong.</p> <p>The researchers’ system, called PUnS (for Planning with Uncertain Specifications), enables a robot to hold a “belief” over a range of possible specifications. The belief itself can then be used to dish out rewards and penalties. “The robot is essentially hedging its bets in terms of what’s intended in a task, and takes actions that satisfy its belief, instead of us giving it a clear specification,” Ankit Shah says.</p> <p>The system is built on “linear temporal logic” (LTL), an expressive language that enables robotic reasoning about current and future outcomes. The researchers defined templates in LTL that model various time-based conditions, such as what must happen now, must eventually happen, and must happen until something else occurs. The robot’s observations of 30 human demonstrations for setting the table yielded a probability distribution over 25 different LTL formulas. Each formula encoded a slightly different preference — or specification — for setting the table. That probability distribution becomes its belief.</p> <p>“Each formula encodes something different, but when the robot considers various combinations of all the templates, and tries to satisfy everything together, it ends up doing the right thing eventually,” Ankit Shah says.</p> <p><strong>Following criteria</strong></p> <p>The researchers also developed several criteria that guide the robot toward satisfying the entire belief over those candidate formulas. One, for instance, satisfies the most likely formula, which discards everything else apart from the template with the highest probability. Others satisfy the largest number of unique formulas, without considering their overall probability, or they satisfy several formulas that represent highest total probability. Another simply minimizes error, so the system ignores formulas with high probability of failure.</p> <p>Designers can choose any one of the four criteria to preset before training and testing. Each has its own tradeoff between flexibility and risk aversion. The choice of criteria depends entirely on the task. In safety critical situations, for instance, a designer may choose to limit possibility of failure. But where consequences of failure are not as severe, designers can choose to give robots greater flexibility to try different approaches.</p> <p>With the criteria in place, the researchers developed an algorithm to convert the robot’s belief — the probability distribution pointing to the desired formula — into an equivalent reinforcement learning problem. This model will ping the robot with a reward or penalty for an action it takes, based on the specification it’s decided to follow.</p> <p>In simulations asking the robot to set the table in different configurations, it only made six mistakes out of 20,000 tries. In real-world demonstrations, it showed behavior similar to how a human would perform the task. If an item wasn’t initially visible, for instance, the robot would finish setting the rest of the table without the item. Then, when the fork was revealed, it would set the fork in the proper place. “That’s where flexibility is very important,” Ankit Shah says. “Otherwise it would get stuck when it expects to place a fork and not finish the rest of table setup.”</p> <p>Next, the researchers hope to modify the system to help robots change their behavior based on verbal instructions, corrections, or a user’s assessment of the robot’s performance. “Say a person demonstrates to a robot how to set a table at only one spot. The person may say, ‘do the same thing for all other spots,’ or, ‘place the knife before the fork here instead,’” Ankit Shah says. “We want to develop methods for the system to naturally adapt to handle those verbal commands, without needing additional demonstrations.”&nbsp;&nbsp;</p> Roboticists are developing automated robots that can learn new tasks solely by observing humans. At home, you might someday show a domestic robot how to do routine chores.Image: Christine Daniloff, MITResearch, Computer science and technology, Algorithms, Artificial intelligence, Machine learning, Robots, Robotics, Assistive technology, Aeronautical and astronautical engineering, Computer Science and Artificial Intelligence Laboratory (CSAIL), School of Engineering QS World University Rankings rates MIT No. 1 in 12 subjects for 2020 Institute ranks second in five subject areas. Tue, 03 Mar 2020 19:01:01 -0500 MIT News Office <p>MIT has been honored with 12 No. 1 subject rankings in the QS World University Rankings for 2020.</p> <p>The Institute received a No. 1 ranking in the following QS subject areas: Architecture/Built Environment; Chemistry; Computer Science and Information Systems; Chemical Engineering; Civil and Structural Engineering; Electrical and Electronic Engineering; Mechanical, Aeronautical and Manufacturing Engineering; Linguistics; Materials Science; Mathematics; Physics and Astronomy; and Statistics and Operational Research.</p> <p>MIT also placed second in five subject areas: Accounting and Finance; Biological Sciences; Earth and Marine Sciences; Economics and Econometrics; and Environmental Sciences.</p> <p>Quacquarelli Symonds Limited subject rankings, published annually, are designed to help prospective students find the leading schools in their field of interest. Rankings are based on research quality and accomplishments, academic reputation, and graduate employment.</p> <p>MIT has been ranked as the No. 1 university in the world by QS World University Rankings for eight straight years.</p> Afternoon light streams into MIT’s Lobby 7.Image: Jake BelcherRankings, Computer science and technology, Linguistics, Chemical engineering, Civil and environmental engineering, Mechanical engineering, Chemistry, Materials science, Mathematics, Physics, Economics, EAPS, Business and management, Accounting, Finance, DMSE, School of Engineering, School of Science, School of Architecture and Planning, Sloan School of Management, School of Humanities Arts and Social Sciences, Electrical Engineering & Computer Science (eecs), Architecture, Biology, Aeronautical and astronautical engineering A new way to prepare graduate students to lead in tech A new graduate certificate offered through the Bernard M. Gordon-MIT Engineering Leadership Program will launch this fall. Wed, 26 Feb 2020 11:20:01 -0500 School of Engineering <p>Before coming to MIT, Benjamin Lienhard focused most of his energy exploring fragile quantum states, dwelling in the world of nanotechnology and filling in gaps in the research to help steer and stabilize new technologies. Now that he’s a fifth-year graduate student in electrical engineering and computer science, he’s still investigating tiny quantum bits, looking for novel ways to support enormous breakthroughs in quantum computing.</p> <p>But for all his advanced technical knowledge and forward-thinking momentum, Lienhard found himself suddenly in a tenuous state in 2017. Asked to coordinate a conference, he realized developing leadership skills was an aspect of his work that he’d overlooked through all those years investigating quantum states at exceptionally small scales.</p> <p>Not wanting to miss an opportunity, Lienhard accepted the conference role and other leadership roles like it, and each time he agreed to step in to lead, he arrived at the same uneasy conclusion. “I really noticed the only way to improve yourself and learn [leadership] is by actually experiencing it, executing it yourself and seeing how the people around you react to your leadership style,” Lienhard says. A background in theoretical leadership skills could’ve made that transition smoother, recognizing new situations on the job to adjust at a faster pace.</p> <p>Since then, Lienhard has joined the <a href="">Graduate Student Advisory Group</a> (GradSAGE) in the School of Engineering, a group established by Anantha P. Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science, to hear from students and bolster initiatives. Through GradSAGE, Lienhard is positioned to help other MIT students. On the GradSAGE Leadership Sub-Committee with engineering graduate students Vamsi Mangena, Laureen Meroueh, Lucio Milanese, Clinton Wang, and Elise Wilcox, he’s provided input that has helped pave the way for a new MIT offering this spring, designed to make those transitions from lab research into leadership roles less of a shock to the system for MIT graduate students<strong>.</strong></p> <p>Becoming a leader is nearly inevitable for engineering students, says Milanese, a fourth-year nuclear science and engineering graduate student. Even for those planning to remain in academia. “In most cases, MIT graduate students will be leading,” Milanese says. “If you become a professor, the first thing you do is set up your lab. You hire a couple graduate students, you hire a couple postdocs, and you are already, early in your 30s, essentially a manager of a small research enterprise.”</p> <p>Meroueh, a fifth-year mechanical engineering graduate student and entrepreneur, puts it another way: “It’s not just our technical skills we need to make a change in the real world.” She became interested in thinking beyond the tech after co-founding a startup company called MetaStorage during her master’s program. She plans to launch a new startup after graduating, and advancing her leadership skills is part of that plan.</p> <p>Recognizing how many engineering graduate students were lacking a leadership program that catered to their future goals, GradSAGE Leadership Sub-Committee approached the Bernard M. Gordon-MIT Engineering Leadership Program (GEL). This led to the creation of a new interim MIT Graduate Certificate in Technical Leadership, which will launch in a permanent form this fall. Completing the certificate requires that students complete a course called Leading Creative Teams and an additional 12 units of graduate leadership courses, plus attendance of four workshops. It’s designed to deliver both leadership theory and practical experience to engineering students by providing technical leadership-focused courses alongside hands-on workshops required to complete the certificate.</p> <p>For engineering students, the GEL courses cover how to conduct multi-stakeholder negotiations, influence others, and provide leadership in the age of artificial intelligence — with coursework all contextualized within tech companies. The program also offers custom paths for graduate students in any program to create a leadership certificate that suits different career goals, with the only required course GEL’s Leading Creative Teams. It’s taught by David Niño, who has been piloting Leading Creative Teams for the past three years. For the GradSAGE students enrolled, taking Niño’s course served as inspiration for building the certificate, and forms its essential core. To complete the additional units, students from any program can choose from dozens of graduate courses from across MIT to build their own certificate, including subjects in building successful careers and organizations; advanced leadership communications; and science, technology, and public policy. “We envision it as being for everybody,” Milanese says of the certificate in technical leadership.</p> <p>This spring, there are six workshops available, <a href="" target="_blank">scheduled at different dates and times</a> to accommodate a range of student schedules. Workshops will cover topics like how to deliver objectives in technical organizations, leadership paths in technical organizations, what to do during your first 90 days in a new professional role, and what happens when technical leaders fail to stand up to unrealistic or unethical pressures.</p> <p>“If you want to improve your leadership skills, you need to exercise them in practice,” Lienhard says, adding that the workshops are not simply extensions of these courses, but immersive experiences of their own.</p> <p>In addition to delivering educational value, another goal of the workshops is to build a community among graduate students interested in technical leadership. Meroueh says the workshops present an opportunity to meet students with different engineering backgrounds. “We wanted to create a sense of community,” she says. Their plan seems to be working (or perhaps it’s the free pizza). Earlier this month, Meroueh and Wilcox both attended the first workshop on technical leadership and finance, led by Olivier L. de Weck, professor of aeronautics and astronautics and engineering systems, and faculty co-director of GEL. The workshop drew twice as many attendees as the GradSAGE sub-committee had predicted.</p> <p>Wilcox, a fifth-year graduate student in medical engineering and medical physics, says she left de Weck’s workshop with a fresh perspective on approaching the job market, taking away actionable advice like how to check a company’s financial health before agreeing to come onboard. She also learned how companies make decisions based on finances, a way of thinking she says will help her better pitch her ideas. Citing a need for female leadership in engineering, Meroueh adds that participating in leadership programs can help women navigate to the top in a male-dominated field.</p> <p>To earn the certificate, students must complete four out of six workshops, attendance of which can be spread out over different semesters. The workshops take two hours to complete, with registration required and food and drinks provided to attendees.</p> <p>Although half of engineering graduate students that GradSAGE sub-committee surveyed indicated an interest in a leadership certificate like GEL’s new initiative, two-thirds of respondents were concerned they wouldn’t have time to hone leadership skills during their graduate degree program. Lienhard says for doctoral programs that require minors, the leadership certificate’s courses can be simultaneously used to meet that requirement, which provides the further benefit of acquiring leaderships skills while working closely with an advisor.</p> <p>This spring, an Interim Certificate in Technical Leadership will be available through the Graduate Program in Engineering Leadership. Any eligible courses completed can be retroactively applied once the certificate debuts next fall. For Lienhard, this bundling of tailored courses combined with practical workshops gives MIT graduate students a “less painful” and more productive adjustment period on the path to specific ambitions, so somebody who is gunning to be chief technology officer doesn’t waste time learning insights more appropriate for tomorrow’s next top CEO.</p> <p>Milanese says the first thing the GradSAGE subcommittee did when they met was land on their own definition of leadership, which serves as a simple summation of the wide array of ambitions being pursued by aspiring tech leaders at MIT. According to Milanese, GradSAGE hopes the new certificate instills in graduate students interested in developing leadership skills “the ability to work with others to create great things.”</p> GradSAGE Leadership Sub-Committee members (left to right): Vamsi Mangena, Elise Wilcox, Benjamin Lienhard, Lucio Milanese, Laureen Meroueh, and Clinton WangPhoto: Lillie Paquette/School of EngineeringElectrical engineering and computer science (EECS), Nuclear science and engineering, Mechanical engineering, Leadership, Aeronautical and astronautical engineering, Classes and programs, School of Engineering, GEL Program How to deflect an asteroid MIT engineers devise a decision map to identify the best mission type to deflect an incoming asteroid. Tue, 18 Feb 2020 23:59:59 -0500 Jennifer Chu | MIT News Office <p>On April 13, 2029, an icy chunk of space rock, wider than the Eiffel Tower is tall, will streak by Earth at 30 kilometers per second, grazing the planet’s sphere of geostationary satellites. It will be the closest approach by one of the largest asteroids crossing Earth’s orbit in the next decade.</p> <p>Observations of the asteroid, known as 99942 Apophis, for the Egyptian god of chaos, once suggested that its 2029 flyby would take it through a gravitational keyhole — a location in Earth’s gravity field that would tug the asteroid’s trajectory such that on its next flyby, in the year 2036, it would likely make a devastating impact.</p> <p>Thankfully, more recent observations have confirmed that the asteroid will sling by Earth without incident in both 2029 and 2036. Nevertheless, most scientists believe it is never too early to consider strategies for deflecting an asteroid if one were ever on a crash course with our home planet.</p> <p>Now MIT researchers have devised a framework for deciding which type of mission would be most successful in deflecting an incoming asteroid. Their decision method takes into account an asteroid’s mass and momentum, its proximity to a gravitational keyhole, and the amount of warning time that scientists have of an impending collision — all of which have degrees of uncertainty, which the researchers also factor in to identify the most successful mission for a given asteroid.</p> <p>The researchers applied their method to Apophis, and Bennu, another near-Earth asteroid which is the target of OSIRIS-REx, an operational NASA mission that plans to return a sample of Bennu’s surface material to Earth in 2023. REXIS, an instrument designed and built by students at MIT, is also part of this mission and its task is to characterize the abundance of chemical elements at the surface.</p> <p>In a paper appearing this month in the journal <em>Acta Astronautica</em>, the researchers use their decision map to lay out the type of mission that would likely have the most success in deflecting Apophis and Bennu, in various scenarios in which the asteroids may be headed toward a gravitational keyhole. They say the method could be used to design the optimal mission configuration and campaign to deflect a potentially hazardous near-Earth asteroid.</p> <p>“People have mostly considered strategies of last-minute deflection, when the asteroid has already passed through a keyhole and is heading toward a collision with Earth,” says Sung Wook Paek, lead author of the study and a former graduate student in MIT’s Department of Aeronautics and Astronautics. “I’m interested in preventing keyhole passage well before Earth impact. It’s like a preemptive strike, with less mess.”</p> <p>Paek’s co-authors at MIT are Olivier de Weck, Jeffrey Hoffman, Richard Binzel, and David Miller.</p> <p><strong>Deflecting a planet-killer</strong></p> <p>In 2007, NASA concluded in a report submitted to the U.S. Congress that in the event that an asteroid were headed toward Earth, the most effective way to deflect it would be to launch a nuclear bomb into space. The force of its detonation would blast the asteroid away, though the planet would then have to contend with any nuclear fallout. The use of nuclear weapons to mitigate asteroid impacts remains a controversial issue in the planetary defense community.</p> <p>The second best option was to send up a “kinetic impactor” — a spacecraft, rocket, or other projectile that, if aimed at just the right direction, with adequate speed, should collide with the asteroid, transfer some fraction of its momentum, and veer it off course.</p> <p>“The basic physics principle is sort of like playing billiards,” Paek explains.</p> <p>For any kinetic impactor to be successful, however, de Weck, a professor of aeronautics and astronautics and engineering systems, says the properties of the asteroid, such as its mass, momentum, trajectory, and surface composition must be known “as precisely as possible.” That means that, in designing a deflection mission, scientists and mission managers need to take uncertainty into account.</p> <p>“Does it matter if the probability of success of a mission is 99.9 percent or only 90 percent? When it comes to deflecting a potential planet-killer, you bet it does,” de Weck says. “Therefore we have to be smarter when we&nbsp;design missions as a function of the level of uncertainty. No one has looked at the problem this way before.”</p> <p><strong>Closing a keyhole</strong></p> <p>Paek and his colleagues developed a simulation code to identify the type of asteroid deflection mission that would have the best possibility of success, given an asteroid’s set of uncertain properties.</p> <p>The missions they considered include a basic kinetic impactor, in which a projectile is shot into space to nudge an asteroid off course. Other variations involved sending a scout to first measure the asteroid to hone the specs of a projectile that would be sent up later, or sending two scouts, one to measure the asteroid and the other to push the asteroid slightly off course before a larger projectile is subsequently launched to make the asteroid miss Earth with near certainty.</p> <p>The researchers fed into the simulation specific variables such as the asteroid’s mass, momentum, and trajectory, as well as the range of uncertainty in each of these variables. Most importantly, they factored in an asteroid’s proximity to a gravitational keyhole, as well as the amount of time scientists have before an asteroid passes through the keyhole.</p> <p>“A keyhole is like a door — once it’s open, the asteroid will impact Earth soon after, with high probability,” Paek says.</p> <p>The researchers tested their simulation on Apophis and Bennu, two of only a handful of asteroids for which the locations of their gravitational keyholes with respect to Earth are known. They simulated various distances between each asteroid and their respective keyhole, and also calculated for each distance a “safe harbor” region where an asteroid would have to be deflected so that it would avoid both an impact with Earth and passing through any other nearby keyhole.</p> <p>They then evaluated which of the three main mission types would be most successful at deflecting the asteroid into a safe harbor, depending on the amount of time scientists have to prepare.</p> <p>For instance, if Apophis will pass through a keyhole in five years or more, then there is enough time to send two scouts — one to measure the asteroid’s dimensions and the other to nudge it slightly off track as a test — before sending a main impactor. If keyhole passage occurs within two to five years, there may be time to send one scout to measure the asteroid and tune the parameters of a larger projectile before sending the impactor up to divert the asteroid. If Apophis passes through its keyhole within one Earth year or less, Paek says it may be too late.</p> <p>“Even a main impactor may not be able to reach the asteroid within this timeframe,” Paek says.</p> <p>Bennu is a similar case, although scientists know a bit more about its material composition, which means that it may not be necessary to send up investigatory scouts before launching a projectile.</p> <p>With the team’s new simulation tool, Peak plans to estimate the success of other deflection missions in the future.</p> <p>“Instead of changing the size of a projectile, we may be able to change the number of launches and send up multiple smaller spacecraft to collide with an asteroid, one by one. Or we could launch projectiles from the moon or use defunct satellites as kinetic impactors,” Paek says. “We’ve created a decision map which can help in prototyping a mission.”</p> <p>This research was supported, in part, by NASA, Draper, and the Samsung Foundation of Culture.</p> MIT researchers have devised a framework for deciding which type of mission would be most successful in deflecting an incoming asteroid, taking into account an asteroid’s mass and momentum, its proximity to a gravitational keyhole, and the amount of warning time that scientists have of an impending collision.Photo collage: Christine Daniloff, MITAeronautical and astronautical engineering, Asteroids, Astronomy, NASA, Research, Satellites, School of Engineering, Space, astronomy and planetary science Half of U.S. deaths related to air pollution are linked to out-of-state emissions Study tracks pollution from state to state in the 48 contiguous United States. Wed, 12 Feb 2020 12:59:59 -0500 Jennifer Chu | MIT News Office <p>More than half of all air-quality-related early deaths in the United States are a result of emissions originating outside of the state in which those deaths occur, MIT researchers report today in the journal <em>Nature</em>.</p> <p>The study focuses on the years between 2005 and 2018 and tracks combustion emissions of various polluting compounds from various sectors, looking at every state in the contiguous United States, from season to season and year to year.</p> <p>In general, the researchers find that when air pollution is generated in one state, half of that pollution is lofted into the air and carried by winds across state boundaries, to affect the health quality of out-of-state residents and increase their risk of early death.</p> <p>Electric power generation is the greatest contributor to out-of-state pollution-related deaths, the findings suggest. In 2005, for example, deaths caused by sulfur dioxide emitted by power plant smokestacks occurred in another state in more than 75 percent of cases.</p> <p>Encouragingly, the researchers found that since 2005, early deaths associated with air pollution have gone down significantly. They documented a decrease of 30 percent in 2018 compared to 2005, equivalent to about 30,000 avoided early deaths, or people who did not die early as a result of pollution. In addition, the fraction of deaths that occur due to emissions in other states is falling — from 53 percent in 2005 to 41 percent in 2018.</p> <p>Perhaps surprisingly, this reduction in cross-state pollution also appears to be related to electric power generation: In recent years, regulations such as the Environmental Protection Agency’s Clean Air Act and other changes have helped to significantly curb emissions from this sector across the country.</p> <p>The researchers caution, however, that today, emissions from other sectors are increasingly contributing to harmful cross-state pollution.</p> <p>“Regulators in the U.S. have done a pretty good job of hitting the most important thing first, which is power generation, by reducing sulfur dioxide emissions drastically, and there’s been a huge improvement, as we see in the results,” says study leader Steven Barrett, an associate professor of aeronautics and astronautics at MIT. “Now it’s looking like other emissions sectors are becoming important. To make further progress, we should start focusing on road transportation and commercial and residential emissions.”</p> <p>Barrett’s coauthors on the paper are Sebastian Eastham, a research scientist at MIT; Irene Dedoussi, formerly an MIT graduate student and now an assistant professor at Delft University of Technology; and Erwan Monier, formerly an MIT research scientist and now an assistant professor at the University of California at Davis. The research was a collaboration between MIT’s Laboratory for Aviation and the Environment and the MIT Joint Program on the Science and Policy of Global Change.</p> <p><strong>Death and the matrix</strong></p> <p>Scientists have long known that pollution observes no boundaries, one of the prime examples being acid rain.</p> <p>“It’s been known in Europe for over 30 years that power stations in England would create acid rain that would affect vegetation in Norway, but there’s not been a systematic way to capture how that translates to human health effects,” Barrett says.</p> <p>In the case of the United States, tracking how pollution from one state affects another state has historically been tricky and computationally difficult, Barrett says. For each of the 48 contiguous states, researchers would have to track emissions to and from the rest of the 47 states.</p> <p>“But now there are modern computational tools that enable you to do these assessments in a much more efficient way,” Barrett says. “That wasn’t really possible before.”</p> <p>He and his colleagues developed such tools, drawing on fundemental work by Daven Henze at the University of Colorado at Boulder, to track how every state in the contiguous U.S. affects pollution and health outcomes in every other state. They looked at multiple species of pollutants, such as sulfur dioxide, ozone, and fine particulates, from various emissions sectors, including electric power generation, road transportation, marine, rail, and aviation, and commercial and residential sources, at intervals of every hour of the year.</p> <p>They first obtained emissions data from each of seven sectors for the years 2005, 2011, and 2018. They then used the GEOS-Chem atmospheric chemistry transport model to track where these emissions ended up, from season to season and year to year, based on wind patterns and a pollutant’s chemical reactions to the atmosphere. Finally, they used an epidemiologically derived model to relate a population’s pollutant exposure and risk of early death.</p> <p>“We have this multidimensional matrix that characterizes the impact of a state’s emissions of a given economic sector of a given pollutant at a given time, on any other state’s health outcomes,” Barrett says. “We can figure out, for example, how much NOx emissions from road transportation in Arizona in July affects human health in Texas, and we can do those calculations instantly.”</p> <p><strong>Importing pollution</strong></p> <p>The researchers also found that emissions traveling out of state could affect the health of residents beyond immediate, neighboring states.</p> <p>“It’s not necessarily just the adjacent state, but states over 1,000 miles away that can be affected,” Barrett says. “Different kinds of emissions have a different kind of range.”</p> <p>For example, electric power generation has the greatest range, as power plants can loft pollutants far into the atmosphere, allowing them to travel over long distances. In contrast, commercial and residential sectors generally emit pollutants that chemically do not last as long in the atmosphere. &nbsp;</p> <p>“The story is different for each pollutant,” Barrett says.</p> <p>In general, the researchers found that out-of-state air pollution was associated with more than half of all pollution-related early deaths in the U.S. from 2005 to 2018.</p> <p>In terms of the impact on individual states, the team found that many of the northern Midwest states such as Wyoming and North Dakota are “net exporters” of pollution-related health impacts, partly because the populations there are relatively low and the emissions these states generate are carried away by winds to other states. Those states that “import” health impacts tend to lie along the East Coast, in the path of the U.S. winds that sweep eastward.</p> <p>New York in particular is what the researchers call “the biggest importer of air pollution deaths”; 60 percent of air pollution-related early deaths are from out-of-state emissions.</p> <p>“There’s a big archive of data we’ve created from this project,” Barrett says. “We think there are a lot of things that policymakers can dig into, to chart a path to saving the most lives.”</p> <p>This research was supported, in part, by the U.S. Environmental Protection Agency, the MIT Martin Family Fellowship for Sustainability, the George and Marie Vergottis Fellowship at MIT, and the VoLo Foundation.</p> New MIT study finds more than half of all air-quality-related early deaths in the United States are a result of cross-state pollution, or emissions originating outside of the state in which those deaths occur.Image: Chelsea Turner, MITEmissions, Environment, Health, Policy, Pollution, Research, School of Engineering, Joint Program on the Science and Policy of Global Change, Aeronautical and astronautical engineering A college for the computing age With the initial organizational structure in place, the MIT Schwarzman College of Computing moves forward with implementation. Tue, 04 Feb 2020 12:30:01 -0500 Terri Park | MIT Schwarzman College of Computing <p>The mission of the MIT Stephen A. Schwarzman College of Computing is to address the opportunities and challenges of the computing age — from hardware to software to algorithms to artificial intelligence (AI) — by transforming the capabilities of academia in three key areas: supporting the rapid evolution and growth of computer science and AI; facilitating collaborations between computing and other disciplines; and focusing on social and ethical responsibilities of computing through combining technological approaches and insights from social science and humanities, and through engagement beyond academia.</p> <p>Since starting his position in August 2019, Daniel Huttenlocher, the inaugural dean of the MIT Schwarzman College of Computing, has been working with many stakeholders in designing the initial organizational structure of the college. Beginning with the <a href="" target="_blank">College of Computing Task Force Working Group reports</a> and feedback from the MIT community, the structure has been developed through an iterative process of draft plans yielding a <a href="" target="_blank">26-page document</a> outlining the initial academic organization of the college that is designed to facilitate the college mission through improved coordination and evolution of existing computing programs at MIT, improved collaboration in computing across disciplines, and development of new cross-cutting activities and programs, notably in the social and ethical responsibilities of computing.</p> <p>“The MIT Schwarzman College of Computing is both bringing together existing MIT programs in computing and developing much-needed new cross-cutting educational and research programs,” says Huttenlocher. “For existing programs, the college helps facilitate coordination and manage the growth in areas such as computer science, artificial intelligence, data systems and society, and operations research, as well as helping strengthen interdisciplinary computing programs such as computational science and engineering. For new areas, the college is creating cross-cutting platforms for the study and practice of social and ethical responsibilities of computing, for multi-departmental computing education, and for incubating new interdisciplinary computing activities.”</p> <p>The following existing departments, institutes, labs, and centers are now part of the college:</p> <ul> <li>Department of Electrical Engineering and Computer (EECS), which has been <a href="" target="_self">reorganized</a> into three overlapping sub-units of electrical engineering (EE), computer science (CS), and artificial intelligence and decision-making (AI+D), and is jointly part of the MIT Schwarzman College of Computing and School of Engineering;</li> <li>Operations Research Center (ORC), which is jointly part of the MIT Schwarzman College of Computing and MIT Sloan School of Management;</li> <li>Institute for Data, Systems, and Society (IDSS), which will be increasing its focus on the societal aspects of its mission while also continuing to support statistics across MIT, and including the Technology and Policy Program (TPP) and Sociotechnical Systems Research Center (SSRC);</li> <li>Center for Computational Science Engineering (CCSE), which is being renamed from the Center for Computational Engineering and broadening its focus in the sciences;</li> <li>Computer Science and Artificial Intelligence Laboratory (CSAIL);</li> <li>Laboratory for Information and Decision Systems (LIDS); and</li> <li>Quest for Intelligence.</li> </ul> <p>With the initial structure in place, Huttenlocher, the college leadership team, and the leaders of the academic units that are part of the college, in collaboration with departments in all five schools, are actively moving forward with curricular and programmatic development, including the launch of two new areas, the Common Ground for Computing Education and the Social and Ethical Responsibilities of Computing (SERC). Still in the early planning stages, these programs are the aspects of the college that are designed to cut across lines and involve a number of departments throughout MIT. Other programs are expected to be introduced as the college continues to take shape.</p> <p>“The college is an Institute-wide entity, working with and across all five schools,” says Anantha Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science, who was part of the task force steering committee. “Its continued growth and focus depend greatly on the input of our MIT community, a process which began over a year ago. I’m delighted that Dean Huttenlocher and the college leadership team have engaged the community for collaboration and discussion around the plans for the college.”</p> <p>With these organizational changes, students, faculty, and staff in these units are members of the college, and in some cases, jointly with a school, as will be those who are engaged in the new cross-cutting activities in SERC and Common Ground. “A question we get frequently,” says Huttenlocher, “is how to apply to the college. As is the case throughout MIT, undergraduate admissions are handled centrally, and graduate admissions are handled by each individual department or graduate program.”<strong> </strong></p> <p><strong>Advancing computing</strong></p> <p>Despite the unprecedented growth in computing, there remains substantial unmet demand for expertise. In academia, colleges and universities worldwide are faced with oversubscribed programs in computer science and the constant need to keep up with rapidly changing materials at both the graduate and undergraduate level.</p> <p>According to Huttenlocher, the computing fields are evolving at a pace today that is beyond the capabilities of current academic structures to handle. “As academics, we pride ourselves on being generators of new knowledge, but academic institutions themselves don’t change that quickly. The rise of AI is probably the biggest recent example of that, along with the fact that about 40 percent of MIT undergraduates are majoring in computer science, where we have 7 percent of the MIT faculty.”</p> <p>In order to help meet this demand, MIT is increasing its academic capacity in computing and AI with 50 new faculty positions — 25 will be core computing positions in CS, AI, and related areas, and 25 will be shared jointly with departments. Searches are now active to recruit core faculty in CS and AI+D, and for joint faculty with MIT Philosophy, the Department of Brain and Cognitive Sciences, and several interdisciplinary institutes.</p> <p>The new shared faculty searches will largely be conducted around the concept of “clusters” to build capacity at MIT in important computing areas that cut across disciplines, departments, and schools. Huttenlocher, the provost, and the five school deans will work to identify themes based on input from departments so that recruiting can be undertaken during the next academic year.</p> <p><strong>Cross-cutting collaborations in computing</strong></p> <p>Building on the history of strong faculty participation in interdepartmental labs, centers, and initiatives, the MIT Schwarzman College of Computing provides several forms of membership in the college based on cross-cutting research, teaching, or external engagement activities. While computing is affecting intellectual inquiry in almost every discipline, Huttenlocher is quick to stress that “it’s bi-directional.” He notes that existing collaborations across various schools and departments, such as MIT Digital Humanities, as well as opportunities for new such collaborations, are key to the college mission because in the same way that “computing is changing thinking in the disciplines; the disciplines are changing the way people do computing.”</p> <p>Under the leadership of Asu Ozdaglar, the deputy dean of academics and department head of EECS, the college is developing the Common Ground for Computing Education, an interdepartmental teaching collaborative that will facilitate the offering of computing classes and coordination of computing-related curricula across academic units.</p> <p>The objectives of this collaborative are to provide opportunities for faculty across departments to work together, including co-teaching classes, creating new undergraduate majors or minors such as in AI+D, as well as facilitating undergraduate blended degrees such as 6-14 (Computer Science, Economics, and Data Science), 6-9 (Computation and Cognition), 11-6 (Urban Science and Planning with Computer Science), 18-C (Mathematics with Computer Science), and others.</p> <p>“It is exciting to bring together different areas of computing with methodological and substantive commonalities as well as differences around one table,” says Ozdaglar. “MIT faculty want to collaborate in topics around computing, but they are increasingly overwhelmed with teaching assignments and other obligations. I think the college will enable the types of interactions that are needed to foster new ideas.”</p> <p>Thinking about the impact on the student experience, Ozdaglar expects that the college will help students better navigate the computing landscape at MIT by creating clearer paths. She also notes that many students have passions beyond computer science, but realize the need to be adept in computing techniques and methodologies in order to pursue other interests, whether it be political science, economics, or urban science. “The idea for the college is to educate students who are fluent in computation, but at the same time, creatively apply computing with the methods and questions of the domain they are mostly interested in.”</p> <p>For Deputy Dean of Research Daniela Rus, who is also the director of CSAIL and the Andrew and Erna Viterbi Professor in EECS, developing research programs “that bring together MIT faculty and students from different units to advance computing and to make the world better through computing” is a top priority. She points to the recent launch of the <a href="" target="_self">MIT Air Force AI Innovation Accelerator</a>, a collaboration between the MIT Schwarzman College of Computing and the U.S. Air Force focused on AI, as an example of the types of research projects the college can facilitate.</p> <p>“As humanity works to solve problems ranging from climate change to curing disease, removing inequality, ensuring sustainability, and eliminating poverty, computing opens the door to powerful new solutions,” says Rus. “And with the MIT Schwarzman College as our foundation, I believe MIT will be at the forefront of those solutions. Our scholars are laying theoretical foundations of computing and applying those foundations to big ideas in computing and across disciplines.”</p> <p><strong>Habits of mind and action</strong></p> <p>A critically important cross-cutting area is the Social and Ethical Responsibilities of Computing, which will facilitate the development of responsible “habits of mind and action” for those who create and deploy computing technologies, and the creation of technologies in the public interest.</p> <p>“The launch of the MIT Schwarzman College of Computing offers an extraordinary new opportunity for the MIT community to respond to today’s most consequential questions in ways that serve the common good,” says Melissa Nobles, professor of political science, the Kenan Sahin Dean of the MIT School of Humanities, Arts, and Social Sciences, and co-chair of the Task Force Working Group on Social Implications and Responsibilities of Computing.</p> <p>“As AI and other advanced technologies become ubiquitous in their influence and impact, touching nearly every aspect of life, we have increasingly seen the need to more consciously align powerful new technologies with core human values — integrating consideration of societal and ethical implications of new technologies into the earliest stages of their development. Asking, for example, of every new technology and tool: Who will benefit? What are the potential ecological and social costs? Will the new technology amplify or diminish human accomplishments in the realms of justice, democracy, and personal privacy?</p> <p>“As we shape the college, we are envisioning an MIT culture in which all of us are equipped and encouraged to think about such implications. In that endeavor, MIT’s humanistic disciplines will serve as deep resources for research, insight, and discernment. We also see an opportunity for advanced technologies to help solve political, economic, and social issues that trouble today’s world by integrating technology with a humanistic analysis of complex civilizational issues — among them climate change, the future of work, and poverty, issues that will yield only to collaborative problem-solving. It is not too much to say that human survival may rest on our ability to solve these problems via collective intelligence, designing approaches that call on the whole range of human knowledge.”</p> <p>Julie Shah, an associate professor in the Department of Aeronautics and Astronautics and head of the Interactive Robotics Group at CSAIL, who co-chaired the working group with Nobles and is now a member of the college leadership, adds that “traditional technologists aren’t trained to pause and envision the possible futures of how technology can and will be used. This means that we need to develop new ways of training our students and ourselves in forming new habits of mind and action so that we include these possible futures into our design.”</p> <p>The associate deans of Social and Ethical Responsibilities of Computing, Shah and David Kaiser, the Germeshausen Professor of the History of Science and professor of physics, are designing a systemic framework for SERC that will not only effect change in computing education and research at MIT, but one that will also inform policy and practice in government and industry. Activities that are currently in development include multi-disciplinary curricula embedded in traditional computing and AI courses across all levels of instruction, the commission and curation of a series of case studies that will be modular and available to all via MIT’s open access channels, active learning projects, cross-disciplinary monthly convenings, public forums, and more.&nbsp;</p> <p>“A lot of how we’ve been thinking about SERC components is building capacity with what we already have at the Institute as a very important first step. And that means how do we get people interacting in ways that can be a little bit different than what has been familiar, because I think there are a lot of shared goals among the MIT community, but the gears aren’t quite meshing yet. We want to further support collaborations that might cut across lines that otherwise might not have had much traffic between them,” notes Kaiser.</p> <p><strong>Just the beginning</strong></p> <p>While he’s excited by the progress made so far, Huttenlocher points out there will continue to be revisions made to the organizational structure of the college. “We are at the very beginning of the college, with a tremendous amount of excellence at MIT to build on, and with some clear needs and opportunities, but the landscape is changing rapidly and the college is very much a work in progress.”</p> <p>The college has other initiatives in the planning stages, such as the Center for Advanced Studies of Computing that will host fellows from inside and outside of MIT on semester- or year-long project-oriented programs in focused topic areas that could seed new research, scholarly, educational, or policy work. In addition, Huttenlocher is planning to launch a search for an assistant or associate dean of equity and inclusion, once the Institute Community and Equity Officer is in place, to focus on improving and creating programs and activities that will help broaden participation in computing classes and degree programs, increase the&nbsp;diversity&nbsp;of top faculty candidates in computing fields, and ensure that faculty search and graduate admissions processes have diverse slates of candidates and interviews.</p> <p>“The typical academic approach would be to wait until it’s clear what to do, but that would be a mistake. The way we’re going to learn is by trying and by being more flexible. That may be a more general attribute of the new era we’re living in, he says. “We don’t know what it’s going to look like years from now, but it’s going to be pretty different, and MIT is going to be shaping it.”</p> <p>The MIT Schwarzman College of Computing will be hosting a community forum on Wednesday, Feb. 12 at 2 p.m. in Room 10-250. Members from the MIT community are welcome to attend to learn more about the initial organizational structure of the college.</p> MIT Schwarzman College of Computing leadership team (left to right) David Kaiser, Daniela Rus, Dan Huttenlocher, Julie Shah, and Asu Ozdaglar Photo: Sarah BastilleMIT Schwarzman College of Computing, School of Engineering, Computer Science and Artificial Intelligence Laboratory (CSAIL), Laboratory for Information and Decision Systems (LIDS), Quest for Intelligence, Philosophy, Brain and cognitive sciences, Digital humanities, School of Humanities Arts and Social Sciences, Artificial intelligence, Operations research, Aeronautical and astronautical engineering, Electrical Engineering & Computer Science (eecs), IDSS, Ethics, Administration, Classes and programs At halfway point, SuperUROP scholars share their research results In a lively poster session, more than 100 undergraduates discuss their yearlong research projects on everything from machine learning to political geography. Wed, 29 Jan 2020 14:25:01 -0500 Kathryn O'Neill | Department of Electrical Engineering and Computer Science <p>MIT undergraduates are rolling up their sleeves to address major problems in the world, conducting research on topics ranging from nursing care to money laundering to the spread of misinformation about climate change — work highlighted at the most recent SuperUROP Showcase.</p> <p>The event, which took place on the Charles M. Vest Student Street in the Stata Center in December 2019, marked the halfway point in the Advanced Undergraduate Research Opportunities Program (better known as “SuperUROP”). The yearlong program gives MIT students firsthand experience in conducting research with close faculty mentorship. Many participants receive scholar titles recognizing the program’s industry sponsors, individual donors, and other contributors.</p> <p>This year, 102 students participated in SuperUROP, with many of their projects focused on applying computer science technologies, such as machine learning, to challenges in fields ranging from robotics to health care. Almost all presented posters of their work at the December showcase, explaining research to fellow students, faculty members, alumni, sponsors, and other guests.</p> <p>“Every year, this program gets more and more impressive,” says Anantha P. Chandrakasan, dean of the School of Engineering and Vannevar Bush Professor of Electrical Engineering and Computer Science. “What’s especially noteworthy is the incredible breadth of projects and how articulate students are in talking about their work. Their presentation skills seem pretty remarkable.”</p> <p>SuperUROP, administered by the Department of Electrical Engineering and Computer Science (EECS), includes a two-term course, 6.UAR (Undergraduate Advanced Research), designed to teach students research skills, including how to design an experiment and communicate results.</p> <p>“What’s different about SuperUROP [compared to other research opportunities offered to undergraduates] is the companion class that guides you through the necessary writing and speaking,” says Anis Ehsani, a senior majoring in EECS and mathematics, whose project centered on the geometry of drawing political districts. “If I want to pursue a research career, it’s nice to have those skills,” adds Ehsani, an MIT EECS/Nutanix SuperUROP scholar.</p> <p><strong>Beyond the lab and classroom</strong></p> <p>Participants present their work at showcases in the fall and spring, and they are expected to produce prototypes or publication-worthy results by the end of the year.</p> <p>“All these presentations help keep us on track with our projects,” says Weitung Chen, an EECS junior whose project focuses on automating excavation for mining applications. He explains that the inspiration for his SuperUROP work was a real-world problem he faced when trying to build a startup in automated food preparation. Scooping tofu, it turns out, is surprisingly difficult to automate. At the showcase, Chen — an MIT EECS/Angle SuperUROP scholar — explained that he is trying to create a simulation than can be used to train machines to scoop materials autonomously. “I feel really accomplished having this poster and presentation,” he said.</p> <p>Launched by EECS in 2012, SuperUROP has expanded across the Institute over the past several years.</p> <p>Adam Berinsky, the Mitsui Professor of Political Science, is working with SuperUROP students for the first time this year, an experience he’s enjoying. “What’s really cool is being able to give undergraduates firsthand experience in real research,” he says. He’s been able to tap students for the computer science skills he needs for his work, while providing them with a deep dive into the social sciences.</p> <p>Madeline Abrahams, an MIT/Tang Family FinTech SuperUROP scholar, says she especially appreciates the program’s flexibility: “I could explore my interdisciplinary interests,” she says. A computer science and engineering major who is also passionate about political science, Abrahams is working with Berinsky to investigate the spread of misinformation related to climate change via algorithmic aggregation platforms.</p> <p>Nicholas Bonaker also enjoyed the freedom of pursuing his SuperUROP project. “I’ve been able to take the research in the direction I want,” says Bonaker, a junior in EECS, who has developed a new algorithm he hopes will improve an assistive technology developed by his advisor, EECS Associate Professor Tamara Broderick.</p> <p><strong>Exploring new directions in health care</strong></p> <p>Bonaker said he particularly values the health-care focus of his project, which centers on creating better communications software for people living with severe motor impairments. “It feels like I’m doing something that can help people — using things I learned in class,” says Bonaker. He is among this year’s MIT EECS/CS+HASS SuperUROP scholars, whose projects combine computer science with the humanities, arts, or social sciences. &nbsp;</p> <p>Many of this year’s SuperUROP students are working on health-care applications. For example, Fatima Gunter-Rahman, a junior in EECS and biology, is examining Alzheimer’s data, and Sabrina Liu, an EECS junior and MIT EECS/Takeda SUperUROP scholar, is investigating noninvasive ways to monitor the heartrates of dental patients. Justin Lim, a senior math major, is using data analytics to try to determine the optimal treatment for chronic diseases like diabetes. “I like the feeling that my work would have real-world impact,” says Lim, an MIT EECS/Hewlett Foundation SuperUROP scholar. “It’s been very satisfying.”</p> <p>Dhamanpreet Kaur, a junior majoring in math and computer science and molecular biology, is using machine learning to determine the characteristics of patients who are readmitted to hospitals following their discharge to skilled nursing facilities. The work aims to predict who might benefit most from expensive telehealth systems that enable clinicians to monitor patients remotely. The project has given Kaur the chance to work with a multidisciplinary team of professors and doctors. “I find that aspect fascinating,” says Kaur, also an MIT EECS/Takeda SuperUROP scholar.</p> <p>As attendees bustled through the two-hour December showcase, some of the most enthusiastic visitors were industry sponsors, including Larry Bair ’84, SM ’86, a director at Advanced Micro Devices. “I’m always amazed at what undergraduates are doing,” he says, noting that his company has been sponsoring SuperUROPs for the last few years.</p> <p>“It’s always interesting to see what’s going on at MIT,” says Tom O’Dwyer, an MIT research affiliate and the former director of technology at Analog Devices, another industry sponsor. O’Dwyer notes that supporting SuperUROP can help companies with recruitment. “The whole high-tech business runs on smart people,” he says. “SuperUROPs can lead to internships and employment.”</p> <p>SuperUROP also exposes students to the work of academia, which can underscore a key difference between classwork and research: Research results are unpredictable.</p> <p>Junior math major Lior Hirschfeld, for example, compared the effectiveness of different machine learning methods used to test molecules for potential in drug development. “None of them performed exceptionally well,” he says.</p> <p>That might appear to be a poor result, but Hirschfeld notes that it’s important information for those who are using and trusting those tests today. “It shows you may not always know where you are going when you start a project,” says Hirschfeld, also an MIT EECS/Takeda SuperUROP scholar.</p> <p>EECS senior Kenneth Acquah had a similar experience with his SuperUROP project, which focuses on finding a technological way to combat money laundering with Bitcoin. “We’ve tried a bunch of things but mostly found out what doesn’t work,” he says.</p> <p>Still, Acquah says, he values the SuperUROP experience, including the chance to work in MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL). "I get a lot more supervision, more one-on-one time with my mentor," the MIT/EECS Tang Family FinTech SuperUROP scholar says. "And working in CSAIL has given me access to state-of-the-art materials."</p> Madeline Abrahams, an EECS senior and MIT/Tang Family FinTech SuperUROP scholar, presents her work investigating the spread of misinformation related to climate change via algorithmic aggregation platforms at the SuperUROP Showcase. Photo: Gretchen ErtlElectrical engineering and computer science (EECS), School of Engineering, SuperUROP, Political science, School of Humanities Arts and Social Sciences, Computer Science and Artificial Intelligence Laboratory (CSAIL), Aeronautical and astronautical engineering, Chemical engineering, Civil and environmental engineering, Urban studies and planning, School of Architecture and Planning, Students, Research, Undergraduate, Classes and programs, Special events and guest speakers Accelerating the pace of engineering The 2019-20 School of Engineering MathWorks Fellows are using MATLAB and Simulink to advance discovery and innovation across disciplines. Tue, 28 Jan 2020 17:00:01 -0500 Lori LoTurco | School of Engineering <p>Founded in 1984 by Jack Little ’78 and Cleve Moler, MathWorks was built on the premise of providing engineers and scientists with more powerful and productive computation environments. In 1985, the company sold its very first order&nbsp;— 10 copies of its first product, MATLAB — to MIT.</p> <p>Decades later, engineers across MIT and around the world consistently rely on MathWorks products to accelerate the pace of discovery, innovation, and development in automotive, aerospace, electronics, biotech-pharmaceutical, and other industries.&nbsp;MathWorks’ products and support have had a significant impact on <em>MITx,</em> OpenCourseWare, and MIT’s digital learning efforts across campus, including the Department of Mathematics, one of the School of Engineering’s closest collaborators in the use of digital learning tools and educational technologies.</p> <p>“We have a strong belief in the importance of engineers and scientists,” says Little. “They act to increase human knowledge and profoundly improve our standard of living. We create products like MATLAB and Simulink to help them do their best work.”</p> <p>As the language of technical computing, MATLAB is a programming environment for algorithm development, data analysis, visualization, and numeric computation. It is used extensively by faculty, students, and researchers across MIT and by over 4 million users in industry, government, and academia in 185 countries.</p> <p>Simulink is a block diagram environment for simulation and model-based design of multidomain and embedded engineering systems, including automatic code generation, verification, and validation. It is used heavily in automotive, aerospace, and other applications that design complex real-time systems.</p> <p>This past summer, MathWorks celebrated 35 years of accelerating the pace of engineering and science. Shortly following this milestone, MathWorks awarded 11 engineering fellowships to graduate students within the School of Engineering who are active users of MATLAB or Simulink. The fellows are using the programs to advance discovery and innovation across disciplines.</p> <p>“PhD fellowships are an investment in the world’s long-term future, and there are few investments more valuable than that,” says Little.</p> <p>The 2019-20 MathWorks fellows are:</p> <p><a href="">Pasquale Antonante</a> is a PhD student in the Department of Aeronautics and Astronautics. He uses MATLAB and Simulink to build tools that make robots more accurate.</p> <p><a href="">Alireza Fallah</a> is a PhD student in the Department of Electrical Engineering and Computer Science. He uses Matlab and Symbolic Math Toolbox to develop better machine-learning algorithms.</p> <p><a href="">James Gabbard</a> is a SM/PhD student in the Department of Mechanical Engineering. He uses MATLAB to model fluids and materials.</p> <p><a href="">Nicolas Meirhaeghe</a><strong> </strong>is a PhD student in medical engineering and medical physics in the Bioastronautics Training Program at Harvard-MIT Division of Health Sciences and Technology. He uses MATLAB to visualize activity in the brain and understand how it is related to an individual’s behavior.</p> <p><a href="">Caroline Nielsen</a> is a PhD student in the Department of Chemical Engineering. She uses MATLAB to implement and test new applications of non-smooth analysis. She also intends to use MATLAB to in the next phase of her research, developing methods to simultaneously optimize for minimal resource use and operating costs.</p> <p><a href="">Bauyrzhan Primkulov</a><strong> </strong>is a PhD student in the Department of Civil and Environmental Engineering. He uses MATLAB to build computational models and explore how fluids interact in porous materials.</p> <p><a href="">Kate Reidy</a><strong> </strong>is a PhD student in the Department of Materials Science and Engineering. She studies how 2D materials — only a single atom thick — can be combined with 3D materials, and uses MATLAB to analyze the properties of different materials.</p> <p><a href="">Isabelle Su</a><strong> </strong>is a PhD student in civil and environmental engineering. She builds computational models with MATLAB to understand the mechanical properties of spider webs.</p> <p><a href="">Joy Zeng</a><strong> </strong>is a PhD student in chemical engineering. Her research is focused on the electrochemical transformation of carbon dioxide to fuels and commodity chemicals. She uses MATLAB to model chemical reactions.</p> <p><a href="">Benjamin "Jiahong" Zhang</a><strong> </strong>is a PhD student in computational science and engineering. He uses MATLAB to prototype new methods for rare event simulation, finding new methods by leveraging mathematical principles used in proofs and re-purposing them for computation.</p> <p><a href="">Paul Zhang</a><strong> </strong>is a PhD student in electrical engineering and computer science. He uses MATLAB to develop algorithms with applications in meshing — the use of simple shapes to study complex ones.</p> <p>For MathWorks, fostering engineering education is a priority, so when deciding where to focus philanthropic support, MIT — its very first customer — was an obvious choice.</p> <p>“We are so humbled by MathWorks' generosity, and their continued support of our engineering students through these fellowships,” says Anantha Chandrakasan, dean of the School of Engineering. “Our relationship with MathWorks is one that we revere — they have developed products that foster research and advancement across many disciplines, and through their support our students launch discoveries and innovation that align with MathWorks’ mission.”</p> MathWorks fellows with Anantha Chandrakasan (back row, center), dean of the MIT School of Engineering. Not pictured: Fellows Pasquale Antonante, Alireza Fallah, and Kate Reidy.Photo: David DegnerSchool of Engineering, MITx, OpenCourseWare, Mathematics, Electrical engineering and computer science (EECS), Mechanical engineering, Chemical engineering, Civil and environmental engineering, Awards, honors and fellowships, Harvard-MIT Health Sciences and Technology, Alumni/ae, Startups, Aeronautical and astronautical engineering, DMSE, Computer science and technology, School of Science Three from MIT graduate from NASA astronaut training Chari, Hoburg, and Moghbeli, all with ties to the Department of Aeronautics and Astronautics, among the first class to graduate under agency’s Artemis program. Wed, 22 Jan 2020 17:00:01 -0500 Sara Cody | Department of Aeronautics and Astronautics <p>On Friday, Jan. 10, the newest class of astronauts graduated from basic training at NASA’s Johnson Space Center in Houston. During the graduation ceremony, Warren “Woody” Hoburg ’08 couldn’t help but reflect on the gratitude towards everyone who supported his dream of becoming an astronaut, including his parents, who were in the audience.</p> <p>“Whether I was skydiving or building 20-foot tall model rockets in their garage, my parents have always given me the freedom to explore my interests,” said Hoburg, a research affiliate in the Department of Aeronautics and Astronautics (AeroAstro) at MIT. “I was so happy to have them by my side celebrating graduation along with my NASA classmates, instructors, trainers and mentors, who were all instrumental in getting me here.”</p> <p>At the ceremony, Hoburg — along with fellow MIT AeroAstro alumni Raja Chari SM '01 and Jasmin Moghbeli '05, eight other NASA candidates, and two Canadian Space Agency (CSA) candidates — was among the first class of astronauts under the Artemis program to receive an astronaut pin, marking their graduation from basic training and their eligibility to be selected for upcoming missions to space.</p> <p>“Artemis is a bold new vision in space exploration uniting the international community. In addition to expeditions to the International Space Station, these astronauts could one day walk on the moon as a part of the Artemis program and perhaps one of them could be among the first humans to walk on Mars,” said NASA Administrator Jim Bridenstine in his opening remarks. “Their trailblazing triumphs will transform humanity’s presence in our solar system and forever change life here on Earth. In short, they represent the best of humanity and our most fervent hopes for the future — no pressure.”</p> <div class="cms-placeholder-content-video"></div> <p>The graduation ceremony is a culmination of two years of rigorous training in spacewalking, robotics, International Space Station systems, T-38 jet proficiency, and Russian language. The Neutral Buoyancy Laboratory features a full-scale mockup of modules from the International Space Station (ISS) and other space agency vehicles in an indoor pool that is 40 feet deep. Astronaut candidates, dressed in space suits, perform tasks underwater to simulate working and maneuvering in a weightless environment.</p> <p>“Everyone comes to basic training with experiences that are relevant to being an astronaut, like pilots, engineers, and scientists, but no one has ever worn a space suit before, which is a significant part of our training,” said Hoburg. “I still remember my first run in the [Neutral] Buoyancy Lab, getting lowered into the pool in the suit and seeing my first view of the [International] Space Station through my helmet visor. It was an amazing and surreal experience.”</p> <p>Hoburg and his classmates, who were <a href="" target="_self">selected in 2017</a> from a record-setting pool of more than 18,000 applicants, will help develop spacecraft, support the teams currently in space and ultimately join the ranks of only about 500 people who have gone into space. The Artemis program is a particularly exciting new initiative that aims to send the first woman and next man to the moon by 2024.</p> <p>“The Artemis program will take a different approach than Apollo by setting up a sustainable established presence on the moon that is similar to that of the ISS,” said Hoburg. “From my perspective as an engineer, I am most excited about building and fixing hardware in space. We need to develop a range of technologies to get us to Mars, and the moon is a perfect training ground for that. If we need to develop architectures and systems that enable us to use the resources of another planetary body to support the mission in a way that our lives depend on, then why not do it first in a place where we have the option to come home?”</p> <p>Post-graduation, Hoburg will complete flight training in Corpus Christi, Texas, and will return to Johnson Space Center to support current and future exploration missions. Hoburg, along with classmates Chari and Moghbeli, bring up the total number of MIT astronaut alumni to 41, with 17 of them coming from MIT AeroAstro.</p> NASA's newest astronauts participate in a graduation ceremony at the Johnson Space Center in Houston. Left to right: NASA astronaut Jonny Kim, Canadian Space Agency (CSA) astronaut Joshua Kutryk, NASA astronaut Jessica Watkins, CSA astronaut Jennifer Sidey-Gibbons, NASA astronauts Frank Rubio, Kayla Barron, Jasmin Moghbeli, Loral O'Hara, Zena Cardman, Raja Chari, Matthew Dominick, Bob Hines, and Warren Hoburg. Photo: James Blair/NASAAeronautical and astronautical engineering, NASA, Alumni/ae, Staff, Space exploration, Moon, School of Engineering, Space, astronomy and planetary science A new approach to making airplane parts, minus the massive infrastructure Carbon nanotube film produces aerospace-grade composites with no need for huge ovens or autoclaves. Mon, 13 Jan 2020 00:00:00 -0500 Jennifer Chu | MIT News Office <p>A modern airplane’s fuselage is made from multiple sheets of different composite materials, like so many layers in a phyllo-dough pastry. Once these layers are stacked and molded into the shape of a fuselage, the structures are wheeled into warehouse-sized ovens and autoclaves, where the layers fuse together to form a resilient, aerodynamic shell.</p> <p>Now MIT engineers have developed a method to produce aerospace-grade composites without the enormous ovens and pressure vessels. The technique may help to speed up the manufacturing of airplanes and other large, high-performance composite structures, such as blades for wind turbines.</p> <p>The researchers detail their new method in a paper published today in the journal <em>Advanced Materials Interfaces. </em></p> <p>“If you’re making a primary structure like a fuselage or wing, you need to build a pressure vessel, or autoclave, the size of a two- or three-story building, which itself requires time and money to pressurize,” says Brian Wardle, professor of aeronautics and astronautics at MIT. “These things are massive pieces of infrastructure. Now we can make primary structure materials without autoclave pressure, so we can get rid of all that infrastructure.”</p> <p>Wardle’s co-authors on the paper are lead author and MIT postdoc Jeonyoon Lee, and Seth Kessler of Metis Design Corporation, an aerospace structural health monitoring company based in Boston.</p> <p><strong>Out of the oven, into a blanket</strong></p> <p>In 2015, Lee led the team, along with another member of Wardle’s lab, in creating a method to make aerospace-grade composites without requiring an oven to fuse the materials together. Instead of placing layers of material inside an oven to cure, the researchers essentially wrapped them in an ultrathin film of carbon nanotubes (CNTs). When they applied an electric current to the film, the CNTs, like a nanoscale electric blanket, quickly generated heat, causing the materials within to cure and fuse together.</p> <p>With this out-of-oven, or OoO, technique, the team was able to produce composites as strong as the materials made in conventional airplane manufacturing ovens, using only 1 percent of the energy.</p> <p>The researchers next looked for ways to make high-performance composites without the use of large, high-pressure autoclaves — building-sized vessels that generate high enough pressures to press materials together, squeezing out any voids, or air pockets, at their interface.</p> <p>“There’s microscopic surface roughness on each ply of a material, and when you put two plys together, air gets trapped between the rough areas, which is the primary source of voids and weakness in a composite,” Wardle says. “An autoclave can push those voids to the edges and get rid of them.”</p> <p>Researchers including Wardle’s group have explored “out-of-autoclave,” or OoA, techniques to manufacture composites without using the huge machines. But most of these techniques have produced composites where nearly 1 percent of the material contains voids, which can compromise a material’s strength and lifetime. In comparison, aerospace-grade composites made in autoclaves are of such high quality that any voids they contain are neglible and not easily measured.</p> <p>“The problem with these OoA approaches is also that the materials have been specially formulated, and none are qualified for primary structures such as wings and fuselages,” Wardle says. “They’re making some inroads in secondary structures, such as flaps and doors, but they still get voids.”</p> <p><strong>Straw pressure</strong></p> <p>Part of Wardle’s work focuses on developing nanoporous networks — ultrathin films made from aligned, microscopic material such as carbon nanotubes, that can be engineered with exceptional properties, including color, strength, and electrical capacity. The researchers wondered whether these nanoporous films could be used in place of giant autoclaves to squeeze out voids between two material layers, as unlikely as that may seem.</p> <p>A thin film of carbon nanotubes is somewhat like a dense forest of trees, and the spaces between the trees can function like thin nanoscale tubes, or capillaries. A capillary such as a straw can generate pressure based on its geometry and its surface energy, or the material’s ability to attract liquids or other materials.&nbsp;</p> <p>The researchers proposed that if a thin film of carbon nanotubes were sandwiched between two materials, then, as the materials were heated and softened, the capillaries between the carbon nanotubes should have a surface energy and geometry such that they would draw the materials in toward each other, rather than leaving a void between them. Lee calculated that the capillary pressure should be larger than the pressure applied by the autoclaves.</p> <p>The researchers tested their idea in the lab by growing films of vertically aligned carbon nanotubes using a technique they previously developed, then laying the films between layers of materials that are typically used in the autoclave-based manufacturing of primary aircraft structures. They wrapped the layers in a second film of carbon nanotubes, which they applied an electric current to to heat it up. They observed that as the materials heated and softened in response, they were pulled into the capillaries of the intermediate CNT film.</p> <p>The resulting composite lacked voids, similar to aerospace-grade composites that are produced in an autoclave. The researchers subjected the composites to strength tests, attempting to push the layers apart, the idea being that voids, if present, would allow the layers to separate more easily.</p> <p>“In these tests, we found that our out-of-autoclave composite was just as strong as the gold-standard autoclave process composite used for primary aerospace structures,” Wardle says.</p> <p>The team will next look for ways to scale up the pressure-generating CNT film. In their experiments, they worked with samples measuring several centimeters wide — large enough to demonstrate that nanoporous networks can pressurize materials and prevent voids from forming. To make this process viable for manufacturing entire wings and fuselages, researchers will have to find ways to manufacture CNT and other nanoporous films at a much larger scale.</p> <p>“There are ways to make really large blankets of this stuff, and there’s continuous production of sheets, yarns, and rolls of material that can be incorporated in the process,” Wardle says.</p> <p>He plans also to explore different formulations of nanoporous films, engineering capillaries of varying surface energies and geometries, to be able to pressurize and bond other high-performance materials.</p> <p>“Now we have this new material solution that can provide on-demand pressure where you need it,” Wardle says. “Beyond airplanes, most of the composite production in the world is composite pipes, for water, gas, oil, all the things that go in and out of our lives. This could make making all those things, without the oven and autoclave infrastructure.”</p> <p>This research was supported, in part, by Airbus, ANSYS, Embraer, Lockheed Martin, Saab AB, Saertex, and Teijin Carbon America through MIT’s Nano-Engineered Composite aerospace Structures (NECST) Consortium.</p> MIT postdoc Jeonyoon LeeImage: Melanie Gonick, MITAeronautical and astronautical engineering, Carbon nanotubes, Manufacturing, Materials Science and Engineering, Research, School of Engineering, Nanoscience and nanotechnology There’s excitement in the air for Humberto Caldelas The AeroAstro major’s childhood love of airplanes and space travel has led to lofty career ambitions. Wed, 04 Dec 2019 00:00:00 -0500 Shafaq Patel | MIT correspondent <p>When Humberto Caldelas II was growing up, his dad took him to all the nearest air shows so he could see all the planes.&nbsp;And when he learned to drive, he joked with his parents that he shouldn’t drive near the airport because he would get distracted. He always looks up at the sky when he hears airplanes pass.&nbsp;</p> <p>“I can't even tell you the first time I got interested in airplanes,” he says. “I think I just was born with it.”</p> <p>Caldelas is an MIT senior majoring in aeronautics and astronautics, but he came into the university thinking he’d go into nuclear science and engineering.&nbsp;He used to think of his love of flying as a hobby but not a profession — that is, until his friends convinced him to take a tour of the MIT’s Department of Aeronautics and Astronautics (AeroAstro). During his tour, he learned of a semiserious requirement for every professor candidate. As the rumor goes, after the technical interviews, the candidate is taken outside; if a plane flies overhead and the candidate doesn’t look up, they don’t get the job.</p> <p>As soon as Caldelas heard this, he knew AeroAstro would be his home.&nbsp;</p> <p>“I was like, ‘If that's the passion here in the department, then that's where I should be.’ And I haven't regretted that decision since,” he says. “It's really been so much fun. It feels like a home just because I can nerd out with people about all the airplane and space things.”</p> <p>Through his major, Caldelas has focused on both air and space travel, and hopes his career will go in both directions. Caldelas has been involved with the Reserve Officers’ Training Corps (ROTC) during his four years at MIT and after graduation will join the Navy as a naval aviator. After serving for his country and working with airplanes, he then hopes to become an astronaut.</p> <p><strong>The flying bug</strong></p> <p>Caldelas is the kind of person to arrive at the airport well before his flight, just so he can see planes take off. And when he’s on the airplane, he loves sitting in a seat where he can look out the window and watch the engine function.&nbsp;&nbsp;</p> <p>“Every time I fly, I get the chills,” he says. “There's a quote that goes ‘with understanding comes appreciation, and with appreciation comes respect.’ So after studying how a jet engine works, how hard it is to design it, how hard it is to build it, it makes [an airplane] even more incredible.”&nbsp;</p> <p>The aeronautics part of his MIT education gave Caldelas a background on the theory and mechanics of airplane flight. Through his classes, he’s learned about the physics of flying, experimented by making foam airplanes, and tested equipment through wind tunnels.&nbsp;</p> <p>Over the past two summers, Caldelas interned at Boeing, gaining hands-on experience with the 737 and P-8A Poseidon aircraft. He also got to see how understanding the mechanics of an airplane will help him when he is a pilot.&nbsp;</p> <p>For example, when they were testing some iterations of the new 777X, one of the test pilots — who had both flying experience and and understood what was going on inside the plane — easily identified an issue with the plane because she was in tune with how an airplane is constructed. Caldelas aspires to do exactly that.</p> <p>After graduating, he wants to commission as an officer in the Navy and be a fighter pilot. During his first year of high school, Caldelas enrolled in the Civil Air Patrol, which is affiliated with the U.S. Air Force. He flew an airplane for the first time and has never gotten over that thrill. Throughout his time at MIT, he’s been involved with Naval ROTC and often wears the classic “summer whites” uniform with the gold buttons; this semester, he is the company commander of his unit.</p> <p>After Navy training post-college, he hopes to go to U.S. Naval Test Pilot School. Caldelas says test pilots know how to fly and have a technical understanding of airplanes, which helps them communicate with the engineers on what they need to tweak.</p> <p><strong>From white uniform to white space suit</strong></p> <p>The AeroAstro hallway displays photos of many illustrious alumni of the department, including a number of astronauts — a group Caldelas ultimately hopes to join.</p> <p>His fascination with astronauts began early: When he was 4 years old, his family went to NASA’s Kennedy Space Center.&nbsp;</p> <p>“I was just barely walking, and this astronaut comes up, and I was like wow, ‘I want to be him,’” he says.&nbsp;</p> <p>The admiration with astronauts skyrocketed as he grew up. When MIT was celebrating the 50th anniversary of the Apollo 11 mission, Caldelas received an email from the department asking for students to help escort astronauts around the events. Immediately, he filled out the form — if there is an opportunity to meet an astronaut, Caldelas is there.&nbsp;</p> <p>Caldelas was assigned to Mark Lee, a former Air Force Colonel and <a href="" title="NASA">NASA</a> <a href="" title="Astronaut">astronaut</a> who flew on four <a href="" title="Space Shuttle">Space Shuttle</a> missions. When Caldelas was showing Lee around, Lee stopped in the middle of the hallway of photographs and nonchalantly said “that’s me,” pointing to a large photograph of a man in a white space suit with Earth in the background. Starstruck, Caldelas looked at the frame and saw the name “Mark Lee” on it. He immediately asked for a photograph of the two of them with the historic image in the background.&nbsp;</p> <p>“I walk past this photo everyday. Who else can say they met the astronaut in a famous photograph?” Caldelas says. “Only at MIT does that happen.”</p> <p>Throughout the tour of the department, Caldelas kept saying how he can’t believe he is in the same space as so many MIT legends. A national Hispanic Scholarship Fund recipient, Caldelas is also a first-generation American, one of the first Hispanic students to be accepted into the engineering program at his high school, and the first person to get into MIT from his New Jersey high school.</p> <p>He’s constantly grateful for his opportunities and hopes to inspire the next generation, just as the MIT astronauts and their photographs inspired him.&nbsp;</p> <p>“You don’t have to be perfect to go to this school, you just have to have the passion, and that motivates people,” he says. “It’s really humbling for me live out my dreams to come to MIT. And I want to honor this opportunity by inspiring others to keep going and reach for their dreams.”</p> “I can't even tell you the first time I got interested in airplanes,” says AeroAstro major Humberto Caldelas. “I think I just was born with it.”Image: Bryce VickmarkUndergraduate, Profile, Aeronautical and astronautical engineering, School of Engineering, ROTC, Students Designing humanity’s future in space The Space Exploration Initiative’s latest research flight explores work and play in microgravity. Tue, 26 Nov 2019 15:20:01 -0500 Janine Liberty | MIT Media Lab <p>How will dancers perform in space? How will scientists do lab experiments without work tables? How will artists pursue crafting in microgravity? How can exercise, gastronomy, research, and other uniquely human endeavors be reimagined for the unique environment of space? These are the questions that drove the <a href="">14 projects</a> aboard the MIT Media Lab Space Exploration Initiative’s second parabolic research flight.</p> <p>Just past the 50th anniversary of the Apollo moon landing, humanity’s life in space isn’t so very far away. Virgin Galactic just opened its spaceport with the goal of launching space tourists into orbit within months, not years; Blue Origin’s New Shepard rocket is gearing up to carry its first human cargo to the edge of space, with New Glenn and a moon mission not far behind. We are nearing a future where trained, professional astronauts aren’t the only people who will regularly leave Earth. The new Space Age will reach beyond the technical and scientific achievements of getting people into space and keeping them alive there; the next frontier is bringing our creativity, our values, our personal pursuits and hobbies with us, and letting them evolve into a new culture unique to off-planet life.&nbsp;</p> <p>But unlike the world of Star Trek, there’s no artificial gravity capability in sight. Any time spent in space will, for the foreseeable future, mean life without weight, and without the rules of gravity that govern every aspect of life on the ground. Through its annual parabolic flight charter with the ZERO-G Research Program, the Space Exploration Initiative (SEI) is actively anticipating and solving for the challenges of microgravity.</p> <p><strong>Space for everyone</strong></p> <p>SEI’s first zero-gravity flight, in 2017, set a high bar for the <a href="">caliber of the projects</a>, but it was also a learning experience in doing research in 20-second bursts of microgravity. In preparation for an annual research flight, SEI founder and lead Ariel Ekblaw organized MIT's first graduate course for parabolic flights (<a href="">Prototyping Our Sci-Fi Space Future: Zero Gravity Flight Class</a>) with the goal of preparing researchers for the realities of parabolic flights, from the rigors of the preflight test readiness review inspections to project hardware considerations and mid-flight adjustments.</p> <p>The class also served to take some of the intimidation factor out of the prospect of space research and focused on democratizing access to microgravity testbed environments.&nbsp;</p> <p>“The addition of the course helped us build bridges across other departments at MIT and take the time to document and open-source our mentorship process for robust, creative, and rigorous experiments,” says Ekblaw.</p> <p>SEI’s mission of democratizing access to space is broad: It extends to actively recruiting researchers, artists, and designers, whose work isn’t usually associated with space, as well as ensuring that the traditional engineering and hard sciences of space research are open to people of all genders, nationalities, and identities. This proactive openness was manifest in every aspect of this year’s microgravity flight.&nbsp;</p> <p>While incubated in the Media Lab, the Space Exploration Initiative now supports research across MIT. Paula do Vale Pereira, a grad student in MIT's Department of Aeronautics and Astronautics (AeroAsto), was on board to test out automated actuators for <a href="">CubeSats</a>. Tim McGrath and Jeremy Stroming, also from AeroAstro, built an <a href="">erg machine</a> specially designed for exercise in microgravity. Chris Carr and Maria Zuber, of the Department of Earth, Atmospheric and Planetary Sciences, flew to test out the latest iteration of their <a href="">Electronic Life-detection Instrument</a> (ELI) research.</p> <p>Research specialist Maggie Coblentz is pursuing her fascination with food in space — including the world’s first <a href="">molecular gastronomy experiment</a> in microgravity. She also custom-made an astronaut’s helmet specially designed to accommodate a multi-course tasting menu, allowing her to experiment with different textures and techniques to make both food and eating more enjoyable on long space flights.&nbsp;</p> <p>“The function of food is not simply to provide nourishment — it’s a key creature comfort in spaceflight and will play an even more significant role on long-duration space travel and future life in space habitats. I hope to uncover new food cultures and food preparation techniques by evoking the imagination and sense of play in space, Willy Wonka style,” says Coblentz.</p> <p>With <a href="">Sensory Synchrony</a>, a project supported by NASA's <span class="st">Translational Research Institute for Space Health</span>, Abhi Jain and fellow researchers in the Media Lab's Fluid Interfaces group investigated vestibular neuromodulation techniques for mitigating the effects of motion sickness caused by the sensory mismatch in microgravity. The team will iterate on the data from this flight to consider possibilities for novel experiences using augmented and virtual reality in microgravity environments.</p> <p>The Space Enabled research group is testing how paraffin wax behaves as a liquid in microgravity, exploring it as an affordable, accessible alternative satellite fuel. Their microgravity experiment, run by Juliet Wanyiri, aimed to determine the speed threshold, and corresponding voltage, needed for the wax to form into a shape called an annulus, which is one of the preferred geometric shapes to store satellite fuel. “This will help us understand what design might be appropriate to use wax as a satellite fuel for an on-orbit mission in the future,” explains Wanyiri.</p> <p>Xin Liu flew for the second time this year, with a new project that continues her explorations into the relationship between <a href="">couture</a>, <a href="">movement</a>, and self-expression when an artist is released from the constraints of gravity. This year’s project, <a href="">Mollastica</a>, is a mollusk-inspired costume designed to swell and float in microgravity. Liu also motion-captured a body performance to be rendered later for a “deep-sea-to-deep-space” video work.</p> <p><strong>The human experience</strong></p> <p>The extraordinary range of fields, goals, projects, and people represented on this year’s microgravity flight speaks to the unique role the Space Exploration Initiative is already starting to play in the future of space.&nbsp;</p> <p>For designer and researcher Alexis Hope, the flight offered the opportunity to discover how weightlessness affects the creative process — how it changes not only the art, but also the artist. Her project, <a href="">Space/Craft</a>, was an experiment in zero-g sculpture: exploring the artistic processes and possibilities enabled by microgravity by using a hot glue gun to "draw in 3D."</p> <p>Like all of the researchers aboard the flight, Hope found the experience both challenging and inspiring. Her key takeaway, she says, is excitement for all the unexplored possibilities of art, crafting, and creativity in space.</p> <p>“Humans always find a way to express themselves creatively, and I expect no different in a zero-gravity environment,” she says. “I’m excited for new materials that will behave in interesting ways in a zero-gravity environment, and curious about how those new materials might inspire future artists to create novel structures, forms, and physical expressions.”</p> <p>Ekblaw herself spent the flight testing out the latest iteration of <a href="">TESSERAE</a>, her self-assembling space architecture prototype. The research has matured extensively over the last year and a half, including a recent <a href="">suborbital test flight</a> with Blue Origin and an upcoming International Space Station mission to take place in early 2020.&nbsp;</p> <p>All of the research projects from this year’s flight — as well as some early results, the projects from the Blue Origin flight, and the early prototypes for the ISS mission — were on display at a recent SEI open house at the Media Lab.&nbsp;</p> <p>For Ekblaw, the great challenge and the great opportunity in these recurring research flights is helping researchers to keep their projects and goals realistic in the moment, while keeping SEI’s gaze firmly fixed on the future.&nbsp;</p> <p>“While parabolic flights are already a remarkable experience, this year was particularly meaningful for us. We had the immense privilege of finalizing our pre-flight testing over the exact days when Neil Armstrong, Buzz Aldrin, and Mike Collins were in microgravity on their way to the moon,” she says. “This 50th anniversary of Apollo 11 reminds us that the next 50 years of interplanetary civilization beckons. We are all now part of this — designing, building, and testing artifacts for our human, lived experience of space.”</p> <div></div> Chris Carr and Maria Zuber of the MIT Department of Earth, Atmospheric and Planetary Sciences have a little fun while monitoring their life-detection data experiment in microgravity.Photo: Steve Boxall/ZERO-GMedia Lab, EAPS, School of Architecture and Planning, Space, astronomy and planetary science, Aeronautical and astronautical engineering, Comparative Media Studies/Writing, School of Science, School of Engineering, Arts, Technology and society, School of Humanities Arts and Social Sciences Interdisciplinary team takes top prize in Mars colony design competition MIT PhD student George Lordos and his brother Alexandros led the project; goal of the Mars Society competition was to establish a colony on Mars for 1,000 residents. Mon, 25 Nov 2019 12:15:01 -0500 Sara Cody | Department of Aeronautics and Astronautics <p>Every 75 years, Halley’s Comet makes a triumphant return to the inner solar system, becoming visible to the naked eye from the Earth’s surface as it streaks across the night sky. In 1986, brothers George and Alexandros Lordos, who helped found the astronomy club at their high school in Cyprus, decided they were not going to miss this once-in-a-lifetime opportunity despite the cloudy weather.</p> <p>“Together with friends, we borrowed camping supplies from the hiking club and hiked up familiar terrain on Troodos Mountain to a cloudless spot that was 5,000 feet above sea level, miles away from city lights” says George Lordos, MBA ’00, SM ’18. “When we unzipped our tent at 3 o’clock in the morning, Halley’s comet was right in front of us, in all its glory. It was like seeing a ghost ship floating on a sea of stars.”</p> <p>Recently, the brothers again combined their shared passion with their professional expertise to team up and develop <a href="" target="_blank">Star City</a>, a concept for a human city on Mars. Their design won first place at the Mars Colony Prize Design contest, which was hosted by the Mars Society and judged by a panel that included experts from NASA and SpaceX.</p> <p>Today, Lordos is a PhD candidate in the Engineering Systems Laboratory at MIT’s Department of Aeronautics and Astronautics and the head teaching assistant at MIT’s System Design and Management Program, researching sustainable human space settlement architectures with professors Olivier de Weck and Jeffrey Hoffman. His brother, Alexandros Lordos, is currently the director of the Center for the Study of Life Skills and Resilience at the Department of Psychology at the University of Cyprus, and head of learning and innovation at the Center for Sustainable Peace and Democratic Development, researching the development of integrated systems to foster mental health and social cohesion in countries facing conflict-related adversities.</p> <p>“In addition to addressing the engineering requirements to put humans on Mars, the overall philosophy of our approach was to provide the residents with a diverse array of capabilities, rather than ready-made solutions, relying on the human capacity to be resourceful and resilient in addressing the many unknown challenges that will arise,” says Lordos. “This ensures not only their survival, but also that their well-being, agency, and capacity to grow will be duly considered so they may thrive there as well.”</p> <p>The goal of the competition was to establish a successful colony on Mars for 1,000 residents. One hundred entrants from around the world submitted proposals, which were eventually narrowed to 10 finalists who presented their proposals at the 22nd Annual Mars Society Convention in October. The criteria for the judges’ consideration included technical merit, economic viability, social and political organization, and aesthetics.</p> <p>Using abundant energy supplies and heavy equipment, Star City’s residents will first focus on carving out habitats by tunneling inside a crater rim to create networks of living and work spaces. By working with the natural topography of Mars, the residents will be able to develop large habitable spaces that will be safe from radiation and other dangers. At the same time, the excavated material will be mined for water and useful minerals that can then support local industry and the growth of self-sustaining crops through hydroponics. From there, they would continue to build around the crater rim to create residential and commercial areas that contain shops, restaurants, and libraries, eventually pooling their resources to develop the city’s central hub, which will house Mars University and other shared facilities.</p> <p>“The idea is to start with five distinct villages that will be constructed around the crater rim, each aiming for a population of 200 residents within a decade of the first landing, and originating from different Earth continents,” says Lordos. “The five villages will interconnect their tunnel networks and focus on continuous growth of their habitats, capabilities, stocks of resources, and quality of life.”</p> <p>According to Alexandros, the wheel-like physical layout is one of the key mechanisms to build an organic sense of community among Star City residents, which is essential to their well-being as they navigate the challenges of living together on a distant planet. Proximity will enable each village to have access to the other four for material and social support, inspiration, leisure, new ideas, different solutions to common challenges, and socialization. By teaming up to address survival challenges and achieve aspirational goals, they will establish a support network completely unique to Star City so residents can better navigate through times of difficulty.</p> <p>“Drawing on cumulative insights from the social sciences and our own experience in developing systems to support societies facing extreme adversities, we have identified core aspects of the human condition that will be relevant for socio-economic development on Mars,” says Alexandros. “Specifically, we considered the pivotal role that individual as well as community resilience will be expected to play on Mars, sought to ensure a balance between survival-orientation and self-expression in everyone’s daily life, while making room for Star City residents to develop multi-layered identities as members of their more intimate village communities and, at the same time, as citizens of a vibrant and forward-looking technological civilization.”</p> <p>In addition to building community by nurturing the well-being of its human residents, Star City will also build a viable economy and political system to ensure that commerce and governance provide stability for its residents. To pay for importing much-needed supplies from Earth in the short term, Star City residents will leverage their local know-how, infrastructure, and heavy equipment to provide construction services to others who may wish to build a city on Mars. In the long term, Star City could establish itself as a central hub for innovation, entrepreneurship, and tourism as humanity travels farther and farther into the reaches of space.&nbsp;&nbsp;</p> <p>“Our vision is not to simply send human explorers to Mars in order to set up these scientific outposts where we can perform useful experiments, though that is an important and valuable component,” says Robert Zubrin, president of Pioneer Astronautics and the founder and president of the Mars Society, who organized the contest and served on the panel of judges. “The fundamental question we are asking is if we can expand human civilization into other worlds. Of course, you have to have the correct technical analysis, but there are all of these other human dimensions to make a colony on Mars work, and Star City addressed those in the most successful way.”</p> <p>The Star City sociotechnical concept and urban plan was created by George and Alexandros Lordos, with architectural support for the creation of design studies, drawings, and renderings by lead architects Nikos Papapanousis and Tatiana Kouppa, and their team members Efi Koutsaftaki, Aliki Noula, and Aris Michailidis of Delta Architects, Athens, Greece.</p> Star City, a concept for a human city on Mars, won first place at the Mars Colony Prize Design contest. The design, led by MIT PhD student George Lordos and his brother Alexandros, features five villages constructed around a crater rim.Image: Star City Team/Delta ArchitectsAeronautical and astronautical engineering, School of Engineering, Mars, Design, Arts, space, Space, astronomy and planetary science, System Design and Management, Contests and academic competitions, NASA, Alumni/ae 3 Questions: When the student becomes the teacher Grad student Brandon Leshchinskiy created EarthDNA Ambassadors, an outreach program “for the Earth, for future generations.” Tue, 05 Nov 2019 12:15:01 -0500 Sara Cody | Aeronautics and Astronautics <p><em>As a master’s student in the <a href="">Department of Aeronautics and Astronautics</a> and the <a href="">Technology and Policy Program</a> at MIT, Brandon Leshchinskiy’s ultimate goal is to “build AI tools to adapt to climate change and the educational tools to stop it.” As part of his graduate thesis, in collaboration with <a href="">MIT Portugal</a> and EarthDNA, both led by <a href="">Dava Newman</a>, the Apollo Professor of Aeronautics and Astronautics, Leschinskiy created <a href="">EarthDNA’s Ambassadors</a>, an outreach program “for the Earth, for future generations.”</em></p> <p><em>The program aims to empower high school students to speak loudly and often about climate change, by leveraging the energy of college students and recent graduates who are passionate about infusing these conversations in to their local community. EarthDNA Ambassadors provides resources, including a Climate 101 presentation, email templates, surveys, and other materials to support these outreach efforts in local communities. Leshchinskiy spoke about the program in a recent interview.</em></p> <p><strong>Q: </strong>Why are you targeting college students and recent grads to participate in educating local high schoolers?</p> <p><strong>A:</strong> As an undergraduate, I participated in a lot of STEM outreach activities, and so I know firsthand that college students have a lot of energy to give back, and there are a lot of institutional resources available for these efforts. College students have this intrinsic capacity and desire for this type of work, so we feel that college students and recent graduates would be great emissaries in our effort.</p> <p>Climate change is an issue that has become more and more of a cultural priority, especially among the younger generations. Recent UN/IPCC reports show we have roughly 10 years before climate impacts could start to spiral out of control, and I think this younger generation is much more attuned to this because we have grown up experiencing the realities of climate change. Because of this, I think young people feel disenfranchised by the status quo and are therefore much more motivated towards activism.</p> <p>To that end, I think there is a much greater sense of trust between peers due to our similar shared experiences. We all understand how high the stakes are here, and I think college students or recent graduates are better able to appeal to high school students in a way that’s meaningful.</p> <p><strong>Q: </strong>What does the process look like to get involved with EarthDNA Ambassadors and what sort of activities and other resources does the program entail?<strong> </strong></p> <p><strong>A:</strong> Our first goal is to foster a sense of community among people who are passionate about climate change, so first and foremost we encourage interested parties to join our Slack community to start the conversation. We share resources that follow the three key steps of our program: Reach out, where volunteers connect with local high schools where they want to present; Present, where they prepare and present the information in their classroom of choice; and Follow up, where volunteers follow up with teachers a day after, a week after, and a month after their presentation, collecting survey data to help us measure the impact of our program.</p> <p>On our website, we offer training resources and other material for our volunteers, such as email templates for contacting teachers, presentation tips and guidelines, recordings of sample presentations, and step-by-step instructions about our “Climate 101” presentation and interactive activity.</p> <p>The goal of our educational program is to present a cohesive narrative that tells the full story of climate change. Solving this problem will require interdisciplinary effort, and right now I think there is this huge misconception that climate education only belongs in science class. Don’t get me wrong — climate-change education absolutely belongs in a science classroom. But history teachers can provide valuable perspective on past interactions between humans and our home planet; visual arts teachers can foster a community dialogue that captures our intimate relationship with Earth’s climate; and since a key component of climate monitoring involves working with data, climate-change education belongs in computer science and math classrooms as well. In the social sciences, climate change is a big economics problem. Economic models assume continuous growth — but they are competing against physics, which sets a clear limit due to finite resources. Physics will win. Still, if we are going to solve climate change, we have to tell the whole story by reconciling all of these perspectives.</p> <p><strong>Q: </strong>What do you hope to accomplish with this program?</p> <p><strong>A:</strong> Our goal is to broaden access to climate literacy. People tend to filter information through their values, ideologies, and experiences, but in order to make the systemic changes required, we’ll need some level of government intervention, which not all citizens are comfortable with. In general, parents do trust their kids, so if we can get adolescents to talk about the impacts of climate change on their lives, we can at least help start the conversation where there isn’t necessarily one happening right now. One of the questions we ask in the followup survey is how many times do you speak about climate change per week. Eventually we hope to see that we help foster thousands of conversations about climate change that may not have happened otherwise.</p> Brandon Leshchinskiy Photo: Sara Cody/Department of Aeronautics and AstronauticsMIT Portugal, Technology and policy, Climate change, Students, Graduate, postdoctoral, Sustainability, STEM education, Science communications, Aeronautical and astronautical engineering, School of Engineering, 3 Questions, K-12 education, Artificial intelligence, Volunteering, outreach, public service, Policy Autonomous system improves environmental sampling at sea Robotic boats could more rapidly locate the most valuable sampling spots in uncharted waters. Mon, 04 Nov 2019 14:54:51 -0500 Rob Matheson | MIT News Office <p>An autonomous robotic system invented by researchers at MIT and the Woods Hole Oceanographic Institution (WHOI) efficiently sniffs out the most scientifically interesting — but hard-to-find —&nbsp;sampling spots in vast, unexplored waters.</p> <p>Environmental scientists are often interested in gathering samples at the most interesting locations, or “maxima,” in an environment. One example could be a source of leaking chemicals, where the concentration is the highest and mostly unspoiled by external factors. But a maximum can be any quantifiable value that researchers want to measure, such as water depth or parts of coral reef most exposed to air.</p> <p>Efforts to deploy maximum-seeking robots suffer from efficiency and accuracy issues. Commonly, robots will move back and forth like lawnmowers to cover an area, which is time-consuming and collects many uninteresting samples. Some robots sense and follow high-concentration trails to their leak source. But they can be misled. For example, chemicals can get trapped and accumulate in crevices far from a source. Robots may identify those high-concentration spots as the source yet be nowhere close.</p> <p>In a paper being presented at the International Conference on Intelligent Robots and Systems (IROS), the researchers describe “PLUMES,” a system that enables autonomous mobile robots to zero in on a maximum far faster and more efficiently. PLUMES leverages probabilistic techniques to predict which paths are likely to lead to the maximum, while navigating obstacles, shifting currents, and other variables. As it collects samples, it weighs what it’s learned to determine whether to continue down a promising path or search the unknown — which may harbor more valuable samples.</p> <p>Importantly, PLUMES reaches its destination without ever getting trapped in those tricky high-concentration spots. “That’s important, because it’s easy to think you’ve found gold, but really you’ve found fool’s gold,” says co-first author Victoria Preston, a PhD student in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and in the MIT-WHOI Joint Program.</p> <p>The researchers built a PLUMES-powered robotic boat that successfully detected the most exposed coral head in the Bellairs Fringing Reef in Barbados —&nbsp;meaning, it was located in the shallowest spot —&nbsp;which is useful for studying how sun exposure impacts coral organisms. In 100 simulated trials in diverse underwater environments, a virtual PLUMES robot also consistently collected seven to eight times more samples of maxima than traditional coverage methods in allotted time frames.</p> <p>“PLUMES does the minimal amount of exploration necessary to find the maximum and then concentrates quickly on collecting valuable samples there,” says co-first author Genevieve Flaspohler, a PhD student and in CSAIL and the MIT-WHOI Joint Program.</p> <p>Joining Preston and Flaspohler on the paper are: Anna P.M. Michel and Yogesh Girdhar, both scientists in the Department of Applied Ocean Physics and Engineering at the WHOI; and Nicholas Roy, a professor in CSAIL and in the Department of Aeronautics and Astronautics. &nbsp;</p> <p><strong>Navigating an exploit-explore tradeoff</strong></p> <p>A key insight of PLUMES was using techniques from probability to reason about navigating the notoriously complex tradeoff between exploiting what’s learned about the environment and exploring unknown areas that may be more valuable.</p> <p>“The major challenge in maximum-seeking is allowing the robot to balance exploiting information from places it already knows to have high concentrations and exploring places it doesn’t know much about,” Flaspohler says. “If the robot explores too much, it won’t collect enough valuable samples at the maximum. If it doesn’t explore enough, it may miss the maximum entirely.”</p> <p>Dropped into a new environment, a PLUMES-powered robot uses a probabilistic statistical model called a Gaussian process to make predictions about environmental variables, such as chemical concentrations, and estimate sensing uncertainties. PLUMES then generates a distribution of possible paths the robot can take, and uses the estimated values and uncertainties to rank each path by how well it allows the robot to explore and exploit.</p> <p>At first, PLUMES will choose paths that randomly explore the environment. Each sample, however, provides new information about the targeted values in the surrounding environment — such as spots with highest concentrations of chemicals or shallowest depths. The Gaussian process model exploits that data to narrow down possible paths the robot can follow from its given position to sample from locations with even higher value. PLUMES uses a novel objective function —&nbsp;commonly used in machine-learning to maximize a reward — to make the call of whether the robot should exploit past knowledge or explore the new area.</p> <p><strong>“Hallucinating” paths</strong></p> <p>The decision where to collect the next sample relies on the system’s ability to “hallucinate” all possible future action from its current location. To do so, it leverages a modified version of Monte Carlo Tree Search (MCTS), a path-planning technique popularized for powering artificial-intelligence systems that master complex games, such as Go and Chess.</p> <p>MCTS uses a decision tree — a map of connected nodes and lines — to simulate a path, or sequence of moves, needed to reach a final winning action. But in games, the space for possible paths is finite. In unknown environments, with real-time changing dynamics, the space is effectively infinite, making planning extremely difficult. The researchers designed “continuous-observation MCTS,” which leverages the Gaussian process and the novel objective function to search over this unwieldy space of possible real paths.</p> <p>The root of this MCTS decision tree starts with a “belief” node, which is the next immediate step the robot can take. This node contains the entire history of the robot’s actions and observations up until that point. Then, the system expands the tree from the root into new lines and nodes, looking over several steps of future actions that lead to explored and unexplored areas.</p> <p>Then, the system simulates what would happen if it took a sample from each of those newly generated nodes, based on some patterns it has learned from previous observations. Depending on the value of the final simulated node, the entire path receives a reward score, with higher values equaling more promising actions. Reward scores from all paths are rolled back to the root node. The robot selects the highest-scoring path, takes a step, and collects a real sample. Then, it uses the real data to update its Gaussian process model and repeats the “hallucination” process.</p> <p>“As long as the system continues to hallucinate that there may be a higher value in unseen parts of the world, it must keep exploring,” Flaspohler says. “When it finally converges on a spot it estimates to be the maximum, because it can’t hallucinate a higher value along the path, it then stops exploring.”</p> <p>Now, the researchers are collaborating with scientists at WHOI to use PLUMES-powered robots to localize chemical plumes at volcanic sites and study methane releases in melting coastal estuaries in the Arctic. Scientists are interested in the source of chemical gases released into the atmosphere, but these test sites can span hundreds of square miles.</p> <p>“They can [use PLUMES to] spend less time exploring that huge area and really concentrate on collecting scientifically valuable samples,” Preston says.</p> Even in unexplored waters, an MIT-developed robotic system can efficiently sniff out valuable, hard-to-find spots to collect samples from. When implemented in autonomous boats deployed off the coast of Barbados (pictured), the system quickly found the most exposed coral head —meaning it was located in the shallowest spot — which is useful for studying how sun exposure impacts coral organisms.Image courtesy of the researchersResearch, Computer science and technology, Algorithms, Computer Science and Artificial Intelligence Laboratory (CSAIL), Autonomous vehicles, Machine learning, Artificial intelligence, Environment, Robots, Robotics, Oceanography and ocean engineering, Aeronautical and astronautical engineering, School of Engineering Technique helps robots find the front door Navigation method may speed up autonomous last-mile delivery. Mon, 04 Nov 2019 00:00:00 -0500 Jennifer Chu | MIT News Office <p>In the not too distant future, robots may be dispatched as last-mile delivery vehicles to drop your takeout order, package, or meal-kit subscription at your doorstep — if they can find the door.</p> <p>Standard approaches for robotic navigation involve mapping an area ahead of time, then using algorithms to guide a robot toward a specific goal or GPS coordinate on the map. While this approach might make sense for exploring specific environments, such as the layout of a particular building or planned obstacle course, it can become unwieldy in the context of last-mile delivery.</p> <p>Imagine, for instance, having to map in advance every single neighborhood within a robot’s delivery zone, including the configuration of each house within that neighborhood along with the specific coordinates of each house’s front door. Such a task can be difficult to scale to an entire city, particularly as the exteriors of houses often change with the seasons. Mapping every single house could also run into issues of security and privacy.</p> <p>Now MIT engineers have developed a navigation method that doesn’t require mapping an area in advance. Instead, their approach enables a robot to use clues in its environment to plan out a route to its destination, which can be described in general semantic terms, such as “front door” or “garage,” rather than as coordinates on a map. For example, if a robot is instructed to deliver a package to someone's front door, it might start on the road and see a driveway, which it has been trained to recognize as likely to lead toward a sidewalk, which in turn is likely to lead to the front door.</p> <div class="cms-placeholder-content-video"></div> <p>The new technique can greatly reduce the time a robot spends exploring a property before identifying its target, and it doesn’t rely on maps of specific residences.&nbsp;</p> <p>“We wouldn’t want to have to make a map of every building that we’d need to visit,” says Michael Everett, a graduate student in MIT’s Department of Mechanical Engineering. “With this technique, we hope to drop a robot at the end of any driveway and have it find a door.”</p> <p>Everett will present the group’s results this week at the International Conference on Intelligent Robots and Systems. The paper, which is co-authored by Jonathan How, professor of aeronautics and astronautics at MIT, and Justin Miller of the Ford Motor Company, is a finalist for “Best Paper for Cognitive Robots.”</p> <p><strong>“A sense of what things are”</strong></p> <p>In recent years, researchers have worked on introducing natural, semantic language to robotic systems, training robots to recognize objects by their semantic labels, so they can visually process a door as a door, for example, and not simply as a solid, rectangular obstacle.</p> <p>“Now we have an ability to give robots a sense of what things are, in real-time,” Everett says.</p> <p>Everett, How, and Miller are using similar semantic techniques as a springboard for their new navigation approach, which leverages pre-existing algorithms that extract features from visual data to generate a new map of the same scene, represented as semantic clues, or context.</p> <p>In their case, the researchers used an algorithm to build up a map of the environment as the robot moved around, using the semantic labels of each object and a depth image. This algorithm is called semantic SLAM (Simultaneous Localization and Mapping).</p> <p>While other semantic algorithms have enabled robots to recognize and map objects in their environment for what they are, they haven’t allowed a robot to make decisions in the moment while navigating a new environment, on the most efficient path to take to a semantic destination such as a “front door.”</p> <p>“Before, exploring was just, plop a robot down and say ‘go,’ and it will move around and eventually get there, but it will be slow,” How says.</p> <p><strong>The cost to go</strong></p> <p>The researchers looked to speed up a robot’s path-planning through a semantic, context-colored world. They developed a new “cost-to-go estimator,” an algorithm that converts a semantic map created by preexisting SLAM algorithms into a second map, representing the likelihood of any given location being close to the goal.</p> <p>“This was inspired by image-to-image translation, where you take a picture of a cat and make it look like a dog,” Everett says. “The same type of idea happens here where you take one image that looks like a map of the world, and turn it into this other image that looks like the map of the world but now is colored based on how close different points of the map are to the end goal.”</p> <p>This cost-to-go map is colorized, in gray-scale, to represent darker regions as locations far from a goal, and lighter regions as areas that are close to the goal. For instance, the sidewalk, coded in yellow in a semantic map, might be translated by the cost-to-go algorithm as a darker region in the new map, compared with a driveway, which is progressively lighter as it approaches the front door — the lightest region in the new map.</p> <p>The researchers trained this new algorithm on satellite images from Bing Maps containing 77 houses from one urban and three suburban neighborhoods. The system converted a semantic map into a cost-to-go map, and mapped out the most efficient path, following lighter regions in the map, to the end goal. For each satellite image, Everett assigned semantic labels and colors to context features in a typical front yard, such as grey for a front door, blue for a driveway, and green for a hedge.</p> <p>During this training process, the team also applied masks to each image to mimic the partial view that a robot’s camera would likely have as it traverses a yard.</p> <p>“Part of the trick to our approach was [giving the system] lots of partial images,” How explains. “So it really had to figure out how all this stuff was interrelated. That’s part of what makes this work robustly.”</p> <p>The researchers then tested their approach in a simulation of an image of an entirely new house, outside of the training dataset, first using the preexisting SLAM algorithm to generate a semantic map, then applying their new cost-to-go estimator to generate a second map, and path to a goal, in this case, the front door.</p> <p>The group’s new cost-to-go technique found the front door 189 percent faster than classical navigation algorithms, which do not take context or semantics into account, and instead spend excessive steps exploring areas that are unlikely to be near their goal.</p> <p>Everett says the results illustrate how robots can use context to efficiently locate a goal, even in unfamiliar, unmapped environments.</p> <p>“Even if a robot is delivering a package to an environment it’s never been to, there might be clues that will be the same as other places it’s seen,” Everett says. “So the world may be laid out a little differently, but there’s probably some things in common.”</p> <p>This research is supported, in part, by the Ford Motor Company.</p> For last-mile delivery, robots of the future may use a new MIT algorithm to find the front door, using clues in their environment.Image: MIT NewsResearch, Aeronautical and astronautical engineering, School of Engineering, Algorithms, Artifical intelligence, Machine learning, Autonomous vehicles, Robots, Robotics, Software Helping autonomous vehicles see around corners By sensing tiny changes in shadows, a new system identifies approaching objects that may cause a collision. Sun, 27 Oct 2019 23:59:59 -0400 Rob Matheson | MIT News Office <p>To improve the safety of autonomous systems, MIT engineers have developed a system that can sense tiny changes in shadows on the ground to determine if there’s a moving object coming around the corner. &nbsp;</p> <p>Autonomous cars could one day use the system to quickly avoid a potential collision with another car or pedestrian emerging from around a building’s corner or from in between parked cars. In the future, robots that may navigate hospital hallways to make medication or supply deliveries could use the system to avoid hitting people.</p> <p>In a paper being presented at next week’s International Conference on Intelligent Robots and Systems (IROS), the researchers describe successful experiments with an autonomous car driving around a parking garage and an autonomous wheelchair navigating hallways. When sensing and stopping for an approaching vehicle, the car-based system beats traditional LiDAR — which can only detect visible objects — by more than half a second.</p> <p>That may not seem like much, but fractions of a second matter when it comes to fast-moving autonomous vehicles, the researchers say.</p> <p>“For applications where robots are moving around environments with other moving objects or people, our method can give the robot an early warning that somebody is coming around the corner, so the vehicle can slow down, adapt its path, and prepare in advance to avoid a collision,” adds co-author Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. “The big dream is to provide ‘X-ray vision’ of sorts to vehicles moving fast on the streets.”</p> <p>Currently, the system has only been tested in indoor settings. Robotic speeds are much lower indoors, and lighting conditions are more consistent, making it easier for the system to sense and analyze shadows.</p> <p>Joining Rus on the paper are: first author Felix Naser SM ’19, a former CSAIL researcher; Alexander Amini, a CSAIL graduate student; Igor Gilitschenski, a CSAIL postdoc; recent graduate Christina Liao ’19; Guy Rosman of the Toyota Research Institute; and Sertac Karaman, an associate professor of aeronautics and astronautics at MIT.</p> <p><strong>Extending ShadowCam</strong></p> <p>For their work, the researchers built on their system, called “ShadowCam,” that uses computer-vision techniques to detect and classify changes to shadows on the ground. MIT professors William Freeman and Antonio Torralba, who are not co-authors on the IROS paper, collaborated on the earlier versions of the system, which were presented at conferences in 2017 and 2018.</p> <p>For input, ShadowCam uses sequences of video frames from a camera targeting a specific area, such as the floor in front of a corner. It detects changes in light intensity over time, from image to image, that may indicate something moving away or coming closer. Some of those changes may be difficult to detect or invisible to the naked eye, and can be determined by various properties of the object and environment. ShadowCam computes that information and classifies each image as containing a stationary object or a dynamic, moving one. If it gets to a dynamic image, it reacts accordingly.</p> <p>Adapting ShadowCam for autonomous vehicles required a few advances. The early version, for instance, relied on lining an area with augmented reality labels called “AprilTags,” which resemble simplified QR codes. Robots scan AprilTags to detect and compute their precise 3D position and orientation relative to the tag. ShadowCam used the tags as features of the environment to zero in on specific patches of pixels that may contain shadows. But modifying real-world environments with AprilTags is not practical.</p> <p>The researchers developed a novel process that combines image registration and a new visual-odometry technique. Often used in computer vision, image registration essentially overlays multiple images to reveal variations in the images. Medical image registration, for instance, overlaps medical scans to compare and analyze anatomical differences.</p> <p>Visual odometry, used for Mars Rovers, estimates the motion of a camera in real-time by analyzing pose and geometry in sequences of images. The researchers specifically employ “Direct Sparse Odometry” (DSO), which can compute feature points in environments similar to those captured by AprilTags. Essentially, DSO plots features of an environment on a 3D point cloud, and then a computer-vision pipeline selects only the features located in a region of interest, such as the floor near a corner. (Regions of interest were annotated manually beforehand.)</p> <p>As ShadowCam takes input image sequences of a region of interest, it uses the DSO-image-registration method to overlay all the images from same viewpoint of the robot. Even as a robot is moving, it’s able to zero in on the exact same patch of pixels where a shadow is located to help it detect any subtle deviations between images.</p> <p>Next is signal amplification, a technique introduced in the first paper. Pixels that may contain shadows get a boost in color that reduces the signal-to-noise ratio. This makes extremely weak signals from shadow changes far more detectable. If the boosted signal reaches a certain threshold — based partly on how much it deviates from other nearby shadows —&nbsp;ShadowCam classifies the image as “dynamic.” Depending on the strength of that signal, the system may tell the robot to slow down or stop.</p> <p>“By detecting that signal, you can then be careful. It may be a shadow of some person running from behind the corner or a parked car, so the autonomous car can slow down or stop completely,” Naser says.</p> <p><strong>Tag-free testing</strong></p> <p>In one test, the researchers evaluated the system’s performance in classifying moving or stationary objects using AprilTags and the new DSO-based method. An autonomous wheelchair steered toward various hallway corners while humans turned the corner into the wheelchair’s path. Both methods achieved the same 70-percent classification accuracy, indicating AprilTags are no longer needed.</p> <p>In a separate test, the researchers implemented ShadowCam in an autonomous car in a parking garage, where the headlights were turned off, mimicking nighttime driving conditions. They compared car-detection times versus LiDAR. In an example scenario, ShadowCam detected the car turning around pillars about 0.72 seconds faster than LiDAR. Moreover, because the researchers had tuned ShadowCam specifically to the garage’s lighting conditions, the system achieved a classification accuracy of around 86 percent.</p> <p>Next, the researchers are developing the system further to work in different indoor and outdoor lighting conditions. In the future, there could also be ways to speed up the system’s shadow detection and automate the process of annotating targeted areas for shadow sensing.</p> <p>This work was funded by the Toyota Research Institute.</p> MIT engineers have developed a system for autonomous vehicles that senses tiny changes in shadows on the ground to determine if there’s a moving object coming around the corner, such as when another car is approaching from behind a pillar in a parking garage.Research, Computer Science and Artificial Intelligence Laboratory (CSAIL), Electrical Engineering & Computer Science (eecs), Aeronautical and astronautical engineering, School of Engineering, Computer science and technology, Algorithms, Robotics, Robots, Autonomous vehicles, Automobiles, Artificial intelligence, Machine learning, Transportation, Technology and society System prevents speedy drones from crashing in unfamiliar areas Drones can fly at high speeds to a destination while keeping safe “backup” plans if things go awry. Fri, 25 Oct 2019 09:21:43 -0400 Rob Matheson | MIT News Office <p>Autonomous drones are cautious when navigating the unknown. They creep forward, frequently mapping unfamiliar areas before proceeding lest they crash into undetected objects. But this slowdown isn’t ideal for drones carrying out time-sensitive tasks, such as flying search-and-rescue missions through dense forests. &nbsp;</p> <p>Now MIT researchers have developed a trajectory-planning model that helps drones fly at high speeds through previously unexplored areas, while staying safe.</p> <p>The model — aptly named “FASTER” — estimates the quickest possible path from a starting point to a destination point across all areas the drone can and can’t see, with no regard for safety. But, as the drone flies, the model continuously logs collision-free “back-up” paths that slightly deviate from that fast flight path. When the drone is unsure about a particular area, it detours down the back-up path and replans its path. The drone can thus cruise at high speeds along the quickest trajectory while occasionally slowing down slightly to ensure safety.</p> <p>“We always want to execute the fastest path, but we don’t always know it’s safe. If, as we move along this fastest path, we discover there’s a problem, we need to have a backup plan,” says Jesus Tordesillas, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro) and first author on a paper describing the model being presented at next month’s International Conference on Intelligent Robots and Systems. “We obtain a higher velocity trajectory that may not be safe and a slow-velocity trajectory that’s completely safe. The two paths are stitched together at first, but then one deviates for performance and the other for safety.”</p> <p>In forest simulations, where a virtual drone navigates around cylinders representing trees, FASTER-powered drones safely completed flight paths about two times quicker than traditional models. In real-life tests, FASTER-powered drones maneuvering around cardboard boxes in a large room achieved speeds of 7.8 meters per second. That’s pushing limits for how fast the drones can fly, based on weight and reaction times, the researchers say.</p> <p>“That’s about as fast as you can go,” says co-author Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics. “If you were standing in a room with a drone flying 7 to 8 meters per second in it, you’d probably take a step back."</p> <p>The paper’s other co-author is Brett T. Lopez, a former PhD student in AeroAstro and now a postdoc at NASA’s Jet Propulsion Laboratory.</p> <p><strong>Splitting paths </strong></p> <p>Drones use cameras to capture environment as voxels, 3D cubes generated from depth information. As the drone flies, each detected voxel gets labeled as “free-known space,” unoccupied by objects, and “occupied-known space,” which contains objects. The rest of the environment is “unknown space.”&nbsp;</p> <p>FASTER utilizes all of those areas to plan three types of trajectories — “whole,” “safe,” and “committed.” The whole trajectory is the entire path from starting point A to goal location B, through known and unknown areas. To do so, “convex decomposition,” a technique that breaks down complex models into discrete components, generates overlapping polyhedrons that model those three areas in an environment. Using some geometric techniques and mathematical constraints, the model uses these polyhedrons to compute an optimal whole trajectory.</p> <p>Simultaneously, the model plans a safe trajectory. Somewhere along the whole trajectory, it plots a “rescue” point that indicates the last moment a drone can detour to unobstructed free-known space, based on its speed and other factors. To find a safe destination, it computes new polyhedrons that cover the free-known space. Then, it locates a spot inside these new polyhedrons. Basically, the drone stops in a spot that’s safe but as close as possible to unknown space, enabling a very quick and efficient detour.</p> <p><strong>Committed trajectory</strong></p> <p>The committed trajectory consists of the first interval of the whole trajectory, as well as the entire safe trajectory. But this first interval is independent of the safe trajectory, and therefore it is not affected by the braking needed for the safe trajectory.</p> <p>The drone computes one whole trajectory at a time, while always keeping track of the safe trajectory. But it’s given a time limit: When it reaches the rescue point, it must have successfully computed the next whole trajectory through known or unknown space. If it does, it will continue following the whole trajectory. Otherwise, it diverts to the safe trajectory. This approach enables the drone to maintain high velocities along the committed trajectories, which is key to achieving high overall speeds.</p> <p>For this to all work, the researchers designed ways for the drones to process all the planning data very quickly, which was challenging. Because the maps are so varied, for instance, the time limit given to each committed trajectory initially varied dramatically. That was computationally expensive and slowed down the drone’s planning, so the researchers developed a method to quickly compute fixed times for all the intervals along the trajectories, which simplified computations. The researchers also designed methods to reduce how many polyhedrons the drone must process to map its surroundings. Both of those methods dramatically increased planning times.</p> <p>"How to increase the flight speed and maintain safety is one of the hardest problems for drone’s motion planning,” says Sikang Liu, a software engineer at Waymo, formerly Google’s self-driving car project, and an expert in trajectory-planning algorithms. “This work showed a great solution to this problem by enhancing the existing trajectory generation framework. In the trajectory optimization pipeline, the time allocation is always a tricky problem that could lead to convergence issue and undesired behavior. This paper addressed this problem through a novel approach … which could be an insightful contribution to this field."</p> <p>The researchers are currently building larger FASTER-powered drones with propellers designed to enable steady horizontal flight. Traditionally, drones will need to roll and pitch as they’re flying. But this custom drone would stay completely flat for various applications.</p> <p>A potential application for FASTER, which has been developed with support by U.S. Department of Defense,&nbsp;could be improving search-and-rescue missions in forest environments, which present many planning and navigational challenges for autonomous drones. “But the unknown area doesn’t have to be forest,” How says. “It could be any area where you don’t know what’s coming, and it matters how quickly you acquire that knowledge. The main motivation is building more agile drones.”</p> MIT researchers have developed a trajectory-planning model that helps drones fly more safely at high speeds through previously unexplored areas, which could aid search-and-rescue missions through dense forests.Research, Aeronautical and astronautical engineering, School of Engineering, Computer science and technology, Algorithms, Autonomous vehicles, Drones, Robots, Robotics, Computer vision, Technology and society System helps smart devices find their position Connected devices can now share position information, even in noisy, GPS-denied areas. Wed, 02 Oct 2019 23:59:59 -0400 Rob Matheson | MIT News Office <p>A new system developed by researchers at MIT and elsewhere helps networks of smart devices cooperate to find their positions in environments where GPS usually fails.</p> <p>Today, the “internet of things” concept is fairly well-known: Billions of interconnected sensors around the world — embedded in everyday objects, equipment, and vehicles, or worn by humans or animals — collect and share data for a range of applications.</p> <p>An emerging concept, the “localization of things,” enables those devices to sense and communicate their position. This capability could be helpful in supply chain monitoring, autonomous navigation, highly connected smart cities, and even forming a real-time “living map” of the world. Experts project that the localization-of-things market will grow to $128 billion by 2027.</p> <p>The concept hinges on precise localization techniques. Traditional methods leverage GPS satellites or wireless signals shared between devices to establish their relative distances and positions from each other. But there’s a snag: Accuracy suffers greatly in places with reflective surfaces, obstructions, or other interfering signals, such as inside buildings, in underground tunnels, or in “urban canyons” where tall buildings flank both sides of a street.</p> <p>Researchers from MIT, the University of Ferrara, the Basque Center of Applied Mathematics (BCAM), and the University of Southern California have developed a system that captures location information even in these noisy, GPS-denied areas. A paper describing the system appears in the <em>Proceedings of the IEEE</em>.</p> <p>When devices in a network, called “nodes,” communicate wirelessly in a signal-obstructing, or “harsh,” environment, the system fuses various types of positional information from dodgy wireless signals exchanged between the nodes, as well as digital maps and inertial data. In doing so, each node considers information associated with all possible locations — called “soft information” — in relation to those of all other nodes. The system leverages machine-learning techniques and techniques that reduce the dimensions of processed data to determine possible positions from measurements and contextual data. Using that information, it then pinpoints the node’s position.</p> <p>In simulations of harsh scenarios, the system operates significantly better than traditional methods. Notably, it consistently performed near the theoretical limit for localization accuracy. Moreover, as the wireless environment got increasingly worse, traditional systems’ accuracy dipped dramatically while the new soft information-based system held steady.</p> <p>“When the tough gets tougher, our system keeps localization accurate,” says Moe Win, a professor in the Department of Aeronautics and Astronautics and the Laboratory for Information and Decision Systems (LIDS), and head of the Wireless Information and Network&nbsp;Sciences Laboratory. “In harsh wireless environments, you have reflections and echoes that make it far more difficult to get accurate location information. Places like the Stata Center [on the MIT campus] are particularly challenging, because there are surfaces reflecting signals everywhere. Our soft information method is particularly robust in such harsh wireless environments.”</p> <p>Joining Win on the paper are: Andrea Conti of the University of Ferrara; Santiago Mazuelas of BCAM; Stefania Bartoletti of the University of Ferrara; and William C. Lindsey of the University of Southern California.</p> <p><strong>Capturing “soft information”</strong></p> <p>In network localization, nodes are generally referred to as anchors or agents. Anchors are nodes with known positions, such as GPS satellites or wireless base stations. Agents are nodes that have unknown positions — such as autonomous cars, smartphones, or wearables.</p> <p>To localize, agents can use anchors as reference points, or they can share information with other agents to orient themselves. That involves transmitting wireless signals, which arrive at the receiver carrying positional information. The power, angle, and time-of-arrival of the received waveform, for instance, correlate to the distance and orientation between nodes.</p> <p>Traditional localization methods extract one feature of the signal to estimate a single value for, say, the distance or angle between two nodes. Localization accuracy relies entirely on the accuracy of those inflexible (or “hard”) values, and accuracy has been shown to decrease drastically as environments get harsher.</p> <p>Say a node transmits a signal to another node that’s 10 meters away in a building with many reflective surfaces. The signal may bounce around and reach the receiving node at a time corresponding to 13 meters away. Traditional methods would likely assign that incorrect distance as a value.</p> <p>For the new work, the researchers decided to try using soft information for localization. The method leverages many signal features and contextual information to create a probability distribution of all possible distances, angles, and other metrics. “It’s called ‘soft information’ because we don’t make any hard choices about the values,” Conti says.</p> <p>The system takes many sample measurements of signal features, including its power, angle, and time of flight. Contextual data come from external sources, such as digital maps and models that capture and predict how the node moves.</p> <p>Back to the previous example: Based on the initial measurement of the signal’s time of arrival, the system still assigns a high probability that the nodes are 13 meters apart. But it assigns a small possibility that they’re 10 meters apart, based on some delay or power loss of the signal. As the system fuses all other information from surrounding nodes, it updates the likelihood for each possible value. For instance, it could ping a map and see that the room’s layout shows it’s highly unlikely both nodes are 13 meters apart. Combining all the updated information, it decides the node is far more likely to be in the position that is 10 meters away.</p> <p>“In the end, keeping that low-probability value matters,” Win says. “Instead of giving a definite value, I’m telling you I’m really confident that you’re 13 meters away, but there’s a smaller possibility you’re also closer. This gives additional information that benefits significantly in determining the positions of the nodes.”</p> <p><strong>Reducing complexity</strong></p> <p>Extracting many features from signals, however, leads to data with large dimensions that can be too complex and inefficient for the system. To improve efficiency, the researchers reduced all signal data into a reduced-dimension and easily computable space.</p> <p>To do so, they identified aspects of the received waveforms that are the most and least useful for pinpointing location based on “principal component analysis,” a technique that keeps the most useful aspects in multidimensional datasets and discards the rest, creating a dataset with reduced dimensions. If received waveforms contain 100 sample measurements each, the technique might reduce that number to, say, eight.</p> <p>A final innovation was using machine-learning techniques to learn a statistical model describing possible positions from measurements and contextual data. That model runs in the background to measure how that signal-bouncing may affect measurements, helping to further refine the system’s accuracy.</p> <p>The researchers are now designing ways to use less computation power to work with resource-strapped nodes that can’t transmit or compute all necessary information. They’re also working on bringing the system to “device-free” localization, where some of the nodes can’t or won’t share information. This will use information about how the signals are backscattered off these nodes, so other nodes know they exist and where they are located.</p> A system designed by researchers at MIT and elsewhere enables interconnected smart devices to cooperatively pinpoint their positions in noisy environments where GPS usually fails, which is useful for emerging “localization-of-things” applications.Image: Christine Daniloff, MITResearch, Computer science and technology, Algorithms, Machine learning, Mobile devices, Networks, Wireless, Internet, Laboratory for Information and Decision Systems (LIDS), Aeronautical and astronautical engineering, School of Engineering, internet of things An interdisciplinary approach to accelerating human-machine collaboration Professor’s startup brings millimeter-scale location tracking to factories, ports, and other industrial environments. Wed, 02 Oct 2019 00:00:01 -0400 Zach Winn | MIT News Office <p>David Mindell has spent his career defying traditional distinctions between disciplines. His work has explored the ways humans interact with machines, drive innovation, and maintain societal well-being as technology transforms our economy.</p> <p>And, Mindell says, he couldn’t have done it anywhere but MIT. He joined MIT’s faculty 23 years ago after completing his PhD in the Program in Science, Technology, and Society, and he currently holds a dual appointment in engineering and humanities as the Frances and David Dibner Professor of the History of Engineering and Manufacturing in the School of Humanities, Arts, and Social Sciences and professor of aeronautics and astronautics.</p> <p>Mindell’s experience combining fields of study has shaped his ideas about the relationship between humans and machines. Those ideas are what led him to found Humatics — a startup named from the merger of “human” and “robotics.”</p> <p>Humatics is trying to change the way humans work alongside machines, by enabling location tracking and navigation indoors, underground, and in other areas where technologies like GPS are limited. It accomplishes this by using radio frequencies to track things at the millimeter scale — unlocking what Mindell calls microlocation technology.</p> <p>The company’s solution is already being used in places like shipping ports and factories, where humans work alongside cranes, industrial tools, automated guided vehicles (AGVs), and other machines. These businesses often lack consistent location data for their machines and are forced to adopt inflexible routes for their mobile robots.</p> <p>“One of the holy grails is to have humans and robots share the same space and collaborate, and we’re enabling mobile robots to work in human environments safely and on a large scale,” Mindell says. “Safety is a critical first form of collaboration, but beyond that, we’re just beginning to learn how to work [in settings] where robots and people are exquisitely aware of where they are.”</p> <p><strong>A company decades in the making</strong></p> <p>MIT has a long history of transcending research fields to improve our understanding of the world. Take, for example, Norbert Wiener, who served on MIT’s faculty in the Department of Mathematics between 1919 and his death in 1964.</p> <p>Wiener is credited with formalizing the field of cybernetics, which is an approach to understanding feedback systems he defined as “the scientific study of control and communication in the animal and the machine." Cybernetics can be applied to mechanical, biological, cognitive, and social systems, among others, and it sparked a frenzy of interdisciplinary study and scientific collaboration.</p> <p>In 2002, Mindell wrote a book exploring the history of cybernetics before Wiener and its emergence at the intersection of a range of disciplines during World War II. It is one of several books Mindell has written that deal with interdisciplinary responses to complex problems, particularly in extreme environments like lunar landings and the deep sea.</p> <p>The interdisciplinary perspective Mindell forged at MIT has helped him identify the limitations of technology that prevent machines and humans from working together seamlessly.</p> <p>One particular shortcoming that Mindell has thought about for years is the lack of precise location data in places like warehouses, subway systems, and shipping ports.</p> <p>“In five years, we’ll look back at 2019 and say, ‘I can’t believe we didn’t know where anything was,’” Mindell says. “We’ve got so much data floating around, but the link between the actual physical world we all inhabit and move around in and the digital world that’s exploding is really still very poor.”</p> <p>In 2014, Mindell partnered with Humatics co-founder Gary Cohen, who has worked as an intellectual property strategist for biotech companies in the Kendall Square area, to solve the problem.</p> <p>In the beginning of 2015, Mindell collaborated with Lincoln Laboratory alumnus and radar expert Greg Charvat; the two built a prototype navigation system and started the company two weeks later. Charvat became Humatics’ CTO and first employee.</p> <p>“It was clear there was about to be this huge flowering of robotics and autonomous systems and AI, and I thought the things we learned in extreme environments, notably under sea and in aviation, had an enormous amount of application to industrial environments,” Mindell says. “The company is about bringing insights from years of experience with remote and autonomous systems in extreme environments into transit, logistics, e-commerce, and manufacturing.”</p> <p><strong>Bringing microlocation to industry</strong></p> <p>Factories, ports, and other locations where GPS data is unworkable or insufficient adopt a variety of solutions to meet their tracking and navigation needs. But each workaround has its drawbacks.</p> <p>RFID and Bluetooth technologies, for instance, can track assets but have short ranges and are expensive to deploy across large areas.</p> <p>Cameras and sensing methods like LIDAR can be used to help machines see their environment, but they struggle with things like rain and different lighting conditions. Floor tape embedded with wires or magnets is also often used to guide machines through fixed routes, but it isn’t well-suited for today’s increasingly dynamic warehouses and production lines.</p> <p>Humatics has focused on making the capabilities of its microlocation location system as easy to leverage as possible. The location and tracking data it collects can be integrated into whatever warehouse management system or “internet of things” (IoT) platforms customers are already using.</p> <p>Its radio frequency beacons have a range of up to 500 meters and, when installed as part of a constellation, can pinpoint three dimensional locations to within 2 centimeters, creating a virtual grid of the surrounding environment.</p> <p>The beacons can be combined with an onboard navigation hub that helps mobile robots move around dynamic environments. Humatics’ system also gathers location data from multiple points at once, monitoring the speed of a forklift, helping a crane operator place a shipping crate, and guiding a robot around obstacles simultaneously.</p> <p>The data Humatics collects don’t just help customers improve their processes; they can also transform the way workers and machines share space and work together. Indeed, with a new chip just emerging from its labs, Mindell says Humatics is moving industries such as manufacturing and logistics into “the world of ubiquitous, millimeter-accurate positioning.”</p> <p>It’s all possible because of the company’s holistic approach to the age-old problem of human-machine interaction.</p> <p>“Humatics is an example of what can happen when we think about technology in a unique, broader context,” Mindell says. “It’s an example of what MIT can accomplish when it pays serious attention to these two ways [from humanities and engineering] of looking at the world.”</p> Humatics co-founder and CEO David Mindell at Humatics headquarters in Waltham, MA.Image: Allegra BovermanInnovation and Entrepreneurship (I&E), Startups, human-robot interaction, Robotics, Robots, Manufacturing, Future of Manufacturing, Autonmous vehicles, Faculty, Program in STS, Aeronautical and astronautical engineering, School of Engineering, School of Humanities Arts and Social Sciences Computing and the search for new planets MIT planetary scientists partner with computer scientists to find exoplanets. Mon, 23 Sep 2019 09:00:00 -0400 Brittany Flaherty | School of Science <p>When MIT launched the <a href="">MIT Stephen A. Schwarzman College of Computing</a> this fall, one of the goals was to drive further innovation in computing across all of MIT’s schools. Researchers are already expanding beyond traditional applications of computer science and using these techniques to advance a range of scientific fields, from cancer medicine to anthropology to design — and to the discovery of new planets.</p> <p>Computation has already proven useful for the Transiting Exoplanet Survey Satellite (TESS), a NASA-funded mission led by MIT. Launched from Cape Canaveral in April 2018, TESS is a satellite that takes images of the sky as it orbits the Earth. These images can help researchers find planets orbiting stars beyond our sun, called exoplanets. This work, which is now halfway complete, will reveal more about the other planets within what NASA calls our “solar neighborhood.”&nbsp;</p> <p>“TESS just completed the first of its two-year prime mission, surveying the southern night sky,” says Sara Seager, an astrophysicist and planetary scientist at MIT and deputy director of science for TESS. “TESS found over 1,000 planet candidates and about 20 confirmed planets, some in multiple-planet systems.”</p> <p>While TESS has enabled some impressive discoveries so far, finding these exoplanets is no simple task. TESS is collecting images of more than 200,000 distant stars, saving an image of these planets every two minutes, as well as saving an image of a large swath of sky every 30 minutes. Seager says every two weeks, which is how long it takes the satellite to orbit the Earth, TESS sends about 350 gigabytes of data (once uncompressed) to Earth. While Seager says this is not as much data as people might expect (a 2019 Macbook Pro has up to 512 gigabytes of storage), analyzing the data involves taking many complex factors into consideration.</p> <p>Seager, who says she has long been interested in how computation can be used as a tool for science, began discussing the project with Victor Pankratius, a former principal research scientist in MIT’s Kavli Institute for Astrophysics and Space Research, who is now the director and head of global software engineering at Bosch Sensortec. A trained computer scientist, Pankratius says that after arriving at MIT in 2013, he started thinking about scientific fields that produce big data, but that have not yet fully benefited from computing techniques. After speaking with astronomers like Seager, he learned more about the data their instruments collect and became interested in applying computer-aided discovery techniques to the search for exoplanets.</p> <p>“The universe is a big place,” Pankratius says. “So I think leveraging what we have on the computer science side is a great thing.”&nbsp;</p> <p>The basic idea underlying TESS’ mission is that like our own solar system, in which the Earth and other planets revolve around a central star (the sun), there are other planets beyond our solar system revolving around different stars. The images TESS collects produce light curves — data that show how the brightness of the star changes over time. Researchers are analyzing these light curves to find drops in brightness, which could indicate that a planet is passing in front of the star and temporarily blocking some of its light.&nbsp;</p> <p>“Every time a planet orbits, you would see this brightness go down,” Pankratius says. “It's almost like a heartbeat.”&nbsp;</p> <p>The trouble is that not every dip in brightness is necessarily caused by a passing planet. Seager says machine learning currently comes into play during the “triage” phase of their TESS data analysis, helping them distinguish between potential planets and other things that could cause dips in brightness, like variable stars, which naturally vary in their brightness, or instrument noise.</p> <p>Analysis on planets that pass through triage is still done by scientists who have learned how to “read” light curves. But the team is now using thousands of light curves that have been classified by eye to teach neural networks how to identify exoplanet transits. Computation is helping them narrow down which light curves they should examine in more detail. Liang Yu PhD ’19, a recent physics graduate, built upon an existing code to write the machine learning tool that the team is now using.</p> <p>While helpful for homing in on the most relevant data, Seager says machine learning cannot yet be used to simply find exoplanets. “We still have a lot of work to do,” she says.</p> <p>Pankratius agrees. “What we want to do is basically create computer-aided discovery systems that do this for all [stars] all the time,” he says. “You want to just press a button and say, show me everything. But right now it's still people with some automation vetting all of these light curves.”</p> <p>Seager and Pankratius also co-taught a course that focused on various aspects of computation and artificial intelligence (AI) development in planetary science. Seager says inspiration for the course arose from a growing interest from students to learn about AI and its applications to cutting-edge data science.</p> <p>In 2018, the course allowed students to use actual data collected by TESS to explore machine learning applications for this data. Modeled after another course Seager and Pankratius taught, students in the course were able to choose a scientific problem and learn the computation skills to solve that problem. In this case, students learned about AI techniques and applications to TESS. Seager says students had a great response to the unique class.&nbsp;</p> <p>“As a student, you could actually make a discovery,” Pankratius says. “You can build a machine learning algorithm, run it on this data, and who knows, maybe you will find something new.”</p> <p>Much of the data TESS collects is also readily available as part of a larger citizen science project. Pankratius says anyone with the right tools could start making discoveries of their own. Thanks to cloud connectivity, this is even possible on a cell phone.&nbsp;</p> <p>“If you get bored on your bus ride home, why not search for planets?” he says.</p> <p>Pankratius says this type of collaborative work allows experts in each domain to share their knowledge and learn from each other, rather than each trying to get caught up in the other’s field.&nbsp;&nbsp;</p> <p>“Over time, science has become more specialized, so we need ways to integrate the specialists better,” Pankratius says. The college of computing could help forge more such collaborations, he adds. Pankratius also says it could attract researchers who work at the intersection of these disciplines, who can bridge gaps in understanding between experts.</p> <p>This type of work integrating computer science is already becoming increasingly common across scientific fields, Seager notes. “Machine learning is ‘in vogue’ right now,” she says.&nbsp;</p> <p>Pankratius says that is in part because there is more evidence that leveraging computer science techniques is an effective way to address various types of problems and growing data sets.</p> <p>“We now have demonstrations in different areas that the computer-aided discovery approach doesn’t just work,” Pankratius says. “It actually leads to new discoveries.”</p> Worlds orbiting stars other than our sun are “exoplanets,” and they come in many sizes, from gas giants larger than Jupiter to small, rocky planets. This illustration of a "super-Earth" represents the type of planet that the TESS mission aims to find outside our solar system.Image: M. Kornmesser/ESOMIT Schwarzman College of Computing, Kavli Institute, Astronomy, Astrophysics, EAPS, Exoplanets, NASA, Planetary science, Research, Satellites, School of Engineering, School of Science, Space, astronomy and planetary science, TESS, Artificial intelligence, Physics, Aeronautical and astronautical engineering, Computer science and technology An immersive experience in industry Through the MechE Alliance’s Industry Immersion Program, graduate students get hands-on experience working on projects across a range of industries. Thu, 19 Sep 2019 13:00:01 -0400 Mary Beth Gallagher | Department of Mechanical Engineering <p>This summer, four mechanical engineering graduate students had the opportunity to gain hands-on experience working in industry. Through the recently launched <a href="">Industry Immersion Project Program (I2P)</a>, students were paired with a company and tasked with tackling a short-term project. Projects in this inaugural year for the program came from a diverse range of industries, including manufacturing, robotics, and aerospace engineering.</p> <p>A flagship program of the <a href="">MechE Alliance</a>, the I2P Program matches students with a company and project that best fits within their own academic experience at MIT. Projects are designed to be short term, lasting three to six months. Building upon programs such as the <a href="">Master of Engineering in Advanced Manufacturing</a> and Design and <a href="">Leaders for Global Operations</a>, which foster collaborations between students and the manufacturing industry, the I2P Program offers graduate students real-world experiences across industries.</p> <p>“For some students, this could be their first experience working in industry before graduating,” says Brian W. Anthony, program faculty director of the I2P Program. “Having that industry experience arms them with knowledge to help make career choices, may inform their further research, and provides skills they will utilize throughout their careers — whether they end up working in academia or industry.”</p> <p>Throughout the course of the projects, students are supported by both a supervisor at the company they’re working for and an academic supervisor from MIT’s mechanical engineering faculty. They also produce a report of their experience and receive academic credit for their industry projects and are enrolled in the class 2.992 (Professional Industry Immersion Project).</p> <p>“It’s been great hearing just how rich the experience has been from the students who participated this summer,” adds Theresa Werth, program manager for the MechE Alliance. “Not only have they spent the summer working on a project that’s relevant to their own research or thesis, they have honed some of the softer skills of professional development.”</p> <p>The four students participating in this year’s I2P Program have shared highlights and takeaways from their experiences:</p> <p><strong>Sara Nagelberg&nbsp;— 3M</strong></p> <p>A PhD candidate working with Associate Professor Mathias Kolle in the Bio-Inspired Photonic Engineering research group, Sara Nagelberg studies optical engineering. Through the I2P Program, this summer she worked at 3M on a project that seeks to automate surface finish analysis in manufacturing by understanding visual perception.</p> <p>While much of manufacturing involves automation, automating quality inspection for the surface finish on appliances or cars offers some technical challenges. The project Nagelberg worked on at 3M hopes to define what makes a surface "good," then develop algorithms so that a computer can determine whether a surface finish is good quality or flawed.</p> <p>“The long-term goal of the project is to automate surface-quality inspection,” Nagelberg explains. She and her team identified parameters that could be used to judge the visual appearance of surfaces — things like color, glossiness, shape, and texture.</p> <p>“By working on this project, I learned about a variety of instruments and metrics that can be used to quantify visual surface finish parameters,” she adds.</p> <p>In addition to gaining experience on an interdisciplinary team at 3M, Nagelberg learned about computer vision, machine learning, and how to relate human perception to measurable parameters.</p> <p><strong>Katie Hahm — Amazon Robotics</strong></p> <p>This summer was one of transition for Katie Hahm. Having graduated with her master’s degree in June, Hahm is now a PhD candidate working in the Device Realization Lab with program director Anthony. As a master’s student, Hahm previously worked with Professor Harry Asada on designing robotic limbs to help manufacturing workers maintain positions for extended periods of time.</p> <p>Through the I2P Program, Hahm worked on a project at Amazon Robotics to improve efficiencies in the robotic process. “Working on this project was a great academic experience,” says Hahm. “I gained insights into the many facets and complexities of robotics.”</p> <p>Hahm also received a ground truth in what it’s like to work at a company like Amazon. She visited a local fulfillment center to gain a deeper understanding of their operations and visited Seattle to attend a company conference. At the conference, she and her fellow interns met with company leadership and teams from other Amazon sectors.</p> <p>One of the biggest takeaways from her experience at Amazon, according to Hahm, was how to approach research projects moving forward. “I learned not only valuable information from working with other professionals, but also the skills and approaches to asking more effective questions for research-oriented work,” she adds.</p> <p><strong>Sai Nithin Reddy Kantareddy — Amazon Robotics</strong></p> <p>A junior PhD candidate, much of Sai Nithin Reddy Kantareddy’s work involves using radio frequency identification (RFID) tags to sense activity and gather data about the surrounding environment. These RFID tags can then be used to connect objects to the internet of things.</p> <p>“Going into this summer, I knew I wanted to work on something related to sensors because of my research interest in environmental sensing,” explains Kantareddy. Through the I2P Program, Kantareddy was assigned to a project about material identification and sensing in robotics at Amazon Robotics.</p> <p>“Material identification for robotic applications really aligns with my own research interests,” he adds. While at Amazon Robotics, he gained hands-on experience working with sensors, cameras, and robots. He also built machine learning models on experimental data.</p> <p>While his background isn’t in robotics research, Kantareddy quickly learned about how robots are designed and what some of the challenges are in field implementation and warehouse automation. In addition to this in-depth technical knowledge, he also gained firsthand experience working in a team setting.</p> <p>“I enjoyed being part of a very resourceful and talented R&amp;D team,” he recalls. &nbsp;“I hope to take back these real-world insights and technical learnings and put them to practice in my PhD work.”</p> <p><strong>Abhishek Patkar — Systems Technology Inc.</strong></p> <p>A sophomore master’s student, Abhishek Patkar works in the flight controls group the Active Adaptive Control Laboratory, led by in Senior Research Scientist Anuradha Annaswamy. Working at Systems Technology Inc. (STI) was a natural fit. Much of STI’s work focuses on aerospace engineering.</p> <p>For his internship, Patkar was matched with Aditya Kotikalpudi, a senior research engineer at STI and the principal investigator for NASA’s project entitled Performance Adaptive Aeroelastic Wing. “I primarily worked on system identification and model parameter update for an aeroelastic vehicle,” says Patkar.</p> <p>While his internship was based in Los Angeles, California, Patkar had the opportunity to visit the University of Minnesota and witness the actual process of flight testing. He worked with the real data taken from these flight tests. Patkar also used STI software to identify aeroelastic mode shapes and obtain transfer function estimates from control surfaces to measured quantities like center body pitch rate.&nbsp;</p> <p>“Through this internship, I was able to learn a lot about aircraft dynamics, aeroelasticity, and the process of performing system identification on an aircraft,” Patkar adds. He expects to use this knowledge back in the flight controls group in the Active Adaptive Control Laboratory.</p> MIT PhD candidates Katie Hahm (left) and Nithin Reddy (right) hiked Skyline Trail in Mount Rainier National Park, Washington, this summer with a friend, Steven Viola. They both spent the summer interning at Amazon Robotics through the MechE Alliance's I2P Program and traveled to Amazon headquarters in Seattle. Photo: Madox SummermatterMechanical engineering, School of Engineering, Classes and programs, Manufacturing, Robotics, Aeronautical and astronautical engineering, STEM education, Industry, Students, Undergraduate, Graduate, postdoctoral", teaching, academics MIT engineers develop “blackest black” material to date Made from carbon nanotubes, the new coating is 10 times darker than other very black materials. Thu, 12 Sep 2019 23:59:59 -0400 Jennifer Chu | MIT News Office <p>With apologies to “Spinal Tap<em>,</em>” it appears that black can, indeed, get more black.</p> <p>MIT engineers report today that they have cooked up a material that is 10 times blacker than anything that has previously been reported. The material is made from vertically aligned carbon nanotubes, or CNTs — microscopic filaments of carbon, like a fuzzy forest of tiny trees, that the team grew on a surface of chlorine-etched aluminum foil. The foil captures at least 99.995 percent* of any incoming light, making it the blackest material on record.</p> <p>The researchers have published their findings today in the journal <em>ACS-Applied Materials and Interfaces. </em>They are also showcasing the cloak-like material as part of a <a href="" target="_blank">new exhibit</a> today at the New York Stock Exchange, titled <a href="" target="_blank">“The Redemption of Vanity.”</a></p> <p>The artwork, conceived by Diemut Strebe, an artist-in-residence at the MIT Center for Art, Science, and Technology, in&nbsp;collaboration with Brian Wardle, professor of aeronautics and astronautics at MIT, and his group, and MIT Center for Art, Science, and Technology artist-in-residence Diemut Strebe, features a 16.78-carat natural yellow diamond from LJ West Diamonds, estimated to be worth $2 million, which the team coated with the new, ultrablack CNT material. The effect is arresting: The gem, normally brilliantly faceted, appears as a flat, black void.</p> <p>Wardle says the CNT material, aside from making an artistic statement, may also be of practical use, for instance in optical blinders that reduce unwanted glare, to help space telescopes spot orbiting exoplanets.</p> <p>“There are optical and space science applications for very black materials, and of course, artists have been interested in black, going back well before the Renaissance,” Wardle says. “Our material is 10 times blacker than anything that’s ever been reported, but I think the blackest black is a constantly moving target. Someone will find a blacker material, and eventually we’ll understand all the underlying mechanisms, and will be able to properly engineer the ultimate black.”</p> <p>Wardle’s co-author on the paper is former MIT postdoc Kehang Cui, now a professor at Shanghai Jiao Tong University.</p> <p><strong>Into the void</strong></p> <p>Wardle and Cui didn’t intend to engineer an ultrablack material. Instead, they were experimenting with ways to grow carbon nanotubes on electrically conducting materials such as aluminum, to boost their electrical and thermal properties.</p> <p>But in attempting to grow CNTs on aluminum, Cui ran up against a barrier, literally: an ever-present layer of oxide that coats aluminum when it is exposed to air. This oxide layer acts as an insulator, blocking rather than conducting electricity and heat. As he cast about for ways to remove aluminum’s oxide layer, Cui found a solution in salt, or sodium chloride.</p> <p>At the time, Wardle’s group was using salt and other pantry products, such as baking soda and detergent, to <a href="">grow carbon nanotubes</a>. In their tests with salt, Cui noticed that chloride ions were eating away at aluminum’s surface and dissolving its oxide layer.</p> <p>“This etching process is common for many metals,” Cui says. “For instance, ships suffer from corrosion of chlorine-based ocean water. Now we’re using this process to our advantage.”</p> <p>Cui found that if he soaked aluminum foil in saltwater, he could remove the oxide layer. He then transferred the foil to an oxygen-free environment to prevent reoxidation, and finally, placed the etched aluminum in an oven, where the group carried out techniques to grow carbon nanotubes via a process called chemical vapor deposition.</p> <p>By removing the oxide layer, the researchers were able to grow carbon nanotubes on aluminum, at much lower temperatures than they otherwise would, by about 100 degrees Celsius. They also saw that the combination of CNTs on aluminum significantly enhanced the material’s thermal and electrical properties — a finding that they expected.</p> <p>What surprised them was the material’s color.</p> <p>“I remember noticing how black it was before growing carbon nanotubes on it, and then after growth, it looked even darker,” Cui recalls. “So I thought I should measure the optical reflectance of the sample.</p> <p>“Our group does not usually focus on optical properties of materials, but this work was going on at the same time as our art-science collaborations with Diemut, so art influenced science in this case,” says Wardle.</p> <p>Wardle and Cui, who have applied for a patent on the technology, are making the new CNT process freely available to any artist to use for a noncommercial art project.</p> <p><strong>“Built to take abuse”</strong></p> <p>Cui measured the amount of light reflected by the material, not just from directly overhead, but also from every other possible angle. The results showed that the material absorbed at least 99.995 percent of incoming light, from every angle. In other words, it reflected 10 times less light than all other superblack materials, including Vantablack. If the material contained bumps or ridges, or features of any kind, no matter what angle it was viewed from, these features would be invisible, obscured in a void of black. &nbsp;</p> <p>The researchers aren’t entirely sure of the mechanism contributing to the material’s opacity, but they suspect that it may have something to do with the combination of etched aluminum, which is somewhat blackened, with the carbon nanotubes. Scientists believe that forests of carbon nanotubes can trap and convert most incoming light to heat, reflecting very little of it back out as light, thereby giving CNTs a particularly black shade.</p> <p>“CNT forests of different varieties are known to be extremely black, but there is a lack of mechanistic understanding as to why this material is the blackest. That needs further study,” Wardle says.</p> <p>The material is already gaining interest in the aerospace community. Astrophysicist and Nobel laureate John Mather, who was not involved in the research, is exploring the possibility of using Wardle’s material as the basis for a star shade — a massive black shade that would shield a space telescope from stray light.</p> <p>“Optical instruments like cameras and telescopes have to get rid of unwanted glare, so you can see what you want to see,” Mather says. “Would you like to see an Earth orbiting another star? We need something very black. … And this black has to be tough to withstand a rocket launch. Old versions were fragile forests of fur, but these are more like pot scrubbers — built to take abuse."</p> <p><em>*An earlier version of this story stated that the new material captures more than 99.96 percent of incoming light. That number has been updated to be more precise; the material absorbs at least 99.995 of incoming light.</em></p> The Redemption of Vanity, is a work of art by MIT artist in residence Diemut Strebe that has been realized together with Brian L. Wardle, Professor of Aeronautics and Astronautics and Director of necstlab and Nano- Engineered Composite aerospace STructures (NECST) Consortium and his team Drs. Luiz Acauan and Estelle Cohen. Strebe’s residency at MIT is supported by the Center for Art, Science and Technology (CAST).Image: Diemut StrebeAeronautical and astronautical engineering, Arts, Carbon nanotubes, Exhibits, Research, School of Engineering, Visual arts, MIT Center for Art, Science & Technology (CAST), Technology and society TESS team is awarded NASA&#039;s Silver Achievement Medal The honor recognizes the &quot;stellar achievement&quot; of the people behind the exoplanet-seeking satellite. Thu, 12 Sep 2019 13:25:01 -0400 Kylie Foy | Lincoln Laboratory <p>On Sept. 5, NASA awarded a Silver Achievement Medal to the Transiting Exoplanet Survey Satellite (TESS) team. The award was presented during a ceremony at the NASA Goddard Space Flight Center as part of NASA's 2019 Agency Honor Awards.</p> <p>The Silver Achievement Medal is given by NASA center directors in recognition of government and non-government individuals or teams for "a stellar achievement that supports one or more of NASA's core values, when it is deemed to be extraordinarily important and appropriate to recognize such achievement in a timely and personalized manner."</p> <p>TESS was launched in April 2018 as the next step in NASA's search for exoplanets, which are planets outside of Earth's solar system. The four cameras aboard TESS were conceived, designed, and built by the MIT Kavli Institute for Astrophysics and Space Research and by MIT Lincoln Laboratory. Together, the cameras will gaze at 85 percent of the sky over the course of its mission, looking for discreet dips in light that signify that a planet is passing in front of a star.</p> <p>George Ricker, TESS principal investigator and senior research scientist at the Kavli Institute, says that "the NASA Silver Achievement Medal recognizes the revolutionary impact that TESS is now having on the emerging field of exoplanets, as well as TESS’ revealing of exciting new insights in stellar and extragalactic astrophysics. The members of the TESS science and engineering teams can rightly be proud of the marvelous instrument which they have brought into operation in just five years from our mission’s selection by NASA. I am personally gratified to have been part of this impressive MIT-led team."</p> <p>Lincoln Laboratory group leader Gregory Berthiaume, who served as the instrument manager for the TESS mission, adds that he is proud and honored to be a part of the TESS team, and congratulated the "incredibly talented and dedicated staff who provided key technologies and capabilities that enabled the TESS instrument."</p> <p>"The team is continuing to work hard to find planet candidates in the TESS data, and new planet candidates that may help us answer some of the intriguing questions in exoplanets today," says Sara Seager, the deputy science director of TESS who is also the Class of 1941 Professor Chair in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS) with appointments in the departments of Physics and Aeronautics and Astronautics.</p> <p>Approximately 250 MIT scientists and engineers are included as recipients of the Silver Achievement Medal for their role on the TESS team.&nbsp;</p> <p>Since its launch, TESS has discovered two dozen new planets and 850 more potential worlds that have yet to be confirmed. Scientists expect TESS to discover thousands of new exoplanets by the time its mission is over.</p> <p>TESS is a NASA Astrophysics Explorer mission led and operated by MIT in Cambridge, Massachusetts, and managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland. George Ricker of MIT’s Kavli Institute, serves as principal investigator for the mission.&nbsp;Additional partners include Orbital ATK, NASA’s Ames Research Center, the Harvard-Smithsonian Center for Astrophysics, and the Space Telescope Science Institute. More than a dozen universities, research institutes, and observatories worldwide are participants in the mission.</p> The Transiting Exoplanet Survey Satellite is inspected before its launch in April 2018. Approximately 250 MIT scientists and engineers are included as recipients of the NASA Silver Achievement Medal for their role in the TESS mission.Photo: Goddard Space Flight CenterLincoln Laboratory, Kavli Institute, Exoplanets, TESS, Astronomy, Astrophysics, Awards, honors and fellowships, Planetary science, Physics, School of Science, EAPS, Space, astronomy and planetary science, Aeronautical and astronautical engineering Taking the next giant leaps Fifty years after the first moon landing with Apollo 11, the Department of Aeronautics and Astronautics looks to the future of space exploration at MIT. Thu, 05 Sep 2019 15:00:01 -0400 Sara Cody | Department of Aeronautics and Astronautics <p>In July, the world celebrated the 50th anniversary of the historic Apollo 11 moon landing. MIT played an enormous role in that accomplishment, helping to usher in a new age of space exploration. Now MIT faculty, staff, and students are working toward the next great advances — ones that could propel humans back to the moon, and to parts still unknown.&nbsp;&nbsp;</p> <p>“I am hard-pressed to think of another event that brought the world together in such a collective way as the Apollo moon landing,” says Daniel Hastings, the Cecil and Ida Green Education Professor and head of the Department of Aeronautics and Astronautics (AeroAstro). “Since the spring, we have been <a href="" target="_self">celebrating</a> the role <a href="" target="_self">MIT played</a> in getting us there and reflecting on how far technology has come in the past five decades.”&nbsp;</p> <p>“Our community continues to build on the incredible legacy of Apollo,” Hastings adds. Some aspects of future of space exploration, he notes, will follow from lessons learned. Others will come from newly developed technologies that were unimaginable in the 1960s. And still others will arise from novel collaborations that will fuel the next phases of research and discovery.&nbsp;</p> <p>“This is a tremendously exciting time to think about the future of space exploration,” Hastings says. “And MIT is leading the way.”</p> <p><strong>Sticking the landing</strong></p> <p>Making a safe landing — anywhere — can be a life-or-death situation. On Earth, thanks to a network of global positioning satellites and a range of ground-based systems, pilots have instantaneous access to real-time data on every aspect of a landing environment. The moon, however, is not home to any of this precision navigation technology, making it rife with potential danger.&nbsp;</p> <p>NASA’s recent decision to return to moon has made this a more pressing challenge — and one that MIT has risen to before. The former MIT Instrumentation Lab (now the independent Draper) developed the guidance systems that enabled Neil Armstrong and Buzz Aldrin to land safely on the moon, and that were used on all Apollo spacecraft. This system relied on inertial navigation, which integrates acceleration and velocity measurements from electronic sensors on the vehicle and a digital computer to determine the spacecraft’s location. It was a remarkable achievement — the first time that humans traveled in a vehicle controlled by a computer.&nbsp;&nbsp;</p> <p>Today, working in MIT’s Aerospace Controls Lab with Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics, graduate student Lena Downes — who is also co-advised by Ted Steiner at Draper — is developing a camera-based navigation system that can sense the terrain beneath the landing vehicle and use that information to update the location estimation. “If we want to explore a crater to determine its age or origin,” Downes explains, “we will need to avoid landing on the more highly-sloped rim of the crater. Since lunar landings can have errors as high as several kilometers, we can’t plan to land too closely to the edge.”&nbsp;</p> <p>Downes’s research on crater detection involves processing images using convolutional neural networks and traditional computer vision methods. The images are combined with other data, such as previous measurements and known crater location information, enabling increased precision vehicle location estimation.</p> <p>“When we return to the moon, we want to visit more interesting locations, but the problem is that more interesting can often mean more hazardous,” says Downes. “Terrain-relative navigation will allow us to explore these locations more safely.”</p> <p><strong>“Make it, don’t take it”</strong></p> <p>NASA also has its sights set on Mars — and with that objective comes a very different challenge: What if something breaks? Given that the estimated travel time to Mars is between 150 and 300 days, there is a relatively high chance that something will break or malfunction during flight. (Just ask Jim Lovell or Fred Haise, whose spacecraft needed serious repairs only 55 hours and 54 minutes into the Apollo 13 mission.)</p> <p>Matthew Moraguez, a graduate student in Professor Olivier L. de Weck’s Engineering Systems Lab, wants to empower astronauts to manufacture whatever they need, whenever they need it. (“On the fly,” you could say).</p> <p>“In-space manufacturing (ISM) — where astronauts can carry out the fabrication, assembly, and integration of components — could revolutionize this paradigm,” says Moraguez. “Since components wouldn’t be limited by launch-related design constraints, ISM could reduce the cost and improve the performance of existing space systems while also enabling entirely new capabilities.”</p> <p>Historically, a key challenge facing ISM is correctly pairing the components with manufacturing processes needed to produce them. Moraguez approached this problem by first defining the constraints created by a stressful launch environment, which can limit the size and weight of a payload. He then itemized the challenges that could potentially be alleviated by ISM and developed cost-estimating relationships and performance models to determine the exact break-even point at which ISM surpasses the current approach.&nbsp;</p> <p>Moraguez points to Made in Space, an additive manufacturing facility that is currently in use on the International Space Station. The facility produces tools and other materials as needed, reducing both the cost and the wait time of replenishing supplies from Earth. Moraguez is now developing physics-based manufacturing models that will determine the size, weight, and power required for the next generation of ISM equipment.</p> <p>“We have been able to evaluate the commercial viability of ISM across a wide range of application areas,” says Moraguez. “Armed with this framework, we aim to determine the best components to produce with ISM and their appropriate manufacturing processes. We want to develop the technology to a point where it truly revolutionizes the future of spaceflight. Ultimately, it could allow humans to travel further into deep space for longer durations than ever before,” he says.&nbsp;</p> <p><strong>Partnering with industry</strong></p> <p>The MIT Instrumentation Lab was awarded the first contract for the Apollo program in 1961. In one brief paragraph on a <a href="" target="_blank">Western Union telegram</a>, the lab was charged with developing the program’s guidance and control system. Today the future of space exploration depends as much as ever on deep collaborations.&nbsp;</p> <p>Boeing is a longstanding corporate partner of MIT, supporting such efforts as the <a href="">Wright Brother’s Wind Tunnel</a> renovation and the New Engineering Education Transformation <a href="">(NEET) program</a>, which focuses on modern industry and real-world projects in support of MIT’s educational mission. In 2020, Boeing is slated to open the <a href="">Aerospace and Autonomy Center</a> in Kendall Square, which will focus on advancing enabling technologies for autonomous aircraft.</p> <p>Just last spring the Institute announced a new relationship with <a href="">Blue Origin</a> in which it will begin planning and <a href="" target="_self">developing new payloads</a> for missions to the moon. These new science experiments, rovers, power systems, and more will hitch a ride to the moon via Blue Moon, Blue Origin’s flexible lunar lander.&nbsp;</p> <p>Working with IBM, MIT researchers are exploring the potential uses of artificial intelligence in space research. This year, IBM’s AI Research Week (Sept. 16-20) will feature an event, co-hosted with AeroAstro, in which researchers will pitch ideas for projects related to <a href="">AI and the International Space Station</a>.</p> <p>“We are currently in an exciting new era marked by the development and growth of entrepreneurial private enterprises driving space exploration,” says Hastings. “This will lead to new and transformative ways for human beings to travel to space, to create new profit-making ventures in space for the world’s economy, and, of course, lowering the barrier of access to space so many other countries can join this exciting new enterprise.”</p> Building on the legacy of the Apollo program, MIT faculty, staff, and students are working toward the next great advances in space flight — ones that could propel humans back to the moon, and to parts still unknown.Aeronautical and astronautical engineering, Spaceflight, Space, astronomy and planetary science, History of MIT, History, Industry, Collaboration, Additive manufacturing, 3-D printing, NASA MIT report examines how to make technology work for society Task force calls for bold public and private action to harness technology for shared prosperity. Wed, 04 Sep 2019 08:59:59 -0400 Peter Dizikes | MIT News Office <p>Automation is not likely to eliminate millions of jobs any time soon — but the U.S. still needs vastly improved policies if Americans are to build better careers and share prosperity as technological changes occur, according to a new MIT report about the workplace.</p> <p><a href="">The report</a>, which represents the initial findings of MIT’s Task Force on the Work of the Future, punctures some conventional wisdom and builds a nuanced picture of the evolution of technology and jobs, the subject of much fraught public discussion.</p> <p>The likelihood of robots, automation, and artificial intelligence (AI) wiping out huge sectors of the workforce in the near future is exaggerated, the task force concludes — but there is reason for concern about the impact of new technology on the labor market. In recent decades, technology has contributed to the polarization of employment, disproportionately helping high-skilled professionals while reducing opportunities for many other workers, and new technologies could exacerbate this trend.</p> <p>Moreover, the report emphasizes, at a time of historic income inequality, a critical challenge is not necessarily a lack of jobs, but the low quality of many jobs and the resulting lack of viable careers for many people, particularly workers without college degrees. With this in mind, the work of the future can be shaped beneficially by new policies, renewed support for labor, and reformed institutions, not just new technologies. Broadly, the task force concludes, capitalism in the U.S. must address the interests of workers as well as shareholders.</p> <p>“At MIT, we are inspired by the idea that technology can be a force for good. But if as a nation we want to make sure that today’s new technologies evolve in ways that help build a healthier, more equitable society, we need to move quickly to develop and implement strong, enlightened policy responses,” says MIT President L. Rafael Reif, who called for the creation of the Task Force on the Work of the Future in 2017.</p> <p>“Fortunately, the harsh societal consequences that concern us all are not inevitable,” Reif adds. “Technologies embody the values of those who make them, and the policies we build around them can profoundly shape their impact. Whether the outcome is inclusive or exclusive, fair or laissez-faire, is therefore up to all of us. I am deeply grateful to the task force members for their latest findings and their ongoing efforts to pave an upward path.”</p> <p>“There is a lot of alarmist rhetoric about how the robots are coming,” adds Elisabeth Beck Reynolds, executive director of the task force, as well as executive director of the MIT Industrial Performance Center. “MIT’s job is to cut through some of this hype and bring some perspective to this discussion.”</p> <p>Reynolds also calls the task force’s interest in new policy directions “classically American in its willingness to consider innovation and experimentation.”</p> <p><strong>Anxiety and inequality</strong></p> <p>The core of the task force consists of a group of MIT scholars. Its research has drawn upon new data, expert knowledge of many technology sectors, and a close analysis of both technology-centered firms and economic data spanning the postwar era.</p> <p>The report addresses several workplace complexities. Unemployment in the U.S. is low, yet workers have considerable anxiety, from multiple sources. One is technology: A 2018 survey by the Pew Research Center found that 65 to 90 percent of respondents in industrialized countries think computers and robots will take over many jobs done by humans, while less than a third think better-paying jobs will result from these technologies.</p> <p>Another concern for workers is income stagnation: Adjusted for inflation, 92 percent of Americans born in 1940 earned more money than their parents, but only about half of people born in 1980 can say that.</p> <p>“The persistent growth in the quantity of jobs has not been matched by an equivalent growth in job quality,” the task force report states.</p> <p>Applications of technology have fed inequality in recent decades. High-tech innovations have displaced “middle-skilled” workers who perform routine tasks, from office assistants to assembly-line workers, but these innovations have complemented the activities of many white-collar workers in medicine, science and engineering, finance, and other fields. Technology has also not displaced lower-skilled service workers, leading to a polarized workforce. Higher-skill and lower-skill jobs have grown, middle-skill jobs have shrunk, and increased earnings have been concentrated among white-collar workers.</p> <p>“Technological advances did deliver productivity growth over the last four decades,” the report states. “But productivity growth did not translate into shared prosperity.”</p> <p>Indeed, says David Autor, who is the Ford Professor of Economics at MIT, associate head of MIT’s Department of Economics, and a co-chair of the task force, “We think people are pessimistic because they’re on to something. Although there’s no shortage of jobs, the gains have been so unequally distributed that most people have not benefited much. If the next four decades of automation are going to look like the last four decades, people have reason to worry.”</p> <p><strong>Productive innovations versus “so-so technology”</strong></p> <p>A big question, then, is what the next decades of automation have in store. As the report explains, some technological innovations are broadly productive, while others are merely “so-so technologies” — a term coined by economists Daron Acemoglu of MIT and Pascual Restrepo of Boston University to describe technologies that replace workers without markedly improving services or increasing productivity.</p> <p>For instance, electricity and light bulbs were broadly productive, allowing the expansion of other types of work. But automated technology allowing for self-check-out at pharmacies or supermarkets merely replaces workers without notably increasing efficiency for the customer or productivity.</p> <p>“That’s a strong labor-displacing technology, but it has very modest productivity value,” Autor says of these automated systems. “That’s a ‘so-so technology.’ The digital era has had fabulous technologies for skill complementarity [for white-collar workers], but so-so technologies for everybody else. Not all innovations that raise productivity displace workers, and not all innovations that displace workers do much for productivity.”</p> <p>Several forces have contributed to this skew, according to the report. “Computers and the internet enabled a digitalization of work that made highly educated workers more productive and made less-educated workers easier to replace with machinery,” the authors write.</p> <p>Given the mixed record of the last four decades, does the advent of robotics and AI herald a brighter future, or a darker one? The task force suggests the answer depends on how humans shape that future. New and emerging technologies will raise aggregate economic output and boost wealth, and offer people the potential for higher living standards, better working conditions, greater economic security, and improved health and longevity. But whether society realizes this potential, the report notes, depends critically on the institutions that transform aggregate wealth into greater shared prosperity instead of rising inequality.</p> <p>One thing the task force does not foresee is a future where human expertise, judgment, and creativity are less essential than they are today. &nbsp;</p> <p>“Recent history shows that key advances in workplace robotics — those that radically increase productivity — depend on breakthroughs in work design that often take years or even decades to achieve,” the report states.</p> <p>As robots gain flexibility and situational adaptability, they will certainly take over a larger set of tasks in warehouses, hospitals, and retail stores — such as lifting, stocking, transporting, cleaning, as well as awkward physical tasks that require picking, harvesting, stooping, or crouching.</p> <p>The task force members believe such advances in robotics will displace relatively low-paid human tasks and boost the productivity of workers, whose attention will be freed to focus on higher-value-added work. The pace at which these tasks are delegated to machines will be hastened by slowing growth, tight labor markets, and the rapid aging of workforces in most industrialized countries, including the U.S.</p> <p>And while machine learning — image classification, real-time analytics, data forecasting, and more — has improved, it may just alter jobs, not eliminate them: Radiologists do much more than interpret X-rays, for instance. The task force also observes that developers of autonomous vehicles, another hot media topic, have been “ratcheting back” their timelines and ambitions over the last year.</p> <p>“The recent reset of expectations on driverless cars is a leading indicator for other types of AI-enabled systems as well,” says David A. Mindell, co-chair of the task force, professor of aeronautics and astronautics, and the Dibner Professor of the History of Engineering and Manufacturing at MIT. “These technologies hold great promise, but it takes time to understand the optimal combination of people and machines. And the timing of adoption is crucial for understanding the impact on workers.”</p> <p><strong>Policy proposals for the future</strong></p> <p>Still, if the worst-case scenario of a “job apocalypse” is unlikely, the continued deployment of so-so technologies could make the future of work worse for many people.</p> <p>If people are worried that technologies could limit opportunity, social mobility, and shared prosperity, the report states, “Economic history confirms that this sentiment is neither ill-informed nor misguided. There is ample reason for concern about whether technological advances will improve or erode employment and earnings prospects for the bulk of the workforce.”</p> <p>At the same time, the task force report finds reason for “tempered optimism,” asserting that better policies can significantly improve tomorrow’s work.</p> <p>“Technology is a human product,” Mindell says. “We shape technological change through our choices of investments, incentives, cultural values, and political objectives.”</p> <p>To this end, the task force focuses on a few key policy areas. One is renewed investment in postsecondary workforce education outside of the four-year college system — and not just in the STEM skills (science, technology, engineering, math) but reading, writing, and the “social skills” of teamwork and judgment.</p> <p>Community colleges are the biggest training providers in the country, with 12 million for-credit and non-credit students, and are a natural location for bolstering workforce education. A wide range of new models for gaining educational credentials is also emerging, the task force notes. The report also emphasizes the value of multiple types of on-the-job training programs for workers.</p> <p>However, the report cautions, investments in education may be necessary but not sufficient for workers: “Hoping that ‘if we skill them, jobs will come,’ is an inadequate foundation for constructing a more productive and economically secure labor market.”</p> <p>More broadly, therefore, the report argues that the interests of capital and labor need to be rebalanced. The U.S., it notes, “is unique among market economies in venerating pure shareholder capitalism,” even though workers and communities are business stakeholders too.</p> <p>“Within this paradigm [of pure shareholder capitalism], the personal, social, and public costs of layoffs and plant closings should not play a critical role in firm decision-making,” the report states.</p> <p>The task force recommends greater recognition of workers as stakeholders in corporate decision making. Redressing the decades-long erosion of worker bargaining power will require new institutions that bend the arc of innovation toward making workers more productive rather than less necessary. The report holds that the adversarial system of collective bargaining, enshrined in U.S. labor law adopted during the Great Depression, is overdue for reform.</p> <p>The U.S. tax code can be altered to help workers as well. Right now, it favors investments in capital rather than labor — for instance, capital depreciation can be written off, and R&amp;D investment receives a tax credit, whereas investments in workers produce no such equivalent benefits. The task force recommends new tax policy that would also incentivize investments in human capital, through training programs, for instance.</p> <p>Additionally, the task force recommends restoring support for R&amp;D to past levels and rebuilding U.S. leadership in the development of new AI-related technologies, “not merely to win but to lead innovation in directions that will benefit the nation: complementing workers, boosting productivity, and strengthening the economic foundation for shared prosperity.”</p> <p>Ultimately the task force’s goal is to encourage investment in technologies that improve productivity, and to ensure that workers share in the prosperity that could result.</p> <p>“There’s no question technological progress that raises productivity creates opportunity,” Autor says. “It expands the set of possibilities that you can realize. But it doesn’t guarantee that you will make good choices.”</p> <p>Reynolds adds: “The question for firms going forward is: How are they going to improve their productivity in ways that can lead to greater quality and efficiency, and aren’t just about cutting costs and bringing in marginally better technology?”</p> <p><strong>Further research and analyses</strong></p> <p>In addition to Reynolds, Autor, and Mindell, the central group within MIT’s Task Force on the Work of the Future consists of 18 MIT professors representing all five Institute schools. Additionally, the project has a 22-person advisory board drawn from the ranks of industry leaders, former government officials, and academia; a 14-person research board of scholars; and eight graduate students. The task force also counsulted with business executives, labor leaders, and community college leaders, among others.</p> <p>The task force follows other influential MIT projects such as the Commission on Industrial Productivity, an intensive multiyear study of U.S. industry in the 1980s. That effort resulted in the widely read book, “Made in America,” as well as the creation of MIT’s Industrial Performance Center.</p> <p>The current task force taps into MIT’s depth of knowledge across a full range of technologies, as well as its strengths in the social sciences.</p> <p>“MIT is engaged in developing frontier technology,” Reynolds says. “Not necessarily what will be introduced tomorrow, but five, 10, or 25 years from now. We do see what’s on the horizon, and our researchers want to bring realism and context to the public discourse.”</p> <p>The current report is an interim finding from the task force; the group plans to conduct additional research over the next year, and then will issue a final version of the report.</p> <p>“What we’re trying to do with this work,” Reynolds concludes, “is to provide a holistic perspective, which is not just about the labor market and not just about technology, but brings it all together, for a more rational and productive discussion in the public sphere.”</p> MIT’s Task Force on the Work of the Future has released a report that punctures some conventional wisdom and builds a nuanced picture of the evolution of technology and jobs.School of Engineering, School of Architecture and Planning, School of Humanities Arts and Social Sciences, School of Science, Sloan School of Management, Jobs, Economics, Aeronautical and astronautical engineering, Urban studies and planning, Program in STS, Industrial Performance Center, employment, Artificial intelligence, Industry, President L. Rafael Reif, Policy, Machine learning, Faculty, Technology and society, Innovation and Entrepreneurship (I&E), Poverty, Business and management, Manufacturing, Careers, STEM education, MIT Schwarzman College of Computing Overcoming obstacles with an electric hovercraft MIT team places first among U.S. universities at 2019 SpaceX Hyperloop Pod Competition. Wed, 28 Aug 2019 12:00:01 -0400 Sarah Jensen | School of Engineering <p>Through dedication and a willingness to face challenges both expected and unforeseen, an MIT team recently brought the air-powered hovercraft from the world of Saturday-morning cartoons to reality, at the 2019 SpaceX Hyperloop Pod Competition.</p> <p>But that’s only part of the story.</p> <p><strong>What’s past is prologue</strong></p> <p>In a 2013 white paper, Elon Musk, technology entrepreneur, investor, and engineer, detailed a high-speed frictionless train — the Hyperloop. When drag and atmosphere were removed from a tunnel, he posited, trains could float within a vacuum tube at up to 700 miles per hour.</p> <p>Musk wasn’t the first to imagine an air-powered train. In the 1860s, Alfred Ely Beach, inventor, publisher, and patent lawyer, envisioned a subway under the streets of New York City. In 1870, his experiments in pneumatic power resulted in a demonstration run of the Beach Pneumatic Transit, a 10-passenger car propelled by a 100-horsepower fan, baffles, and blowers, through a tunnel beneath Broadway. His efforts were thwarted by Tammany Hall politics and the Panic of 1873.</p> <p>It took the MIT team, dubbed Hyperloop II, to once more embrace Beach’s concept. “We took Beach’s vision and accomplished a much more efficient pneumatic vehicle,” explains Vik Parthiban, team captain.</p> <p><strong>Lofty goals</strong></p> <p>Parthiban, a graduate researcher at the MIT Media Lab, was part of the 2017 SpaceX Hyperloop Pod Competition during his undergraduate years at the University of Texas. He came to MIT determined to further the technology and, in the fall of 2018, recruited nearly 30 undergraduate and graduate students to develop an autonomous electric hovercraft.</p> <p>“Imagine an air hockey puck,” explains Parthiban. “Instead of air coming out of a table, it comes out of pucks under the vehicle. A regulation system pumps air into these air castors, which then levitate the vehicle.” Four castors beneath the vehicle are operated by a pneumatic system controlled by a central computer. The propulsion system takes the 200-kilogram vehicle from zero to 200 miles per hour in 20 seconds with the push of a finger.</p> <p>High-speed passenger trains in China and Japan use magnetic levitation to create a gap between the train and the track to remove the drag, but Parthiban took a different approach. “Putting magnetic levitation in a hyperloop is expensive,” he says. “Our goal was to invent a new technology that would cost less and be more efficient than magnetic levitation, and to develop an electric hovercraft that would work even without a vacuum tunnel. The only thing needed is a flat surface.”</p> <p><strong>The process</strong></p> <p>With support from the Edgerton Center and industry sponsors including Arrow Electronics, Silicon Expert, and Texas Guadaloop, the group joined forces to contribute individual skills. “We worked together to figure out the best way to integrate the components. Every person brought their own knowledge,” says Nick Dowmon, software engineering lead and a System Design and Management (SDM) graduate student. “It was an awesome learning opportunity and a chance to collaborate and learn from each other.”</p> <p>Over the winter, the team met in the Edgerton Center’s build space to create a machine no one had ever built before. They brainstormed, designed, and redesigned. They machined parts, outsourcing the more complex components. They collaborated with the University of Texas on pneumatics and conducted analyses to determine the type of sensors needed to levitate and propel the pod at the required speed, adjusting here, fine-tuning there. They fashioned the 70-component wiring harness and constructed a test track in a 200-foot-long corridor beneath MIT’s Great Dome.</p> <p>On May 22, the completed pod was presented at the MIT Museum to an overflow crowd eager to view the world’s first electric hovercraft.</p> <p><strong>A minor setback</strong></p> <p>In early summer, the production schedule was on target. Team members were confident the pod would meet its delivery deadline and reach California by July 7. On June 18, Parthiban and two teammates bent over the pod in the build space, intent upon working out last-minute details.</p> <p>Then Parthiban saw flames. A tear in the battery insulation had caused a short, and he reached for a fire extinguisher. But the blaze quickly escalated, he recalls, and he reached for the fire alarm instead.</p> <p>“The battery insulation fire burned down most of the vehicle,” he says. “It was the saddest thing.”</p> <p>Parthiban called an emergency team meeting that evening, and within two hours, every team member had arrived — including those who’d left the project to focus on research and internships. Parthiban explained that rebuilding the pod in three weeks was virtually impossible.</p> <p>But in true MIT style, every team member came together in a resounding “Let’s do this!”</p> <p>“Everyone agreed we had to make it happen,” says Bowen Zeng, levitation lead and a graduate student in mechanical engineering. “There was no choice.”</p> <p>“We had to drop everything to rebuild the pod before we went to California,” says Dowmon. “Many times, I was still in the build space at midnight with someone that I didn’t normally work with toiling on a part of the pod, but we helped each other. We worked through it together.”</p> <p>Three days after the meeting, the pneumatic panel was rebuilt. In a week, the new chassis was finished. The electronic systems were recreated. Sponsors fast-tracked the delivery of replacement components. And a week before the shipping deadline, the pod was finished (again).</p> <p>“I don’t think I’ve ever worked with a team that was so dedicated, so able to keep on going after something so discouraging,” adds Jessica Harsono, braking team lead and graduate student in mechanical engineering.</p> <p>MIT’s entry was the only fully-functioning levitating pod in the competition at SpaceX headquarters in July. “Competition week was truly where our collaboration paid off,” says Parthiban. “With only a few people in California, we had to split the tasks and get parts and do the machining in a short amount of time, under deadline. But we made it.”</p> <p>The MIT team emerged as the No. 1 U.S. university at the annual competition and placed fifth worldwide. They also earned a SpaceX Innovation award.</p> <p><strong>Only at MIT</strong></p> <p>Beach laid the groundwork and Musk provided the opportunity, but in the end, it was the spirit of camaraderie and teamwork that made the MIT team’s hyperloop a reality.</p> <p>“My motivation wasn’t that I wanted to achieve this for myself,” says Harsono. “So many other people worked so hard, and I didn’t want to let them down. I was motivated out of respect for what they’d done and how much effort and care they put in.”</p> <p>“This only can happen at MIT,” Parthiban says. “We all have that same mindset, the same hard-work attitude.”</p> Hyperloop II holding their Innovation Award at the 2019 SpaceX Hyperloop Pod Competition.Photo: Hyperloop II teamSchool of Engineering, Media Lab, Aeronautics and Astronautics, Aeronautical and astronautical engineering, Contests and academic competitions, Students, Transportation, Mechanical engineering, Electrical engineering and computer science (EECS), School of Architecture and Planning, Sloan School of Management, Clubs and activities The music of the spheres MIT hosts &quot;Songs from Extrasolar Spaces,&quot; a musical melding of art and science inspired by the Transiting Exoplanet Survey Satellite (TESS). Fri, 09 Aug 2019 13:25:01 -0400 Ken Shulman | Arts at MIT <p>Space has long fascinated poets, physicists, astronomers, and science fiction writers. Musicians, too, have often found beauty and meaning in the skies above. At MIT’s Kresge Auditorium, a group of composers and musicians manifested their fascination with space in a concert titled “Songs from Extrasolar Spaces.” Featuring the Lorelei Ensemble — a Boston, Massachusetts-based women’s choir — the concert included premieres by MIT composers John Harbison and Elena Ruehr, along with compositions by Meredith Monk and Molly Herron. All the music was inspired by discoveries in astronomy.</p> <p>“Songs from Extrasolar Spaces,” part of an MIT conference on TESS — the Transiting Exoplanet Survey Satellite, launched in April 2018. TESS is an MIT-led NASA mission that scans the skies for evidence of exoplanets: bodies ranging from dwarf planets to giant planets that orbit stars other than our sun. During its two-year mission, TESS and its four highly-sensitive cameras survey 85 percent of the sky, monitoring more than 200,000 stars for the temporary dips in brightness that might signal a transit — the passage of a planetary body across that star.</p> <p>“There is a feeling you get when you look at these images from TESS,” says Ruehr, an award-winning MIT lecturer in the Music and Theater Arts Section and former Guggenheim Fellow. “A sense of vastness, of infinity. This is the sensation I tried to capture and transpose into vocal music.”&nbsp;</p> <p>Supported by the MIT Center for Art, Science and Technology’s Fay Chandler Creativity Grant; MIT Music and Theater Arts; and aerospace and technology giant Northrop Grumman, which also built the TESS satellite, the July 30 concert was conceived by MIT Research Associate Natalia Guerrero. Both the conference and concert marked the 50th anniversary of the Apollo 11 moon landing — another milestone in the quest to chart the universe and Earth’s place in it.</p> <p>A 2014 MIT graduate, Guerrero manages the team finding planet candidates in the TESS images at the MIT Kavli Institute for Astrophysics and Space Research and is also the lead for the MIT branch of the mission’s communications team. “I wanted to include an event that could make the TESS mission accessible to people who aren’t astronomers or physicists,” says Guerrero. “But I also wanted that same event to inspire astronomers and physicists to look at their work in a new way.”</p> <p>Guerrero majored in physics and creative writing at MIT, and after graduating she deejayed a radio show called “Voice Box” on the MIT radio station WMBR. That transmission showcased contemporary vocal music and exposed her to composers including Harbison and Ruehr. Last year, in early summer, Guerrero contacted Ruehr to gauge her interest in composing music for a still-hypothetical concert that might complement the 2019 TESS conference.</p> <p>Ruehr was keen on the idea. She was also a perfect fit for the project. The composer had often drawn inspiration from visual images and other art forms for her music. “Sky Above Clouds,” an orchestral piece she composed in 1989, is inspired by the Georgia O’Keefe paintings she viewed as a child at the Art Institute of Chicago. Ruehr had also created music inspired by David Mitchell’s visionary novel “Cloud Atlas” and Anne Patchett’s “Bel Canto.” “It’s a question of reinterpreting language, capturing its rhythms and volumes and channeling them into music,” says Ruehr. “The source language can be fiction, or painting, or in this case these dazzling images of the universe.”</p> <p>In addition, Ruehr had long been fascinated by space and stars. “My father was a mathematician who studied fast Fourier transform analysis,” says Ruehr, who is currently composing an opera set in space. “As a young girl, I’d listen to him talking about infinity with his colleagues on the telephone. I would imagine my father existing in infinity, on the edge of space.”</p> <p>Drawing inspiration from the images TESS beams back to Earth, Ruehr composed two pieces for “Songs from Extrasolar Spaces.” The first, titled “Not from the Stars,” takes its name and lyrics from a Shakespeare sonnet. For the second, “Exoplanets,” Ruehr used a text that Guerrero extrapolated from the titles of the first group of scientific papers published from TESS data. “I’m used to working from images,” explains Ruehr. “First, I study them. Then, I sit down at the piano and try to create a single sound that captures their essence and resonance. Then, I start playing with that sound.”</p> <p>Ruehr was particularly pleased to compose music about space for the Lorelei Ensemble. “There’s a certain quality in a women’s choir, especially the Lorelei Ensemble, that is perfectly suited for this project,” says Ruehr. “They have an ethereal sound and wonderful harmonic structures that make us feel as if we’re perceiving a small dab of brightness in an envelope of darkness.”</p> <p>At the 2019 MIT TESS conference, experts from across the globe shared results from the first year of observation in the sky above the Southern Hemisphere, and discussed plans for the second-year trek above the Northern Hemisphere. The composers and musicians hope “Songs from Extrasolar Spaces” brought attention to the TESS missions, offers a new perspective on space exploration, and will perhaps spark further collaborations between scientists and artists. George Ricker, TESS principal investigator; Sara Seager, TESS deputy director of science; and Guerrero presented a pre-concert lecture. “Music has the power to generate incredibly powerful emotions,” says Ruehr. “So do these images from TESS. In many ways, they are more beautiful than any stars we might ever imagine.”</p> <p>TESS is a NASA Astrophysics Explorer mission led and operated by MIT in Cambridge, Massachusetts, and managed by Goddard Spaceflight Center. Additional partners include Northrop Grumman, based in Falls Church, Virginia; NASA’s Ames Research Center in California’s Silicon Valley; the Harvard-Smithsonian Center for Astrophysics in Cambridge; MIT Lincoln Laboratory; and the Space Telescope Science Institute in Baltimore, Maryland. More than a dozen universities, research institutes, and observatories worldwide are participants in the mission.</p> The Lorelei Ensemble performs in "Songs from Extrasolar Spaces: Music Inspired by TESS" on July 30 in MIT's Kresge Auditorium.Photo: Danny GoldfieldArts, Center for Art, Science and Technology, School of Humanities Arts and Social Sciences, Kavli Institute, Astronomy, NASA, TESS, Music, Faculty, School of Engineering, Satellites, Exoplanets, Theater, Special events and guest speakers, Technology and society, Space, astronomy and planetary science, Aeronautical and astronautical engineering, Alumni/ae Optimus Ride’s autonomous system makes self-driving vehicles a reality MIT startup’s unique approach to improving human mobility is helping it gain traction in a competitive landscape. Fri, 09 Aug 2019 11:36:01 -0400 Zach Winn | MIT News Office <p>Some of the biggest companies in the world are spending billions in the race to develop self-driving vehicles that can go anywhere. Meanwhile, Optimus Ride, a startup out of MIT, is already helping people get around by taking a different approach.</p> <p>The company’s autonomous vehicles only drive in areas it comprehensively maps, or geofences. Self-driving vehicles can safely move through these areas at about 25 miles per hour with today’s technology.</p> <p>“It’s important to realize there are multiple approaches, and multiple markets, to self-driving,” says Optimus Ride CEO Ryan Chin MA ’00, SM ’04, PhD ’12. “There’s no monolithic George Jetson kind of self-driving vehicle. You have robot trucks, you have self-driving taxis, self-driving pizza delivery machines, and each of these will have different time frames of technological development and different markets.”</p> <p>By partnering with developers, the Optimus team is currently focused on deploying its vehicles in communities with residential and commercial buildings, retirement communities, corporate and university campuses, airports, resorts, and smart cities. The founders estimate the combined value of transportation services in those markets to be over $600 billion.</p> <p>“We believe this is an important, huge business, but we also believe this is the first addressable market in the sense that we believe the first autonomous vehicles that will generate profits and make business sense will appear in these environments, because you can build the tech much more quickly,” says Chin, who co-founded the company with Albert Huang SM ’05, PhD ’10, Jenny Larios Berlin MCP ’14, MBA ’15, Ramiro Almeida, and Class of 1948 Career Development Professor of Aeronautics and Astronautics Sertac Karaman.</p> <p>Optimus Ride currently runs fleets of self-driving vehicles in the Seaport area of Boston, in a mixed-use development in South Weymouth, Massachusetts, and, as of this week, in the Brooklyn Navy Yard, a 300-acre industrial park that now hosts the first self-driving vehicle program in the state.</p> <p>Later this year, the company will also deploy its autonomous vehicles in a private community of Fairfield, California, and in a mixed-use development in Reston, Virginia.</p> <p>The early progress — and the valuable data that come with it — is the result of the company taking a holistic view of transportation. That perspective can be traced back to the founders’ diverse areas of focus at MIT.</p> <p><strong>A multidisciplinary team</strong></p> <p>Optimus Ride’s founders have worked across a wide array of departments, labs, and centers across MIT. The technical validation for the company began when Karaman participated in the Defense Advanced Research Projects Agency’s (DARPA) Urban Challenge with a team including Huang in 2007. Both researchers had also worked in the Computer Science and Artificial Intelligence Laboratory together.</p> <p>For the event, DARPA challenged 89 teams with creating a fully autonomous vehicle that could traverse a 60 mile course in under six hours. The vehicle from MIT was one of only six to complete the journey.</p> <p>Chin, who led a Media Lab project that developed a retractable electric vehicle in the Smart Cities group, met Karaman when both were PhD candidates in 2012. Almeida began working in the Media Lab as a visiting scholar a year later.</p> <p>As members of the group combined their expertise on both self-driving technology and the way people move around communities, they realized they needed help developing business models around their unique approach to improving transportation. Jenny Larios Berlin was introduced to the founders in 2015 after earning joint degrees from the Department of Urban Studies and Planning and the Sloan School of Management. The team started Optimus Ride in August that year.</p> <p>“The company is really a melting pot of ideas from all of these schools and departments,” Karaman says. “When we met each other, there was the technology angle, but we also realized there’s an important business angle, and there’s also an interesting urban planning/media arts and sciences angle around thinking of the system as a whole. So when we formed the company we thought, not just how can we build fully autonomous vehicles, but also how can we make transportation in general more affordable, sustainable, equitable, accessible, and so on.”</p> <p>Karaman says the company’s approach could only have originated in a highly collaborative environment like MIT, and believes it gives the company a big advantage in the self-driving sector.</p> <p>“I knew how to build autonomous systems, but in interacting with Ryan and Ramiro and Jenny, I really got a better understanding of what the systems would look like, what the smart cities that utilize the systems would look like, what some of the business models would look like,” Karaman says. “That has a feedback on the technology. It allows you to build the right kind of technology very efficiently in order to go to these markets.”</p> <p><img alt="" src="/sites/" style="width: 500px; height: 281px;" /></p> <p><em><span style="font-size:10px;">Optimus Ride's self-driving vehicles already travel on many public roads. Courtesy of Optimus Ride</span></em></p> <p><strong>First mover advantage</strong></p> <p>Optimus Ride’s vehicles have a suite of cameras, lasers, and sensors similar to what other companies use to help autonomous vehicles navigate their environments. But Karaman says the company’s key technical differentiators are its machine vision system, which rapidly identifies objects, and its ability to fuse all those data sources together to make predictions, such as where an object is going and when it will get there.</p> <p><img alt="" src="/sites/" /></p> <p><em><span style="font-size:10px;">Optimus Ride's vehicles feature a range of cameras and sensors to help them&nbsp;navigate their&nbsp;environment. Courtesy of Optimus Ride</span></em></p> <p>The strictly defined areas where the vehicles drive help them learn what Karaman calls the “culture of driving” on different roads. Human drivers might subconsciously take a little longer at certain intersections. Commuters might drive much faster than the speed limit. Those and other location-specific details, like the turn radius of the Silver Line bus in the Seaport, are learned by the system through experience.</p> <p>“A lot of the well-funded autonomous driving projects out there try to capture everything at the same time and tackle every problem,” Karaman says. “But we operate the vehicle in places where it can learn very rapidly. If you go around, say, 10,000 miles in a small community, you end up seeing a certain intersection a hundred or a thousand times, so you learn the culture of driving through that intersection. But if you go 10,000 miles around the country, you’ll only see places once.”</p> <p>Safety drivers are still required to be behind the wheels of autonomous vehicles in the states Optimus Ride operates in, but the founders hope to soon be monitoring fleets with fewer people in a manner similar to an air traffic controller.</p> <p>For now, though, they’re focused on scaling their current model. The contract in Reston, Virginia is part of a strategic partnership with one of the largest real estate managers in the world, Brookfield Properties. Chin says Brookfield owns over 100 locations where Optimus Ride could deploy its system, and the company is aiming to be operating 10 or more fleets by the end of 2020.</p> <p>“Collectively, [the founders] probably have around three decades of experience in building self-driving vehicles, electric vehicles, shared vehicles, mobility transportation, on demand systems, and in looking at how you integrate new transportation systems into cities,” Chin says. “So that’s been the idea of the company: to marry together technical expertise with the right kind of policymaking, the right kind of business models, and to bring autonomy to the world as fast as possible.”</p> Optimus Ride has already deployed its autonomous transportation systems in the Seaport area of Boston, in a mixed-use development in South Weymouth, Massachusetts, and in the Brooklyn Navy Yard, a 300-acre industrial park.Image: Courtesy of Optimus RideInnovation and Entrepreneurship (I&E), Startups, Autonomous vehicles, Electric vehicles, Transportation, cars, Media Lab, Urban studies and planning, Computer Science and Artificial Intelligence Laboratory (CSAIL), Aeronautical and astronautical engineering, Sloan School of Management, Cambridge, Boston and region, School of Architecture and Planning Jack Kerrebrock, professor emeritus of aeronautics and astronautics, dies at 91 Former department head and associate dean of engineering was an international expert in the development of propulsion systems for aircraft and spacecraft. Wed, 31 Jul 2019 10:48:10 -0400 Department of Aeronautics and Astronautics <p>Jack L. Kerrebrock, professor emeritus of aeronautics and astronautics at MIT, died at home on July 19. He was 91.</p> <p>Born in Los Angeles in 1928, Kerrebrock received his BS in 1950 from Oregon State University, his MS in 1951 from Yale University, and his PhD in 1956 from Caltech. With a passion for aerospace, he held positions with the National Advisory Committee for Aeronautics, Caltech, and Oak Ridge National Laboratory before joining the faculty of MIT as an assistant professor in 1960.</p> <p>Promoted to associate professor in 1962 and to full professor in 1965, Kerrebrock founded and directed the Department of Aeronautics and Astronautics’ Space Propulsion Laboratory from 1962 until 1976, when it merged with the department’s Gas Turbine Laboratory, of which he had become director in 1968. In 1978, he accepted the role of head of the Department of Aeronautics and Astronautics (AeroAstro).</p> <p>Kerrebrock enjoyed an international reputation as an expert in the development of propulsion systems for aircraft and spacecraft. Over the years, he served as chair or member of multiple advisory committees — both government and professional — and as NASA associate administrator of aeronautics and space technology.</p> <p>As associate director of engineering, Kerrebrock was the faculty leader of the Daedalus Project in AeroAstro. Daedalus was a human-powered aircraft that, on 23 April 1988, flew a distance of 72.4 miles (115.11 kilometers) in three hours, 54 minutes, from Heraklion on the island of Crete to the island of Santorini. Daedalus still holds the world record for human-powered flight. This flight was the culmination of a decade of work by MIT students and alumni and made a major contribution to the understanding of the science and engineering of human-powered flight.</p> <p>Elected to the National Academy of Engineering in 1978, Kerrebrock was the recipient of numerous accolades, including election to the status of honorary fellow of the&nbsp;American Institute of Aeronautics and Astronautics, as well as the Explorers Club and the American Academy of Arts and Sciences. A member of the&nbsp;American Association for the Advancement of Science, Sigma Xi, Tau Beta Pi, and Phi Kappa Phi, he received NASA’s Distinguished Service Medal&nbsp;in 1983.&nbsp;He was also a contributor to the Intergovernmental &nbsp;Panel on Climate Change, which along with Al Gore won the Nobel Prize in 2007.</p> <p>Although a luminary in his field, Kerrebrock — an enthusiastic outdoorsman — was perhaps never happier than when climbing a mountain, hiking a wilderness trail, or leading a group of young people through ice and snow to teach them independence and survival skills. He ran his first Boston Marathon in his early 50s on a whim, with no training, following that with several more marathons, including the Marine Corps Marathon in Washington.</p> <p>Kerrebrock and his wife Crickett traveled widely, to destinations including South Africa, Scotland, Tuscany, Paris, and a very special trip to Canaveral for one of the last Space Shuttle launches, where he was able to introduce his wife to his friend Neil Armstrong, who was one of her heroes.</p> <p>Kerrebrock was married to Rosemary “Crickett” Redmond (Keough) Kerrebrock for the last 12 years of his life. He was previously married for 50 years to the late Bernice “Vickie” (Veverka) Kerrebrock, who died in 2003. In addition to his wife, Kerrebrock leaves behind two children, Nancy Kerrebrock (Clint Cummins) of Palo Alto, California, and Peter Kerrebrock (Anne) of Hingham, Masachusetts; and five grandchildren, Lewis Kerrebrock, Gale Kerrebrock, Renata Cummins, Skyler Cummins, and Lance Cummins. He was preceded in death by his son Christopher Kerrebrock, brother Glenn, and sister Ann. He also is remembered fondly by the Redmond children, Paul J. Redmond Jr. and his partner Joe Palombo, Kelly Redmond and her husband Philip Davis, Maura Redmond, Meaghan Winokur and James Winokur and their children, Laine and Alicia.</p> <p>A public memorial service is being planned at MIT and will be announced soon. In lieu of flowers, contributions in his memory may be made to the Jack and Vickie Kerrebrock Fellowship Fund, Massachusetts Institute of Technology, 600 Memorial Drive, Cambridge MA 02139.</p> Jack Kerrebrock's accomplishments included serving as faculty leader of the Daedalus Project, which developed an aircraft that still holds the world record for human-powered flight. Photo courtesy of Crickett KerrebrockObituaries, Faculty, Aeronautical and astronautical engineering, School of Engineering, Administration School of Engineering second quarter 2019 awards Faculty members recognized for excellence via a diverse array of honors, grants, and prizes over the past quarter. Tue, 30 Jul 2019 13:40:01 -0400 School of Engineering <p>Members of the MIT engineering faculty receive many&nbsp;awards in recognition of their scholarship, service, and overall excellence. Every quarter, the School of Engineering publicly recognizes&nbsp;their achievements by highlighting the&nbsp;honors, prizes, and medals won by faculty working in their academic departments, labs, and centers.</p> <p>Antione Allanore, of the Department of Materials Science and Engineering, won the <a href="">Elsevier Atlas Award</a> on May 15; he also won <a href="">third place for best conference proceedings manuscript</a> at the TMS Annual Meeting and Exhibition on March 14.</p> <p>Dimitri Antoniadis, of the Department of Electrical Engineering and Computer Science, was elected to the <a href="">American Academy of Arts and Sciences</a> on April 18.</p> <p>Martin Bazant, of the Department of Chemical Engineering, was named a <a href="">fellow of the American Physical Society</a> on Oct. 17, 2018.</p> <p>Sangeeta Bhatia, of the Department of Electrical Engineering and Computer Science, was awarded an honorary degree of doctor of science from the University of London on July 4; she was also awarded the <a href="">Othmer Gold Medal</a> from the Science History Institute on March 8.</p> <p>Richard Braatz, of the Department of Chemical Engineering, was <a href="">elected to the&nbsp;National Academy of Engineering&nbsp;</a>on Feb. 11.</p> <p>Tamara Broderick, of the Department of Electrical Engineering and Computer Science, won the <a href="">Notable Paper Award</a> at the International Conference on Artificial Intelligence and Statistics on April 18.</p> <p>Fikile Brushett, of the Department of Chemical Engineering, won the <a href="">Electrochemical Society’s 2019 Supraniam Srinivasan Young Investigator Award</a> on Oct. 9, 2018; he was also named to the annual <a href="">Talented Twelve</a> list by <em>Chemical Engineering News</em> on Aug. 22, 2017.</p> <p>Vincent W.S. Chan, of the Department of Electrical Engineering and Computer Science, received the <a href="">Best Paper Award</a> at the IEEE International Conference on Communications on May 10.</p> <p>Arup Chakraborty, of the Department of Chemical Engineering, won a <a href="">Guggenheim Fellowship</a> on March 4, 2018.</p> <p>Anantha Chandrakasan, of the Department of Electrical Engineering and Computer Science, was elected to <a href="">American Academy of Arts and Science</a>s on April 18.</p> <p>Kwanghun Chung, of the Department of Chemical Engineering, was awarded a <a href="">Presidential Early Career Awards for Scientists and Engineers</a> on July 10.</p> <p>Constantinos Daskalakis, of the Department of Electrical Engineering and Computer Science, won the <a href="">Grace Murray Hopper Award for Outstanding Computer Scientist</a> from the Association of Computing Machinery on May 8.</p> <p>Jesús del Alamo, Department of Electrical Engineering and Computer Science, was named a <a href="">Fellow of the Materials Research Society</a> on May 2.</p> <p>Elazer R. Edelman, of the Institute for Medical Engineering and Science, won the Excellence in Mentoring Award from the <a href="">Corrigan Minehan Heart Center</a> at Massachusetts General Hospital on June 18.</p> <p>Karen K. Gleason, of the Department of Chemical Engineering, was honored with the <a href="">John M. Prausnitz Institute AIChE Lecturer&nbsp;Award</a> by the American Institute of Chemical Engineers on April 3.</p> <p>Bill Green, of the Department of Chemical Engineering, won the <a href="">R.H. Wilhelm Award in Chemical Reaction Engineering</a> from the American Institute of Chemical Engineers on July 19.</p> <p>Paula Hammond, of the Department of Chemical Engineering, was honored with the <a href="">Margaret H. Rousseau Pioneer Award for Lifetime Achievement by a Woman Chemical Engineer</a> from the American Institute of Chemical Engineers on June 1; she also received the <a href="">American Chemical Society Award in Applied Polymer Science</a> on Jan. 8, 2018.</p> <p>Ruonan Han, of the Department of Electrical Engineering and Computer Science, won the <a href="">Outstanding Researcher Award </a>from Intel Corporation on April 1.</p> <p>Song Han, of the Department of Electrical Engineering and Computer Science, was named to the annual list of <a href="">Innovators Under 35</a> by MIT Technology Review on June 25.</p> <p>Klavs Jensen, of the Department of Chemical Engineering, was honored with the <a href="">John M. Prausnitz Institute AIChE Lecturer&nbsp;Award</a> by the American Institute of Chemical Engineers on Aug. 21, 2018; he also recognized with the <a href="">Corning International Prize for Outstanding Work in Continuous Flow Reactors</a> on May 1, 2018.</p> <p>David R. Karger, of the Department of Electrical Engineering and Computer Science, was <a href="">elected to the American Academy of Arts and Sciences</a> on April 18.</p> <p>Dina Katabi, of the Department of Electrical Engineering and Computer Science, was <a href="">named a Great Immigrant by the Carnegie Corporation of New York</a> on June 27.</p> <p>Manolis Kellis, of the Department of Electrical Engineering and Computer Science, was honored as a speaker by the <a href="">Mendel Lectures Committee</a> on May 2.</p> <p>Jeehwan Kim, of the Department of Mechanical Engineering, awarded the <a href="">Young Faculty Award</a> from the Defense Advanced Research Projects Agency on May 28.</p> <p>Heather Kulik, of the Department of Chemical Engineering, was awarded a&nbsp;<a href=";HistoricalAwards=false">CAREER award from the National Science Foundation</a> on Feb. 7; she won the <a href="">Journal of Physical Chemistry and PHYS Division Lectureship Award</a> from the <em>Journal of Physical Chemistry</em> and the Physical Chemistry Division of the American Chemical Society on July 1; she was honored with the <a href="">Marion Milligan Mason Award</a> Oct. 26, 2018; she earned the <a href="">DARPA Young Faculty Award</a> on June 20, 2018; she also won the <a href="">Young Investigator Award from the Office of Naval Research</a> on Feb. 21, 2018.</p> <p>Robert Langer, of the Department of Chemical Engineering, won the <a href="">Dreyfus Prize for Chemistry in Support of Human Health</a> from the Camille and Henry Dreyfus Foundation on May 14; he also was named on the <a href="">2018 Medicine Maker’s Power List</a> on May 8, 2018; he was also named <a href="">U.S. Science Envoy</a> on June 18, 2018.</p> <p>John Lienhard, of the Department of Mechanical Engineering, recevied the <a href="">Edward F. Obert Award</a> from the American Society of Mechanical Engineers on May 28.</p> <p>Nancy Lynch, of the Department of Electrical Engineering and Computer Science, won <a href="">TDCP Outstanding Technical Achievement Award</a> from the Institute for Electrical and Electronics Engineers on April 18.</p> <p>Karthish Manthiram, of the Department of Chemical Engineering, received a <a href="">Petroleum Research Fund</a> grant from the American Chemical Society on June 28.</p> <p>Benedetto Marelli, of the Department of Civil and Environmental Engineering, won a <a href="">Presidential Early Career Awards for Scientists and Engineers</a> on July 10.</p> <p>Robert T. Morris, of the Department of Electrical Engineering and Computer Science, was <a href="">elected to the National Academy of Engineering</a> on Feb. 11.</p> <p>Heidi Nepf, of the Department of Civil and Environmental Engineering, won the <a href=";all_recipients=1">Hunter Rouse Hydraulic Engineering Award</a> from the American Society of Civil Engineers on May 20.</p> <p>Dava Newman, of the Department of Aeronautics and Astronautics, was named <a href="">co-chair of the Committee on Biological and Physical Sciences in Space</a> by the National Academies of Sciences, Engineering, and Medicine on April 8.</p> <p>Kristala Prather, of the Department of Chemical Engineering, was elected <a href="">fellow of American Association for the Advancement of Science</a> on Nov. 27, 2018.</p> <p>Ellen Roche, of the Department of Mechanical Engineering, won the <a href="">Child Health Research Award</a> from the Charles H. Hood Foundation on June 13; she was also awarded a <a href=";HistoricalAwards=false">CAREER award from the National Science Foundation</a> on Feb. 20.</p> <p>Yuriy Román, of the Department of Chemical Engineering, received the&nbsp;<a href="">Early Career in Catalysis Award</a> from the American Chemical Society Catalysis Science and Technology Division on Feb. 28; he also received the&nbsp;<a href="">Rutherford Aris Award</a>&nbsp;from the North American Symposium on Chemical Reaction Engineering on March 10.</p> <p>Julian Shun, of the Department of Electrical Engineering and Computer Science, awarded a <a href="">CAREER award from the National Science Foundation</a> on Feb. 26.</p> <p>Hadley Sikes, of the Department of Chemical Engineering, was honored with the <a href="">Best of BIOT</a> award from the ACS Division of Biochemical Technology on Sept. 9, 2018.</p> <p>Zachary Smith, of the Department of Chemical Engineering, was awarded the <a href="">Doctoral New Investigator Grant</a> from the American Chemical Society, on May 22.</p> <p>Michael Strano, of the Department of Chemical Engineering, won the <a href="">Andreas Acrivos Award for Professional Progress in Chemical Engineering</a> from American Institute of Chemical Engineers&nbsp;on July 1.</p> <p>Greg Stephanopoulos, of the Department of Chemical Engineering, was honored with the <a href="">Gaden Award for Biotechnology and Bioengineering</a> on March 31.</p> <p>Harry Tuller, of the Department of Materials Science and Engineering, received the <a href="">Thomas Egleston Medal for Distinguished Engineering Achievement</a> from Columbia University on May 3.</p> <p>Caroline Uhler, of the Department of Electrical Engineering and Computer Science, won the <a href="">Simons Investigator Award in the Mathematical Model of Living Systems</a> from Simmons Foundation on June 19.</p> Jaume Plensa sculpture, "The Alchemist" on MIT's campusPhoto: Lillie Paquette/School of EngineeringSchool of Engineering, Materials Science and Engineering, Electrical Engineering & Computer Science (eecs), Chemical engineering, Institute for Medical Engineering and Science (IMES), Mechanical engineering, Civil and environmental engineering, Awards, honors and fellowships, DMSE, Aeronautical and astronautical engineering, Faculty New leadership for Bernard M. Gordon-MIT Engineering Leadership Program Faculty and industry co-directors will focus on implementing new directions for program. Mon, 22 Jul 2019 12:20:02 -0400 School of Engineering <p>Olivier de Weck, professor of aeronautics and astronautics and of engineering systems at MIT, has been named the new faculty co-director of the Bernard M. Gordon-MIT Engineering Leadership Program (GEL). He joins Reza Rahaman, who was appointed the Bernard M. Gordon-MIT Engineering Leadership Program industry co-director and senior lecturer on July 1, 2018.</p> <p>“Professor de Weck has a longstanding commitment to engineering leadership, both as an educator and a researcher. I look forward to working with him and the GEL team as they continue to strengthen their outstanding undergraduate program and develop the new program for graduate students,” says Anantha Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science.</p> <p>A leader in systems engineering, de Weck researches how complex human-made systems such as aircraft, spacecraft, automobiles, and infrastructures are designed, manufactured, and operated. By investigating their lifecycle properties, de Weck and members of his research group have developed a range of novel techniques broadly adopted by industry to maximize the value of these systems over time.</p> <p>A fellow of the International Council on Systems Engineering (INCOSE), de Weck was honored with their Outstanding Service Award in 2018 for his work as editor-in-chief of <em>Systems Engineering</em>. He is also an associate fellow of the American Institute of Aeronautics and Astronautics (AAIA), where he previously served as associate editor for the&nbsp;<em>Journal of Spacecraft and Rockets</em>&nbsp;and chair of the AIAA Space Logistics Technical Committee. De Weck is a past recipient of the Capers and Marion McDonald Award for Excellence in Mentoring and Advising from the MIT School of Engineering, and the Teaching with Digital Technology Award from the MIT Office of Open Learning.</p> <p>A member of the MIT faculty since 2001, de Weck earned a BS in industrial engineering at ETH Zurich in 1993 and an MS and PhD in aerospace systems at MIT in 1999 and 2001. He previously served as associate head of the engineering systems division and as executive director of Production in the Innovation Economy (PIE) commission at MIT. He recently returned to campus after a two-year leave of absence at Airbus in Toulouse, France, where he served as senior vice president and was responsible for planning and roadmapping the group’s $1 billion research and technology portfolio.</p> <p>Since the launch of GEL in 2007, de Weck has taught 16.669/6.914 (Project Engineering) — a popular bootcamp-style class offered during independent activities period. Besides learning how to better plan and execute engineering projects, the class has helped cohorts of students create a sense of community and belonging.</p> <p>De Weck succeeds Joel Schindall, co-director for GEL since 2007 and the Bernard M. Gordon Professor of the Practice in electrical engineering and computer science. “Drawing on his many years of experience and success in industry, Joel has been an exceptional leader for the GEL program,” Chandrakasan says. “He has instilled the character and the skills that will enable our students to be both the thought leaders and the ‘do leaders’ of the future.”</p> <p>Reza Rahaman earned a BEng in chemical engineering at Imperial College London in 1984 and an MS in chemical engineering practice and PhD in chemical engineering at MIT in 1985 and 1989.</p> <p>Rahaman’s career in industry spanned nearly three decades across consumer packaged goods, pharmaceuticals, and agricultural chemicals. Before returning to MIT, he was the vice president of research, development, and innovation at the Clorox Company, where he guided new innovation strategies and coordinated technology roadmaps for 45 percent of the company’s portfolio. Rahaman also serves as vice chair of the board of directors for Out &amp; Equal Workplace Advocates, the largest nonprofit dedicated to LGBTQ workplace equality in the world.</p> <p>“Reza has deep expertise in leading large, highly matrixed organizations and spearheading complex technical projects to produce category-changing innovation,” says Chandrakasan. “His experience in industry, as well as his technical depth and inclusive leadership style, are a wonderful asset to our students. By imparting his knowledge, and guiding our students’ attitudes and thought processes, he is helping to create the next generation of exemplary leaders.”</p> Oliver de Weck (left) and Reza Rahaman are co-directors of the Bernard M. Gordon-MIT Engineering Leadership Program.Images: Courtesy of de Weck; Lillie Paquette/MIT School of EngineeringSchool of Engineering, GEL Program, Leadership, Faculty, Industry, Administration, Classes and programs, graduate, Graduate, postdoctoral, Staff, Aeronautical and astronautical engineering Behind the scenes of the Apollo mission at MIT From making the lunar landings possible to interpreting the meaning of the moon rocks, the Institute was a vital part of history. Thu, 18 Jul 2019 09:23:27 -0400 David L. Chandler | MIT News Office <p>Fifty years ago this week, humanity made its first expedition to another world, when Apollo 11 touched down on the moon and two astronauts walked on its surface. That moment changed the world in ways that still reverberate today.</p> <p>MIT’s deep and varied connections to that epochal event — <a href="">many</a> of <a href="">which</a> have been <a href="">described</a> on <em>MIT News</em> — began years before the actual landing, when the MIT Instrumentation Laboratory (now Draper) signed the very first contract to be awarded for the Apollo program after its announcement by President John F. Kennedy in 1961. The Institute’s involvement continued throughout the program — and is still ongoing today.</p> <p>MIT’s role in creating the navigation and guidance system that got the mission to the moon and back has been widely recognized in books, movies, and television series. But many other aspects of the Institute’s involvement in the Apollo program and its legacy, including advances in mechanical and computational engineering, simulation technology, biomedical studies, and the geophysics of planet formation, have remained less celebrated.</p> <p>Amid the growing chorus of recollections in various media that have been appearing around this 50th anniversary, here is a small collection of bits and pieces about some of the unsung heroes and lesser-known facts from the Apollo program and MIT’s central role in it.</p> <p><strong>A new age in electronics</strong></p> <p>The computer system and its software that controlled the spacecraft — called the Apollo Guidance Computer and designed by the MIT Instrumentation Lab team under the leadership of Eldon Hall — were remarkable achievements that helped push technology forward in many ways.</p> <p>The AGC’s programs were written in one of the first-ever compiler languages, called MAC, which was developed by Instrumentation Lab engineer <a href="" target="_blank">Hal Laning</a>. The computer itself, the 1-cubic-foot Apollo Guidance Computer, was the first significant use of silicon integrated circuit chips and greatly accelerated the development of the microchip technology that has gone on to change virtually every consumer product.</p> <p>In an age when most computers took up entire climate-controlled rooms, the compact AGC was uniquely small and lightweight. But most of its “software” was actually hard-wired: The programs were woven, with tiny donut-shaped metal “cores” strung like beads along a set of wires, with a given wire passing outside the donut to represent a zero, or through the hole for a 1. These so-called rope memories were made in the Boston suburbs at Raytheon, mostly by women who had been hired because they had experience in the weaving industry. Once made, there was no way to change individual bits within the rope, so any change to the software required weaving a whole new rope, and last-minute changes were impossible.</p> <p>As David Mindell, the <em>Frances and David Dibner Professor of the History of Engineering and Manufacturing,</em> points out in his book “Digital Apollo,” that system represented the first time a computer of any kind had been used to control, in real-time, many functions of a vehicle carrying human beings — a trend that continues to accelerate as the world moves toward self-driving vehicles. Right after the Apollo successes, the AGC was directly adapted to an F-8 fighter jet, to create the first-ever fly-by-wire system for aircraft, where the plane’s control surfaces are moved via a computer rather than direct cables and hydraulic systems. This approach is now widespread in the aerospace industry, says John Tylko, who teaches MIT’s class <a href="" target="_blank">16.895J</a> (Engineering Apollo: The Moon Project as a Complex System), which is taught every other year.</p> <p>As sophisticated as the computer was for its time, computer users today would barely recognize it as such. Its keyboard and display screen looked more like those on a microwave oven than a computer: a simple numeric keypad and a few lines of five-digit luminous displays. Even the big mainframe computer used to test the code as it was being developed had no keyboard or monitor that the programmers ever saw. Programmers wrote their code by hand, then typed it onto punch cards — one card per line — and handed the deck of cards to a computer operator. The next day, the cards would be returned with a printout of the program’s output. And in this time long before email, communications among the team often relied on handwritten paper notes.</p> <p><strong>Priceless rocks</strong></p> <p>MIT’s involvement in the geophysical side of the Apollo program also extends back to the early planning stages — and continues today. For example, Professor Nafi Toksöz, an expert in seismology, helped to develop a seismic monitoring station that the astronauts placed on the moon, where it helped lead to a greater understanding of the moon’s structure and formation. “It was the hardest work I have ever done, but definitely the most exciting,” <a href="">he has said</a>.</p> <p>Toksöz says that the data from the Apollo seismometers “changed our understanding of the moon completely.” The seismic waves, which on Earth continue for a few minutes, lasted for two hours, which turned out to be the result of the moon’s extreme lack of water. “That was something we never expected, and had never seen,” he recalls.</p> <p>The first seismometer was placed on the moon’s surface very shortly after the astronauts landed, and seismologists including Toksöz started seeing the data right away — including every footstep the astronauts took on the surface. Even when the astronauts returned to the lander to sleep before the morning takeoff, the team could see that Buzz Aldrin&nbsp;ScD ’63 and Neil Armstrong were having a sleepless night, with every toss and turn dutifully recorded on the seismic traces.</p> <p>MIT Professor Gene Simmons was among the first group of scientists to gain access to the lunar samples as soon as NASA released them from quarantine, and he and others in what is now the Department of Earth, Planetary and Atmospheric Sciences (EAPS) have continued to work on these samples ever since. As part of a conference on campus, he exhibited some samples of lunar rock and soil in their first close-up display to the public, where some people may even have had a chance to touch the samples.</p> <p>Others in EAPS have also been studying those Apollo samples almost from the beginning. Timothy Grove, the Robert R. Shrock Professor of Earth and Planetary Sciences, started studying the Apollo samples in 1971 as a graduate student at Harvard University, and has been doing research on them ever since. Grove says that these samples have led to major new understandings of planetary formation processes that have helped us understand the Earth and other planets better as well.</p> <p>Among other findings, the rocks showed that ratios of the isotopes of oxygen and other elements in the moon rocks were identical to those in terrestrial rocks but completely different than those of any meteorites, proving that the Earth and the moon had a common origin and leading to the hypothesis that the moon was created through a giant impact from a planet-sized body. The rocks also showed that the entire surface of the moon had likely been molten at one time. The idea that a planetary body could be covered by an ocean of magma was a major surprise to geologists, Grove says.</p> <p>Many puzzles remain to this day, and the analysis of the rock and soil samples goes on. “There’s still a lot of exciting stuff” being found in these samples, Grove says.</p> <p><strong>Sorting out the facts</strong></p> <p>In the spate of publicity and new books, articles, and programs about Apollo, inevitably some of the facts — some trivial, some substantive — have been scrambled along the way. “There are some myths being advanced,” says Tylko, some of which he addresses in his “Engineering Apollo” class. “People tend to oversimplify” many aspects of the mission, he says.</p> <p>For example, many accounts have described the sequence of alarms that came from the guidance computer during the last four minutes of the mission, forcing mission controllers to make the daring decision to go ahead despite the unknown nature of the problem. But Don Eyles, one of the Instrumentation Lab’s programmers who had written the landing software for the AGC, says that he can’t think of a single account he’s read about that sequence of events that gets it entirely right. According to Eyles, many have claimed the problem was caused by the fact that the rendezvous radar switch had been left on, so that its data were overloading the computer and causing it to reboot.</p> <p>But Eyles says the actual reason was a much more complex sequence of events, including a crucial mismatch between two circuits that would only occur in rare circumstances and thus would have been hard to detect in testing, and a probably last-minute decion to put a vital switch in a position that allowed it to happen. Eyles has described these details in a memoir about the Apollo years and in a technical paper <a href="" target="_blank">available online</a>, but he says they are difficult to summarize simply. But he thinks the author Norman Mailer may have come closest, capturing the essence of it in his book “Of a Fire on the Moon,” where he describes the issue as caused by a “sneak circuit” and an “undetectable” error in the onboard checklist.</p> <p>Some accounts have described the AGC as a very limited and primitive computer compared to today’s average smartphone, and Tylko acknowledges that it had a tiny fraction of the power of today’s smart devices — but, he says, “that doesn’t mean they were unsophisticated.” While the AGC only had about 36 kilobytes of read-only memory and 2 kilobytes of random-access memory, “it was exceptionally sophisticated and made the best use of the resources available at the time,” he says.</p> <p>In some ways it was even ahead of its time, Tylko says. For example, the compiler language developed by Laning along with Ramon Alonso at the Instrumentation Lab used an architecture that he says was relatively intuitive and easy to interact with. Based on a system of “verbs” (actions to be performed) and “nouns” (data to be worked on), “it could probably have made its way into the architecture of PCs,” he says. “It’s an elegant interface based on the way humans think.”</p> <p>Some accounts go so far as to claim that the computer failed during the descent and astronaut Neil Armstrong had to take over the controls and land manually, but in fact partial manual control was always part of the plan, and the computer remained in ultimate control throughout the mission. None of the onboard computers ever malfunctioned through the entire Apollo program, according to astronaut David Scott SM ’62, who used the computer on two Apollo missions: “We never had a failure, and I think that is a remarkable achievement.”</p> <p><strong>Behind the scenes</strong></p> <p>At the peak of the program, a total of about 1,700 people at MIT’s Instrumentation Lab were working on the Apollo program’s software and hardware, according to Draper, the Instrumentation Lab’s successor, which spun off from MIT in 1973. A few of those, such as the near-legendary <a href="">“Doc” Draper</a> himself — Charles Stark Draper ’26, SM ’28, ScD ’38, former head of the Department of Aeronautics and Astronautics (AeroAstro) — have become widely known for their roles in the mission, but most did their work in near-anonymity, and many went on to entirely different kinds of work after the Apollo program’s end.</p> <p><a href="">Margaret Hamilton</a>, who directed the Instrumentation Lab’s Software Engineering Division, was little known outside of the program itself until an <a href="" target="_self">iconic photo</a> of her next to the original stacks of AGC code began making the rounds on social media in the mid 2010s. In 2016, when she was awarded the Presidential Medal of Freedom by President Barack Obama, MIT Professor Jaime Peraire, then head of AeroAstro, said of Hamilton that “She was a true software engineering pioneer, and it’s not hyperbole to say that she, and the Instrumentation Lab’s Software Engineering Division that she led, put us on the moon.” After Apollo, Hamilton went on to found a software services company, which she still leads.</p> <p>Many others who played major roles in that software and hardware development have also had their roles little recognized over the years. For example, Hal Laning ’40, PhD ’47, who developed the programming language for the AGC, also devised its executive operating system, which employed what was at the time a new way of handling multiple programs at once, by assigning each one a priority level so that the most important tasks, such as controlling the lunar module’s thrusters, would always be taken care of. “Hal was the most brilliant person we ever had the chance to work with,” Instrumentation Lab engineer Dan Lickly <em><a href="" target="_blank">told MIT Technology Review</a></em>. And that priority-driven operating system proved crucial in allowing the Apollo 11 landing to proceed safely in spite of the 1202 alarms going off during the lunar descent.</p> <p>While the majority of the team working on the project was male, software engineer Dana Densmore recalls that compared to the heavily male-dominated workforce at NASA at the time, the MIT lab was relatively welcoming to women. Densmore, who was a control supervisor for the lunar landing software, told <em>The Wall Street Journal</em> that “NASA had a few women, and they kept them hidden. At the lab it was very different,” and there were opportunities for women there to take on significant roles in the project.</p> <p>Hamilton recalls the atmosphere at the Instrumentation Lab in those days as one of real dedication and meritocracy. As she told <a href="">MIT News in 2009</a>, “Coming up with solutions and new ideas was an adventure. Dedication and commitment were a given. Mutual respect was across the board. Because software was a mystery, a black box, upper management gave us total freedom and trust. We had to find a way and we did. Looking back, we were the luckiest people in the world; there was no choice but to be pioneers.”</p> The computer system and software that controlled the Apollo 11 spacecraft — called the Apollo Guidance Computer and designed by the MIT Instrumentation Lab team — helped push technology forward in many ways. The computer itself was the first significant use of silicon integrated circuit chips.Image courtesy of the MIT MuseumAeronautical and astronautical engineering, EAPS, Research, Satellites, School of Engineering, History of MIT, NASA, History of science, Moon, Space, astronomy and planetary science, Women in STEM, School of Science Spotting objects amid clutter New approach quickly finds hidden objects in dense point clouds, for use in driverless cars or work spaces with robotic assistants. Wed, 19 Jun 2019 23:59:59 -0400 Jennifer Chu | MIT News Office <p>A new MIT-developed technique enables robots to quickly identify objects hidden in a three-dimensional cloud of data, reminiscent of how some people can make sense of a densely patterned “Magic Eye” image if they observe it in just the right way.</p> <p>Robots typically “see” their environment through sensors that collect and translate a visual scene into a matrix of dots. Think of the world of, well, “The Matrix,” except that the 1s and 0s seen by the fictional character Neo are replaced by dots — lots of dots — whose patterns and densities outline the objects in a particular scene.</p> <p>Conventional techniques that try to pick out objects from such clouds of dots, or point clouds, can do so with either speed or accuracy, but not both.</p> <p>With their new technique, the researchers say a robot can accurately pick out an object, such as a small animal, that is otherwise obscured within a dense cloud of dots, within seconds of receiving the visual data. The team says the technique can be used to improve a host of situations in which machine perception must be both speedy and accurate, including driverless cars and robotic assistants in the factory and the home.</p> <p>“The surprising thing about this work is, if I ask you to find a bunny in this cloud of thousands of points, there’s no way you could do that,” says Luca Carlone, assistant professor of aeronautics and astronautics and a member of MIT’s Laboratory for Information and Decision Systems (LIDS). “But our algorithm is able to see the object through all this clutter. So we’re getting to a level of superhuman performance in localizing objects.”</p> <p>Carlone and graduate student Heng Yang will present details of the technique later this month at the Robotics: Science and Systems conference in Germany.</p> <p><strong>“Failing without knowing”</strong></p> <p>Robots currently attempt to identify objects in a point cloud by comparing a template object — a 3-D dot representation of an object, such as a rabbit — with a point cloud representation of the real world that may contain that object. The template image includes “features,” or collections of dots that indicate characteristic curvatures or angles of that object, such the bunny’s ear or tail. Existing algorithms first extract similar features from the real-life point cloud, then attempt to match those features and the template’s features, and ultimately rotate and align the features to the template to determine if the point cloud contains the object in question.</p> <p>But the point cloud data that streams into a robot’s sensor invariably includes errors, in the form of dots that are in the wrong position or incorrectly spaced, which can significantly confuse the process of feature extraction and matching. As a consequence, robots can make a huge number of wrong associations, or what researchers call “outliers” between point clouds, and ultimately misidentify objects or miss them entirely.</p> <p>Carlone says state-of-the-art algorithms are able to sift the bad associations from the good once features have been matched, but they do so in “exponential time,” meaning that even a cluster of processing-heavy computers, sifting through dense point cloud data with existing algorithms, would not be able to solve the problem in a reasonable time. Such techniques, while accurate, are impractical for analyzing larger, real-life datasets containing dense point clouds.</p> <p>Other algorithms that can quickly identify features and associations do so hastily, creating a huge number of outliers or misdetections in the process, without being aware of these errors.</p> <p>“That’s terrible if this is running on a self-driving car, or any safety-critical application,” Carlone says. “Failing without knowing you’re failing is the worst thing an algorithm can do.”</p> <p><strong>A relaxed view</strong></p> <p>Yang and Carlone instead devised a technique that prunes away outliers in “polynomial time,” meaning that it can do so quickly, even for increasingly dense clouds of dots. The technique can thus quickly and accurately identify objects hidden in cluttered scenes.</p> <p><img alt="" src="/sites/" style="width: 500px; height: 281px;" /></p> <p><em><span style="font-size:10px;"><span style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0); font-family: Calibri, sans-serif; text-size-adjust: auto;">The MIT-developed technique quickly and smoothly matches objects to those hidden in dense point clouds (left), versus existing techniques (right) that produce incorrect, disjointed matches.&nbsp;</span>Gif: Courtesy of the researchers</span></em></p> <p>The researchers first used conventional techniques to extract features of a template object from a point cloud. They then developed a three-step process to match the size, position, and orientation of the object in a point cloud with the template object, while simultaneously identifying good from bad feature associations.</p> <p>The team developed an “adaptive voting scheme” algorithm to prune outliers and match an object’s size and position. For size, the algorithm makes associations between template and point cloud features, then compares the relative distance between features in a template and corresponding features in the point cloud. If, say, the distance between two features in the point cloud is five times that of the corresponding points in the template, the algorithm assigns a “vote” to the hypothesis that the object is five times larger than the template object.</p> <p>The algorithm does this for every feature association. Then, the algorithm selects those associations that fall under the size hypothesis with the most votes, and identifies those as the correct associations, while pruning away the others. &nbsp;In this way, the technique simultaneously reveals the correct associations and the relative size of the object represented by those associations. The same process is used to determine the object’s position. &nbsp;</p> <p>The researchers developed a separate algorithm for rotation, which finds the orientation of the template object in three-dimensional space.</p> <p>To do this is an incredibly tricky computational task. Imagine holding a mug and trying to tilt it just so, to match a blurry image of something that might be that same mug. There are any number of angles you could tilt that mug, and each of those angles has a certain likelihood of matching the blurry image.</p> <p>Existing techniques handle this problem by considering each possible tilt or rotation of the object as a “cost” — the lower the cost, the more likely that that rotation creates an accurate match between features. Each rotation and associated cost is represented in a topographic map of sorts, made up of multiple hills and valleys, with lower elevations associated with lower cost.</p> <p>But Carlone says this can easily confuse an algorithm, especially if there are multiple valleys and no discernible lowest point representing the true, exact match between a particular rotation of an object and the object in a point cloud. Instead, the team developed a “convex relaxation” algorithm that simplifies the topographic map, with one single valley representing the optimal rotation. In this way, the algorithm is able to quickly identify the rotation that defines the orientation of the object in the point cloud.</p> <p>With their approach, the team was able to quickly and accurately identify three different objects — a bunny, a dragon, and a Buddha — hidden in point clouds of increasing density. They were also able to identify objects in real-life scenes, including a living room, in which the algorithm quickly was able to spot a cereal box and a baseball hat.</p> <p>Carlone says that because the approach is able to work in “polynomial time,” it can be easily scaled up to analyze even denser point clouds, resembling the complexity of sensor data for driverless cars, for example.</p> <p>“Navigation, collaborative manufacturing, domestic robots, search and rescue, and self-driving cars is where we hope to make an impact,” Carlone says.</p> <p>This research was supported in part by the Army Research Laboratory, the Office of Naval Research, and the Google Daydream Research Program.</p> Robots currently attempt to identify objects in a point cloud by comparing a template object — a 3-D dot representation of an object, such as a rabbit — with a point cloud representation of the real world that may contain that object.Image: Christine Daniloff, MITAeronautical and astronautical engineering, Algorithms, Autonomous vehicles, Laboratory for Information and Decision Systems (LIDS), Robots, Robotics, Research, School of Engineering QS ranks MIT the world’s No. 1 university for 2019-20 Ranked at the top for the eighth straight year, the Institute also places first in 11 of 48 disciplines. Tue, 18 Jun 2019 20:01:00 -0400 MIT News Office <p>MIT has again been named the world’s top university by the QS World University Rankings, which were announced today. This is the eighth year in a row MIT has received this distinction.</p> <p>The full 2019-20 rankings — published by Quacquarelli Symonds, an organization specializing in education and study abroad — can be found at <a href=""></a>. The QS rankings were based on academic reputation, employer reputation, citations per faculty, student-to-faculty ratio, proportion of international faculty, and proportion of international students. MIT earned a perfect overall score of 100.</p> <p>MIT was also ranked the world’s top university in <a href="">11 of 48 disciplines ranked by QS</a>, as announced in February of this year.</p> <p>MIT received a No. 1 ranking in the following QS subject areas: Chemistry; Computer Science and Information Systems; Chemical Engineering; Civil and Structural Engineering; Electrical and Electronic Engineering; Mechanical, Aeronautical and Manufacturing Engineering; Linguistics; Materials Science; Mathematics; Physics and Astronomy; and Statistics and Operational Research.</p> <p>MIT also placed second in six subject areas: Accounting and Finance; Architecture/Built Environment; Biological Sciences; Earth and Marine Sciences; Economics and Econometrics; and Environmental Sciences.</p> Image: Christopher HartingRankings, Architecture, Chemical engineering, Chemistry, Civil and environmental engineering, Electrical Engineering & Computer Science (eecs), Economics, Linguistics, Materials Science and Engineering, DMSE, Mechanical engineering, Aeronautical and astronautical engineering, Physics, Business and management, Accounting, Finance, Arts, Design, Mathematics, EAPS, School of Architecture and Planning, School of Humanities Arts and Social Sciences, School of Science, School of Engineering, Sloan School of Management 3Q: David Mindell on his vision for human-centered robotics Engineer and historian discusses how the MIT Schwarzman College of Computing might integrate technical and humanistic research and education. Tue, 18 Jun 2019 14:35:01 -0400 School of Humanities, Arts, and Social Sciences <p><em>David Mindell, Frances and David Dibner Professor of the History of Engineering and Manufacturing in the School of Humanities, Arts, and Social Sciences and professor of aeronautics and astronautics, researches the intersections of human behavior, technological innovation, and automation. Mindell is the author of five acclaimed books, most recently "Our Robots, Ourselves: Robotics and the Myths of Autonomy" (Viking, 2015) as well as the co-founder of the Humatics Corporation, which develops technologies for human-centered automation. SHASS Communications spoke with Mindell recently on how his vision for human-centered robotics is developing and his thoughts about the new MIT Stephen A. Schwarzman College of Computing, which aims to integrate technical and humanistic research and education.&nbsp;&nbsp;</em><br /> &nbsp;<br /> <strong>Q:</strong> Interdisciplinary programs have proved challenging to sustain, given the differing methodologies and vocabularies of the fields being brought together. How might the MIT Schwarzman College of Computing design the curriculum to educate "bilinguals" — students who are adept in both advanced computation and one of more of the humanities, arts, and social science fields?<br /> &nbsp;<br /> <strong>A:</strong> Some technology leaders today are naive and uneducated in humanistic and social thinking. They still think that technology evolves on its own and “impacts” society, instead of understanding technology as a human and cultural expression, as part of society.<br /> <br /> As a historian and an engineer, and MIT’s only faculty member with a dual appointment in engineering and the humanities, I’ve been “bilingual” my entire career (long before we began using that term for fluency in both humanities and technology fields). My education started with firm grounding in two fields — electrical engineering and history — that I continue to study.<br /> <br /> Dual competence is a good model for undergraduates at MIT today as well. Pick two: not necessarily the two that I chose, but any two disciplines that capture the core of technology and the core of the humanities. Disciplines at the undergraduate level provide structure, conventions, and professional identity (although my appointment is in Aero/Astro, I still identify as an electrical engineer). I prefer the term “dual disciplinary” to “interdisciplinary.”&nbsp;<br /> <br /> The College of Computing curriculum should focus on fundamentals, not just engineering plus some dabbling in social implications.<br /> <br /> It sends the wrong message to students that “the technical stuff is core, and then we need to add all this wrapper humanities and social sciences around the engineering.” Rather, we need to say: “master two fundamental ways of thinking about the world, one technical and one humanistic or social.” Sometimes these two modes will be at odds with each other, which raises critical questions. Other times they will be synergistic and energizing. For example, my historical work on the Apollo guidance computer inspired a great deal of my current engineering work on precision navigation.<br /> <br /> <strong>Q:</strong> In naming the company you founded Humatics, you’ve combined “human” and “robotics,” highlighting the synergy between human beings and our advanced technologies. What projects underway at Humatics define and demonstrate how you envision people working collaboratively with machines?&nbsp;<br /> <br /> <strong>A:</strong> Humatics builds on the synthesis that has defined my career — the name is the first four letters of “human" and the last four letters of “robotics.” Our mission is to build technologies that weave robotics into the human world, rather than shape human behavior to the limitations of the robots. We do very technical stuff: We build our own radar chips, our own signal processing algorithms, our own AI-based navigation systems. But we also craft our technologies to be human-centered, to give users and workers information that enables them to make their own decisions and work safer and more efficiently.<br /> <br /> We’re currently working to incorporate our ultra-wideband navigation systems into subway and mass transit systems. Humatics' technologies will enable modern signaling systems to be installed more quickly and less expensively. It's gritty, dirty work down in the tunnels, but it is a “smart city” application that can improve the daily lives of millions of people. By enabling the trains to navigate themselves with centimeter-precision, we enable greater rush-hour throughput, fewer interruptions, even improved access for people with disabilities, at a minimal cost compared to laying new track.<br /> <br /> A great deal of this work focuses on reliability, robustness, and safety. These are large technological systems that MIT used to focus on in the Engineering Systems Division. They are legacy infrastructure running at full capacity, with a variety of stakeholders, and technical issues hashed out in political debate. As an opportunity to improve peoples' lives with our technology, this project is very motivating for the Humatics team.<br /> <br /> We see a subway system as a giant robot that collaborates with millions of people every day. Indeed, for all their flaws, it does so today in beautifully fluid ways.&nbsp;Disruption is not an option. Similarly, we see factories, e-commerce fulfillment centers, even entire supply chains as giant human-machine systems that combine three key elements: people, robots (vehicles), and infrastructure. Humatics builds the technological glue that ties these systems together.<br /> <br /> <strong>Q:</strong> Autonomous cars were touted to be available soon, but their design has run into issues and ethical questions. Is there a different approach to the design of artificially intelligent vehicles, one that does not attempt to create&nbsp;fully autonomous vehicles?&nbsp;If so, what are the barriers or resistance to human-centered approaches?<br /> <br /> <strong>A:</strong> Too many engineers still imagine autonomy as meaning “alone in the world.” This approach derives from a specific historical imagination of autonomy, derived from Defense Advanced Research Projects Agency sponsorship and elsewhere, that a robot should be independent of all infrastructure. While that’s potentially appropriate for military operations, the promise of autonomy on our roads must be the promise of autonomy in the human world, in myriad&nbsp;exquisite relationships.<br /> <br /> Autonomous vehicle companies are learning, at great expense, that they already depend heavily on infrastructure (including roads and traffic signs) and that the sooner they learn to embrace it, the sooner they can deploy at scale. Decades of experience have taught us that, to function in the human world, autonomy must be connected, relational, and situated. Human-centered autonomy in automobiles must be more than a fancy FitBit on a driver; it must factor into the fundamental design of the systems: What do we wish to control? Whom do we trust? Who owns our data? How are our systems trained? How do they handle failure? Who gets to decide?<br /> <br /> The current crisis over the Boeing 737 MAX control systems show these questions are hard to get right, even in aviation. There we have a great deal of regulation, formalism, training, and procedure, not to mention a safety culture that evolved over a century. For autonomous cars, with radically different regulatory settings and operating environments, not to mention non-deterministic software, we still have a great deal to learn. Sometimes I think it could take the better part of this century to really learn how to build robust autonomy into safety-critical systems at scale.<br /> &nbsp;</p> <h5><em>Interview prepared by MIT SHASS Communications<br /> Editorial and Design Director: Emily Hiestand<br /> Interview conducted by writer Maria Iacobo</em><br /> &nbsp;</h5> "As an engineer and historian, I’ve been 'bilingual' my entire career,” says David Mindell, the Dibner Professor of the History of Engineering and Manufacturing; professor of aeronautics and astronautics; and co-founder and CEO of Humatics Corporation. “Dual competence is a good model for undergraduates at MIT as well."Photo: Len RosensteinSchool of Humanities Arts and Social Sciences, School of Engineering, Faculty, Technology and society, Program in STS, Human-computer interaction, History of science, MIT Schwarzman College of Computing, History, 3 Questions, Autonomous vehicles, Aeronautical and astronautical engineering, Defense Advanced Research Projects Agency (DARPA), Industry Ribbon cutting launches auxiliary Beaver Works space The 4,000-square-foot addition will facilitate further collaboration between MIT and Lincoln Laboratory researchers. Tue, 11 Jun 2019 14:00:09 -0400 Nathan Parde | MIT Lincoln Laboratory <p><a href="">MIT Lincoln Laboratory Beaver Works</a> is opening a new space in the recently renovated Department of Aeronautics and Astronautics (AeroAstro) Building 31. This new facility builds on the successful partnership between Lincoln Laboratory and the School of Engineering by providing another location for innovation, collaboration, and hands-on development. The new site will also strengthen connections between AeroAstro researchers and practicing engineers at the laboratory while supporting collaboration on projects such as the Transiting Exoplanet Survey Satellite and cutting-edge research on autonomous drone systems.</p> <p>To celebrate the opening of the new space, a ribbon-cutting ceremony was held on May 24. Speakers included Eric Evans, director of Lincoln Laboratory; <a href="">Professor Daniel Hastings</a>, MIT Aeronautics and Astronautics department head; and <a href="">Professor Jaime Peraire</a>, the H.N. Slater Professor of Aeronautics and Astronautics.</p> <p>“It was the generosity and enthusiasm of our extended MIT family that made this vision a reality. Generations of researchers and students will use this greatly improved space to conduct research that will benefit the world,” says Peraire.</p> <p>Beaver Works has a history of bringing Lincoln Laboratory and AeroAstro together to generate innovative solutions and expose students to opportunities in engineering, research, and service to the nation. Beaver Works pursues this mission through a broad range of research and educational activities that include capstone courses, joint research projects, the <a href="">Undergraduate Research Opportunities Program</a>, undergraduate internships, and STEM (science, technology, engineering, and mathematics) outreach for local schools. The new facility will also support multiple <a href="">Independent Activities Period courses</a> and community outreach classes for middle and high school students, including coding classes and the four-week <a href="">Beaver Works Summer Institute</a>.</p> <p>“This facility will enable great students, staff, and faculty to work together on complex system prototypes,” said Evans. “We are looking forward to the creative, new technologies that will be developed here.”</p> <p>The renovation added a second facility that is 4,000 square feet for Beaver Works researchers to use. With this space, laboratory and MIT affiliates will continue to enable research and development in autonomous air systems, bold air vehicle designs, small satellite designs, and new drone research areas to face coming challenges in subjects ranging from transportation to self-driving drone races.</p> <p>Of the newly renovated space, Hastings said: “This facility will enable us to undertake real-world projects with our students in a manner that exemplifies ‘mens et manus.’” The Latin motto, adopted by MIT, translates to “mind and hand.” This motto reflects the ideal of cooperation between knowledge and practice — a partnership that the new Beaver Works space exemplifies.</p> Left to right: Professor Daniel Hastings, Aeronautics and Astronautics department head; Professor Anantha Chandrakasan, dean of the School of Engineering; Eric Evans, director of Lincoln Laboratory; and Robert Shin, head of the Intelligence, Surveillance, and Reconnaissance and Tactical Systems Division at Lincoln Laboratory, cut the ribbon at the unveiling of the new Beaver Works space in MIT’s Building 31.Photo: Glen CooperLincoln Laboratory, School of Engineering, Beaver works, Independent Activities Period, Undergraduate Research Opportunities Program (UROP), Aeronautical and astronautical engineering, Innovation and Entrepreneurship (I&E), Facilities Algorithm tells robots where nearby humans are headed A new tool for predicting a person’s movement trajectory may help humans and robots work together in close proximity. Mon, 10 Jun 2019 23:59:59 -0400 Jennifer Chu | MIT News Office <p>In 2018, researchers at MIT and the auto manufacturer BMW were testing ways in which humans and robots might work in close proximity to assemble car parts. In a replica of a factory floor setting, the team rigged up a robot on rails, designed to deliver parts between work stations. Meanwhile, human workers crossed its path every so often to work at nearby stations.&nbsp;</p> <p>The robot was programmed to stop momentarily if a person passed by. But the researchers noticed that the robot would often freeze in place, overly cautious, long before a person had crossed its path. If this took place in a real manufacturing setting, such unnecessary pauses could accumulate into significant inefficiencies.</p> <p>The team traced the problem to a limitation in the robot’s trajectory alignment algorithms used by the robot’s motion predicting software. While they could reasonably predict where a person was headed, due to the poor time alignment the algorithms couldn’t anticipate how long that person spent at any point along their predicted path — and in this case, how long it would take for a person to stop, then double back and cross the robot’s path again.</p> <p>Now, members of that same MIT team have come up with a solution: an algorithm that accurately aligns partial trajectories in real-time, allowing motion predictors to accurately anticipate the timing of a person’s motion. When they applied the new algorithm to the BMW factory floor experiments, they found that, instead of freezing in place, the robot simply rolled on and was safely out of the way by the time the person walked by again.</p> <p>“This algorithm builds in components that help a robot understand and monitor stops and overlaps in movement, which are a core part of human motion,” says Julie Shah, associate professor of aeronautics and astronautics at MIT. “This technique is one of the many way we’re working on robots better understanding people.”</p> <p>Shah and her colleagues, including project lead and graduate student Przemyslaw “Pem” Lasota, will present their results this month at the Robotics: Science and Systems conference in Germany.</p> <div class="cms-placeholder-content-video"></div> <p><strong>Clustered up</strong></p> <p>To enable robots to predict human movements, researchers typically borrow algorithms from music and speech processing. These algorithms are designed to align two complete time series, or sets of related data, such as an audio track of a musical performance and a scrolling video of that piece’s musical notation.</p> <p>Researchers have used similar alignment algorithms to sync up real-time and previously recorded measurements of human motion, to predict where a person will be, say, five seconds from now. But unlike music or speech, human motion can be messy and highly variable. Even for repetitive movements, such as reaching across a table to screw in a bolt, one person may move slightly differently each time.</p> <p>Existing algorithms typically take in streaming motion data, in the form of dots representing the position of a person over time, and compare the trajectory of those dots to a library of common trajectories for the given scenario. An algorithm maps a trajectory in terms of the relative distance between dots.</p> <p>But Lasota says algorithms that predict trajectories based on distance alone can get easily confused in certain common situations, such as temporary stops, in which a person pauses before continuing on their path. While paused, dots representing the person’s position can bunch up in the same spot.</p> <p>“When you look at &nbsp;the data, you have a whole bunch of points clustered together when a person is stopped,” Lasota says. “If you’re only looking at the distance between points as your alignment metric, that can be confusing, because they’re all close together, and you don’t have a good idea of which point you have to align to.”</p> <p>The same goes with overlapping trajectories — instances when a person moves back and forth along a similar path. Lasota says that while a person’s current position may line up with a dot on a reference trajectory, existing algorithms can’t differentiate between whether that position is part of a trajectory heading away, or coming back along the same path.</p> <p>“You may have points close together in terms of distance, but in terms of time, a person’s position may actually be far from a reference point,” Lasota says.</p> <p><strong>It’s all in the timing</strong></p> <p>As a solution, Lasota and Shah devised a “partial trajectory” algorithm that aligns segments of a person’s trajectory in real-time with a library of previously collected reference trajectories. Importantly, the new algorithm aligns trajectories in both distance and timing, and in so doing, is able to accurately anticipate stops and overlaps in a person’s path.</p> <p>“Say you’ve executed this much of a motion,” Lasota explains. “Old techniques will say, ‘this is the closest point on this representative trajectory for that motion.’ But since you only completed this much of it in a short amount of time, the timing part of the algorithm will say, ‘based on the timing, it’s unlikely that you’re already on your way back, because you just started your motion.’”</p> <p>The team tested the algorithm on two human motion datasets: one in which a person intermittently crossed a robot’s path in a factory setting (these data were obtained from the team’s experiments with BMW), and another in which the group previously recorded hand movements of participants reaching across a table to install a bolt that a robot would then secure by brushing sealant on the bolt.</p> <p>For both datasets, the team’s algorithm was able to make better estimates of a person’s progress through a trajectory, compared with two commonly used partial trajectory alignment algorithms. Furthermore, the team found that when they integrated the alignment algorithm with their motion predictors, the robot could more accurately anticipate the timing of a person’s motion. In the factory floor scenario, for example, they found the robot was less prone to freezing in place, and instead smoothly resumed its task shortly after a person crossed its path.</p> <p>While the algorithm was evaluated in the context of motion prediction, it can also be used as a preprocessing step for other techniques in the field of human-robot interaction, such as action recognition and gesture detection. Shah says the algorithm will be a key tool in enabling robots to recognize and respond to patterns of human movements and behaviors. Ultimately, this can help humans and robots work together in structured environments, such as factory settings and even, in some cases, the home.</p> <p>“This technique could apply to any environment where humans exhibit typical patterns of behavior,” Shah says. “The key is that the [robotic] system can observe patterns that occur over and over, so that it can learn something about human behavior. This is all in the vein of work of the robot better understand aspects of human motion, to be able to collaborate with us better.”</p> <p>This research was funded, in part, by a NASA Space Technology Research Fellowship and the National Science Foundation.</p> A new algorithm helps robots predict the paths people take in structured environments like the factory floor, which may further enable close collaboration between humans and machines.Aeronautical and astronautical engineering, Algorithms, Artificial intelligence, Assistive technology, Computer modeling, Computer Science and Artificial Intelligence Laboratory (CSAIL), Research, Robots, Robotics, School of Engineering, Machine learning, NASA, National Science Foundation (NSF) The tenured engineers of 2019 Seventeen appointments have been made in eight departments within the School of Engineering. Tue, 04 Jun 2019 10:30:01 -0400 School of Engineering <p>The School of Engineering has announced that 17 members of its faculty have been granted tenure by MIT.</p> <p>“The tenured faculty in this year’s cohort are a true inspiration,” said Anantha Chandrakasan, dean of the School of Engineering. “They have shown exceptional dedication to research and teaching, and their innovative work has greatly advanced their fields.”</p> <p>This year’s newly tenured associate professors are:</p> <p><a href="" target="_blank">Antoine Allanore</a>, in the Department of Materials Science and Engineering, develops more sustainable technologies and strategies for mining, metal extraction, and manufacturing, including novel methods of fertilizer production.</p> <p><a href="" target="_blank">Saurabh Amin</a>, in the Department of Civil and Environmental Engineering, focuses on the design and implementation of network inspection and control algorithms for improving the resilience of large-scale critical infrastructures, such as transportation systems and water and energy distribution networks, against cyber-physical security attacks and natural events.</p> <p><a href="" target="_blank">Emilio Baglietto</a>, in the Department of Nuclear Science and Engineering, uses computational modeling to characterize and predict the underlying heat-transfer processes in nuclear reactors, including turbulence modeling, unsteady flow phenomena, multiphase flow, and boiling.</p> <p><a href="" target="_blank">Paul Blainey</a>, the Karl Van Tassel (1925) Career Development Professor in the Department of Biological Engineering, integrates microfluidic, optical, and molecular tools for application in biology and medicine across a range of scales.</p> <p><a href="" target="_blank">Kerri Cahoy</a>, the Rockwell International Career Development Professor in the Department of Aeronautics and Astronautics, develops nanosatellites that demonstrate weather sensing using microwave radiometers and GPS radio occultation receivers, high data-rate laser communications with precision time transfer, and active optical imaging systems using MEMS deformable mirrors for exoplanet exploration applications.&nbsp;</p> <p><a href="" target="_blank">Juejun Hu</a>, in the Department of Materials Science and Engineering, focuses on novel materials and devices to exploit interactions of light with matter, with applications in on-chip sensing and spectroscopy, flexible and polymer photonics, and optics for solar energy.</p> <p><a href="" target="_blank">Sertac Karaman</a>, the Class of 1948 Career Development Professor in the Department of Aeronautics and Astronautics, studies robotics, control theory, and the application of probability theory, stochastic processes, and optimization for cyber-physical systems such as driverless cars and drones.</p> <p><a href="" target="_blank">R. Scott Kemp</a>, the Class of 1943 Career Development Professor in the Department of Nuclear Science and Engineering, combines physics, politics, and history to identify options for addressing nuclear weapons and energy. He investigates technical threats to nuclear-deterrence stability and the information theory of treaty verification; he is also developing technical tools for reconstructing the histories of secret nuclear-weapon programs.</p> <p><a href="" target="_blank">Aleksander Mądry</a>, in the Department of Electrical Engineering and Computer Science, investigates topics ranging from developing new algorithms using continuous optimization, to combining theoretical and empirical insights, to building a more principled and thorough understanding of key machine learning tools. A major theme of his research is rethinking machine learning from the perspective of security and robustness.</p> <p><a href="" target="_blank">Frances Ross</a>, the Ellen Swallow Richards Professor in the Department of Materials Science and Engineering, performs research on nanostructures using transmission electron microscopes that allow researchers to see, in real-time, how structures form and develop in response to changes in temperature, environment, and other variables. Understanding crystal growth at the nanoscale is helpful in creating precisely controlled materials for applications in microelectronics and energy conversion and storage.</p> <p><a href="" target="_blank">Daniel Sanchez</a>, in the Department of Electrical Engineering and Computer Science, works on computer architecture and computer systems, with an emphasis on large-scale multi-core processors, scalable and efficient memory hierarchies, architectures with quality-of-service guarantees, and scalable runtimes and schedulers.</p> <p><a href="" target="_blank">Themistoklis Sapsis</a>, the Doherty Career Development Professor in the Department of Mechanical Engineering, develops analytical, computational, and data-driven methods for the probabilistic prediction and quantification of extreme events in high-dimensional nonlinear systems such as turbulent fluid flows and nonlinear mechanical systems.</p> <p><a href="" target="_blank">Julie Shah</a>, the Boeing Career Development Professor in the Department of Aeronautics and Astronautics, develops innovative computational models and algorithms expanding the use of human cognitive models for artificial intelligence. Her research has produced novel forms of human-machine teaming in manufacturing assembly lines, healthcare applications, transportation, and defense.</p> <p><a href="">Hadley Sikes</a>, the Esther and Harold E. Edgerton Career Development Professor in the Department of Chemical Engineering, employs biomolecular engineering and knowledge of reaction networks to detect epigenetic modifications that can guide cancer treatment, induce oxidant-specific perturbations in tumors for therapeutic benefit, and improve signaling reactions and assay formats used in medical diagnostics.</p> <p><a href="" target="_blank">William Tisdale</a>, the ARCO Career Development Professor in the Department of Chemical Engineering, works on energy transport in nanomaterials, nonlinear spectroscopy, and spectroscopic imaging to better understand and control the mechanisms by which excitons, free charges, heat, and reactive chemical species are converted to more useful forms of energy, and on leveraging this understanding to guide materials design and process optimization.</p> <p><a href="" target="_blank">Virginia Vassilevska Williams</a>, the Steven and Renee Finn Career Development Professor in the Department of Electrical Engineering and Computer Science, applies combinatorial and graph theoretic tools to develop efficient algorithms for matrix multiplication, shortest paths, and a variety of other fundamental problems. Her recent research is centered on proving tight relationships between seemingly different computational problems. She is also interested in computational social choice issues, such as making elections computationally resistant to manipulation.</p> <p><a href="" target="_blank">Amos Winter</a>, the Tata Career Development Professor in the Department of Mechanical Engineering, focuses on connections between mechanical design theory and user-centered product design to create simple, elegant technological solutions for applications in medical devices, water purification, agriculture, automotive, and other technologies used in highly constrained environments.</p> The MIT School of Engineering newly tenured faculty are: (first row, left to right) Amos Winter, Kerri Cahoy, Antoine Allanore, R. Scott Kemp, Juejun Hu, Emilio Baglietto, Virginia Vassilevska Williams, Aleksander Mądry, and Julie Shah. (second row, left to right) William Tisdale, Paul Blainey, Themistoklis Sapsis, Frances Ross, Sertac Karaman, Hadley Sikes, Saurabh Amin, and Daniel Sanchez.School of Engineering, Materials Science and Engineering, Civil and environmental engineering, Biological engineering, Nuclear science and engineering, Aeronautical and astronautical engineering, Electrical engineering and computer science (EECS), Mechanical engineering, Chemical engineering, Awards, honors and fellowships, Faculty CSAIL hosts first-ever TEDxMIT Speakers — all women — discuss everything from gravitational waves to robot nurses. Fri, 31 May 2019 15:00:01 -0400 Adam Conner-Simons | Rachel Gordon | MIT CSAIL <p>On Tuesday, May 28, MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) hosted a special TEDx event featuring an all-female line-up of MIT scientists and researchers who discussed cutting-edge ideas in science and technology.</p> <p>TEDxMIT speakers included roboticists, engineers, astronomers, and policy experts, including former White House chief technology officer Megan Smith ’86 SM ’88 and MIT Institute Professor Emerita Barbara Liskov, <a href="" target="_blank">winner of the A.M. Turing Award</a>, often considered the “Nobel Prize for computing.”</p> <p>From Professor Nergis Mavalvala’s <a href="">work on gravitational waves</a> to Associate Provost Krystyn Van Vliet’s <a href="">efforts to improve cell therapy</a>, the afternoon was filled with energizing and historic success stories of women in STEM.</p> <p>In an early talk, MIT Associate Professor Julie Shah touched on the much-discussed narrative of artificial intelligence and job displacement, and how that relates to her own work creating systems that she described as “being intentional about augmenting human capabilities.”She spoke about her efforts developing robots to help reduce the cognitive burden of overwhelmed workers, like the nurses on labor wards who have to make <a href="" target="_self">hundreds of split-second decisions</a> for scheduling deliveries and C-sections.</p> <p>“We can create a future where we don’t have robots who replace humans, but that help us accomplish what neither group can do alone,” said Shah.</p> <p>CSAIL Director Daniela Rus, a professor of electrical engineering and computer science, spoke of how computer scientists can inspire the next generation of programmers by emphasizing the many possibilities that coding opens up.</p> <p>“I like to say that those of us who know how to ... breathe life into things through programming have superpowers,” said Rus.</p> <p>Throughout the day scientists showed off technologies that could fundamentally transform many industries, from Professor Dava Newman’s <a href="" target="_self">prototype Mars spacesuit</a> to Associate Professor Vivienne Sze’s <a href="" target="_self">low-power processors for machine learning</a>.</p> <p>Judy Brewer, director of the World Wide Web Consortium’s <a href="" target="_blank">Web Accessibility Initiative</a>, discussed the ways in which the web has made the world a more connected place for those with disabilities — and yet, how important it is for the people who design digital technologies to be better about making them accessible.</p> <p>“When the web became available, I could go and travel anywhere,” Brewer said. “There’s a general history of excluding people with disabilities, and then we go and design tech that perpetuates that exclusion. In my vision of the future everything is accessible, including the digital world.”</p> <p>Liskov captivated the audience with her tales of the early days of computer programming. She was asked to learn Fortran on her first day of work in 1961 — having never written a line of code before.</p> <p>“I didn’t have any training,” she said. “But then again, nobody did.”</p> <p>In 1971 Liskov joined MIT, where she created the programming language CLU, which established the notion of “abstract data types” and laid the groundwork for languages like Java and C#. Many coders now take so-called “object-oriented programming” (OOP) for granted: She wryly reflected on how, after she won the Turing Award, one internet commenter looked at her contributions to data abstraction and pointed out that “everybody knows that, anyway.”</p> <p>“It was a statement to how much the world has changed,” she said with a smile. “When I was doing that work decades earlier, nobody knew anything about [OOP].”</p> <p>Other researchers built off of Liskov’s remarks in discussing the birth of big data and machine learning. Professor Ronitt Rubinfeld spoke about how computer scientists’ work in sublinear time algorithms has allowed them to better make sense of large amounts of data, while Hamsa Balakrishnan spoke about the ways in which algorithms can help systems engineers make air travel more efficient.</p> <p>The event’s overarching theme was highlighting examples of female role models in fields where they’ve often been overlooked. Paula Hammond, head of MIT’s Department of Chemical Engineering, touted the fact that more than half of undergraduates in her department this year were women. Rus urged the women in the audience, many of whom were MIT students, to think about what role they might want to play in continuing to advance science in the coming years.</p> <p>“To paraphrase our hometown hero, President John F. Kennedy, we need to prepare [women] to see both what technology can do for them — and what they can do for technology,” Rus said.</p> <p>Rus led the planning of the TEDxMIT event alongside MIT research affiliate John Werner and student directors Stephanie Fu and Rucha Kelkar, both first-years.</p> MIT Professor Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory, kicked off an all-female lineup of speakers at TEDxMIT.Photo: Jason Dorfman/MIT CSAILSpecial events and guest speakers, Technology and society, Women in STEM, Faculty, Alumni/ae, Diversity and inclusion, Computer science and technology, History of science, Chemical engineering, Aeronautical and astronautical engineering, Electrical Engineering & Computer Science (eecs), Physics, DMSE, School of Engineering, School of Science, Assistive technology Pantry ingredients can help grow carbon nanotubes Study finds baking soda, detergent, and table salt — all rich in sodium — are effective catalysts. Tue, 28 May 2019 23:59:59 -0400 Jennifer Chu | MIT News Office <p>Baking soda, table salt, and detergent are surprisingly effective ingredients for cooking up carbon nanotubes, researchers at MIT have found.</p> <p>In a study published this week in the journal <em>Angewandte Chemie</em>, the team reports that sodium-containing compounds found in common household ingredients are able to catalyze the growth of carbon nanotubes, or CNTs, at much lower temperatures than traditional catalysts require.</p> <p>The researchers say that sodium may make it possible for carbon nanotubes to be grown on a host of lower-temperature materials, such as polymers, which normally melt under the high temperatures needed for traditional CNT growth.</p> <p>“In aerospace composites, there are a lot of polymers that hold carbon fibers together, and now we may be able to directly grow CNTs on polymer materials, to make stronger, tougher, stiffer composites,” says Richard Li, the study’s lead author and a graduate student in MIT’s Department of Aeronautics and Astronautics. “Using sodium as a catalyst really unlocks the kinds of surfaces you can grow nanotubes on.”</p> <p>Li’s MIT co-authors are postdocs Erica Antunes, Estelle Kalfon-Cohen, Luiz Acauan, and Kehang Cui; alumni Akira Kudo PhD ’16, Andrew Liotta ’16, and Ananth Govind Rajan SM ’16, PhD ’19; professor of chemical engineering Michael Strano, and professor of aeronautics and astronautics Brian Wardle, along with collaborators at the National Institute of Standards and Technology and Harvard University.</p> <p><strong>Peeling onions</strong></p> <p>Under a microscope, carbon nanotubes resemble hollow cylinders of chicken wire. Each tube is made from a rolled up lattice of hexagonally arranged carbon atoms. The bond between carbon atoms is extraordinarily strong, and when patterned into a lattice, such as graphene, or as a tube, such as a CNT, such structures can have exceptional stiffness and strength, as well as unique electrical and chemical properties. As such, researchers have explored coating various surfaces with CNTs to produce stronger, stiffer, tougher materials.</p> <p>Researchers typically grow CNTs on various materials through a process called chemical vapor deposition. A material of interest, such as carbon fibers, is coated in a catalyst — usually an iron-based compound — and placed in a furnace, through which carbon dioxide and other carbon-containing gases flow. At temperatures of up to 800 degrees Celsius, the iron starts to draw carbon atoms out of the gas, which glom onto the iron atoms and to each other, eventually forming vertical tubes of carbon atoms around individual carbon fibers. Researchers then use various techniques to dissolve the catalyst, leaving behind pure carbon nanotubes.</p> <p>Li and his colleagues were experimenting with ways to grow CNTs on various surfaces by coating them with different solutions of iron-containing compounds, when the team noticed the resulting carbon nanotubes looked different from what they expected.</p> <p>“The tubes looked a little funny, and Rich and the team carefully peeled the onion back, as it were, and it turns out a small quantity of sodium, which we suspected was inactive, was actually causing all the growth,” Wardle says.</p> <p><strong>Tuning sodium’s knobs</strong></p> <p>For the most part, iron has been the traditional catalyst for growing CNTs. Wardle says this is the first time that researchers have seen sodium have a similar effect.</p> <p>“Sodium and other alkali metals have not been explored for CNT catalysis,” Wardle says. “This work has led us to a different part of the periodic table.”</p> <p>To make sure their initial observation wasn’t just a fluke, the team tested a range of sodium-containing compounds. They initially experimented with commercial-grade sodium, in the form of baking soda, table salt, and detergent pellets, which they obtained from the campus convenience store. Eventually, however, they upgraded to purified versions of those compounds, which they dissolved in water. They then immersed a carbon fiber in each compound’s solution, coating the entire surface in sodium. Finally, they placed the material in a furnace and carried out the typical steps involved in the chemical vapor deposition process to grow CNTs.</p> <p>In general, they found that, while iron catalysts form carbon nanotubes at around 800 degrees Celsius, the sodium catalysts were able to form short, dense forests of CNTs at much lower temperatures, of around 480 C. What’s more, after surfaces spent about 15 to 30 minutes in the furnace, the sodium simply vaporized away, leaving behind hollow carbon nanotubes.</p> <p>“A large part of CNT research is not on growing them, but on cleaning them —getting the different metals used to grow them out of the product,” Wardle says. “The neat thing with sodium is, we can just heat it and get rid of it, and get pure CNT as product, which you can’t do with traditional catalysts.”</p> <p>Li says future work may focus on improving the quality of CNTs that are grown using sodium catalysts. The researchers observed that while sodium was able to generate forests of carbon nanotubes, the walls of the tubes were not perfectly aligned in perfectly hexagonal patterns — crystal-like configurations that give CNTs their characteristic strength. Li plans to “tune various knobs” in the CVD process, changing the timing, temperature, and environmental conditions, to improve the quality of sodium-grown CNTs.</p> <p>“There are so many variables you can still play with, and sodium can still compete pretty well with traditional catalysts,” Li says. “We anticipate with sodium, it is possible to get high quality tubes in the future. And we have pretty high confidence that, even if you were to use regular Arm and Hammer baking soda, it should work.”</p> <p>For Shigeo Maruyama, professor of mechanical engineering at the University of Tokyo, the ability to cook up CNTs from such a commonplace ingredient as sodium should reveal new insights into the way the exceptionally strong materials grow.</p> <p>“It is a surprise that we can grow carbon nanotubes from table salt!” says Maruyama, who was not involved in the research. “Even though chemical vapor deposition (CVD) growth of carbon nanotubes has been studied for more than 20 years, nobody has tried to use alkali group metal as catalyst. This will be a great hint for the fully new understanding of growth mechanism of carbon nanotubes.”</p> <p>This research was supported, in part, by Airbus, Boeing, Embraer, Lockheed Martin, Saab AB, ANSYS, Saertex, and TohoTenax through MIT’s Nano-Engineered Composite aerospace STructures (NECST) Consortium.</p> Sodium-containing compounds, such as those found in common household ingredients like detergent, baking soda, and table salt, are surprisingly effective ingredients for cooking up carbon nanotubes, new MIT study finds.Image: Christine Daniloff, MITAeronautical and astronautical engineering, Carbon nanotubes, Research, School of Engineering, Nanoscience and nanotechnology, Materials Science and Engineering Bringing human-like reasoning to driverless car navigation Autonomous control system “learns” to use simple maps and image data to navigate new, complex routes. Wed, 22 May 2019 23:59:59 -0400 Rob Matheson | MIT News Office <p>With aims of bringing more human-like reasoning to autonomous vehicles, MIT researchers have created a system that uses only simple maps and visual data to enable driverless cars to navigate routes in new, complex environments.</p> <p>Human drivers are exceptionally good at navigating roads they haven’t driven on before, using observation and simple tools. We simply match what we see around us to what we see on our GPS devices to determine where we are and where we need to go. Driverless cars, however, struggle with this basic reasoning. In every new area, the cars must first map and analyze all the new roads, which is very time consuming. The systems also rely on complex maps —&nbsp;usually generated by 3-D scans —&nbsp;which are computationally intensive to generate and process on the fly.</p> <p>In a paper being presented at this week’s International Conference on Robotics and Automation, MIT researchers describe an autonomous control system that “learns” the steering patterns of human drivers as they navigate roads in a small area, using only data from video camera feeds and a simple GPS-like map. Then, the trained system can control a driverless car along a planned route in a brand-new area, by imitating the human driver.</p> <p>Similarly to human drivers, the system also detects any mismatches between its map and features of the road. This helps the system determine if its position, sensors, or mapping are incorrect, in order to correct the car’s course.</p> <p>To train the system initially, a human operator controlled an automated&nbsp;Toyota Prius — equipped with several cameras and a basic GPS navigation system —&nbsp;to collect data from local suburban streets including various road structures and obstacles. When deployed autonomously, the system successfully navigated the car along a preplanned path in a different forested area, designated for autonomous vehicle tests.</p> <p>“With our system, you don’t need to train on every road beforehand,” says first author Alexander Amini, an MIT graduate student. “You can download a new map for the car to navigate through roads it has never seen before.”</p> <p>“Our objective is to achieve autonomous navigation that is robust for driving in new environments,” adds co-author Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. “For example, if we train an autonomous vehicle to drive in an urban setting such as the streets of Cambridge, the system should also be able to drive smoothly in the woods, even if that is an environment it has never seen before.”</p> <p>Joining Rus and Amini on the paper are Guy Rosman, a researcher at the Toyota Research Institute, and Sertac Karaman, an associate professor of aeronautics and astronautics at MIT.</p> <div class="cms-placeholder-content-video"></div> <p><strong>Point-to-point navigation</strong></p> <p>Traditional navigation systems process data from sensors through multiple modules customized for tasks such as localization, mapping, object detection, motion planning, and steering control. For years, Rus’s group has been developing “end-to-end” navigation systems, which process inputted sensory data and output steering commands, without a need for any specialized modules.</p> <p>Until now, however, these models were strictly designed to safely follow the road, without any real destination in mind. In the new paper, the researchers advanced their end-to-end system to drive from goal to destination, in a previously unseen environment. To do so, the researchers trained their system to predict a full probability distribution over all possible steering commands at any given instant while driving.</p> <p>The system uses a machine learning model called a convolutional neural network (CNN), commonly used for image recognition. During training, the system watches and learns how to steer from a human driver. The CNN correlates steering wheel rotations to road curvatures it observes through cameras and an inputted map. Eventually, it learns the most likely steering command for various driving situations, such as straight roads, four-way or T-shaped intersections, forks, and rotaries.</p> <p>“Initially, at a T-shaped intersection, there are many different directions the car could turn,” Rus says. “The model starts by thinking about all those directions, but as it sees more and more data about what people do, it will see that some people turn left and some turn right, but nobody goes straight. Straight ahead is ruled out as a possible direction, and the model learns that, at T-shaped intersections, it can only move left or right.”</p> <p><strong>What does the map say?</strong></p> <p>In testing, the researchers input the system with a map with a randomly chosen route. When driving, the system extracts visual features from the camera, which enables it to predict road structures. For instance, it identifies a distant stop sign or line breaks on the side of the road as signs of an upcoming intersection. At each moment, it uses its predicted probability distribution of steering commands to choose the most likely one to follow its route.</p> <p>Importantly, the researchers say, the system uses maps that are easy to store and process. Autonomous control systems typically use LIDAR scans to create massive, complex maps that take roughly 4,000 gigabytes (4 terabytes) of data to store just the city of San Francisco. For every new destination, the car must create new maps, which amounts to tons of data processing. Maps used by the researchers’ system, however, captures the entire world using just 40 gigabytes of data. &nbsp;</p> <p>During autonomous driving, the system also continuously matches its visual data to the map data and notes any mismatches. Doing so helps the autonomous vehicle better determine where it is located on the road. And it ensures the car stays on the safest path if it’s being fed contradictory input information: If, say, the car is cruising on a straight road with no turns, and the GPS indicates the car must turn right, the car will know to keep driving straight or to stop.</p> <p>“In the real world, sensors do fail,” Amini says. “We want to make sure that the system is robust to different failures of different sensors by building a system that can accept these noisy inputs and still navigate and localize itself correctly on the road.”</p> To bring more human-like reasoning to autonomous vehicle navigation, MIT researchers have created a system that enables driverless cars to check a simple map and use visual data to follow routes in new, complex environments.Image: Chelsea TurnerResearch, Computer science and technology, Algorithms, Robotics, Autonomous vehicles, Automobiles, Artificial intelligence, Machine learning, Transportation, Technology and society, Computer Science and Artificial Intelligence Laboratory (CSAIL), Electrical Engineering & Computer Science (eecs), School of Engineering, Aeronautical and astronautical engineering MIT team places second in 2019 NASA BIG Idea Challenge Multilevel Mars greenhouse could provide food to sustain astronauts for several years. Mon, 20 May 2019 13:00:01 -0400 Sarah Jensen | Department of Aeronautics and Astronautics <p>An MIT student team took second place for its design of a multilevel greenhouse to be used on Mars in NASA’s 2019 Breakthrough, Innovative and Game-changing (BIG) Idea Challenge last month.&nbsp;</p> <p>Each year, NASA holds the BIG Idea competition in its search for innovative and futuristic ideas. This year’s challenge invited universities across the United States to submit designs for a sustainable, cost-effective, and efficient method of supplying food to astronauts during future crewed explorations of Mars. Dartmouth College was awarded first place in this year’s closely contested challenge.</p> <p>“This was definitely a full-team success,” says team leader Eric Hinterman, a graduate student in MIT’s Department of Aeronautics and Astronautics (AeroAstro). The team had contributions from 10 undergraduates and graduate students from across MIT departments. Support and assistance were provided by four architects and designers in Italy. This project was completely voluntary; all 14 contributors share a similar passion for space exploration and enjoyed working on the challenge in their spare time.</p> <p>The MIT team dubbed its design “BEAVER” (Biosphere Engineered Architecture for Viable Extraterrestrial Residence). “We designed our greenhouse to provide 100 percent of the food requirements for four active astronauts every day for two years,” explains Hinterman.</p> <p>The ecologists and agriculture specialists on the MIT team identified eight types of crops to provide the calories, protein, carbohydrates, and oils and fats that astronauts would need; these included potatoes, rice, wheat, oats, and peanuts. The flexible menu suggested substitutes, depending on astronauts’ specific dietary requirements.</p> <p>“Most space systems are metallic and very robotic,” Hinterman says. “It was fun working on something involving plants.”</p> <p>Parameters provided by NASA — a power budget, dimensions necessary for transporting by rocket, the capacity to provide adequate sustenance — drove the shape and the overall design of the greenhouse.</p> <p>Last October, the team held an initial brainstorming session and pitched project ideas. The iterative process continued until they reached their final design: a cylindrical growing space 11.2 meters in diameter and 13.4 meters tall after deployment.</p> <p><strong>An innovative design</strong></p> <p>The greenhouse would be packaged inside a rocket bound for Mars and, after landing, a waiting robot would move it to its site. Programmed with folding mechanisms, it would then expand horizontally and vertically and begin forming an ice shield around its exterior to protect plants and humans from the intense radiation on the Martian surface.</p> <p>Two years later, when Earth and Mars orbits were again in optimal alignment for launching and landing, a crew would arrive on Mars, where they would complete the greenhouse setup and begin growing crops. “About every two years, the crew would leave and a new crew of four would arrive and continue to use the greenhouse,” explains Hinterman.</p> <p>To maximize space, BEAVER employs a large spiral that moves around a central core within the cylinder. Seedlings are planted at the top and flow down the spiral as they grow. By the time they reach the bottom, the plants are ready for harvesting, and the crew enters at the ground floor to reap the potatoes and peanuts and grains. The planting trays are then moved to the top of the spiral, and the process begins again.</p> <p>“A lot of engineering went into the spiral,” says Hinterman. “Most of it is done without any moving parts or mechanical systems, which makes it ideal for space applications. You don’t want a lot of moving parts or things that can break.”</p> <p><strong>The human factor</strong></p> <p>“One of the big issues with sending humans into space is that they will be confined to seeing the same people every day for a couple of years,” Hinterman explains. “They’ll be living in an enclosed environment with very little personal space.”</p> <p>The greenhouse provides a pleasant area to ensure astronauts’ psychological well-being. On the top floor, just above the spiral, a windowed “mental relaxation area” overlooks the greenery. The ice shield admits natural light, and the crew can lounge on couches and enjoy the view of the Mars landscape. And rather than running pipes from the water tank at the top level down to the crops, Hinterman and his team designed a cascading waterfall at the area’s periphery, further adding to the ambiance.</p> <p>Sophomore Sheila Baber, an Earth, atmospheric, and planetary sciences (EAPS) major and the team’s ecology lead, was eager to take part in the project. “My grandmother used to farm in the mountains in Korea, and I remember going there and picking the crops,” she says. “Coming to MIT, I felt like I was distanced from my roots. I am interested in life sciences and physics and all things space, and this gave me the opportunity to combine all those.”</p> <p>Her work on BEAVER led to Baber’s award of one of five NASA internships at Langley Research Center in Hampton, Virginia this summer. She expects to continue exploration of the greenhouse project and its applications on Earth, such as in urban settings where space for growing food is constrained.</p> <p>“Some of the agricultural decisions that we made about hydroponics and aquaponics could potentially be used in environments on Earth to raise food,” she says.</p> <p>“The MIT team was great to work with,” says Hinterman. “They were very enthusiastic and hardworking, and we came up with a great design as a result.”</p> <p>In addition to Baber and Hinterman, team members included Siranush Babakhanova (Physics), Joe Kusters (AeroAstro), Hans Nowak (Leaders for Global Operations), Tajana Schneiderman (EAPS), Sam Seaman (Architecture), Tommy Smith (System Design and Management), Natasha Stamler (Mechanical Engineering and Urban Studies and Planning), and Zhuchang Zhan (EAPS). Assistance was provided by Italian designers and architects Jana Lukic, Fabio Maffia, Aldo Moccia, and Samuele Sciarretta. The team’s advisors were Jeff Hoffman, Sara Seager, Matt Silver, Vladimir Aerapetian, Valentina Sumini, and George Lordos.</p> <p>The BIG Idea Challenge is sponsored by NASA’s Space Technology Mission Directorate’s Game Changing Development program and managed by the National Institute of Aerospace.</p> MIT team that designed BEAVER (Biosphere Engineered Architecture for Viable Extraterrestrial Residence), a proposed Martian greenhouse that could provide 100 percent of the food required by four astronauts for up to two yearsPhoto: William LitantNASA, Mars, Aeronautical and astronautical engineering, Urban studies and planning, EAPS, Physics, Architecture, Mechanical engineering, Leaders for Global Operations (LGO), System Design and Management, School of Engineering, School of Science, School of Architecture and Planning, Contests and academic competitions, Agriculture, Food, Space exploration Solution for remotely monitoring oil wells wins MIT $100K MIT startup Acoustic Wells earned the grand prize at the annual entrepreneurship competition. Fri, 17 May 2019 10:14:24 -0400 Zach Winn | MIT News Office <p>The winner of Wednesday’s MIT $100K Entrepreneurship Competition was a startup helping oil well owners remotely monitor and control the pumping of their wells, increasing production while reducing equipment failures and cutting methane emissions.</p> <p>Acoustic Wells, a team including two MIT postdocs, was awarded the grand prize after eight finalist teams pitched their projects to judges and hundreds of attendees at Kresge Auditorium. T-var EdTech, a company developing phonics-based devices that help children learn to read, earned the $10,000 audience choice prize.</p> <p>The MIT $100K, MIT’s largest entrepreneurship competition, celebrated its 30th anniversary this year and featured talks from $100K co-founder Peter Mui ’82 and MassChallenge founder John Harthorne MBA ’07, who won the $100K grand prize in 2007.</p> <p>Mui reflected on how much the program has grown since he and his classmates first had the idea in 1989, back when it was the MIT $10K. Harthorne talked about the inspiration he got as an MBA candidate from his MIT classmates tackling some of the world’s biggest problems.</p> <p>“MIT taught me to dream big, and that’s what this event is all about,” Harthorne said to the crowd at the sold out auditorium. “Every one of the teams competing tonight could go on to do great things.”</p> <p><strong>Improving oil well pumping</strong></p> <p>The majority of North America’s 1.4 million oil and gas wells are run by independent owners operating batches of hundreds or thousands of aging wells. Working with thin profit margins and older equipment, the owners rely on small teams of workers to manually inspect each well in a yearlong, labor-intensive, daily process.</p> <p>When setting up their pumping equipment, each owner must strike a balance: If they set up the wells to pump too slowly, they risk leaving oil in the ground and losing much-needed revenue. If they pump too fast, they risk breaking their equipment and causing pollution.</p> <p>“The result [of pumping too fast] is similar to when you’re drinking with a straw from a cup and there’s nothing left, so you hear that bubble sound,” Acoustic Wells founder and CEO Sebastien Mannai SM ’14 PhD ’18 told the audience. “The same thing happens with oil wells, but on a much bigger scale.”</p> <p>In the case of oil wells, those “bubbles” are pockets of methane that enter the pump and cause it to fail, unleashing unnecessary greenhouse gases in the process.</p> <p>To address this problem, Acoustic Wells is developing an “internet of things” device based on a novel sensor and an online cloud solution to help well owners control their equipment using real-time pumping data.</p> <p>Mannai, a postdoc in the Department of Aeronautics and Astronautics, compared the device’s sensor to a stethoscope. It works via a sensor similar to a microphone connected to the wellhead at the surface. The sensor records the sound of the pump and a field computer processes the data on the edge before sending the results&nbsp;to a cloud-based system for real-time analysis. Owners can view the processed data on a dashboard and remotely send orders to the well to change its pump settings, simplifying the inspection and control processes.</p> <p>The company has already conducted field tests with an early version of its solution on 30 wells across Oklahoma, Texas, and Louisiana. In those tests, the solution was able to detect key issues and the wells were adjusted to increase their efficiency and reduce their emissions, Mannai says.</p> <p>The team, which also includes Charles-Henri Clerget, a postdoctoral associate&nbsp;in the Department of Mathematics who is also affiliated with the Earth Resources Laboratory (ERL), and Louis Creteur, the IoT and cloud achitect&nbsp;of the company Leanbox, will use the winnings from the $100K competition to hire it's first employees&nbsp;and continue to scale its user base.</p> <p>The company is initially targeting independent well owners in North America. It plans to commercialize its product as a Software as a Service platform (SaaS).</p> <p>Overall, Acoustic Wells believes its solution could save independent well owners $6 billion annually while preventing the methane equivalent of 240 million tons of carbon dioxide.</p> <p><strong>Teaching kids to read</strong></p> <p>T-var EdTech has developed a product called The Read Read that acts as a sound board when blocks of letters are placed on it, in order to mimic phonics, a proven method for teaching children to read.</p> <p>Phonics has historically required adults to sound out letters and words with children as they read. The process is time-consuming and best done one-on-one. The Read Read allows children to use phonics on their own.</p> <p>The letters that come with the device represent the major English speech sounds. Children place the letters on the device and, when touched, it sounds out the letters. In the first version of its device, the company has placed braille underneath large, black letters that contrast with the white block to help children who are blind or visually impaired learn braille.</p> <p>In a pilot with the Perkins School for the Blind, students previously classified as nonreaders learned to read using the company’s device, according to founder Alex Tavares, a graudate of Harvard University’s Graduate School of Education.</p> <p>The company has begun preselling to parents and schools and has partnered with LC Industries, one of the largest employers of adults with visual impairments in the U.S.</p> <p>“Phonics works, but it’s not scalable in its current implementation,” Tavares told the audience. “The Read Read scales phonics by allowing kids to practice independently. Finally, phonics is accessible to all kids.”</p> <p>Wednesday night’s competition was the culmination of a process that began in the winter for semifinalist teams, who received funding and mentoring to develop comprehensive business plans around their ideas.</p> <p>The event was run by students and supported by the Martin Trust Center for MIT Entrepreneurship and the MIT Sloan School of Management.</p> <p>This year’s judges were TJ Parker, co-founder and CEO of PillPack; Mira Wilczek ’04 MBA ’09, president and CEO of Cogo Labs; Thomas Collet PhD ’91, president and CEO of Phrixus Pharmaceuticals; Tanguy Chau SM ’10 PhD ’10 MBA ’11, an investor and co-founder of Corvium; and Katie Rae, the CEO and managing general partner of The Engine.</p> Acoustic wells chief operating officer Charles-Henri Clerget (sixth from left), chief executive officer Sebastien Mannai SM ’14 PhD ’18 (sixth from right), and chief technology officer Louis Creteur (fifth from right) pose with judges and organizers after winning Wednesday’s $100K Entrepreneurship Competition.Research, Industry, Innovation and Entrepreneurship (I&E), Oil and gas, Greenhouse gases, Sustainability, Climate change, Startups, Contests and academic competitions, MIT $100K competition, Business and management, Energy, Aeronautical and astronautical engineering, School of Engineering, Alumni/ae MIT, Blue Origin to cooperate on sending research experiment to the moon MIT community input will help determine the focus of the study to be carried aboard Blue Moon lander. Fri, 10 May 2019 17:15:01 -0400 Department of Aeronautics and Astronautics <p>MIT and Blue Origin have signed a memorandum outlining plans to pursue mutual interests in space exploration. MIT will develop one or more payload experiments to be launched aboard Blue Origin’s Blue Moon, a flexible lander delivering a wide variety of small, medium, and large payloads to the lunar surface.</p> <p>MIT Apollo Professor of Astronautics and former NASA Deputy Director Dava Newman, who developed the agreement with Blue Origin, says that over the coming months, MIT researchers will invite input from the MIT community to help determine the nature of the flight opportunity experiment. “Some potential areas include smart habitats, rovers, life support and autonomous systems, human-machine interaction, science of the moon, lunar poles, sample return, and future astronaut performance and suit technologies,” Newman says.</p> <p>Blue Origin’s business development director, A.C. Charania, has said the company’s lunar transportation program is its “first step to developing a lunar landing capability for the country, for other customers internationally, to be able to land multimetric tons on the lunar surface.” Blue Moon payloads could include science experiments, rovers, power systems, and sample return stages.</p> <p>MIT has a long history of aerospace engineering development and lunar science related to space exploration, including receiving the first major contract of the Apollo program, which involved the design and development of the lunar missions’ guidance and navigation computers. MIT experiments have flown on Space Shuttle missions, and been conducted aboard Skylab, Mir, and the International Space Station. MIT also led the GRAIL (Gravity Recovery And Interior Laboratory) mission to explore the moon’s gravity field and geophysical structure.</p> Conceptual image of Blue Origin’s Blue Moon, the proposed robotic space cargo carrier and lander for making cargo deliveries to the MoonIllustration courtesy of Blue Origin.Space, astronomy and planetary science, Aeronautical and astronautical engineering, School of Engineering, Planetary science, Research, Moon A sustainability rating for space debris Teams from the MIT Media Lab, European Space Agency, and World Economic Forum will create a rating system and global standards for space waste mitigation. Mon, 06 May 2019 14:00:01 -0400 Helen Knight | MIT Media Lab <p>Space is becoming increasingly congested, even as our societal dependence on space technology is greater than ever before.</p> <p>With over 20,000 pieces of debris larger than 10 centimeters, including inactive satellites and discarded rocket parts hurtling around in Earth’s orbit, the risk of damaging collisions increases every year.</p> <p>In a bid to address this issue, and to foster global standards in waste mitigation, the World Economic Forum has chosen a team led by the <a href="">Space Enabled Research Group</a> at the MIT Media Lab, together with a team from the European Space Agency (ESA), to launch the Space Sustainability Rating (SSR), a concept developed by the Forum’s <a href="">Global Future Council on Space Technologies</a>.</p> <p>Similar to rating systems such as the LEED certification used by the construction industry, the SSR is designed to ensure long-term sustainability by encouraging more responsible behavior among countries and companies participating in space.</p> <p>The team, announced on May 6 at the <a href="">Satellite 2019</a> conference in Washington, also includes collaborators from Bryce Space and Technology, and the University of Texas at Austin.</p> <p>The MIT portion of the team will be led by Danielle Wood, the Benesse Corporation Career Development Assistant Professor of Research in Education within MIT’s Program in Media Arts and Sciences, and jointly appointed in the Department of Aeronautics and Astronautics. She will be working alongside Minoo Rathnasabapathy, a research engineer within the Space Enabled group. Professor Moriba Jah and Adjunct Professor Diane Howard contribute from the <a href="">University of Texas at Austin</a> building on Professor Jah’s in-depth research on tracking and visualizing space objects and Professor Howard’s legal knowledge, while Mike French and Aschley Schiller bring expertise about space industry dynamics from <a href="">Bryce</a>. The MIT-led team joins the efforts of Nikolai Khlystov and Maksim Soshkin in the World Economic Forum Aerospace Industry Team as well as Stijn Lemmens and Francesca Letzia in the <a href="">Space Debris Office</a> of the European Space Agency.</p> <p>Working with the World Economic Forum and the other collaborators to create the SSR is directly in line with the mission of the Media Lab's Space Enabled research group, of which Wood is also the founder and head, to advance justice in Earth's complex systems using designs enabled by space.</p> <p>“One element of justice is ensuring that every country has the opportunity to participate in using space technology as a form of infrastructure to provide vital services in our society such as communication, navigation, and environmental monitoring,” Wood says.</p> <p>Many aspects of modern society depend on satellite services. Weather reports, for example, depend on a global network of weather satellites operated primarily by governments.</p> <p>In addition, car drivers, trains, ships and airplanes routinely use satellite positioning services. These same positioning satellites also offer a highly accurate timing signal used by the global banking system to precisely time financial transactions.</p> <p>“Our global economy depends on our ability to operate satellites safely in order to fly in planes, prepare for severe weather, broadcast television and study our changing climate,” Wood says. “To continue using satellites in orbit around Earth for years to come, we need to ensure that the environment around Earth is as free as possible from trash leftover from previous missions.”</p> <p>When satellites are retired from useful service, many will remain in orbit for decades longer, adding to the problem of space debris.</p> <p>In the best case scenario, satellites will gradually drift down to lower orbits and burn up in Earth's atmosphere. However, the higher the orbit a satellite is operating in, the longer it takes to move down and burn up.</p> <p>When satellite operators design their satellites, they are able to choose which altitude to use, and for how long their spacecraft will operate. They therefore have a responsibility to design their satellites to produce as little waste as possible in Earth's orbit.</p> <p>“The Space Sustainability Rating will create an incentive for companies and governments operating satellites to take all the steps they can to reduce the creation of space debris,” Wood says. “This will create a more equitable opportunity for new countries to participate in space with less risk of collision with older satellites.”</p> <p>Many governments already provide guidelines to companies operating within their borders, to help reduce the amount of space debris produced. The space community is also engaged in an ongoing discussion about new ways to reduce the creation of debris.</p> <p>But in the meantime, multiple companies are planning to launch large constellations of satellites that will quickly increase the number of spacecraft in orbit. These satellite constellations will eventually be decommissioned, adding to the growing space junk problem.</p> <p>To address this issue, the World Economic Forum Global Future Council on Space Technologies, which is composed of leaders from government, academia and industry, has developed the concept of a voluntary system, the SSR, to encourage those who operate satellites to create as little debris as possible.</p> <p>The newly announced team will draw up the rules and processes by which the SSR will operate, including determining what information should be collected from satellite operators to assess their impact on space sustainability.</p> <p>“Countries in every region are starting new space programs to participate in applying space to their national development,” Wood says. “Creating the Space Sustainability Rating with our collaborators is one key step to ensure that all countries continue to increase the benefits we receive from space technology," she says.</p> <p>With a lack of diversity in existing strategies to tackle the orbital debris challenge, the Global Future Council felt it important to develop an industry-wide approach, according to Nikolai Khlystov, lead for aerospace industry at the World Economic Forum.</p> <p>“We are very glad to partner with leading industry entities such as the European Space Agency, MIT's Space Enabled research group, the University of Texas at Austin and Bryce Space and Technology to build and launch the Space Sustainability Rating,” Khlystov says.</p> <p>The envisaged SSR has as clear goal to promote mission designs and operational concepts that avoid an unhampered growth in space debris and the resulting detrimental effects, says Stijn Lemmens, senior space debris mitigation analyst in the Space Debris Office at ESA.</p> <p>“Together with our collaborators, we aim to put in place a system that has the flexibility to stimulate and drive innovative sustainable design solutions, and spotlight those missions that contribute positively to the space environment,” Lemmens says.</p> In orbit around Earth today there are over 20,000 objects larger than 10 centimeters. Of these, only about 2000 are operational satellites. The remainder are space debris that create a hazard for the useful satellites that provide our global weather, navigation, and communication services. Image: European Space AgencyAeronautical and astronautical engineering, School of Architecture and Planning, Media Lab, Satellites, Space, astronomy and planetary science, Technology and society, Policy School of Engineering first quarter 2019 awards Faculty members recognized for excellence via a diverse array of honors, grants, and prizes over the last quarter. Fri, 03 May 2019 15:30:01 -0400 School of Engineering <p>Members of the MIT engineering faculty receive many&nbsp;awards in recognition of their scholarship, service, and overall excellence. Every quarter, the School of Engineering publicly recognizes&nbsp;their achievements by highlighting the&nbsp;honors, prizes, and medals won by faculty working in our academic departments, labs, and centers.</p> <p>Regina Barzilay, of the Department of Electrical Engineering and Computer Science, was named among the “<a href="">Top 100 AI Leaders in Drug Discovery and Advanced Healthcare</a>” by Deep Knowledge Analytics on Feb. 1.</p> <p>Sir Tim Berners-Lee, of the Department of Electrical Engineering and Computer Science, was named <a href="">Person of the Year</a> by the <em>Financial Times</em> on Mar. 14.</p> <p>Ed Boyden, of the Department of Biological Engineering and the MIT Media Lab, was awarded the <a href="">Rumford Prize</a> on Jan. 30.</p> <p>Emery N. Brown, of the Department of Brain and Cognitive Sciences and the Institute for Medical and Engineering Science, was awarded an <a href="">honorary degree</a> from the University of Southern California on April 9.</p> <p>Areg Danagoulian, of the Department of Nuclear Science and Engineering, was named to the <a href="">Consortium of Monitoring, Technology, and Verification</a> by the Department of Energy’s National Nuclear Security Administration on Jan. 17.</p> <p>Luca Daniel, of the Department of Electrical Engineering and Computer Science, was awarded a <a href="">Thornton Family Faculty Research Innovation Fellowship</a> on Feb. 8.</p> <p>Constantinos Daskalakis, of the Department of Electrical Engineering and Computer Science, was awarded a <a href="">Frank Quick Faculty Research Innovation Fellowship</a> on Feb. 8.</p> <p>Srini Devadas, of the Department of Electrical Engineering and Computer Science, won the <a href="">Distinguished Alumnus Award</a> by the Indian Institute of Technology, Madras on Feb. 1.</p> <p>Carmen Guerra-Garcia, of the Department of Aeronautics and Astronautics, was named an <a href="">AIAA Senior Member</a> on April 5.</p> <p>Thomas Heldt, of the Department of Electrical Engineering and Computer Science and the Institute for Medical Engineering and Science, was named <a href="">Distinguished Lecturer</a> by the IEEE Engineering in Medicine and Biology Society on Dec. 20, 2018.</p> <p>Tommi Jaakola, of the Department of Electrical Engineering and Computer Science, was named among “<a href="">Top 100 AI Leaders in Drug Discovery and Advanced Healthcare</a>” by Deep Knowledge Analytics on Feb. 1.</p> <p>Manolis Kellis, of the Department of Electrical Engineering and Computer Science, was named among “<a href="">Top 100 AI Leaders in Drug Discovery and Advanced Healthcare</a>” by Deep Knowledge Analytics on Feb. 1.</p> <p>Sangbae Kim, of the Department of Mechanical Engineering, was named a <a href="">Defense Science Study Group</a> member on Mar. 20.</p> <p>Angela Koehler, of the Department of Biological Engineering, won the Junior Bose Award for Teaching Excellence on Mar. 11.</p> <p>Jing Kong, of the Department of Electrical Engineering and Computer Science, was awarded a <a href="">Thornton Family Faculty Research Innovation Fellowship</a> on Feb. 8.</p> <p>Luqiao Liu, of the Department of Electrical Engineering and Computer Science, was received a <a href="">Young Investigator Research Program</a> grant from the U.S. Air Force Office of Scientific Research on Sept. 26, 2018.</p> <p>Gareth McKinley, of the Department of Mechanical Engineering, was elected to the <a href="">National Academy of Engineering</a> on Feb. 2.</p> <p>Muriel Médard, of the Department of Electrical Engineering and Computer Science, was named a fellow by the <a href="">National Academy of Inventors</a> on Dec. 11, 2018.</p> <p>Stefanie Mueller, of the Department of Electrical Engineering and Computer Science, won an <a href="">NSF CAREER award</a> on Feb. 22.</p> <p>Julia Ortony, of the Department of Materials Science and Engineering, was awarded a Professor Amar G. Bose Research Grant on Feb. 14.</p> <p>Ellen Roche, of the Department of Mechanical Engineering and the Institute of Medical Engineering and Science, won an <a href="">NSF CAREER award</a> on Feb. 20.</p> <p>Christopher Schuh, of the Department of Materials Science and Engineering, was elected to the <a href="">National Academy of Engineering</a> on Feb. 7.</p> <p>Suvrit Sra, of the Department of Electrical Engineering and Computer Science, won an <a href="">NSF CAREER award</a> on Mar. 11.</p> <p>Leia Stirling, of the Department of Aeronautics and Astronautics, was named an <a href="">Alan I. Leshner Leadership Institute fellow</a> on Feb. 11.</p> <p>Peter Szolovits, of the Department of Electrical Engineering and Computer Science, was named among “<a href="">Top 100 AI Leaders in Drug Discovery and Advanced Healthcare</a>” by Deep Knowledge Analytics on Feb. 1.</p> Photo: Lillie Paquette / MIT School of Engineeringawards, Awards, honors and fellowships, Biological engineering, Aeronautical and astronautical engineering, Chemical engineering, Electrical Engineering & Computer Science (eecs), Mechanical engineering, Civil and environmental engineering, DMSE, Nuclear science and engineering, IDSS, Institute for Medical Engineering and Science (IMES) Six suborbital research payloads from MIT fly to space and back Space Exploration Initiative research aboard Blue Origin’s New Shepard experiment capsule crossed the Karman line for three minutes of sustained microgravity. Fri, 03 May 2019 14:50:40 -0400 Stephanie Strom | MIT Media Lab <p>Blast off! MIT made its latest foray into research in space on May 2 via six payloads from the Media Lab Space Exploration Initiative, tucked into Blue Origin’s New Shepard reusable space vehicle that took off from a launchpad in West Texas.</p> <p>It was also the first time in the history of the Media Lab that in-house research projects were launched into space, for several minutes of sustained microgravity. The results of that research may have big implications for semiconductor manufacturing, art and telepresence, architecture and farming, among other things.</p> <p>“The projects we’re testing operate fundamentally different in Earth’s gravity compared to how they would operate in microgravity,” explained Ariel Ekblaw, the founder and lead of the Media Lab’s Space Exploration Initiative.</p> <p>Previously, the Media Lab sent projects into microgravity aboard the plane used by NASA to train astronauts, lovingly nicknamed “the vomit comet.” These parabolic flights provide repeated 15 to 30 second intervals of near weightlessness. The New Shepard experiment capsule will coast in microgravity for significantly longer and cross the Karman line (the formal boundary of “space”) in the process. While that may not seem like much time, it’s enough to get a lot accomplished.</p> <p>“The capsule where the research takes place arcs through space for three minutes, which gives us precious moments of sustained, high quality microgravity,” Ekblaw said. “This provides an opportunity to expand our experiments from prior parabolic flight protocols, and test entirely new research as well.”</p> <p>Depending on the results of the experiments done during New Shepard’s flight, some of the projects will undergo further, long-term research aboard the International Space Station, Ekblaw said.</p> <p>On this trip, she sent Tessellated Electromagnetic Space Structures for the Exploration of Reconfigurable, Adaptive Environments, otherwise known as TESSERAE, into space. The ultimate goal for these sensor-augmented hexagonal and pentagonal &nbsp;“tiles” is to autonomously self-assemble into space structures. These flexible, reconfigurable modules can then be used for habitat construction, in-space assembly of satellites, or even as infrastructure for parabolic mirrors. Ekblaw hopes TESSERAE will one day support in-orbit staging bases for human exploration of the surface of the moon or Mars, or enable low Earth orbit space tourism.</p> <p>An earlier prototype, flown on a parabolic flight in November 2017, validated the research concept mechanical structure, polarity arrangement of bonding magnets, and the self-assembly physical protocol. On the Blue Origin flight, Ekblaw is testing a new embedded sensor network in the tiles, as well as their communication architecture and guidance control aspects of their self-assembly capabilities. “We’re testing whether they’ll autonomously circulate, find correct neighbors, and bond together magnetically in microgravity for robust self-assembly,” Ekblaw said.</p> <p>Another experiment aboard New Shepard combined art with the test of a tool for future space exploration — traversing microgravity with augmented mobility. Living Distance, an artwork conceived by the Space Exploration Initiative’s art curator, Xin Liu, explores freedom of movement via a wisdom tooth — yes, you read that correctly!</p> <p>The tooth traveled to space carried by a robotic device named EBIFA and encased in a crystalline container. Once New Shepard entered space, the container burst open and EBIFA swung into action, shooting cords out with magnetic tips to latch onto a metal surface. The tooth then floated through space with minimal interference in the virtually zero-gravity environment.</p> <p>“In this journey, the tooth became a newborn entity in space, its crystalline, sculptural body and life supported by an electromechanical system,” Xin Liu wrote. “Each of its weightless movements was carefully calculated on paper and modeled in simulation software, as there can never be a true test like this on Earth.”</p> <p>The piece builds on a performance art work called Orbit Weaver that Liu performed last year during a parabolic flight, where she was physically tethered to a nylon cord that floated freely and attached to nearby surfaces. Orbit Weaver and Living Distance may offer insights to future human space explorers about how best to navigate weightlessness.</p> <p>A piece of charcoal also made the trip to space inside a chamber lined with drawing paper, part of a project designed by Ani Liu, a Media Lab alumna. In microgravity, the charcoal will chart its own course inside the chamber, marking the paper as it floats through an arc far above the Earth.</p> <p>When the chamber returns to the Media Lab, the charcoal will join forces with a KUKA robot that will mimic the charcoal’s trajectory during the three-ish minutes of coasting in microgravity. Together, the charcoal and the robot will become a museum exhibit that provides a demonstration of motion in microgravity to a broad audience and illustrates the Space Exploration Initiative’s aim to democratize access to space and invite the public to engage in space exploration.</p> <p>Harpreet Sareen, another Media Lab alum, tested how crystals form in microgravity, research that may eventually lead to manufacturing semiconductors in space.</p> <p>Semiconductors used in today’s technology require crystals with extremely high levels of purity and perfect shapes, but gravity interferes with crystal growth on Earth, resulting in faults, contact stresses, and other flaws. Sareen and his collaborator, Anna Garbier, created a nano-sized lab in a box a little smaller than a half-gallon milk carton. The electric current that kicked off growth of the crystals during the three minutes the New Shepard capsule was suborbital was triggered by onboard rocket commands from Blue Origin.</p> <p>The crystals will be evaluated for potential industrial applications, and they also have a future as an art installation: Floral Cosmonauts.</p> <p>And then there are the 40 or so bees (one might say “apionauts”) that made the trip into space on behalf of the Mediated Matter group at the Media Lab, which is interested in seeing the impact space travel has on a queen bee and her retinue. Two queen bees that were inseminated at a U.S. Department of Agriculture facility in Louisiana went to space, each with roughly 20 attendant bees whose job it was to feed her and help control her body temperature.</p> <p>The bees traveled via two small containers — metabolic support capsules — into which they previously built honeycomb structures. This unique design gives them a familiar environment for their trip. A modified GoPro camera, pointed into the specially designed container housing the bees, was fitted into the top of the case to film the insects and create a record of their behavior during flight.</p> <p>Everything inside the case was designed to make the journey as comfortable as possible for the bees, right down to a tiny golden heating pad that was to kick into action if the temperature dropped too low for a queen bee’s comfort.</p> <p>Researchers in the Mediated Matter group will study the behavior of the bees when they return to Earth and are reintroduced to a colony at the Media Lab. Will the queens lay their eggs? Will those eggs hatch? And can bees who’ve been to space continue making pollen and honey once they’ve returned to Earth? Those are among the many questions the team will be asking.</p> <p>“We currently have no robotic alternative to bees for pollination of many crops,” Ekblaw said. “If we want to grow crops on Mars, we may need to bring bees with us. Knowing if they can survive a mission, reintegrate into the hive, and thrive afterwards is critical.”</p> <p>As these projects show, the Space Exploration Initiative unites engineers, scientists, artists, and designers across a multifaceted research portfolio. The team looks forward to a regular launch cadence and progressing through microgravity research milestones — from parabolic flights, to further launch opportunities with Blue Origin, to the International Space Station and even lunar landings.</p> MIT Media Lab researchers (l-r) Xin Liu, Felix Kraemer, Ariel Ekblaw, Pete Dilworth, Rachel Smith, and Harpreet Sareen stand in front of the Blue Origin capsule holding their six payloads.Media Lab, Space, astronomy and planetary science, Research, Industry, School of Architecture and Planning, Arts, Materials Science and Engineering, Architecture, Agriculture, Manufacturing, Aeronautical and astronautical engineering MIT podcast breaks down the facts on climate change TILclimate (Today I Learned: Climate) podcast demystifies the science, technology, and policy surrounding climate change in 10-minute bites. Thu, 02 May 2019 17:10:01 -0400 Environmental Solutions Initiative <p>Climate change is confusing.</p> <p>At least, that’s the impression the average American might get if they tried to learn about the subject from the flurry of journal articles, policy papers, and action plans that dominate the conversation among scientists and environmental advocates. To help demystify the science, solutions, and policies behind climate change, the MIT Environmental Solutions Initiative (ESI) has launched a podcast series called TILclimate, airing eight episodes in its first season over the 2019 spring semester.</p> <p>“There’s a lot of information out there about why climate change is happening, how it will affect human life, and the solutions that are on the table. But it’s hard to find sources that you trust,” says Laur Hesse Fisher, program director for ESI and host of the new series. “And even then, there’s still a lot of jargon and technicalities that you have to wade through.</p> <p>“We’re trying to solve that problem.”</p> <p>In each 10-minute episode, Hesse Fisher speaks to an expert from the MIT community to break down a clear, focused question related to climate change. In the first batch of episodes, these questions have included: What do clouds have to do with climate change? Why are different parts of the world experiencing different climate impacts? How does carbon pricing work?</p> <p>The podcast is part of a broader ESI project called MIT Climate, a community-building effort built around a common web portal where users can share climate change-related projects, news stories, and learning resources at MIT and beyond. MIT Climate is intended to draw individuals and groups working on climate issues at MIT closer together, and eventually become a platform for worldwide, science-based learning and engagement on climate change. You can see a prototype of the portal at <a href=""></a>.</p> <p>“We named the podcast TILclimate after the popular Reddit hashtag TIL, which stands for Today I Learned,” says Hesse Fisher. “We hope to signify that these episodes are accessible. Even if you have no prior knowledge of climate science or policy, after 10 minutes you know enough to start being a part of the conversation.”</p> <p>To hear this approach in action, you can listen to the podcast’s first episode, “TIL about planes,” where Hesse Fisher interviews MIT professor of aeronautics and astronautics Steven Barrett, head of MIT's Laboratory for Aviation and the Environment. Together, they walk listeners through the two primary ways that air travel impacts Earth’s climate: by releasing carbon dioxide high into the atmosphere and by producing heat-trapping condensation trails.</p> <p>“Most of the CO<sub>2</sub> that aviation's ever emitted is still in the atmosphere because it lasts so long,” Barrett says in the interview. To help illustrate his point, Hesse Fisher adds: “Think about fighter planes circling Europe in World War I, or Charles Lindbergh flying across the Atlantic Ocean in 1927. The CO<sub>2</sub> from those flights are still in the atmosphere.”</p> <p>“We steer clear of jargon whenever possible and make a real attempt to define the terms and concepts that we use,” says Hesse Fisher. “The point is, we hope the podcast will appeal to the ‘climate curious’ — people who are just interested enough in climate change that they’d listen to something around 10 minutes.”</p> <p>Those who do want to dig deeper into the content can head to TILclimate’s profile on the <a href="">MIT Climate website</a>, where each episode posting includes a “More Info” tab with links to external resources.</p> <p>Season 1 concluded on May 1, comprising eight episodes about planes, clouds, materials, hurricanes, uncertainty, climate impacts, carbon pricing, and geoengineering. You can listen to TILclimate on iTunes, Spotify, Google Podcasts, or wherever you get your podcasts.</p> <p>MIT Climate is MIT’s central online portal for all things related to anthropogenic climate change.</p> Each 10-minute episode of TILclimate explains a different climate change topic.Image: Aaron Krol/MIT ESIClimate change, Earth and atmospheric sciences, School of Science, EAPS, ESI, Climate, Sustainability, Science communications, Aeronautical and astronautical engineering, School of Engineering