2 research outputs found

    Project-based, collaborative, algorithmic robotics for high school students: Programming self-driving race cars at MIT

    Get PDF
    We describe the pedagogy behind the MIT Beaver Works Summer Institute Robotics Program, a new high-school STEM program in robotics. The program utilizes state-of-the-art sensors and embedded computers for mobile robotics. These components are carried on an exciting 1/10-scale race-car platform. The program has three salient, distinguishing features: (i) it focuses on robotics software systems: the students design and build robotics software towards real-world applications, without being distracted by hardware issues; (ii) it champions project-based learning: the students learn through weekly project assignments and a final course challenge; (iii) the learning is implemented in a collaborative fashion: the students learn the basics of collaboration and technical communication in lectures, and they work in teams to design and implement their software systems. The program was offered as a four-week residential program at MIT in the summer of 2016. In this paper, we provide the details of this new program, its teaching objectives, and its results. We also briefly discuss future directions and opportunities

    Software and Hardware Infrastructure for Visual-Inertial SLAM

    No full text
    One of the challenges faced by researchers in the field of robot localization and mapping is finding a reliable infrastructure to test their ideas. That infrastructure could be a simulation platform, suitable hardware, or a sensor interface. A useful simulation platform needs to capture the dynamics and the sensor modalities that meet the researchers’ needs. A suitable hardware needs to have the capability to navigate, sense the environments, and use onboard computers to run the software it was designed for. A sensor interface allows adapting and testing algorithms on novel sensors. In this research, we develop an essential hardware and software infrastructure for aiding the development and testing of visual-inertial Simultaneous Localization and Mapping (SLAM) systems. SLAM is a fundamental problem in robot navigation and enables constructing or updating a representation (map) of an environment utilizing sensors on board a robot while concurrently using that representation to localize the robot itself. In visual-inertial SLAM the onboard sensors are cameras (monocular or stereo) and an inertial measurement unit (IMU). The contribution of this thesis is threefold. First, we develop a hardware platform consisting of a real drone capable of running state-of-art metric-semantic SLAM; this infrastructure allows us to test advanced SLAM algorithms using real sensors and real robot dynamics. Second, we develop a multi-robot simulation platform that includes dynamically accurate, photo-realistic drones; this platform allows extending our tests to multi-robot SLAM systems. Finally, we develop a new sensor interface; in particular, we integrate and test an omnidirectional stereo frontend in Kimera, an open-source visual-inertial SLAM pipeline. The thesis presents the design, implementation, and testing of each contribution.M.Eng
    corecore