17 research outputs found

    An Algorithm for Group Formation and Maximal Independent Set in an Amorphous Computer

    Get PDF
    Amorphous computing is the study of programming ultra-scale computing environments of smart sensors and actuators cite{white-paper}. The individual elements are identical, asynchronous, randomly placed, embedded and communicate locally via wireless broadcast. Aggregating the processors into groups is a useful paradigm for programming an amorphous computer because groups can be used for specialization, increased robustness, and efficient resource allocation. This paper presents a new algorithm, called the clubs algorithm, for efficiently aggregating processors into groups in an amorphous computer, in time proportional to the local density of processors. The clubs algorithm is well-suited to the unique characteristics of an amorphous computer. In addition, the algorithm derives two properties from the physical embedding of the amorphous computer: an upper bound on the number of groups formed and a constant upper bound on the density of groups. The clubs algorithm can also be extended to find the maximal independent set (MIS) and Delta+1Delta + 1 vertex coloring in an amorphous computer in O(logN)O(log N) rounds, where NN is the total number of elements and DeltaDelta is the maximum degree

    Paradigms for Structure in an Amorphous Computer

    Get PDF
    Recent developments in microfabrication and nanotechnology will enable the inexpensive manufacturing of massive numbers of tiny computing elements with sensors and actuators. New programming paradigms are required for obtaining organized and coherent behavior from the cooperation of large numbers of unreliable processing elements that are interconnected in unknown, irregular, and possibly time-varying ways. Amorphous computing is the study of developing and programming such ultrascale computing environments. This paper presents an approach to programming an amorphous computer by spontaneously organizing an unstructured collection of processing elements into cooperative groups and hierarchies. This paper introduces a structure called an AC Hierarchy, which logically organizes processors into groups at different levels of granularity. The AC hierarchy simplifies programming of an amorphous computer through new language abstractions, facilitates the design of efficient and robust algorithms, and simplifies the analysis of their performance. Several example applications are presented that greatly benefit from the AC hierarchy. This paper introduces three algorithms for constructing multiple levels of the hierarchy from an unstructured collection of processors

    Automatic profiler-driven probabilistic compiler optimization

    Get PDF
    Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1994.Includes bibliographical references (p. 80-81).by Daniel Coore.M.S

    Botanical computing : a developmental approach to generating interconnect topologies on an amorphous computer

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.Includes bibliographical references (p. 292-295).by Daniel N. Coore.Ph.D

    Amorphous Computing

    Get PDF
    Amorphous computing is the development of organizational principles and programming languages for obtaining coherent behaviors from the cooperation of myriads of unreliable parts that are interconnected in unknown, irregular, and time-varying ways. The impetus for amorphous computing comes from developments in microfabrication and fundamental biology, each of which is the basis of a kernel technology that makes it possible to build or grow huge numbers of almost-identical information-processing units at almost no cost. This paper sets out a research agenda for realizing the potential of amorphous computing and surveys some initial progress, both in programming and in fabrication. We describe some approaches to programming amorphous systems, which are inspired by metaphors from biology and physics. We also present the basic ideas of cellular computing, an approach to constructing digital-logic circuits within living cells by representing logic levels by concentrations DNA-binding proteins

    Experience With the Cardiac Surgery Simulation Curriculum: Results of the Resident and Faculty Survey

    Get PDF
    BACKGROUND: The Cardiac Surgery Simulation Curriculum was developed at 8 institutions from 2010 to 2013. A total of 27 residents were trained by 18 faculty members. A survey was conducted to gain insight into the initial experience. METHODS: Residents and faculty were sent a 72- and 68-question survey, respectively. In addition to demographic information, participants reported their view of the overall impact of the curriculum. Focused investigation into each of the 6 modules was obtained. Participants evaluated the value of the specific simulators used. Institutional biases regarding implementation of the curriculum were evaluated. RESULTS: Twenty (74%) residents and 14 (78%) faculty responded. The majority (70%) of residents completed this training in their first and second year of traditional-track programs. The modules were well regarded with no respondents having an unfavorable view. Both residents and faculty found low, moderate, and high fidelity simulators to be extremely useful, with particular emphasis on utility of high fidelity components. The vast majority of residents (85%) and faculty (100%) felt more comfortable in the resident skill set and performance in the operating room. Simulation of rare adverse events allowed for development of multidisciplinary teams to address them. At most institutions, the conduct of this curriculum took precedence over clinical obligations (64%). CONCLUSIONS: The Cardiac Surgery Simulation Curriculum was implemented with robust adoption among the investigating centers. Both residents and faculty viewed the modules favorably. Using this curriculum, participants indicated an improvement in resident technical skills and were enthusiastic about training in adverse events and crisis management

    Simulation-Based Training in Cardiac Surgery

    Get PDF
    BACKGROUND: Operating room surgical training has significant limitations. This study hypothesized that some skills could be learned efficiently and safely by using simulation with component task training, deliberate practice, progressive complexity, and experienced coaching to produce safer cardiac surgeons. METHODS: Training modules included cardiopulmonary bypass, coronary artery bypass grafting, aortic valve replacement, massive air embolism, acute intraoperative aortic dissection, and sudden deterioration in cardiac function. Using deliberate practice, first-year cardiothoracic surgical residents at eight institutions were trained and evaluated on component tasks for each module and later on full cardiac operations. Evaluations were based on five-point Likert-scale tools indexed by module, session, task items, and repetitions. Statistical analyses relied on generalized linear model estimation and corresponding confidence intervals. RESULTS: The 27 residents who participated demonstrated improvement with practice repetitions resulting in excellent final scores per module (mean ± two SEs): cardiopulmonary bypass, 4.80 ± 0.12; coronary artery bypass grafting, 4.41 ± 0.19; aortic valve replacement, 4.51 ± 0.20; massive air embolism, 0.68 ± 0.14; acute intraoperative aortic dissection, 4.52 ± 0.17; and sudden deterioration in cardiac function, 4.76 ± 0.16. The transient detrimental effect of time away from training was also evident. CONCLUSIONS: Overall performance in component tasks and complete cardiac surgical procedures improved during simulation-based training. Simulation-based training imparts skill sets for management of adverse events and can help produce safer surgeons
    corecore