249 research outputs found

    Introducing distributed dynamic data-intensive (D3) science: Understanding applications and infrastructure

    Get PDF
    A common feature across many science and engineering applications is the amount and diversity of data and computation that must be integrated to yield insights. Data sets are growing larger and becoming distributed; and their location, availability and properties are often time-dependent. Collectively, these characteristics give rise to dynamic distributed data-intensive applications. While "static" data applications have received significant attention, the characteristics, requirements, and software systems for the analysis of large volumes of dynamic, distributed data, and data-intensive applications have received relatively less attention. This paper surveys several representative dynamic distributed data-intensive application scenarios, provides a common conceptual framework to understand them, and examines the infrastructure used in support of applications.Comment: 38 pages, 2 figure

    NASA-HBCU Space Science and Engineering Research Forum Proceedings

    Get PDF
    The proceedings of the Historically Black Colleges and Universities (HBCU) forum are presented. A wide range of research topics from plant science to space science and related academic areas was covered. The sessions were divided into the following subject areas: Life science; Mathematical modeling, image processing, pattern recognition, and algorithms; Microgravity processing, space utilization and application; Physical science and chemistry; Research and training programs; Space science (astronomy, planetary science, asteroids, moon); Space technology (engineering, structures and systems for application in space); Space technology (physics of materials and systems for space applications); and Technology (materials, techniques, measurements)

    PolyVR - A Virtual Reality Authoring Framework for Engineering Applications

    Get PDF
    Die virtuelle Realität ist ein fantastischer Ort, frei von Einschränkungen und vielen Möglichkeiten. Für Ingenieure ist dies der perfekte Ort, um Wissenschaft und Technik zu erleben, es fehlt jedoch die Infrastruktur, um die virtuelle Realität zugänglich zu machen, insbesondere für technische Anwendungen. Diese Arbeit bescheibt die Entstehung einer Softwareumgebung, die eine einfachere Entwicklung von Virtual-Reality-Anwendungen und deren Implementierung in immersiven Hardware-Setups ermöglicht. Virtual Engineering, die Verwendung virtueller Umgebungen für Design-Reviews während des Produktentwicklungsprozesses, wird insbesondere von kleinen und mittleren Unternehmen nur äußerst selten eingesetzt. Die Hauptgründe sind nicht mehr die hohen Kosten für professionelle Virtual-Reality-Hardware, sondern das Fehlen automatisierter Virtualisierungsabläufe und die hohen Wartungs- und Softwareentwicklungskosten. Ein wichtiger Aspekt bei der Automatisierung von Virtualisierung ist die Integration von Intelligenz in künstlichen Umgebungen. Ontologien sind die Grundlage des menschlichen Verstehens und der Intelligenz. Die Kategorisierung unseres Universums in Begriffe, Eigenschaften und Regeln ist ein grundlegender Schritt von Prozessen wie Beobachtung, Lernen oder Wissen. Diese Arbeit zielt darauf ab, einen Schritt zu einem breiteren Einsatz von Virtual-Reality-Anwendungen in allen Bereichen der Wissenschaft und Technik zu entwickeln. Der Ansatz ist der Aufbau eines Virtual-Reality-Authoring-Tools, eines Softwarepakets zur Vereinfachung der Erstellung von virtuellen Welten und der Implementierung dieser Welten in fortschrittlichen immersiven Hardware-Umgebungen wie verteilten Visualisierungssystemen. Ein weiteres Ziel dieser Arbeit ist es, das intuitive Authoring von semantischen Elementen in virtuellen Welten zu ermöglichen. Dies sollte die Erstellung von virtuellen Inhalten und die Interaktionsmöglichkeiten revolutionieren. Intelligente immersive Umgebungen sind der Schlüssel, um das Lernen und Trainieren in virtuellen Welten zu fördern, Prozesse zu planen und zu überwachen oder den Weg für völlig neue Interaktionsparadigmen zu ebnen

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Technology 2003: The Fourth National Technology Transfer Conference and Exposition, volume 2

    Get PDF
    Proceedings from symposia of the Technology 2003 Conference and Exposition, Dec. 7-9, 1993, Anaheim, CA, are presented. Volume 2 features papers on artificial intelligence, CAD&E, computer hardware, computer software, information management, photonics, robotics, test and measurement, video and imaging, and virtual reality/simulation

    Automatic Spatiotemporal Analysis of Cardiac Image Series

    Get PDF
    RÉSUMÉ À ce jour, les maladies cardiovasculaires demeurent au premier rang des principales causes de décès en Amérique du Nord. Chez l’adulte et au sein de populations de plus en plus jeunes, la soi-disant épidémie d’obésité entraînée par certaines habitudes de vie tels que la mauvaise alimentation, le manque d’exercice et le tabagisme est lourde de conséquences pour les personnes affectées, mais aussi sur le système de santé. La principale cause de morbidité et de mortalité chez ces patients est l’athérosclérose, une accumulation de plaque à l’intérieur des vaisseaux sanguins à hautes pressions telles que les artères coronaires. Les lésions athérosclérotiques peuvent entraîner l’ischémie en bloquant la circulation sanguine et/ou en provoquant une thrombose. Cela mène souvent à de graves conséquences telles qu’un infarctus. Outre les problèmes liés à la sténose, les parois artérielles des régions criblées de plaque augmentent la rigidité des parois vasculaires, ce qui peut aggraver la condition du patient. Dans la population pédiatrique, la pathologie cardiovasculaire acquise la plus fréquente est la maladie de Kawasaki. Il s’agit d’une vasculite aigüe pouvant affecter l’intégrité structurale des parois des artères coronaires et mener à la formation d’anévrismes. Dans certains cas, ceux-ci entravent l’hémodynamie artérielle en engendrant une perfusion myocardique insuffisante et en activant la formation de thromboses. Le diagnostic de ces deux maladies coronariennes sont traditionnellement effectués à l’aide d’angiographies par fluoroscopie. Pendant ces examens paracliniques, plusieurs centaines de projections radiographiques sont acquises en séries suite à l’infusion artérielle d’un agent de contraste. Ces images révèlent la lumière des vaisseaux sanguins et la présence de lésions potentiellement pathologiques, s’il y a lieu. Parce que les séries acquises contiennent de l’information très dynamique en termes de mouvement du patient volontaire et involontaire (ex. battements cardiaques, respiration et déplacement d’organes), le clinicien base généralement son interprétation sur une seule image angiographique où des mesures géométriques sont effectuées manuellement ou semi-automatiquement par un technicien en radiologie. Bien que l’angiographie par fluoroscopie soit fréquemment utilisé partout dans le monde et souvent considéré comme l’outil de diagnostic “gold-standard” pour de nombreuses maladies vasculaires, la nature bidimensionnelle de cette modalité d’imagerie est malheureusement très limitante en termes de spécification géométrique des différentes régions pathologiques. En effet, la structure tridimensionnelle des sténoses et des anévrismes ne peut pas être pleinement appréciée en 2D car les caractéristiques observées varient selon la configuration angulaire de l’imageur. De plus, la présence de lésions affectant les artères coronaires peut ne pas refléter la véritable santé du myocarde, car des mécanismes compensatoires naturels (ex. vaisseaux----------ABSTRACT Cardiovascular disease continues to be the leading cause of death in North America. In adult and, alarmingly, ever younger populations, the so-called obesity epidemic largely driven by lifestyle factors that include poor diet, lack of exercise and smoking, incurs enormous stresses on the healthcare system. The primary cause of serious morbidity and mortality for these patients is atherosclerosis, the build up of plaque inside high pressure vessels like the coronary arteries. These lesions can lead to ischemic disease and may progress to precarious blood flow blockage or thrombosis, often with infarction or other severe consequences. Besides the stenosis-related outcomes, the arterial walls of plaque-ridden regions manifest increased stiffness, which may exacerbate negative patient prognosis. In pediatric populations, the most prevalent acquired cardiovascular pathology is Kawasaki disease. This acute vasculitis may affect the structural integrity of coronary artery walls and progress to aneurysmal lesions. These can hinder the blood flow’s hemodynamics, leading to inadequate downstream perfusion, and may activate thrombus formation which may lead to precarious prognosis. Diagnosing these two prominent coronary artery diseases is traditionally performed using fluoroscopic angiography. Several hundred serial x-ray projections are acquired during selective arterial infusion of a radiodense contrast agent, which reveals the vessels’ luminal area and possible pathological lesions. The acquired series contain highly dynamic information on voluntary and involuntary patient movement: respiration, organ displacement and heartbeat, for example. Current clinical analysis is largely limited to a single angiographic image where geometrical measures will be performed manually or semi-automatically by a radiological technician. Although widely used around the world and generally considered the gold-standard diagnosis tool for many vascular diseases, the two-dimensional nature of this imaging modality is limiting in terms of specifying the geometry of various pathological regions. Indeed, the 3D structures of stenotic or aneurysmal lesions may not be fully appreciated in 2D because their observable features are dependent on the angular configuration of the imaging gantry. Furthermore, the presence of lesions in the coronary arteries may not reflect the true health of the myocardium, as natural compensatory mechanisms may obviate the need for further intervention. In light of this, cardiac magnetic resonance perfusion imaging is increasingly gaining attention and clinical implementation, as it offers a direct assessment of myocardial tissue viability following infarction or suspected coronary artery disease. This type of modality is plagued, however, by motion similar to that present in fluoroscopic imaging. This issue predisposes clinicians to laborious manual intervention in order to align anatomical structures in sequential perfusion frames, thus hindering automation o

    Automatic Spatiotemporal Analysis of Cardiac Image Series

    Get PDF
    RÉSUMÉ À ce jour, les maladies cardiovasculaires demeurent au premier rang des principales causes de décès en Amérique du Nord. Chez l’adulte et au sein de populations de plus en plus jeunes, la soi-disant épidémie d’obésité entraînée par certaines habitudes de vie tels que la mauvaise alimentation, le manque d’exercice et le tabagisme est lourde de conséquences pour les personnes affectées, mais aussi sur le système de santé. La principale cause de morbidité et de mortalité chez ces patients est l’athérosclérose, une accumulation de plaque à l’intérieur des vaisseaux sanguins à hautes pressions telles que les artères coronaires. Les lésions athérosclérotiques peuvent entraîner l’ischémie en bloquant la circulation sanguine et/ou en provoquant une thrombose. Cela mène souvent à de graves conséquences telles qu’un infarctus. Outre les problèmes liés à la sténose, les parois artérielles des régions criblées de plaque augmentent la rigidité des parois vasculaires, ce qui peut aggraver la condition du patient. Dans la population pédiatrique, la pathologie cardiovasculaire acquise la plus fréquente est la maladie de Kawasaki. Il s’agit d’une vasculite aigüe pouvant affecter l’intégrité structurale des parois des artères coronaires et mener à la formation d’anévrismes. Dans certains cas, ceux-ci entravent l’hémodynamie artérielle en engendrant une perfusion myocardique insuffisante et en activant la formation de thromboses. Le diagnostic de ces deux maladies coronariennes sont traditionnellement effectués à l’aide d’angiographies par fluoroscopie. Pendant ces examens paracliniques, plusieurs centaines de projections radiographiques sont acquises en séries suite à l’infusion artérielle d’un agent de contraste. Ces images révèlent la lumière des vaisseaux sanguins et la présence de lésions potentiellement pathologiques, s’il y a lieu. Parce que les séries acquises contiennent de l’information très dynamique en termes de mouvement du patient volontaire et involontaire (ex. battements cardiaques, respiration et déplacement d’organes), le clinicien base généralement son interprétation sur une seule image angiographique où des mesures géométriques sont effectuées manuellement ou semi-automatiquement par un technicien en radiologie. Bien que l’angiographie par fluoroscopie soit fréquemment utilisé partout dans le monde et souvent considéré comme l’outil de diagnostic “gold-standard” pour de nombreuses maladies vasculaires, la nature bidimensionnelle de cette modalité d’imagerie est malheureusement très limitante en termes de spécification géométrique des différentes régions pathologiques. En effet, la structure tridimensionnelle des sténoses et des anévrismes ne peut pas être pleinement appréciée en 2D car les caractéristiques observées varient selon la configuration angulaire de l’imageur. De plus, la présence de lésions affectant les artères coronaires peut ne pas refléter la véritable santé du myocarde, car des mécanismes compensatoires naturels (ex. vaisseaux----------ABSTRACT Cardiovascular disease continues to be the leading cause of death in North America. In adult and, alarmingly, ever younger populations, the so-called obesity epidemic largely driven by lifestyle factors that include poor diet, lack of exercise and smoking, incurs enormous stresses on the healthcare system. The primary cause of serious morbidity and mortality for these patients is atherosclerosis, the build up of plaque inside high pressure vessels like the coronary arteries. These lesions can lead to ischemic disease and may progress to precarious blood flow blockage or thrombosis, often with infarction or other severe consequences. Besides the stenosis-related outcomes, the arterial walls of plaque-ridden regions manifest increased stiffness, which may exacerbate negative patient prognosis. In pediatric populations, the most prevalent acquired cardiovascular pathology is Kawasaki disease. This acute vasculitis may affect the structural integrity of coronary artery walls and progress to aneurysmal lesions. These can hinder the blood flow’s hemodynamics, leading to inadequate downstream perfusion, and may activate thrombus formation which may lead to precarious prognosis. Diagnosing these two prominent coronary artery diseases is traditionally performed using fluoroscopic angiography. Several hundred serial x-ray projections are acquired during selective arterial infusion of a radiodense contrast agent, which reveals the vessels’ luminal area and possible pathological lesions. The acquired series contain highly dynamic information on voluntary and involuntary patient movement: respiration, organ displacement and heartbeat, for example. Current clinical analysis is largely limited to a single angiographic image where geometrical measures will be performed manually or semi-automatically by a radiological technician. Although widely used around the world and generally considered the gold-standard diagnosis tool for many vascular diseases, the two-dimensional nature of this imaging modality is limiting in terms of specifying the geometry of various pathological regions. Indeed, the 3D structures of stenotic or aneurysmal lesions may not be fully appreciated in 2D because their observable features are dependent on the angular configuration of the imaging gantry. Furthermore, the presence of lesions in the coronary arteries may not reflect the true health of the myocardium, as natural compensatory mechanisms may obviate the need for further intervention. In light of this, cardiac magnetic resonance perfusion imaging is increasingly gaining attention and clinical implementation, as it offers a direct assessment of myocardial tissue viability following infarction or suspected coronary artery disease. This type of modality is plagued, however, by motion similar to that present in fluoroscopic imaging. This issue predisposes clinicians to laborious manual intervention in order to align anatomical structures in sequential perfusion frames, thus hindering automation o

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity
    corecore