2,414 research outputs found

    GIER: A Danish computer from 1961 with a role in the modern revolution of astronomy

    Full text link
    A Danish computer, GIER, from 1961 played a vital role in the development of a new method for astrometric measurement. This method, photon counting astrometry, ultimately led to two satellites with a significant role in the modern revolution of astronomy. A GIER was installed at the Hamburg Observatory in 1964 where it was used to implement the entirely new method for the measurement of stellar positions by means of a meridian circle, then the fundamental instrument of astrometry. An expedition to Perth in Western Australia with the instrument and the computer was a success. This method was also implemented in space in the first ever astrometric satellite Hipparcos launched by ESA in 1989. The Hipparcos results published in 1997 revolutionized astrometry with an impact in all branches of astronomy from the solar system and stellar structure to cosmic distances and the dynamics of the Milky Way. In turn, the results paved the way for a successor, the one million times more powerful Gaia astrometry satellite launched by ESA in 2013. Preparations for a Gaia successor in twenty years are making progress.Comment: 19 pages,8 figures, Accepted for publication in Nuncius Hamburgensis, Volume 2

    Per Aspera ad Astra: On the Way to Parallel Processing

    Get PDF
    Computational Science and Engineering is being established as a third category of scientific methodology; this innovative discipline supports and supplements the traditional categories: theory and experiment, in order to solve the problems arising from complex systems challenging science and technology. While the successes of the past two decades in scientific computing have been achieved essentially by the technical breakthrough of the vector-supercomputers, today the discussion about the future of supercomputing is focussed on massively parallel computers. The discrepancy, however, between peak performance and sustained performance achievable with algorithmic kernels, software packages, and real applications is still disappointingly high. An important issue are programming models. While Message Passing on parallel computers with distributed memory is the only efficient programming paradigm available today, from a user's point of view it is hard to imagine that this programming model, rather than Shared Virtual Memory, will be capable to serve as the central basis in order to bring computing on massively parallel systems from a sheer computer science trend to the technological breakthrough needed to deal with the large applications of the future; this is especially true for commercial applications where explicit programming the data communication via Message Passing may turn out to be a huge software-technological barrier which nobody might be willing to surmount.KFA JĂŒlich is one of the largest big-science research centres in Europe; its scientific and engineering activities are ranging from fundamental research to applied science and technology. KFA's Central Institute for Applied Mathematics (ZAM) is running the large-scale computing facilities and network systems at KFA and is providing communication services, general-purpose and supercomputer capacity also to the HLRZ ("Höchstleistungsrechenzentrum") established in 1987 in order to further enhance and promote computational science in Germany. Thus, at KFA - and in particular enforced by ZAM - supercomputing has received high priority since more than ten years. What particle accelerators mean to experimental physics, supercomputers mean to Computational Science and Engineering: Supercomputers are the accelerators of theory

    Jahresbericht Forschung und Entwicklung 2004

    Get PDF
    Forschungsjahresbericht 2004 der Fachhochschule Konstan

    Tagungsband Dagstuhl-Workshop MBEES: Modellbasierte Entwicklung eingebetteter Systeme 2005

    Get PDF

    Algorithm Engineering : Aktuelles Schlagwort

    Get PDF

    Methoden und Beschreibungssprachen zur Modellierung und Verifikation vonSchaltungen und Systemen: MBMV 2015 - Tagungsband, Chemnitz, 03. - 04. MĂ€rz 2015

    Get PDF
    Der Workshop Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen (MBMV 2015) findet nun schon zum 18. mal statt. Ausrichter sind in diesem Jahr die Professur Schaltkreis- und Systementwurf der Technischen UniversitĂ€t Chemnitz und das Steinbeis-Forschungszentrum Systementwurf und Test. Der Workshop hat es sich zum Ziel gesetzt, neueste Trends, Ergebnisse und aktuelle Probleme auf dem Gebiet der Methoden zur Modellierung und Verifikation sowie der Beschreibungssprachen digitaler, analoger und Mixed-Signal-Schaltungen zu diskutieren. Er soll somit ein Forum zum Ideenaustausch sein. Weiterhin bietet der Workshop eine Plattform fĂŒr den Austausch zwischen Forschung und Industrie sowie zur Pflege bestehender und zur KnĂŒpfung neuer Kontakte. Jungen Wissenschaftlern erlaubt er, ihre Ideen und AnsĂ€tze einem breiten Publikum aus Wissenschaft und Wirtschaft zu prĂ€sentieren und im Rahmen der Veranstaltung auch fundiert zu diskutieren. Sein langjĂ€hriges Bestehen hat ihn zu einer festen GrĂ¶ĂŸe in vielen Veranstaltungskalendern gemacht. Traditionell sind auch die Treffen der ITGFachgruppen an den Workshop angegliedert. In diesem Jahr nutzen zwei im Rahmen der InnoProfile-Transfer-Initiative durch das Bundesministerium fĂŒr Bildung und Forschung geförderte Projekte den Workshop, um in zwei eigenen Tracks ihre Forschungsergebnisse einem breiten Publikum zu prĂ€sentieren. Vertreter der Projekte Generische Plattform fĂŒr SystemzuverlĂ€ssigkeit und Verifikation (GPZV) und GINKO - Generische Infrastruktur zur nahtlosen energetischen Kopplung von Elektrofahrzeugen stellen Teile ihrer gegenwĂ€rtigen Arbeiten vor. Dies bereichert denWorkshop durch zusĂ€tzliche Themenschwerpunkte und bietet eine wertvolle ErgĂ€nzung zu den BeitrĂ€gen der Autoren. [... aus dem Vorwort

    Algorithm Engineering

    Get PDF

    Clean by Nature. Lively Surfaces and the Holistic-Systemic Heritage of Contemporary Bionik.

    Get PDF
    This paper addresses questions regarding the prospering field of Bionik in Germany. Its starting point is the wide spread assumption that universal functional principles exist in nature and that these ‘solutions’ can be transferred into technological objects. Accordingly, advocates of Bionik herald the advent of a better world with more sustainable and efficient products of engineering. The so-called ‘functional surfaces’ occupy a special place within this contemporary version of biomimesis. Shark-skin-inspired swim suits, self-cleaning façade paints with lotus effect or drag reducing Dolphin-Skins for aircraft-wings are expected to improve the quality of life for everyone. It seems that skin and shell of living systems return as revenants to our technological world and live their afterlives as lively surfaces of everyday objects. This paper argues however, that understanding this attention to ‘natural engineering solutions’ in contemporary Bionik, one needs to focus on a different kind of afterlife. For baring the historic-epistemological roots allows fathoming direct connections to two widely influential historical concepts within the history of science in the 20th century: Biotechnik, a very popular bio-philosophical concept from the Weimar Republic of the 1920s and Bionics, an in many ways similar endeavor that emerged during the second wave of Cybernetics in the USA from around 1960. Both historical concepts share a certain proximity to a distinct holistic-systemic style of thinking that emerged during the 20th century and still resonates with the movement of Bionik in contemporary Germany. Based on the example of the lotus effect, I want to address three aspects of the afterlife of this holistic-systemic heritage in contemporary Bionik. First, the assumption that the best engineering solutions can be found in nature conceals the specific discursive and non-discursive complexity that forms the basis of all technological objects. Second, the holistic-systemic heritage of Bionik directly correlates with its epistemological bias towards visual evidence and its enthusiasm for ‘functional surfaces’. Third, the rhetoric of Bionik paradoxically oscillates between a counter-modern demotion of human creativity and autonomy and a fascination for modern scientific instruments and practices

    Engineering Aggregation Operators for Relational In-Memory Database Systems

    Get PDF
    In this thesis we study the design and implementation of Aggregation operators in the context of relational in-memory database systems. In particular, we identify and address the following challenges: cache-efficiency, CPU-friendliness, parallelism within and across processors, robust handling of skewed data, adaptive processing, processing with constrained memory, and integration with modern database architectures. Our resulting algorithm outperforms the state-of-the-art by up to 3.7x
    • 

    corecore