15 research outputs found

    Umsetzungskonzept der Universitäten des Landes Baden-Württemberg für das High Performance Computing (HPC), Data Intensive Computing (DIC) und Large Scale Scientific Data Management (LS² DM)

    Get PDF
    Computational Sciences1 und damit die HPC-Systeme als ihr technisches Fundament gewinnen unablässig an Bedeutung, wie auch der Wissenschaftsrat in seinen jüngsten Empfehlungen zur „Finanzierung des Nationalen Hoch- und Höchstleistungsrechnens in Deutschland“2 betont. Die fortschreitende Digitalisierung der Wissenschaft generiert auf Basis verschiedener Forschungs- infrastrukturen Forschungsdaten und damit Anforderungen, die von der schnellen Speicherung bei der Datenerhebung, über die Verarbeitung in HPC- und Cloudsystemen bis hin zur notwen- digen Aufarbeitung der Daten im Sinne „guter wissenschaftlicher Praxis“ reichen. Die Analyse dieser großen Datenmengen zur Gewinnung von neuen Erkenntnissen wird Data Intensive Computing (DIC) genannt – sie wird heute neben Theorie, Experiment und Simulation als vierte Säule der Wissenschaft3 bezeichnet. Hinzu kommen die notwendigen technischen und organi- satorischen Maßnahmen für eine nachhaltige Nutzung der Daten, die eine langfristige Speiche- rung und eine nach Möglichkeit öffentliche Zugänglichkeit garantieren. Der Erkenntnis folgend, dass diese neuen Anforderungen nicht mehr sinnvoll von einzelnen Universitäten oder Forschungsinstitutionen bedient werden können, koordinieren die wissen- schaftlichen Rechenzentren des Landes Baden-Württemberg ihre Aktivitäten diesbezüglich. Gleichzeitig wollen die Landesuniversitäten den Empfehlungen des Rats für Informationsinfra- strukturen (RfII) folgen und ihre Infrastrukturentwicklungen mit dem Aufbau einer Infrastruk- tur für Forschungsdatenmanagement auf Basis ihrer HPC- und DATA-Konzepte verschränken. Kooperative Lösungen helfen die beschriebenen Herausforderungen zu bewältigen und verspre- chen einen institutionen- und disziplinübergreifenden Mehrwert. Für die Periode von 2018 bis 2024 ist es das Ziel aller beteiligten Akteure, den beschrittenen Weg der Kooperation gemäß der HPC Landesstrategie4 weiter zu verfolgen. Damit baut das Land Ba- den-Württemberg ein wesentliches Alleinstellungsmerkmal bei der Unterstützung der Wissen- schaften aus und bekundet ausdrücklich das Interesse und die Bereitschaft, in einer frühen Phase beim Aufbau und der Entwicklung der Nationalen Forschungsdateninfrastruktur (NFDI)5 mitzuwirken. Im Sinne eines integrierten Ansatzes werden die bestehenden Konzepte für HPC, DIC und LS2DM weiterentwickelt und in einer gemeinsamen Strategie zusammengeführt. Gleichzeitig werden die Grundlagen für eine frühe Beteiligung am Aufbau einer NFDI geschaffen und erforderliche Infrastrukturen bereitgestellt

    Fine-Grained Transclusions of Multimedia Documents in HTML

    No full text
    Transclusions are a technique for virtually including existing content into new documents by reference to the original documents rather than by copying. In principle, transclusions are used in HTML for the inclusion of entire text documents, images, movies and similar media. The HTML specification only takes transclusions of entire documents into account, though. Hence it is not possible, for instance, to include a part of an existing image into an HTML document. In this paper, fine-grained transclusion of multimedia documents on the Web are proposed, which presents a logical realisation of the concept of transclusions in HTML. The proposal makes it possible, for instance, to include sections of existing images or small portions of entire movies into HTML documents. Two different approaches to implementing the functionality presented are detailed. The first architecture is based on a transparent extension module to conventional HTTP servers, whereas the alternative design makes use of a CGI program. Both approaches are fully self-contained, reside on an HTTP server and do not require browser plug-ins or any other special software components to be installed on client computers. An amendment to the HTTP specification is not required either. A prototype implementation demonstrates the proposal for a number of document types

    Fine-Grained Transclusions of Multimedia Documents in HTML

    No full text
    Transclusions are a technique for virtually including existing content into new documents by reference to the original documents rather than by copying. In principle, transclusions are used in HTML for the inclusion of entire text documents, images, movies and similar media. The HTML specification only takes transclusions of entire documents into account, though. Hence it is not possible, for instance, to include a part of an existing image into an HTML document. In this paper, fine-grained transclusion of multimedia documents on the Web are proposed, which presents a logical realisation of the concept of transclusions in HTML. The proposal makes it possible, for instance, to include sections of existing images or small portions of entire movies into HTML documents. Two different approaches to implementing the functionality presented are detailed. The first architecture is based on a transparent extension module to conventional HTTP servers, whereas the alternative design makes use of a CGI program. Both approaches are fully self-contained, reside on an HTTP server and do not require browser plug-ins or any other special software components to be installed on client computers. An amendment to the HTTP specification is not required either. A prototype implementation demonstrates the proposal for a number of document types

    The Transformation of the Web: How Emerging Communities Shape the Information we Consume

    No full text
    To date, one of the main aims of the World Wide Web has been to provide users with information. In addition to private homepages, large professional information providers, including news services, companies, and other organisations have set up web-sites. With the development and advance of recent technologies such as wikis, blogs, podcasting and file sharing this model is challenged and community-driven services are gaining influence rapidly. These new paradigms obliterate the clear distinction between information providers and consumers. The lines between producers and consumers are blurred even more by services such as Wikipedia, where every reader can become an author, instantly. This paper presents an overview of a broad selection of current technologies and services: blogs, wikis including Wikipedia and Wikinews, social networks such as Friendster and Orkut as well as related social services like del.icio.us, file sharing tools such as Flickr, and podcasting. These services enable user participation on the Web and manage to recruit a large number of users as authors of new content. It is argued that the transformations the Web is subject to are not driven by new technologies but by a fundamental mind shift that encourages individuals to take part in developing new structures and content. The evolving services and technologies encourage ordinary users to make their knowledge explicit and help a collective intelligence to develop

    Transclusions in an HTML-Based Environment

    Get PDF
    Transclusions are an advanced technique for the inclusion of existing content into new documents without the need to duplicate it. Although originally described in the early 1960s, transclusions have still not been made available to users and authors on the world wide web. This paper describes the prototype implementation of a system that allows users to write articles that may contain transclusions. The system offers a simple web-based interface where users can compose new articles. With a simple button the user has the ability to insert a transclusion from any HTML page available on the world wide web. While other approaches introduce new markups for the HTML specification, make use of technologies such as XML and XLink or employ authoring systems that internally support transclusions and can generate web pages as output, this implementation solely relies on the techniques provided by an HTML-based environment. Therefore HTML, Javascript, the Document Object Model, CGI scripts and HTTP are the core technologies utilised in the prototype

    Community Building around Encyclopaedic Knowledge

    Get PDF
    This paper gives a brief overview of current technologies in systems handling encyclopaedic knowledge. Since most of the electronic encyclopaedias currently available are rather static and inflexible, greatly enhanced functionality is introduced that enables users to work more effectively and collaboratively. Users have the ability, for instance, to add annotations to every kind of object and can have private and shared workspaces. The techniques described employ user profiles in order to adapt to different users and involve statistical analysis to improve search results. Moreover, a tracking and navigation mechanism based on trails is presented. The second part of the paper details community building around encyclopaedic knowledge with the aim to involve ?plain? users and experts in environments with largely editorial content. The foundations for building a user community are specified along with significant facets such as retaining the high quality of content, rating mechanisms and social aspects. A system that implements large portions of the community-related concepts in a heterogeneous environment of several largely independent data sources is proposed. Apart from online and DVD-based encyclopaedias, potential application areas are e-Learning, corporate documentation and knowledge management systems

    Dynamic Adaptation of Content and Structure in Electronic Encyclopaedias

    No full text
    Adaptive functionality has been applied successfully in many areas ranging from user interfaces to hypermedia systems. Digital libraries and electronic encyclopaedias, however, have rarely made use of the power of adaptation. In this paper, an approach to include adaptation into encyclopaedic environments is presented. The proposal covers a set of adaptation techniques. They enable the system to explain technical terms and replace domain specific expressions with "plain" words automatically. Moreover, specific terms can be linked to further articles automatedly. Blacklisting, whitelisting and general link alteration are employed in order to assure quality standards and to provide users with more appropriate hyperlinks. With navigation support based on the automatic insertion of trails and suggestions of potentially interesting articles, the users' navigation in encyclopaedias can be facilitated. A first version has been implemented in project "Alexander" and has been made available to a limited public. The system is based on a traditional client-server architecture, where the server-side components perform the actual adaptation. Details of this pilot project are provided

    The incidence of lumbar ligamentum flavum midline gaps

    No full text
    Lumbar epidural anesthesia and analgesia has gained increasing importance in perioperative pain therapy for abdominal and lower limb surgery. The loss-of-resistance technique, used to identify the epidural space, is thought to rely on the penetration of the ligamentum flavum. However, the exact morphology of the ligamentum flavum at different vertebral levels remains controversial. Therefore, in this study, we directly investigated the incidence of lumbar ligamentum flavum midline gaps in embalmed cadavers. Vertebral column specimens were obtained from 45 human cadavers. On each dissected level, ligamentum flavum midline gaps were recorded. The incidence of midline gaps per number of viable specimens at the following levels was: L1-2 = 10 of 45 (22.2%), L2-3 = 5 of 44 (11.4%), L3-4 = 5 of 45 (11.1%), L4-5 = 4 of 43 (9.3%), L5/S1 = 0 of 33 (0%). In conclusion, the present study determined the frequency of lumbar ligamentum flavum midline gaps. Gaps in the lumbar ligamentum flavum are most frequent between L1 and L2 but are more rare below this level. When using the midline approach, the ligamentum flavum may not impede entering the epidural space in all patients. IMPLICATIONS: The ligamentum flavum is a crucial anatomical landmark for the safe performance of epidural anesthesia. However, the present study demonstrates some failure of the lumbar ligamentum flavum as a landmark. This may mean that, using a midline approach, one cannot always rely on the ligamentum flavum as a perceptible barrier to epidural needle advancemen

    There and here: patterns of content transclusion in Wikipedia

    No full text
    As large, collaboratively authored hypertexts such as Wikipedia grow so does the requirement both for organisational principles and methods to provide sustainable consistency and to ease the task of contributing editors. Large numbers of (potential) editors are not necessarily a sufficient bulwark against loss of coherence amongst a corpus of many discrete articles. The longitudinal task of curation may benefit from deliberate curatorial roles and techniques.A potentially beneficial technique for the development and maintenance of hypertext content at scale is hypertext transclusion, by offering controllable re-use of a canonical source. In considering issues of longitudinal support of web collaborative hypertexts, we investigated the current degree and manner of adoption of transclusion facilities by editors of Wikipedia articles. We sampled 20 million articles from ten discrete language wikis within Wikipedia to analyse behaviour both within and across the individual Wikipedia communities.We show that Wikipedia makes limited, inconsistent of use of transclusion (as at February 2016). Use is localised to subject areas, which differ between sampled languages. A limited number of patterns were observed including: Lists from transclusion, Lists of Lists, Episodic Media Listings, Tangles, Articles as Macros, and Self-Transclusion. We find little indication of deliberate structural maintenance of the hypertext
    corecore