625 research outputs found

    Information management throughout the life cycle of buildings - Basics and new approaches such as blockchain

    Get PDF
    Ensuring sustainability for real estate is subject - among other aspects - to building related information. This information needs to be stored and updated continuously throughout the life cycle of a building. A delivery to buyers, tenants, consultants or other actors must be possible at any time. However, in most cases transactions cause significant loss of information while the issues associated with the "building passport" approach remains unsolved to date. Considering the long service life of buildings, various questions arise: (1) How to support data generation and storage within the life cycle and how to encourage actors to compete? (2) How to assure a high data quality and how to store it over a long period of time? (3) How to assure that all data users can track down the data owners at any point of time to manage compliance and legal issues? (4) Are there any new business models or new scopes for designers or other service providers? Information needs of actors along the life cycle are analysed and new information technologies (e.g. blockchain) are discussed. A relation to Building Information Modeling (BIM) is shown. Potentials of enhancing existing approaches regarding documentation retracing and accessibility of building and life cycle related information by using new technologies and IT are discussed; benefits of using a blockchain based system is pointed out by referring to existing pilot projects and first examples. Solution approaches for building passports are shown

    Pre-operative gastric ultrasound in patients at risk of pulmonary aspiration: a prospective observational cohort study.

    Get PDF
    Point-of-care gastric sonography offers an objective approach to assessing individual pulmonary aspiration risk before induction of general anaesthesia. We aimed to evaluate the potential impact of routine pre-operative gastric ultrasound on peri-operative management in a cohort of adult patients undergoing elective or emergency surgery at a single centre. According to pre-operative gastric ultrasound results, patients were classified as low risk (empty, gastric fluid volume ≤ 1.5 ml.kg-1 body weight) or high risk (solid, mixed or gastric fluid volume > 1.5 ml.kg-1 body weight) of aspiration. After sonography, examiners were asked to indicate changes in aspiration risk management (none; more conservative; more liberal) to their pre-defined anaesthetic plan and to adapt it if patient safety was at risk. We included 2003 patients, 1246 (62%) of which underwent elective and 757 (38%) emergency surgery. Among patients who underwent elective surgery, 1046/1246 (84%) had a low-risk and 178/1246 (14%) a high-risk stomach, with this being 587/757 (78%) vs. 158/757 (21%) among patients undergoing emergency surgery, respectively. Routine pre-operative gastric sonography enabled changes in anaesthetic management in 379/2003 (19%) of patients, with these being a more liberal approach in 303/2003 (15%). In patients undergoing elective surgery, pre-operative gastric sonography would have allowed a more liberal approach in 170/1246 (14%) and made a more conservative approach indicated in 52/1246 (4%), whereas in patients undergoing emergency surgery, 133/757 (18%) would have been managed more liberally and 24/757 (3%) more conservatively. We showed that pre-operative gastric ultrasound helps to identify high- and low-risk situations in patients at risk of aspiration and adds useful information to peri-operative management. Our data suggest that routine use of pre-operative gastric ultrasound may improve individualised care and potentially impact patient safety

    Discovering Implicational Knowledge in Wikidata

    Full text link
    Knowledge graphs have recently become the state-of-the-art tool for representing the diverse and complex knowledge of the world. Examples include the proprietary knowledge graphs of companies such as Google, Facebook, IBM, or Microsoft, but also freely available ones such as YAGO, DBpedia, and Wikidata. A distinguishing feature of Wikidata is that the knowledge is collaboratively edited and curated. While this greatly enhances the scope of Wikidata, it also makes it impossible for a single individual to grasp complex connections between properties or understand the global impact of edits in the graph. We apply Formal Concept Analysis to efficiently identify comprehensible implications that are implicitly present in the data. Although the complex structure of data modelling in Wikidata is not amenable to a direct approach, we overcome this limitation by extracting contextual representations of parts of Wikidata in a systematic fashion. We demonstrate the practical feasibility of our approach through several experiments and show that the results may lead to the discovery of interesting implicational knowledge. Besides providing a method for obtaining large real-world data sets for FCA, we sketch potential applications in offering semantic assistance for editing and curating Wikidata

    Space Efficient Breadth-First and Level Traversals of Consistent Global States of Parallel Programs

    Full text link
    Enumerating consistent global states of a computation is a fundamental problem in parallel computing with applications to debug- ging, testing and runtime verification of parallel programs. Breadth-first search (BFS) enumeration is especially useful for these applications as it finds an erroneous consistent global state with the least number of events possible. The total number of executed events in a global state is called its rank. BFS also allows enumeration of all global states of a given rank or within a range of ranks. If a computation on n processes has m events per process on average, then the traditional BFS (Cooper-Marzullo and its variants) requires O(mn1n)\mathcal{O}(\frac{m^{n-1}}{n}) space in the worst case, whereas ou r algorithm performs the BFS requires O(m2n2)\mathcal{O}(m^2n^2) space. Thus, we reduce the space complexity for BFS enumeration of consistent global states exponentially. and give the first polynomial space algorithm for this task. In our experimental evaluation of seven benchmarks, traditional BFS fails in many cases by exhausting the 2 GB heap space allowed to the JVM. In contrast, our implementation uses less than 60 MB memory and is also faster in many cases

    Empirical comparison of high gradient achievement for different metals in DC and pulsed mode

    Full text link
    For the SwissFEL project, an advanced high gradient low emittance gun is under development. Reliable operation with an electric field, preferably above 125 MV/m at a 4 mm gap, in the presence of an UV laser beam, has to be achieved in a diode configuration in order to minimize the emittance dilution due to space charge effects. In the first phase, a DC breakdown test stand was used to test different metals with different preparation methods at voltages up to 100 kV. In addition high gradient stability tests were also carried out over several days in order to prove reliable spark-free operation with a minimum dark current. In the second phase, electrodes with selected materials were installed in the 250 ns FWHM, 500 kV electron gun and tested for high gradient breakdown and for quantum efficiency using an ultra-violet laser.Comment: 25 pages, 13 figures, 5 tables. Follow up from FEL 2008 conference (Geyongju Korea 2008) New Title in JVST A (2010) : Vacuum breakdown limit and quantum efficiency obtained for various technical metals using DC and pulsed voltage source

    DEvIANT: Discovering Significant Exceptional (Dis-)Agreement Within Groups

    Get PDF
    We strive to find contexts (i.e., subgroups of entities) under which exceptional (dis-)agreement occurs among a group of individuals , in any type of data featuring individuals (e.g., parliamentarians , customers) performing observable actions (e.g., votes, ratings) on entities (e.g., legislative procedures, movies). To this end, we introduce the problem of discovering statistically significant exceptional contextual intra-group agreement patterns. To handle the sparsity inherent to voting and rating data, we use Krippendorff's Alpha measure for assessing the agreement among individuals. We devise a branch-and-bound algorithm , named DEvIANT, to discover such patterns. DEvIANT exploits both closure operators and tight optimistic estimates. We derive analytic approximations for the confidence intervals (CIs) associated with patterns for a computationally efficient significance assessment. We prove that these approximate CIs are nested along specialization of patterns. This allows to incorporate pruning properties in DEvIANT to quickly discard non-significant patterns. Empirical study on several datasets demonstrates the efficiency and the usefulness of DEvIANT. Technical Report Associated with the ECML/PKDD 2019 Paper entitled: "DEvIANT: Discovering Significant Exceptional (Dis-)Agreement Within Groups"

    Elimination of Herpes Simplex Virus-2 and Epstein-Barr Virus With Seraph 100 Microbind Affinity Blood Filter and Therapeutic Plasma Exchange: An Explorative Study in a Patient With Acute Liver Failure

    Full text link
    OBJECTIVES Herpes simplex virus (HSV)-2 is a rare cause of hepatitis that can lead to acute liver failure (ALF) and often death. The earlier the initiation of acyclovir treatment the better the survival. With regard to ALF, controlled randomized data support the use of therapeutic plasma exchange (TPE) both as bridge to recovery or transplantation-possibly by modulating the systemic inflammatory response and by replacing coagulation factors. Seraph 100 Microbind Affinity Blood Filter (Seraph; Ex Thera Medical, Martinez, CA), a novel extracorporeal adsorption device, removes living pathogens by binding to a heparin-coated surface was shown to efficiently clear HSV-2 particles in vitro. Here, we tested the combination of Seraph with TPE to reduce a massive HSV-2 viral load to reach a situation in that liver transplantation would be feasible. DESIGN Explorative study. SETTING Academic tertiary care transplant center. PATIENT Single patient with HSV-2-induced ALF. INTERVENTIONS TPE + Seraph 100 Microbind Affinity Blood Filter. MEASUREMENTS AND MAIN RESULTS We report Seraph clearance data of HSV-2 and of Epstein-Barr virus (EBV) in vivo as well as total viral elimination by TPE. Genome copies/mL of HSV-2 and EBV in EDTA plasma were measured by polymerase chain reaction every 60 minutes over 6 hours after starting Seraph both systemically and post adsorber. Also, HSV-2 and EBV were quantified before and after TPE and in the removed apheresis plasma. We found a total elimination of 1.81 × e11^{11} HSV-2 copies and 2.11 × e6^{6} EBV copies with a single TPE (exchange volume of 5L; 1.5× calculated plasma volume). Whole blood clearance of HSV-2 in the first 6 hours of treatment was 6.64 mL/min (4.98-12.92 mL/min). Despite much lower baseline viremia, clearance of EBV was higher 36.62 mL/min (22.67-53.48 mL/min). CONCLUSIONS TPE was able to remove circulating HSV-2 copies by 25% and EBV copies by 40% from the blood. On the other hand, clearance of HSV-2 by Seraph was clinically irrelevant, but Seraph seemed to be far more effective of removing EBV, implicating a possible use in EBV-associated pathologies, but this requires further study

    Constraint Programming for Multi-criteria Conceptual Clustering

    Get PDF
    International audienceA conceptual clustering is a set of formal concepts (i.e., closed itemsets) that defines a partition of a set of transactions. Finding a conceptual clustering is an N P-complete problem for which Constraint Programming (CP) and Integer Linear Programming (ILP) approaches have been recently proposed. We introduce new CP models to solve this problem: a pure CP model that uses set constraints, and an hybrid model that uses a data mining tool to extract formal concepts in a preprocessing step and then uses CP to select a subset of formal concepts that defines a partition. We compare our new models with recent CP and ILP approaches on classical machine learning instances. We also introduce a new set of instances coming from a real application case, which aims at extracting setting concepts from an Enterprise Resource Planning (ERP) software. We consider two classic criteria to optimize, i.e., the frequency and the size. We show that these criteria lead to extreme solutions with either very few small formal concepts or many large formal concepts, and that compromise clusterings may be obtained by computing the Pareto front of non dominated clusterings

    On the high-density expansion for Euclidean Random Matrices

    Get PDF
    Diagrammatic techniques to compute perturbatively the spectral properties of Euclidean Random Matrices in the high-density regime are introduced and discussed in detail. Such techniques are developed in two alternative and very different formulations of the mathematical problem and are shown to give identical results up to second order in the perturbative expansion. One method, based on writing the so-called resolvent function as a Taylor series, allows to group the diagrams in a small number of topological classes, providing a simple way to determine the infrared (small momenta) behavior of the theory up to third order, which is of interest for the comparison with experiments. The other method, which reformulates the problem as a field theory, can instead be used to study the infrared behaviour at any perturbative order.Comment: 29 page
    corecore