44,773 research outputs found
Early Warning Software for Emergency Department Crowding
Emergency department (ED) crowding is a well-recognized threat to patient
safety and it has been repeatedly associated with increased mortality. Accurate
forecasts of future service demand could lead to better resource management and
has the potential to improve treatment outcomes. This logic has motivated an
increasing number of research articles but there has been little to no effort
to move these findings from theory to practice. In this article, we present
first results of a prospective crowding early warning software, that was
integrated to hospital databases to create real-time predictions every hour
over the course of 5 months in a Nordic combined ED using Holt-Winters'
seasonal methods. We showed that the software could predict next hour crowding
with a nominal AUC of 0.98 and 24 hour crowding with an AUC of 0.79 using
simple statistical models. Moreover, we suggest that afternoon crowding can be
predicted at 1 p.m. with an AUC of 0.84.Comment: 15 pages, 6 figure
Slow growth and revolutionary change. The Norwegian IT industry enters the global age, 1970-2005
The article concludes that although the Norwegian IT industry has been lacking in export success the last 30 years, it has been important for the development of the Norwegian economy. Several IT companies have been on the verge of international breakthroughs, but have been stopped by rising costs and guided by national opportunities. The rise of the important oil-sector has been both a hindrance and an opportunity for the Norwegian IT industry. Specialised products for national markets rather than general mass-market products have become the norm for the Norwegian IT industry. This development had to a remarkable degree been associated with continuity in terms of organisations and people. The firms these people and organisations have been attached too, however, have experienced turbulence, bankruptcy and change, making the whole development from 1970 until today a seemingly messy and problematic affair. But this has really been a period of IT industry growth, and in the end the national development is reasonably successful.
OPTICON: EC Optical Infrared Coordination Network for Astronomy
OPTICON, the ICN Optical Infrared Coordination Network for Astronomy, brings
together for the first time the operators of all Europe's medium to large
optical-infrared telescopes, the largest corresponding data archives, and
several user representatives. The OPTICON partners work with their communities
to identify those major challenges for the future development of European
optical-infrared astronomy which require Europe-wide collaboration. OPTICON
sponsors and coordinates developments towards these goals, involving the entire
astronomical community through workshops and meetings targeted towards these
agreed common goals of general importance.Comment: to appear in Organizations and Strategies in Astronomy II, Ed. A.
Heck, Kluwer Acad. Pub
ATLAS Data Challenge 1
In 2002 the ATLAS experiment started a series of Data Challenges (DC) of
which the goals are the validation of the Computing Model, of the complete
software suite, of the data model, and to ensure the correctness of the
technical choices to be made. A major feature of the first Data Challenge (DC1)
was the preparation and the deployment of the software required for the
production of large event samples for the High Level Trigger (HLT) and physics
communities, and the production of those samples as a world-wide distributed
activity. The first phase of DC1 was run during summer 2002, and involved 39
institutes in 18 countries. More than 10 million physics events and 30 million
single particle events were fully simulated. Over a period of about 40 calendar
days 71000 CPU-days were used producing 30 Tbytes of data in about 35000
partitions. In the second phase the next processing step was performed with the
participation of 56 institutes in 21 countries (~ 4000 processors used in
parallel). The basic elements of the ATLAS Monte Carlo production system are
described. We also present how the software suite was validated and the
participating sites were certified. These productions were already partly
performed by using different flavours of Grid middleware at ~ 20 sites.Comment: 10 pages; 3 figures; CHEP03 Conference, San Diego; Reference MOCT00
Overview of the Nordic Seas CARINA data and salinity measurements
Water column data of carbon and carbon relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruises in the Arctic, Atlantic, and Southern Ocean have been retrieved and merged into a new database: CARINA (CARbon IN the Atlantic). The data have been subject to rigorous quality control (QC) in order to ensure highest possible quality and consistency. The data for most of the parameters included were examined in order to quantify systematic biases in the reported values, i.e. secondary quality control. Significant biases have been corrected for in the data products, i.e. the three merged files with measured, calculated and interpolated values for each of the three CARINA regions; the Arctic Mediterranean Seas (AMS), the Atlantic (ATL) and the Southern Ocean (SO). With the adjustments the CARINA database is consistent both internally as well as with GLODAP (Key et al., 2004) and is suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation. The Arctic Mediterranean Seas include the Arctic Ocean and the Nordic Seas, and the quality control was carried out separately in these two areas. This contribution provides an overview of the CARINA data from the Nordic Seas and summarises the findings of the QC of the salinity data. One cruise had salinity data that were of questionable quality, and these have been removed from the data product. An evaluation of the consistency of the quality controlled salinity data suggests that they are consistent to at least ±0.005
Technoligical Life Cycles Regional Clusters Facing Disruption
The phenomenon of technological life cycles is argued to be of great importance in the development of regional clusters. New 'disruptive' technologies may initiate the emergence of new regional industrial clusters and/or create new opportunities for further development of existing ones. However, they may also result in stagnation and decline of the latter. The term disruptive refers to such significant changes in the basic technologies that may change the industrial landscape, even in the shorter run. The paper examines the key features of a regional cluster, where the economic development patterns are quite closely related to the emergence of new key technologies.Technological life cycles, regional clusters, communication technology
Salford postgraduate annual research conference (SPARC) 2012 proceedings
These proceedings bring together a selection of papers from the 2012 Salford Postgraduate Annual Research Conference (SPARC). They reflect the breadth and diversity of research interests showcased at the conference, at which over 130 researchers from Salford, the North West and other UK universities presented their work. 21 papers are collated here from the humanities, arts, social sciences, health, engineering, environment and life sciences, built environment and business
DepAnn - An Annotation Tool for Dependency Treebanks
DepAnn is an interactive annotation tool for dependency treebanks, providing
both graphical and text-based annotation interfaces. The tool is aimed for
semi-automatic creation of treebanks. It aids the manual inspection and
correction of automatically created parses, making the annotation process
faster and less error-prone. A novel feature of the tool is that it enables the
user to view outputs from several parsers as the basis for creating the final
tree to be saved to the treebank. DepAnn uses TIGER-XML, an XML-based general
encoding format for both, representing the parser outputs and saving the
annotated treebank. The tool includes an automatic consistency checker for
sentence structures. In addition, the tool enables users to build structures
manually, add comments on the annotations, modify the tagsets, and mark
sentences for further revision
Developing a National Design Scoreboard
Recognising the growing importance of design, this paper reports on the development of an approach to measuring design at a national level. A series of measures is proposed, that are based around a simplified model of design as a system at a national level. This model was developed though insights from literature and a workshop with government, industry and design sector representatives. Detailed data on design in the UK is presented to highlight the difficulties in collecting reliable and robust data. Evidence is compared with four countries (Spain, Canada, Korea and Sweden). This comparison highlights the inherent difficulties in comparing performance and a revised set of measures is proposed. Finally, an approach to capturing design spend at a firm level is proposed, based on insights from literature and case studies.
Keywords:
National Design System, Design Performance</p
- …