11,487 research outputs found
Designing a novel virtual collaborative environment to support collaboration in design review meetings
Project review meetings are part of the project management process and are organised to assess progress and resolve any design conflicts to avoid delays in construction. One of the key challenges during a project review meeting is to bring the stakeholders together and use this time effectively to address design issues as quickly as possible. At present, current technology solutions based on BIM or CAD are information-centric and do not allow project teams to collectively explore the design from a range of perspectives and brainstorm ideas when design conflicts are encountered. This paper presents a system architecture that can be used to support multi-functional team collaboration more effectively during such design review meetings. The proposed architecture illustrates how information-centric BIM or CAD systems can be made human- and team-centric to enhance team communication and problem solving. An implementation of the proposed system architecture has been tested for its utility, likability and usefulness during design review meetings. The evaluation results suggest that the collaboration platform has the potential to enhance collaboration among multi-functional teams
Europe’s New Post-Trade Infrastructure Rules. ECMI Policy Brief No. 20, 8 November 2012
After more than a decade of indecision, the EU is finally now set to implement a consistent regulatory architecture for clearing and settlement. Following the agreement on a European market infrastructure Regulation (EMIR), the European Commission has proposed harmonised rules for centralised settlement depositaries (CSDs), while the European Central Bank is moving forward with its plans for a central eurozone settlement engine. This paper analyses three components of the new post-trade infrastructure measures: 1) the regulatory framework for and supervision of central counterparties under the new EMIR legislation, 2) the authorisation requirements of trade repositories and 3) the draft CSD Regulation and the progress with the ECB’s Target 2 Securities project. It then discusses the impact of the new rules, and argues that, analogous to the unexpected impact of MiFID on trading infrastructures, a similar EMIR revolution may be on its way
Context Aware Computing for The Internet of Things: A Survey
As we are moving towards the Internet of Things (IoT), the number of sensors
deployed around the world is growing at a rapid pace. Market research has shown
a significant growth of sensor deployments over the past decade and has
predicted a significant increment of the growth rate in the future. These
sensors continuously generate enormous amounts of data. However, in order to
add value to raw sensor data we need to understand it. Collection, modelling,
reasoning, and distribution of context in relation to sensor data plays
critical role in this challenge. Context-aware computing has proven to be
successful in understanding sensor data. In this paper, we survey context
awareness from an IoT perspective. We present the necessary background by
introducing the IoT paradigm and context-aware fundamentals at the beginning.
Then we provide an in-depth analysis of context life cycle. We evaluate a
subset of projects (50) which represent the majority of research and commercial
solutions proposed in the field of context-aware computing conducted over the
last decade (2001-2011) based on our own taxonomy. Finally, based on our
evaluation, we highlight the lessons to be learnt from the past and some
possible directions for future research. The survey addresses a broad range of
techniques, methods, models, functionalities, systems, applications, and
middleware solutions related to context awareness and IoT. Our goal is not only
to analyse, compare and consolidate past research work but also to appreciate
their findings and discuss their applicability towards the IoT.Comment: IEEE Communications Surveys & Tutorials Journal, 201
Recommended from our members
Data standardization
With data rapidly becoming the lifeblood of the global economy, the ability to improve its use significantly affects both social and private welfare. Data standardization is key to facilitating and improving the use of data when data portability and interoperability are needed. Absent data standardization, a “Tower of Babel” of different databases may be created, limiting synergetic knowledge production. Based on interviews with data scientists, this Article identifies three main technological obstacles to data portability and interoperability: metadata uncertainties, data transfer obstacles, and missing data. It then explains how data standardization can remove at least some of these obstacles and lead to smoother data flows and better machine learning. The Article then identifies and analyzes additional effects of data standardization. As shown, data standardization has the potential to support a competitive and distributed data collection ecosystem and lead to easier policing in cases where rights are infringed or unjustified harms are created by data-fed algorithms. At the same time, increasing the scale and scope of data analysis can create negative externalities in the form of better profiling, increased harms to privacy, and cybersecurity harms. Standardization also has implications for investment and innovation, especially if lock-in to an inefficient standard occurs. The Article then explores whether market-led standardization initiatives can be relied upon to increase welfare, and the role governmental-facilitated data standardization should play, if at all
An analysis of existing production frameworks for statistical and geographic information: Synergies, gaps and integration
The production of official statistical and geospatial data is often in the hands of highly specialized public agencies that have traditionally followed their own paths and established their own production frameworks. In this article, we present the main frameworks of these two areas and focus on the possibility and need to achieve a better integration between them through the interoperability of systems, processes, and data. The statistical area is well led and has well-defined frameworks. The geospatial area does not have clear leadership and the large number of standards establish a framework that is not always obvious. On the other hand, the lack of a general and common legal framework is also highlighted. Additionally, three examples are offered: the first is the application of the spatial data quality model to the case of statistical data, the second of the application of the statistical process model to the geospatial case, and the third is the use of linked geospatial and statistical data. These examples demonstrate the possibility of transferring experiences/advances from one area to another. In this way, we emphasize the conceptual proximity of these two areas, highlighting synergies, gaps, and potential integration. © 2021 by the authors. Licensee MDPI, Basel, Switzerland
Recommended from our members
Cancer Informatics for Cancer Centers (CI4CC): Building a Community Focused on Sharing Ideas and Best Practices to Improve Cancer Care and Patient Outcomes.
Cancer Informatics for Cancer Centers (CI4CC) is a grassroots, nonprofit 501c3 organization intended to provide a focused national forum for engagement of senior cancer informatics leaders, primarily aimed at academic cancer centers anywhere in the world but with a special emphasis on the 70 National Cancer Institute-funded cancer centers. Although each of the participating cancer centers is structured differently, and leaders' titles vary, we know firsthand there are similarities in both the issues we face and the solutions we achieve. As a consortium, we have initiated a dedicated listserv, an open-initiatives program, and targeted biannual face-to-face meetings. These meetings are a place to review our priorities and initiatives, providing a forum for discussion of the strategic and pragmatic issues we, as informatics leaders, individually face at our respective institutions and cancer centers. Here we provide a brief history of the CI4CC organization and meeting highlights from the latest CI4CC meeting that took place in Napa, California from October 14-16, 2019. The focus of this meeting was "intersections between informatics, data science, and population science." We conclude with a discussion on "hot topics" on the horizon for cancer informatics
ERIGrid Holistic Test Description for Validating Cyber-Physical Energy Systems
Smart energy solutions aim to modify and optimise the operation of existing energy infrastructure. Such cyber-physical technology must be mature before deployment to the actual infrastructure, and competitive solutions will have to be compliant to standards still under development. Achieving this technology readiness and harmonisation requires reproducible experiments and appropriately realistic testing environments. Such testbeds for multi-domain cyber-physical experiments are complex in and of themselves. This work addresses a method for the scoping and design of experiments where both testbed and solution each require detailed expertise. This empirical work first revisited present test description approaches, developed a newdescription method for cyber-physical energy systems testing, and matured it by means of user involvement. The new Holistic Test Description (HTD) method facilitates the conception, deconstruction and reproduction of complex experimental designs in the domains of cyber-physical energy systems. This work develops the background and motivation, offers a guideline and examples to the proposed approach, and summarises experience from three years of its application.This work received funding in the European Community’s Horizon 2020 Program (H2020/2014–2020)
under project “ERIGrid” (Grant Agreement No. 654113)
- …