461,018 research outputs found
A requirements engineering framework for integrated systems development for the construction industry
Computer Integrated Construction (CIC) systems are computer environments through which
collaborative working can be undertaken. Although many CIC systems have been developed to demonstrate the
communication and collaboration within the construction projects, the uptake of CICs by the industry is still
inadequate. This is mainly due to the fact that research methodologies of the CIC development projects are
incomplete to bridge the technology transfer gap. Therefore, defining comprehensive methodologies for the
development of these systems and their effective implementation on real construction projects is vital.
Requirements Engineering (RE) can contribute to the effective uptake of these systems because it drives the
systems development for the targeted audience. This paper proposes a requirements engineering approach for
industry driven CIC systems development. While some CIC systems are investigated to build a broad and deep
contextual knowledge in the area, the EU funded research project, DIVERCITY (Distributed Virtual Workspace
for Enhancing Communication within the Construction Industry), is analysed as the main case study project
because its requirements engineering approach has the potential to determine a framework for the adaptation of
requirements engineering in order to contribute towards the uptake of CIC systems
Dropout Model Evaluation in MOOCs
The field of learning analytics needs to adopt a more rigorous approach for
predictive model evaluation that matches the complex practice of
model-building. In this work, we present a procedure to statistically test
hypotheses about model performance which goes beyond the state-of-the-practice
in the community to analyze both algorithms and feature extraction methods from
raw data. We apply this method to a series of algorithms and feature sets
derived from a large sample of Massive Open Online Courses (MOOCs). While a
complete comparison of all potential modeling approaches is beyond the scope of
this paper, we show that this approach reveals a large gap in dropout
prediction performance between forum-, assignment-, and clickstream-based
feature extraction methods, where the latter is significantly better than the
former two, which are in turn indistinguishable from one another. This work has
methodological implications for evaluating predictive or AI-based models of
student success, and practical implications for the design and targeting of
at-risk student models and interventions
Recommended from our members
Conceptual modelling and the quality of ontologies: A comparison between object-role modelling and the object paradigm
Ontologies are key enablers for sharing precise and machine-understandable semantics among different applications and parties. Yet, for ontologies to meet these expectations, their quality must be of a good standard. The quality of an ontology is strongly based on the design method employed. This paper addresses the design problems related to the modelling of ontologies, with specific concentration on the issues related to the quality of the conceptualisations produced. The paper aims
to demonstrate the impact of the modelling paradigm adopted on the quality of ontological models and, consequently, the potential impact that such a decision can have in relation to the development of
software applications. To this aim, an ontology that is conceptualised based on the Object Role Modelling (ORM) approach is re-engineered into a one modelled on the basis of the Object Paradigm (OP). Next, the two ontologies are analytically compared using the specified criteria. The conducted
comparison highlights that using the OP for ontology conceptualisation can provide more expressive, reusable, objective and temporal ontologies than those conceptualised on the basis of the ORM approach
Connecting the Dots: Linking Sustainable Wild Capture Fisheries Initiatives and Impact Investors
Wilderness Markets undertook a series of fishery value chain assessments to better understand the opportunities and constraints for private impact capital to flow into wild capture fisheries markets. Given the investments in developing sustainable fisheries pilots, Wilderness Markets expected to identify a range of investment opportunities in each of the fisheries assessed. However, they did not find investment opportunities that could address the suite of challenges associated with improving financial and social outcomes, while also contributing to conservation outcomes, particularly in developing country fisheries. Wilderness Markets' research indicates the lack of triple-bottom line (TBL) investment opportunities is due to six main constraints to an economically sustainable fisheries value chain—data, management, market differentiation, infrastructure, finance and the lack of investable entities
Beyond Volume: The Impact of Complex Healthcare Data on the Machine Learning Pipeline
From medical charts to national census, healthcare has traditionally operated
under a paper-based paradigm. However, the past decade has marked a long and
arduous transformation bringing healthcare into the digital age. Ranging from
electronic health records, to digitized imaging and laboratory reports, to
public health datasets, today, healthcare now generates an incredible amount of
digital information. Such a wealth of data presents an exciting opportunity for
integrated machine learning solutions to address problems across multiple
facets of healthcare practice and administration. Unfortunately, the ability to
derive accurate and informative insights requires more than the ability to
execute machine learning models. Rather, a deeper understanding of the data on
which the models are run is imperative for their success. While a significant
effort has been undertaken to develop models able to process the volume of data
obtained during the analysis of millions of digitalized patient records, it is
important to remember that volume represents only one aspect of the data. In
fact, drawing on data from an increasingly diverse set of sources, healthcare
data presents an incredibly complex set of attributes that must be accounted
for throughout the machine learning pipeline. This chapter focuses on
highlighting such challenges, and is broken down into three distinct
components, each representing a phase of the pipeline. We begin with attributes
of the data accounted for during preprocessing, then move to considerations
during model building, and end with challenges to the interpretation of model
output. For each component, we present a discussion around data as it relates
to the healthcare domain and offer insight into the challenges each may impose
on the efficiency of machine learning techniques.Comment: Healthcare Informatics, Machine Learning, Knowledge Discovery: 20
Pages, 1 Figur
Industry-driven innovative system development for the construction industry: The DIVERCITY project
Collaborative working has become possible using the innovative integrated systems in construction as many activities are performed globally with stakeholders situated in various locations. The Integrated VR based information systems can bind the fragmentation and provide communication and collaboration between the distributed stakeholders n various locations. The development of these technologies is vital for the uptake of these systems by the construction industry.
This paper starts by emphasising the importance of construction IT research and reviews some future research directions in this area. In particular, the paper explores how virtual prototyping can improve the productivity and effectiveness of construction projects, and presents DIVERCITY, which is th as a case study of the research in virtual prototyping.
Besides, the paper explores the requirements engineering of the DIVERCITY project. DIVERCITY has large and evolving requirements, which considered the perspectives of multiple stakeholders, such as clients, architects and contractors. However, practitioners are often unsure of the detail of how virtual environments would support the construction process, and how to overcome some barriers to the introduction of new technologies. This complicates the requirements engineering process
Principles for aerospace manufacturing engineering in integrated new product introduction
This article investigates the value-adding practices of Manufacturing Engineering for integrated New Product Introduction. A model representing how current practices align to support lean integration in Manufacturing Engineering has been defined. The results are used to identify a novel set of guiding principles for integrated Manufacturing Engineering. These are as follows: (1) use a data-driven process, (2) build from core capabilities, (3) develop the standard, (4) deliver through responsive processes and (5) align cross-functional and customer requirements. The investigation used a mixed-method approach. This comprises case studies to identify current practice and a survey to understand implementation in a sample of component development projects within a major aerospace manufacturer. The research contribution is an illustration of aerospace Manufacturing Engineering practices for New Product Introduction. The conclusions will be used to indicate new priorities for New Product Introduction and the cross-functional interactions to support flawless and innovative New Product Introduction. The final principles have been validated through a series of consultations with experts in the sponsoring company to ensure that correct and relevant content has been defined
Conceptual Modelling and The Quality of Ontologies: Endurantism Vs. Perdurantism
Ontologies are key enablers for sharing precise and machine-understandable
semantics among different applications and parties. Yet, for ontologies to meet
these expectations, their quality must be of a good standard. The quality of an
ontology is strongly based on the design method employed. This paper addresses
the design problems related to the modelling of ontologies, with specific
concentration on the issues related to the quality of the conceptualisations
produced. The paper aims to demonstrate the impact of the modelling paradigm
adopted on the quality of ontological models and, consequently, the potential
impact that such a decision can have in relation to the development of software
applications. To this aim, an ontology that is conceptualised based on the
Object-Role Modelling (ORM) approach (a representative of endurantism) is
re-engineered into a one modelled on the basis of the Object Paradigm (OP) (a
representative of perdurantism). Next, the two ontologies are analytically
compared using the specified criteria. The conducted comparison highlights that
using the OP for ontology conceptualisation can provide more expressive,
reusable, objective and temporal ontologies than those conceptualised on the
basis of the ORM approach
- …