1,308 research outputs found

    Unexpected Events in Nigerian Construction Projects: A Case of Four Construction Companies

    Get PDF
    In Nigeria, 50% to 70% of construction projects are delayed due to unexpected events that are linked to lapses in performance, near misses, and surprises. While researchers have theorized on the impact of mindfulness and information systems management (ISM) on unexpected events, information is lacking on how project teams can combine ISM and mindfulness in response to unexpected events in construction projects. The purpose of this case study was to examine how project teams can combine mindfulness with ISM in response to unexpected events during the execution phase of Nigerian construction projects. The framework of High Reliability Theory revealed that unexpected events could be minimized by mindfulness defined by 5 cognitive processes: preoccupation with failure, reluctance to simplify, sensitivity to operations, commitment to resilience, and deference to expertise. In-depth semi-structured interviews elicited the views of 24 project experts on team behaviors, tactics, and processes for combining mindfulness with ISM. Data analysis was conducted by open coding to identify and reduce data into themes, and axial coding was used to identify and isolate categories. Findings were that project teams could combine mindfulness with ISM in response to unexpected events by integrating effective risk, team, and communication management with appropriate training and technology infrastructure. If policymakers, project clients, and practitioners adopt practices suggested in this study, the implications for social change are that project management practices, organizational learning, and the performance of construction projects may improve, construction wastes may be reduced, and taxpayers may derive optimum benefits from public funds committed to construction projects

    Dynamic Optimization of Network Routing Problem through Ant Colony Optimization (ACO)

    Get PDF
    Search Based Software Engineering (SBSE) is a new paradigm of Software engineering, which considers software engineering problems as search problems and emphasizes to find out optimal solution for the given set of available solutions using metaheuristic techniques like hill climbing simulated annealing, evolutionary programming and tabu search. On the other hand AI techniques like Swarm particle optimization and Ant colony optimization (ACO) are used to find out solutions for dynamic problems. SBSE is yet not used for dynamic problems. In this study ACO techniques are applied on SBSE problem by considering Network routing problem as case study, in which the nature of problem is dynamic. Keywords: SBSE, ACO, Metaheuristic search techniques, dynamic optimizatio

    Search-based techniques applied to optimization of project planning for a massive maintenance project

    Full text link
    This paper evaluates the use of three different search-based techniques, namely genetic algorithms, hill climbing and simulated annealing, and two problem representations, for planning resource allocation in large massive maintenance projects. In particular the search-based approach aims to find an optimal or near optimal order in which to allocate work packages to programming teams, in order to minimize the project duration.The approach is validated by an empirical study of a large, commercial Y2K massive maintenance project, which compares these techniques with each other and with a random search (to provide base line comparison data).Results show that an ordering-based genome encoding (with tailored cross over operator) and the genetic algorithm appear to provide the most robust solution, though the hill climbing approach also performs well. The best search technique results reduce the project duration by as much as 50%

    Cost and time overruns in public investment projects: an exogenous determinants model, theory and practice

    Get PDF
    The original contribution to knowledge of this research is overtaking the absence of an exogenous variables model in the analysis of cost and time deviations in public projects in the existing literature. This is achieved through the construction of a model that includes exogenous (political, governance, and economic) and endogenous (project-related) determinants. This model aims to help public decision makers develop public policies that seek to minimise cost and time overruns in public infrastructure projects. Cost and time overruns are often perceived to be a sign of project failure, and several past studies have identified potential causes and explanatory factors for the occurrence of such deviations. Governments devote significant resources to public projects, which thus makes cost and time overruns a critical issue for public management. The research presents a theoretical underpinning based on Opportunistic Behaviour, Institutional, Economic Cycles, and Incomplete contracts theories and provides an empirical analysis of 4,305 public projects developed in Portugal between 1980 and 2014. We used as dependent variables the cost/time deviation (the percentual difference between the final and initial cost/time) and the cost/time overruns (assuming one if the cost/time deviation is positive and zero if the cost/time deviation is zero or negative). The analysis suggests that these exogenous determinants have been under-valued in the existing literature and that they do, indeed, play a relevant role in understanding cost and time deviations

    Development of a cost-predicting model for construction projects in Ghana

    Get PDF
    Abstract: One of the foremost challenges faced by the construction industry is the issue of cost overruns. Cost overruns cut across construction projects of nations and continents as well. They vary in magnitude and occur irrespective of project size and location. Over the years numerous attempts have been made in the area of estimating cost of construction projects right and improving the efficacy or accuracy of cost estimating using different statistical methods. This research investigated the factors that contribute to cost overruns and developed a predicting cost-estimating model for public sector building projects. The aim primarily was to extract factors from historical data of completed projects and use these predictive factors to develop a predictive model. Two models were developed using the predictive variables from historical data by the use of multiple linear regression and extreme learning machine. These models were compared to see the accuracy of performance. Results from the study reveal findings that; predictive variables from historical data can be used to predict the cost of completion of construction projects at the contract award stage, the multiple linear regression model results as compared to extreme learning machine results shows that extreme learning machine performs better. The study brought to light the use of extreme learning machine for developing predicting cost-estimating models built on historical data from completed projects. This rarely exists in construction industry. It further substantiates the superior performance of extreme learning machine to multiple linear regressions using big data. The developed model can also be converted to desktop software for predicting completion cost by industry...Ph.D. (Engineering Management

    Search based software engineering: Trends, techniques and applications

    Get PDF
    © ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives. This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E

    Hacia un modelo híbrido de simulación de la producción de software en un entorno multiproyecto

    Get PDF
    La simulación del ciclo de vida de los proyectos software o de partes de éste es un activo campo de investigación en la ingeniería del software. El presente informe analiza la literatura sobre modelos de simulación que puedan se útiles para estudiar las organizaciones de desarrollo de software. La finalidad del trabajo es construir un modelo de simulación en el que se puedan implementar diversas políticas de asignación de recursos basadas en las técnicas de gestión de proyectos propias de la investigación operativa en un contexto de mejora de procesos y teniendo en cuenta el carácter multiproyecto.Ministerio de Educación y Ciencia TIN2007-67843- C06-0

    Effecting Data Quality Through Data Governance: a Case Study in the Financial Services Industry

    Get PDF
    One of the most significant challenges faced by senior management today is implementing a data governance program to ensure that data is an asset to an organization\u27s mission. New legislation aligned with continual regulatory oversight, increasing data volume growth, and the desire to improve data quality for decision making are driving forces behind data governance initiatives. Data governance involves reshaping existing processes and the way people view data along with the information technology required to create a consistent, secure and defined processes for handling the quality of an organization\u27s data. In examining attempts to move towards making data an asset in organizations, the term data governance helps to conceptualize the break with existing ad hoc, siloed and improper data management practices. This research considers a case study of large financial services company to examine data governance policies and procedures. It seeks to bring some information to bare on the drivers of data governance, the processes to ensure data quality, the technologies and people involved to aid in the processes as well as the use of data governance in decision making. This research also addresses some core questions surrounding data governance, such as the viability of a golden source record, ownership and responsibilities for data, and the optimum placement of a data governance department. The findings will provide a model for financial services companies hoping to take the initial steps towards better data quality and ultimately a data governance capability
    corecore