1,308 research outputs found
Unexpected Events in Nigerian Construction Projects: A Case of Four Construction Companies
In Nigeria, 50% to 70% of construction projects are delayed due to unexpected events that are linked to lapses in performance, near misses, and surprises. While researchers have theorized on the impact of mindfulness and information systems management (ISM) on unexpected events, information is lacking on how project teams can combine ISM and mindfulness in response to unexpected events in construction projects. The purpose of this case study was to examine how project teams can combine mindfulness with ISM in response to unexpected events during the execution phase of Nigerian construction projects. The framework of High Reliability Theory revealed that unexpected events could be minimized by mindfulness defined by 5 cognitive processes: preoccupation with failure, reluctance to simplify, sensitivity to operations, commitment to resilience, and deference to expertise. In-depth semi-structured interviews elicited the views of 24 project experts on team behaviors, tactics, and processes for combining mindfulness with ISM. Data analysis was conducted by open coding to identify and reduce data into themes, and axial coding was used to identify and isolate categories. Findings were that project teams could combine mindfulness with ISM in response to unexpected events by integrating effective risk, team, and communication management with appropriate training and technology infrastructure. If policymakers, project clients, and practitioners adopt practices suggested in this study, the implications for social change are that project management practices, organizational learning, and the performance of construction projects may improve, construction wastes may be reduced, and taxpayers may derive optimum benefits from public funds committed to construction projects
Dynamic Optimization of Network Routing Problem through Ant Colony Optimization (ACO)
Search Based Software Engineering (SBSE) is a new paradigm of Software engineering, which considers software engineering problems as search problems and emphasizes to find out optimal solution for the given set of available solutions using metaheuristic techniques like hill climbing simulated annealing, evolutionary programming and tabu search. On the other hand AI techniques like Swarm particle optimization and Ant colony optimization (ACO) are used to find out solutions for dynamic problems. SBSE is yet not used for dynamic problems. In this study ACO techniques are applied on SBSE problem by considering Network routing problem as case study, in which the nature of problem is dynamic. Keywords: SBSE, ACO, Metaheuristic search techniques, dynamic optimizatio
Search-based techniques applied to optimization of project planning for a massive maintenance project
This paper evaluates the use of three different search-based techniques, namely genetic algorithms, hill climbing and simulated annealing, and two problem representations, for planning resource allocation in large massive maintenance projects. In particular the search-based approach aims to find an optimal or near optimal order in which to allocate work packages to programming teams, in order to minimize the project duration.The approach is validated by an empirical study of a large, commercial Y2K massive maintenance project, which compares these techniques with each other and with a random search (to provide base line comparison data).Results show that an ordering-based genome encoding (with tailored cross over operator) and the genetic algorithm appear to provide the most robust solution, though the hill climbing approach also performs well. The best search technique results reduce the project duration by as much as 50%
Cost and time overruns in public investment projects: an exogenous determinants model, theory and practice
The original contribution to knowledge of this research is overtaking the absence of an exogenous variables model in the analysis of cost and time deviations in public projects in the existing literature. This is achieved through the construction of a model that includes exogenous (political, governance, and economic) and endogenous (project-related) determinants. This model aims to help public decision makers develop public policies that seek to minimise cost and time overruns in public infrastructure projects. Cost and time overruns are often perceived to be a sign of project failure, and several past studies have identified potential causes and explanatory factors for the occurrence of such deviations. Governments devote significant resources to public projects, which thus makes cost and time overruns a critical issue for public management.
The research presents a theoretical underpinning based on Opportunistic Behaviour, Institutional, Economic Cycles, and Incomplete contracts theories and provides an empirical analysis of 4,305 public projects developed in Portugal between 1980 and 2014. We used as dependent variables the cost/time deviation (the percentual difference between the final and initial cost/time) and the cost/time overruns (assuming one if the cost/time deviation is positive and zero if the cost/time deviation is zero or negative).
The analysis suggests that these exogenous determinants have been under-valued in the existing literature and that they do, indeed, play a relevant role in understanding cost and time deviations
Development of a cost-predicting model for construction projects in Ghana
Abstract: One of the foremost challenges faced by the construction industry is the issue of cost overruns. Cost overruns cut across construction projects of nations and continents as well. They vary in magnitude and occur irrespective of project size and location. Over the years numerous attempts have been made in the area of estimating cost of construction projects right and improving the efficacy or accuracy of cost estimating using different statistical methods. This research investigated the factors that contribute to cost overruns and developed a predicting cost-estimating model for public sector building projects. The aim primarily was to extract factors from historical data of completed projects and use these predictive factors to develop a predictive model. Two models were developed using the predictive variables from historical data by the use of multiple linear regression and extreme learning machine. These models were compared to see the accuracy of performance. Results from the study reveal findings that; predictive variables from historical data can be used to predict the cost of completion of construction projects at the contract award stage, the multiple linear regression model results as compared to extreme learning machine results shows that extreme learning machine performs better. The study brought to light the use of extreme learning machine for developing predicting cost-estimating models built on historical data from completed projects. This rarely exists in construction industry. It further substantiates the superior performance of extreme learning machine to multiple linear regressions using big data. The developed model can also be converted to desktop software for predicting completion cost by industry...Ph.D. (Engineering Management
Search based software engineering: Trends, techniques and applications
© ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives.
This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E
Hacia un modelo híbrido de simulación de la producción de software en un entorno multiproyecto
La simulación del ciclo de vida de los proyectos software o de partes
de éste es un activo campo de investigación en la ingeniería del software. El
presente informe analiza la literatura sobre modelos de simulación que puedan
se útiles para estudiar las organizaciones de desarrollo de software. La finalidad
del trabajo es construir un modelo de simulación en el que se puedan
implementar diversas políticas de asignación de recursos basadas en las técnicas
de gestión de proyectos propias de la investigación operativa en un contexto de
mejora de procesos y teniendo en cuenta el carácter multiproyecto.Ministerio de Educación y Ciencia TIN2007-67843- C06-0
Recommended from our members
Lean Product Development process structure and its effect on the performance of NPD projects
New product development (NPD) has a pivotal role in the industrial competition, and makes a basis for long‐term prosperity of companies. To survive in today's fast‐changing market environment, companies are always trying to improve the performance of their NPD projects, by implementing new approaches, such as Lean Product Development (LPD). Nevertheless, applying such approaches is not straightforward, mainly due to the high level of interdependency between development activities and the role of dynamic effects in the project performance. Understanding the combined effects of dynamic features, including feedback loops, time delays and nonlinear causal relationships, is the main step for achieving higher project performance.
In this thesis, the dynamics of LPD process structure is investigated to find the ways it could affect the time, cost and quality performance of a development project. As there is no consensus about the definition of LPD among researchers in this filed, first through a comprehensive literature review different approaches to LPD are studied. Two major approaches for LPD are introduced based on the adaptation of lean manufacturing tools and techniques for optimizing NPD processes, or extracting LPD specific tools and techniques from Toyota Product Development System (TPDS). The second approach is proved to be more applicable, mainly due to fundamental differences between manufacturing and NPD environments, and the LPD process design based on TPDS is selected as the focal point for this research.
The combination of Set‐Based approach to design and Concurrent Engineering in the form of SBCE is identified as the unique feature of LPD process structure which have been the topic of several researches in this field during past decade. Set‐based design approach calls for the higher number of iteration cycles at the front end of the projects, and is responsible for higher project effectiveness while increases the time and effort invested. On the other hand, concurrent engineering targets the project duration, and is an efficiency factor, but if not structured properly it could have an opposite effect through increasing the number of rework cycles. Although the performance of TPDS which is the best benchmark for LPD shows the positive effect of SBCE on the projects performance, the reasons behind it and the way through which two approaches could be structured to achieve the favourable results is not clear yet. In addition, while different types of new product development projects, based on VII their levels of complexity and innovation, are defined and executed in companies, it is not clear if SBCE approach has the same impact on all project types.
To investigate the reasons behind the superiority of SBCE and its effects in different types of development projects the systems thinking approach is selected as the main research methodology to provide a holistic view on the development projects through looking on interdependencies between performance measures and process structure. System dynamics modelling is used as the research method, due to its capacity in modelling feedback loops and iteration and rework cycles, as underlying factors which determine the time, cost and quality performance in projects. The model is built based on verified structures for rework cycle and resource allocation as the platform for the model, and becomes more specific for the purpose of this research by adding structures related to the iteration cycles, number of initial concepts, and effect of project type. After passing the standard system dynamics validation tests, the model is calibrated using the historical project data from different projects in a major car manufacturing company. The calibrated and verified model then used for the policy analysis by defining different scenarios based on the number of iteration cycles during the conceptual design phase, number of initial concepts and the type of project. All types of projects show the improved performance metrics when moving towards the SBCE approach by increasing the number of iteration cycle. However, the degree of improvement for projects with higher levels of complexity is more profound. In addition, it is concluded for projects with the high level of complexity that increasing the number of initial concepts has the positive effect on all project performance measures.
This research results have a methodological contribution by providing a method for rigorous representation of the impact of LPD process structure on projects performance through simulation. From the practical point of view, the developed model could be used by project managers as a guide for making informed decisions which guarantee the long‐term success of development projects
Effecting Data Quality Through Data Governance: a Case Study in the Financial Services Industry
One of the most significant challenges faced by senior management today is implementing a data governance program to ensure that data is an asset to an organization\u27s mission. New legislation aligned with continual regulatory oversight, increasing data volume growth, and the desire to improve data quality for decision making are driving forces behind data governance initiatives. Data governance involves reshaping existing processes and the way people view data along with the information technology required to create a consistent, secure and defined processes for handling the quality of an organization\u27s data. In examining attempts to move towards making data an asset in organizations, the term data governance helps to conceptualize the break with existing ad hoc, siloed and improper data management practices. This research considers a case study of large financial services company to examine data governance policies and procedures. It seeks to bring some information to bare on the drivers of data governance, the processes to ensure data quality, the technologies and people involved to aid in the processes as well as the use of data governance in decision making. This research also addresses some core questions surrounding data governance, such as the viability of a golden source record, ownership and responsibilities for data, and the optimum placement of a data governance department. The findings will provide a model for financial services companies hoping to take the initial steps towards better data quality and ultimately a data governance capability
- …