24,853 research outputs found
Hindsight, Organizational Routines and Media Risk Coverage
Professor Stallings describes how journalists explain catastrophes by coupling them with flaws in human organizations
Recommended from our members
Understanding construction delay analysis and the role of pre-construction programming
Copyright © 2013, American Society of Civil Engineers. This is the author's accepted manuscript. The final published article is available from the link below.Modern construction projects commonly suffer from delay in their completions. The resolution of time and cost claims consequently flowing from such delays continues to remain a difficult undertaking for all project parties. A common approach often relied on by contractors and their employers (or their representatives) to resolve this matter involves applying various delay analysis techniques, which are all based on construction programs originally developed for managing the project. However, evidence from literature suggests that the reliability of these techniques in ensuring successful claims resolution are often undermined by the nature and quality of the underlying program used. As part of a wider research carried out on delay and disruption analysis in practice, this paper reports on an aspect of the study aimed at exploring preconstruction stage programming issues that affect delay claims resolutions. This aspect is based on an in-depth interview with experienced construction planning engineers in the United Kingdom, conducted after an initial large-scale survey on delay and disruption techniques usage. Summary of key findings and conclusions include: (1) most contractors prefer to use linked bar chart format for their baseline programs over conventional critical path method (CPM) networks; (2) baseline programs are developed using planning software packages. Some of these pose difficulties when employed for most delay analysis techniques, except for simpler ones; (3) manpower loading graphs are not commonly developed as part of the main deliverables during preconstruction stage planning. As a result, most programs are not subjected to resource loading and leveling for them to accurately reflect planned resource usage on site. This practice has detrimental effects on the reliability of baseline programs in their use for resolving delay claims; and (4) baseline program development involves many different experts within construction organizations as expected, but with very little involvement of the employer or its representative. Active client involvement is however quite important as it would facilitate quick program approval/acceptance before construction, a necessary requirement for early delay claims settlement, which otherwise are often left unresolved long after the delaying events with the potential of generating into expensive disputes. The study results provide a better understanding of the key issues that need attention if improvements are to be made in delay claim resolutions. Additional research focusing on the testing of these results using a much larger sample and rigorous statistical analysis for generalization purposes would be helpful in advancing the limited knowledge of this subject matter
Sequential anomaly detection in the presence of noise and limited feedback
This paper describes a methodology for detecting anomalies from sequentially
observed and potentially noisy data. The proposed approach consists of two main
elements: (1) {\em filtering}, or assigning a belief or likelihood to each
successive measurement based upon our ability to predict it from previous noisy
observations, and (2) {\em hedging}, or flagging potential anomalies by
comparing the current belief against a time-varying and data-adaptive
threshold. The threshold is adjusted based on the available feedback from an
end user. Our algorithms, which combine universal prediction with recent work
on online convex programming, do not require computing posterior distributions
given all current observations and involve simple primal-dual parameter
updates. At the heart of the proposed approach lie exponential-family models
which can be used in a wide variety of contexts and applications, and which
yield methods that achieve sublinear per-round regret against both static and
slowly varying product distributions with marginals drawn from the same
exponential family. Moreover, the regret against static distributions coincides
with the minimax value of the corresponding online strongly convex game. We
also prove bounds on the number of mistakes made during the hedging step
relative to the best offline choice of the threshold with access to all
estimated beliefs and feedback signals. We validate the theory on synthetic
data drawn from a time-varying distribution over binary vectors of high
dimensionality, as well as on the Enron email dataset.Comment: 19 pages, 12 pdf figures; final version to be published in IEEE
Transactions on Information Theor
- …