19 research outputs found

    On de-bunking “Fake News” in the post-truth era : how to reduce statistical error in research

    Get PDF
    The authors note with alarm that statistical noise caused by statistical incompetence is beginning to creep into research on cost overrun in public investment projects, contaminating research with work that does not meet basic standards of validity and reliability. The paper gives examples of such work and proposes three heuristics to root out the problem. First, researchers who are not statisticians, or do not have a strong background in statistics, should abstain from doing statistical analysis, and instead rely on more experienced colleagues, preferably professional statisticians. Second, journal referees should clearly state their level of statistical proficiency to journal editors, so these can set the right referee team. Finally, journal editors should make sure that at least one referee is capable of reviewing the statistical and methodological aspects of a paper. The work under review would have benefitted from observing these simple heuristics, as would any work based on statistical analysis.Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.Transport and Logistic

    Modelling a Historic Oil-Tank Fire Allows an Estimation of the Sensitivity of the Infrared Receptors in Pyrophilous Melanophila Beetles

    Get PDF
    Pyrophilous jewel beetles of the genus Melanophila approach forest fires and there is considerable evidence that these beetles can detect fires from great distances of more than 60 km. Because Melanophila beetles are equipped with infrared receptors and are also attracted by hot surfaces it can be concluded that these infrared receptors are used for fire detection

    Early Diagnosis of Vegetation Health From High-Resolution Hyperspectral and Thermal Imagery: Lessons Learned From Empirical Relationships and Radiative Transfer Modelling

    Get PDF
    [Purpose of Review] We provide a comprehensive review of the empirical and modelling approaches used to quantify the radiation–vegetation interactions related to vegetation temperature, leaf optical properties linked to pigment absorption and chlorophyll fluorescence emission, and of their capability to monitor vegetation health. Part 1 provides an overview of the main physiological indicators (PIs) applied in remote sensing to detect alterations in plant functioning linked to vegetation diseases and decline processes. Part 2 reviews the recent advances in the development of quantitative methods to assess PI through hyperspectral and thermal images.[Recent Findings] In recent years, the availability of high-resolution hyperspectral and thermal images has increased due to the extraordinary progress made in sensor technology, including the miniaturization of advanced cameras designed for unmanned aerial vehicle (UAV) systems and lightweight aircrafts. This technological revolution has contributed to the wider use of hyperspectral imaging sensors by the scientific community and industry; it has led to better modelling and understanding of the sensitivity of different ranges of the electromagnetic spectrum to detect biophysical alterations used as early warning indicators of vegetation health.[Summary] The review deals with the capability of PIs such as vegetation temperature, chlorophyll fluorescence, photosynthetic energy downregulation and photosynthetic pigments detected through remote sensing to monitor the early responses of plants to different stressors. Various methods for the detection of PI alterations have recently been proposed and validated to monitor vegetation health. The greatest challenges for the remote sensing community today are (i) the availability of high spatial, spectral and temporal resolution image data; (ii) the empirical validation of radiation–vegetation interactions; (iii) the upscaling of physiological alterations from the leaf to the canopy, mainly in complex heterogeneous vegetation landscapes; and (iv) the temporal dynamics of the PIs and the interaction between physiological changes.The authors received funding provided by the FluorFLIGHT (GGR801) Marie Curie Fellowship, the QUERCUSAT and ESPECTRAMED projects (Spanish Ministry of Economy and Competitiveness), the Academy of Finland (grants 266152, 317387) and the European Research Council Synergy grant ERC-2013-SyG-610028 IMBALANCE-P.Peer reviewe

    Regression to the tail: Why the Olympics blow up

    No full text
    The Olympic Games are the largest, highest-profile, and most expensive megaevent hosted by cities and nations. Average sports-related costs of hosting are $12.0 billion. Non-sports-related costs are typically several times that. Every Olympics since 1960 has run over budget, at an average of 172 percent in real terms, the highest overrun on record for any type of megaproject. The paper tests theoretical statistical distributions against empirical data for the costs of the Games, in order to explain the cost risks faced by host cities and nations. It is documented, for the first time, that cost and cost overrun for the Games follow a power-law distribution. Olympic costs are subject to infinite mean and variance, with dire consequences for predictability and planning. We name this phenomenon "regression to the tail": it is only a matter of time until a new extreme event occurs, with an overrun larger than the largest so far, and thus more disruptive and less plannable. The generative mechanism for the Olympic power law is identified as strong convexity prompted by six causal drivers: irreversibility, fixed deadlines, the Blank Check Syndrome, tight coupling, long planning horizons, and an Eternal Beginner Syndrome. The power law explains why the Games are so difficult to plan and manage successfully, and why cities and nations should think twice before bidding to host. Based on the power law, two heuristics are identified for better decision making on hosting. Finally, the paper develops measures for good practice in planning and managing the Games, including how to mitigate the extreme risks of the Olympic power law

    Robust next release problem

    No full text

    Improving the shutter-less compensation method for TEC-less microbolometer-based infrared cameras

    Get PDF
    Shutter-less infrared cameras based on microbolometer focal plane arrays (FPAs) are the most widely used cameras in thermography, in particular in the fields of handheld devices and small distributed sensors. For acceptable measurement uncertainty values the disturbing influences of changing thermal ambient conditions have to be treated corresponding to temperature measurements of the thermal conditions inside the camera. We propose a compensation approach based on calibration measurements where changing external conditions are simulated and all correction parameters are determined. This allows to process the raw infrared data and to consider all disturbing influences. The effects on the pixel responsivity and offset voltage are considered separately. The responsivity correction requires two different, alternating radiation sources. This paper presents the details of the compensation procedure and discusses relevant aspects to gain low temperature measurement uncertainty

    The empirical reality of IT project cost overruns: discovering a power-law distribution

    No full text
    If managers assume a normal or near-normal distribution of Information Technology (IT) project cost overruns, as is common, and cost overruns can be shown to follow a power-law distribution, managers may be unwittingly exposing their organizations to extreme risk by severely underestimating the probability of large cost overruns. In this research, we collect and analyze a large sample comprised of 5,392 IT projects to empirically examine the probability distribution of IT project cost overruns. Further, we propose and examine a mechanism that can explain such a distribution. Our results reveal that IT projects are far riskier in terms of cost than normally assumed by decision makers and scholars. Specifically, we found that IT project cost overruns follow a power-law distribution in which there are a large number of projects with relatively small overruns and a fat tail that includes a smaller number of projects with extreme overruns. A possible generative mechanism for the identified power-law distribution is found in interdependencies among technological components in IT systems. We propose and demonstrate, through computer simulation, that a problem in a single technological component can lead to chain reactions in which other interdependent components are affected, causing substantial overruns. What the power law tells us is that extreme IT project cost overruns will occur and that the prevalence of these will be grossly underestimated if managers assume that overruns follow a normal or near-normal distribution. This underscores the importance of realistically assessing and mitigating the cost risk of new IT projects up front

    Road work ahead: the emerging revolution in the road construction industry

    No full text
    Road networks, on which governments around the world spend significant shares of their civil engineering budgets, are rightly considered the lifeline for modern and successful economies. While those have been transformed by numerous innovations and (especially digital) disruptions, both the process and materials used in building roads as well as their key parameters and functionalities have remained remarkably unchanged over the past years and decades. However, this seemingly natural continuity should definitely not lead to the assumption that there will be no major changes in the road construction industry in the future. It is already becoming apparent that four megatrends – autonomous driving, automated production, digitization, and advances in road construction materials – as well as a new process flow for road construction are bound to not only make the roads of the future look significantly different from those of today, but also make road construction much faster and cheaper. Our aspiration in publishing this white paper is to provide objective insights into the various aspects posed by the emerging revolution in the road construction industry, its implications, and the pressing question of how to prepare for the shake-up of the industry landscape. The ideas and information in this article are the result of many months of work by numerous experts from McKinsey & Company and Oxford Global Projects. This paper should offer the latest and most relevant know-how on the status of road construction in Europe, current challenges, and an assessment of the potential of novel technologies and processes.</p
    corecore