1,640 research outputs found

    Essays on Panel Data Prediction Models

    Get PDF
    Forward-looking analysis is valuable for policymakers as they need effective strategies to mitigate imminent risks and potential challenges. Panel data sets contain time series information over a number of cross-sectional units and are known to have superior predictive abilities in comparison to time series only models. This PhD thesis develops novel panel data methods to contribute to the advancement of short-term forecasting and nowcasting of macroeconomic and environmental variables. The two most important highlights of this thesis are the use of cross-sectional dependence in panel data forecasting and to allow for timely predictions and ‘nowcasts’.Although panel data models have been found to provide better predictions in many empirical scenarios, forecasting applications so far have not included cross-sectional dependence. On the other hand, cross-sectional dependence is well-recognised in large panels and has been explicitly modelled in previous causal studies. A substantial portion of this thesis is devoted to developing cross-sectional dependence in panel models suited to diverse empirical scenarios. The second important aspect of this work is to integrate the asynchronous release schedules of data within and across panel units into the panel models. Most of the thesis emphasises the pseudo-real-time predictions with efforts to estimate the model on the data that has been released at the time of predictions, thus trying to replicate the realistic circumstances of delayed data releases.Linear, quantile and non-linear panel models are developed to predict a range of targets both in terms of their meaning and method of measurement. Linear models include panel mixed-frequency vector-autoregression and bridge equation set-ups which predict GDP growth, inflation and CO2 emissions. Panel quantile regressions and latent variable discrete choice models predict growth-at-risk and extreme episodes of cross-border capital flows, respectively. The datasets include both international cross-country panels as well as regional subnational panels. Depending on the nature of the model and the prediction targets, different precision criteria evaluate the accuracy of the models in out-of-sample settings. The generated predictions beat respective standard benchmarks in a more timely fashion

    Strategy Tripod Perspective on the Determinants of Airline Efficiency in A Global Context: An Application of DEA and Tobit Analysis

    Get PDF
    The airline industry is vital to contemporary civilization since it is a key player in the globalization process: linking regions, fostering global commerce, promoting tourism and aiding economic and social progress. However, there has been little study on the link between the operational environment and airline efficiency. Investigating the amalgamation of institutions, organisations and strategic decisions is critical to understanding how airlines operate efficiently. This research aims to employ the strategy tripod perspective to investigate the efficiency of a global airline sample using a non-parametric linear programming method (data envelopment analysis [DEA]). Using a Tobit regression, the bootstrapped DEA efficiency change scores are further regressed to determine the drivers of efficiency. The strategy tripod is employed to assess the impact of institutions, industry and resources on airline efficiency. Institutions are measured by global indices of destination attractiveness; industry, including competition, jet fuel and business model; and finally, resources, such as the number of full-time employees, alliances, ownership and connectivity. The first part of the study uses panel data from 35 major airlines, collected from their annual reports for the period 2011 to 2018, and country attractiveness indices from global indicators. The second part of the research involves a qualitative data collection approach and semi-structured interviews with experts in the field to evaluate the impact of COVID-19 on the first part’s significant findings. The main findings reveal that airlines operate at a highly competitive level regardless of their competition intensity or origin. Furthermore, the unpredictability of the environment complicates airline operations. The efficiency drivers of an airline are partially determined by its type of business model, its degree of cooperation and how fuel cost is managed. Trade openness has a negative influence on airline efficiency. COVID-19 has toppled the airline industry, forcing airlines to reconsider their business model and continuously increase cooperation. Human resources, sustainability and alternative fuel sources are critical to airline survival. Finally, this study provides some evidence for the practicality of the strategy tripod and hints at the need for a broader approach in the study of international strategies

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well

    On factor models for high-dimensional time series

    Get PDF
    The aim of this thesis is to develop statistical methods for use with factor models for high-dimensional time series. We consider three broad areas: estimation, changepoint detection, and determination of the number of factors. In Chapter 1, we sketch the backdrop for our thesis and review key aspects of the literature. In Chapter 2, we develop a method to estimate the factors and parameters in an approximate dynamic factor model. Specifically, we present a spectral expectation-maximisation (or \spectral EM") algorithm, whereby we derive the E and M step equations in the frequency domain. Our E step relies on the Wiener-Kolmogorov smoother, the frequency domain counterpart of the Kalman smoother, and our M step is based on maximisation of the Whittle Likelihood with respect to the parameters of the model. We initialise our procedure using dynamic principal components analysis (or \dynamic PCA"), and by leveraging results on lag-window estimators of spectral density by Wu and Zaffaroni (2018), we establish consistency-with-rates of our spectral EM estimator of the parameters and factors as both the dimension (N) and the sample size (T) go to infinity. We find rates commensurate with the literature. Finally, we conduct a simulation study to numerically validate our theoretical results. In Chapter 3, we develop a sequential procedure to detect changepoints in an approximate static factor model. Specifically, we define a ratio of eigenvalues of the covariance matrix of N observed variables. We compute this ratio each period using a rolling window of size m over time, and declare a changepoint when its value breaches an alarm threshold. We investigate the asymptotic behaviour (as N;m ! 1) of our ratio, and prove that, for specific eigenvalues, the ratio will spike upwards when a changepoint is encountered but not otherwise. We use a block-bootstrap to obtain alarm thresholds. We present simulation results and an empirical application based on Financial Times Stock Exchange 100 Index (or \FTSE 100") data. In Chapter 4, we conduct an exploratory analysis which aims to extend the randomised sequential procedure of Trapani (2018) into the frequency domain. Specifically, we aim to estimate the number of dynamically loaded factors by applying the test of Trapani (2018) to eigenvalues of the estimated spectral density matrix (as opposed to the covariance matrix) of the data

    Measuring the impact of COVID-19 on hospital care pathways

    Get PDF
    Care pathways in hospitals around the world reported significant disruption during the recent COVID-19 pandemic but measuring the actual impact is more problematic. Process mining can be useful for hospital management to measure the conformance of real-life care to what might be considered normal operations. In this study, we aim to demonstrate that process mining can be used to investigate process changes associated with complex disruptive events. We studied perturbations to accident and emergency (A &E) and maternity pathways in a UK public hospital during the COVID-19 pandemic. Co-incidentally the hospital had implemented a Command Centre approach for patient-flow management affording an opportunity to study both the planned improvement and the disruption due to the pandemic. Our study proposes and demonstrates a method for measuring and investigating the impact of such planned and unplanned disruptions affecting hospital care pathways. We found that during the pandemic, both A &E and maternity pathways had measurable reductions in the mean length of stay and a measurable drop in the percentage of pathways conforming to normative models. There were no distinctive patterns of monthly mean values of length of stay nor conformance throughout the phases of the installation of the hospital’s new Command Centre approach. Due to a deficit in the available A &E data, the findings for A &E pathways could not be interpreted

    Introduction to Competition Economics

    Get PDF
    The book is intended to be a reference book of Competition Economics for economists, consultants and/or practitioners. It is a modern review of demand and supply estimation, market structure, merger analysis, damage estimation, welfare loss, abuse of dominance, network effects, and a math and statistics review. Faced with potential multibillion fines, and thousands of damage claims firms are hiring and paying high fees to comply, defend or claim in antitrust cases. Complex economic and statistical issues appear in most cases and all the parties involved in cases are expected to have a good knowledge of them. This book tries to cover a demand of practitioners for a compact introductory level book on this field

    Heterogeneity in flood risk valuation and estimation from county to continental scales

    Get PDF
    Flood risk management in the U.S. has contributed to overdevelopment in at-risk areas, increases in flood losses over time, significant deficits in federal emergency programs, and inequitable outcomes to households and communities. Addressing these issues in a cost-effective and socially equitable manner relies on the ability of policy analysts to identify and understand complex interactions that characterize coupled natural-human systems, and tools for accurate estimates of the risks that arise from these interactions. This dissertation addresses this need by developing and investigating a flood risk analysis system that integrates data on property locations, assessments and transactions, high resolution flood hazard models, and flood risk policy and impacts across the coterminous United States. We focus on the degree to which markets accurately value their exposure to flooding and its impacts, and the accuracy of procedures and tools to estimate flood losses. In the first chapter, we identify heterogeneous valuation of storm risk in the Florida Keys that depends on the presence of structural defense and proximity to damaged homes after Hurricane Irma. This result suggests that stranded assets, properties with increasing vulnerability to storms but unable to rebuild structures and recover wealth, and overvalued assets at risk, which raise disaster costs, can occur simultaneously. This runs counter to the common framing of competing drivers of observed market valuation. In the second, we show that conventional methods employed in flood loss assessments to achieve large spatial scales introduce large aggregation bias by sacrificing spatial resolution in inputs. This investigation adds important context to published risk assessments and highlights opportunities to improve flood loss estimation uncertainty quantification which can support more cost effective and equitable management. In the final chapter, we conduct a nationwide study to contrast the predictive accuracy of predominantly used U.S. agency flood damage prediction models and empirical alternatives using data on 846 K observed flood losses to single-family homes from 446 flood events. We find that U.S. agency models mischaracterize the relationships of losses at the lowest low and high inundation depths, for high-valued structures, and structures with basements. Evaluated alternatives improve mean accuracy on these dimensions. In extrapolation to 72.4 M single-family homes in the U.S., these differences translate into markedly different predictions of U.S.-wide flood damages to single-family homes. The results from this dissertation provide an improved empirical foundation for flood risk management that relies on the valuation and estimation of flood risk from county to continental scales

    Water scarcity and user behavior:Economics of Cooperation under extraction caps

    Get PDF
    Fresh water is a scarce and depletable resource that has been studied by analyzing declinations of groundwater tables in various regions throughout the world. Climate change effects on water resources are pressing all types of water users to implement adaptation measures. So far, the management of groundwater has been mainly studied from the supply-side and engineering perspectives. This is necessary, but not sufficient to solve the problem of overexploitation of groundwater resource. There has been less research on the demand side of the problem, on how to induce cooperation among users to conserve water resources. Water scarcity in a location results when extraction rates of users, exceed the available water stock and the recharge capacity of the aquifer. Therefore, adaptation to water scarcity depends on how the water users adjust their water extraction - over time - to the recharge capacity of the aquifer. This requires water users to have knowledge on water extraction volumes of all water users of the aquifer, and the recharging capacity of the aquifer. Based on this information, water users might consider the connection between water inflows, outflows and stock determinants of the water balance, as a key concept for sustainability of ground water resource management. This research was focused upon the demand side of water scarcity in three Colombian municipalities Corozal (Sucre), Guamal (Magdalena), and Riohacha (La Guajira) with the objective to better understand the nature of cooperation among water users. This researcher analyzed drivers of cooperation, behavior and institutional mechanisms, using complementary lenses of common pool resource theory, behavioral economics and institutional economics. This general research question used for structuring this research was: 1. How does information on water scarcity affects the extraction behavior of water users, and how can current information provision strategies be improved? Subquestions involve: 2. What are the main drivers and inhibitors of cooperation among water users in water management systems in dry regions?3. How do social rules coexist with legal rules in the overexploitation of aquifers in dry regions?4. How does egoistic behavior and free riding from neighbor users affect collective action in the adaptation to climate variability?The research strategy to collect empirical data involved field experiments, review of historical documents on institutional developments in water management in Sucre and la Guajira, and interviews of water users. Experimental sessions were designed to understand the decision-making processes of farmers, by providing them information on competing extraction sources and information on well capacity. The effect of information on decision-making was measured as part of the experiments. For each type of information, two experimental groups, were organized: (i) information on water extraction quantity was provided to all participants and free communication was allowed, and (ii) information on time remaining before aquifer exhaustion. In the two control groups, as part of the experiment, communication among participants was limited and also, allowed to test the effects of the possibility to design agreed upon decisions on extractions.The field experiments were implemented as games in which players were asked to allocate water caps under diverse scenarios of depletion including suggestions to extract a balanced volume of water or take into account the remaining time for sustainable aquifer management. Participants were asked to allocate water resources for their current and future use, for themselves and their neighbors. Collaborative behavior of participants was tested by measuring compliance with suggested water extraction caps. In total 62 farmers representing 10 communities participated in the field experiments, took part in 668 experimental rounds, based upon 2,670 observations used for empirical data analysis. The qualitative analyses included 40 semi-structured interviews with selected participants. Both quantitative analyses of data obtained through the field experiments, and qualitative data resulting from semi-structured interviews, provided the evidence for answering the research questions<br/
    corecore