188 research outputs found

    Fortune Favors the Bold: Evidence from an Emerging Market Bank Merger

    Get PDF
    This study analyses one of the largest and controversial mergers from the Philippine Banking sector and links it to M&A theories: Firstly, this study survey prior M&A literature to identify merger motivations, synergies and factors affecting merger outcomes. Secondly, this study conducts a case study to link prior literature to a merger in an emerging economy. This merger provides an ideal setting for a case study, subsequent links to M&A theories and generalizable lessons for future bank mergers in emerging markets. Furthermore, this study identifies key factors and steps taken by the acquiring bank management to obtain success such as doubling net income, assets and becoming the number one bank in Philippines post-merger

    Ultrasound Nerve Segmentation Using Deep Probabilistic Programming

    Get PDF
    Deep probabilistic programming concatenates the strengths of deep learning to the context of probabilistic modeling for efficient and flexible computation in practice. Being an evolving field, there exist only a few expressive programming languages for uncertainty management. This paper discusses an application for analysis of ultrasound nerve segmentation-based biomedical images. Our method uses the probabilistic programming language Edward with the U-Net model and generative adversarial networks under different optimizers. The segmentation process showed the least Dice loss ("‘0.54) and the highest accuracy (0.99) with the Adam optimizer in the U-Net model with the least time consumption compared to other optimizers. The smallest amount of generative network loss in the generative adversarial network model gained was 0.69 for the Adam optimizer. The Dice loss, accuracy, time consumption and output image quality in the results show the applicability of deep probabilistic programming in the long run. Thus, we further propose a neuroscience decision support system based on the proposed approach

    Big Data Analytics using Small Datasets: Machine Learning for Early Breast Cancer Detection

    Get PDF
    In US breast cancer happens to possess the highest death rate apart from lung cancer. As of 2019, on average, 1 in 8 US women (approx. 12%) would develop invasive breast cancer at some point during her life. These statistics highlight the importance of early detection for increasing the mortality of patients. In recent years, machine learning (ML) techniques begin to play a key role in healthcare, especially as a diagnostic aid. In the case of breast cancer, ML techniques can be used to distinguish between malignant and benign tumours for enabling early detection. Moreover, accurate classification can assist physicians to guide patients and prescribe relevant treatment. Given this background, the objective of this paper is to apply ML algorithms to classify breast cancer outcomes. In this study, we build a platform using Ridge, AdaBoost, Gradient Boost, Random Forest, Principle Component Analysis (PCA) plus Ridge, and Neural Network ML algorithms for early breast cancer outcome detection. As a traditional benchmark technique, we use logistic regression model to compare against our chosen ML algorithms. We utilise the Wisconsin Breast Cancer Database (WBCD) dataset (Dua and Graff 2019). Although ML is generally deployed with large datasets, we highlight their usefulness and feasibility for small datasets in this study of only 30 features. We contribute to literature by providing a platform that will enable (a) big data analytics using small datasets and (b) high accuracy breast cancer outcome classifications. Specifically, we identify most important features in breast cancer outcome classification from a wide range of ML algorithms with a small dataset. This would enable health practitioners and patients to focus on these key features in their decision making for future breast cancer tests and subsequent early detection thus reducing analysis and decision latencies. In our ML based breast cancer classification platform, the user is required to make three function calls: data pre-processor, model generator and a single test. The pre-processor cleans the raw dataset from the user by removing \u27NaN\u27 and empty values, and it follows further instructions from a configuration file. After the pre-processing, the platform can train ML models from model generator based on two inputs, a cleaned dataset and a configuration file. Model generator creates different models from different ML algorithms specified in the study and generates corresponding evaluations. As such, the user can call single test to use the generated models in making predictions

    The impacts of commercial lease structures on landlord and tenant leasing behaviours and experiences

    Full text link
    The commercial property market in New Zealand is characterized by two standard but distinct lease environments. In Auckland, the commercial core of the economy, net leases dominate, whereas in Wellington, the political capital, gross leases are dominant. These different lease environments have the potential to strongly influence the nature of landlord and tenant relationships in these markets. Using in-depth interviews with key industry personnel, this study examines the perceptions, behaviours, experiences and key issues confronting landlords and tenants under net and gross leases. The paper examines how different lease structures affect the behavioural and attitudinal characteristics of landlords and tenants including: landlord/tenant perceptions of a lease, the operation and maintenance procedures, landlord-tenant relationship, and ultimately, overall satisfaction

    Africa's growth dividend? Lived poverty drops across much of the continent

    Get PDF
    Though Africa has recorded high levels of economic growth over the past decade, previous Afrobarometer surveys of citizens found little evidence that this growth had reduced levels of poverty in any consistent way (Dulani, Mattes, & Logan, 2013). However, new data from Afrobarometer Round 6, collected across 35 African countries, suggest a very different picture. While “lived poverty” remains pervasive across much of the continent, especially in Central and West Africa, we now see evidence that the decade of economic growth seems to have finally delivered broad-based reductions in poverty. “Lived poverty” (an index that measures the frequency with which people experience shortages of basic necessities) retreated across a broad range of countries. In the roughly three-year period between Round 5 (2011/2013) and Round 6 (2014/2015) surveys, our data suggest that “lived poverty” fell in 22 of 33 countries surveyed in both rounds. However, these changes show no systematic relation to recent rates of economic growth. While growing economies are undoubtedly important, what appears to be more important in improving the lives of ordinary people is the extent to which national governments and their donor partners put in place the type of development infrastructure that enables people to build better lives

    Black Hole Paradoxes: A Unified Framework for Information Loss

    Get PDF
    The black hole information loss paradox is a catch-all term for a family of puzzles related to black hole evaporation. For almost 50 years, the quest to elucidate the implications of black hole evaporation has not only sustained momentum, but has also become increasingly populated with proposals that seem to generate more questions than they purport to answer. Scholars often neglect to acknowledge ongoing discussions within black hole thermodynamics and statistical mechanics when analyzing the paradox, including the interpretation of Bekenstein-Hawking entropy, which is far from settled. To remedy the dialectical gridlock, I have formulated an overarching, unified framework, which I call ``Black Hole Paradoxes'', that integrates the debates and taxonomizes the relevant `camps' or philosophical positions. I demonstrate that black hole evaporation within Hawking's semi-classical framework insinuates how late-time Hawking radiation is an entangled global system, a contradiction in terms. The relevant forms of information loss are associated with a decrease in maximal Boltzmann entropy and an increase in global von Neumann entropy respectively, which engender what I've branded the ``paradox of phantom entanglement''. Prospective solutions are then tasked with demonstrating how late-time Hawking radiation is either exclusively an entangled subsystem, in which a black hole remnant lingers as an information safehouse, or exclusively an unentangled global system, in which information is evacuated to the exterior. The disagreement between safehouse and evacuation solutions boils down to the statistical interpretation of thermodynamic black hole entropy, i.e., Bekenstein-Hawking entropy. Safehouse solutions attribute Bekenstein-Hawking entropy to a minority of black hole degrees of freedom, those that are associated with the horizon. Evacuation solutions, in contrast, attribute Bekenstein-Hawking entropy to all black hole degrees of freedom. I argue that the interpretation of Bekenstein-Hawking entropy is the litmus test to vet the overpopulated proposal space. So long as any proposal rejecting Hawking's original calculation independently derives black hole evaporation, globally conserves degrees of freedom and entanglement, preserves a version of semi-classical gravity at sub-Planckian scales, and describes black hole thermodynamics in statistical terms, then it counts as a genuine solution to the paradox

    Black Hole Paradoxes: A Unified Framework for Information Loss

    Get PDF
    The black hole information loss paradox is a catch-all term for a family of puzzles related to black hole evaporation. For almost 50 years, the quest to elucidate the implications of black hole evaporation has not only sustained momentum, but has also become increasingly populated with proposals that seem to generate more questions than they purport to answer. Scholars often neglect to acknowledge ongoing discussions within black hole thermodynamics and statistical mechanics when analyzing the paradox, including the interpretation of Bekenstein-Hawking entropy, which is far from settled. To remedy the dialectical gridlock, I have formulated an overarching, unified framework, which I call “Black Hole Paradoxes”, that integrates the debates and taxonomizes the relevant ‘camps’ or philosophical positions. I demonstrate that black hole evaporation within Hawking’s semi-classical framework insinuates how late-time Hawking radiation is an entangled global system, a contradiction in terms. The relevant forms of information loss are associated with a decrease in maximal Boltzmann entropy and an increase in global von Neumann entropy respectively, which engender what I’ve branded the “paradox of phantom entanglement”. Prospective solutions are then tasked with demonstrating how late-time Hawking radiation is either exclusively an entangled subsystem, in which a black hole remnant lingers as an information safehouse, or exclusively an unentangled global system, in which information is evacuated to the exterior. The disagreement between safehouse and evacuation solutions boils down to the statistical interpretation of thermodynamic black hole entropy, i.e., Bekenstein-Hawking entropy. Safehouse solutions attribute Bekenstein-Hawking entropy to a minority of black hole degrees of freedom, those that are associated with the horizon. Evacuation solutions, in contrast, attribute Bekenstein-Hawking entropy to all black hole degrees of freedom. I argue that the interpretation of Bekenstein-Hawking entropy is the litmus test to vet the overpopulated proposal space. So long as any proposal rejecting Hawking’s original calculation independently derives black hole evaporation, globally conserves degrees of freedom and entanglement, preserves a version of semi-classical gravity at sub-Planckian scales, and describes black hole thermodynamics in statistical terms, then it counts as a genuine solution to the paradox

    EDUCATION AND JOB MATCH: REVISITED

    Get PDF
    To study the changes in the effect of degree field on mismatch and the change in the effect of mismatch of wages over time, we revisit a study by Robst (2006) who found that workers who are mismatched earn less than adequately match workers with the same amount schooling. Using recent data from 2015 National Survey of College Graduate (NSCG), we also find a negative relationship between the case of mismatch and the outcome of workers in term of wages, even though the degree of mismatch doesn’t seem to matter as much
    • 

    corecore