9,311 research outputs found

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    Boundary Spanner Corruption in Business Relationships

    Get PDF
    Boundary spanner corruption—voluntary collaborative behaviour between individuals representing different organisations that violates their organisations’ norms—is a serious problem in business relationships. Drawing on insights from the literatures on general corruption perspectives, the dark side of business relationships and deviance in sales and service organisations, this dissertation identifies boundary spanner corruption as a potential dark side complication inherent in close business relationships It builds research questions from these literature streams and proposes a research structure based upon commonly used methods in corruption research to address this new concept. In the first study, using an exploratory survey of boundary spanner practitioners, the dissertation finds that the nature of boundary spanner corruption is broad and encompasses severe and non-severe types. The survey also finds that these deviance types are prevalent in a widespread of geographies and industries. This prevalence is particularly noticeable for less-severe corruption types, which may be an under-researched phenomenon in general corruption research. The consequences of boundary spanner corruption can be serious for both individuals and organisations. Indeed, even less-severe types can generate long-term negative consequences. A second interview-based study found that multi-level trust factors could also motivate the emergence of boundary spanner corruption. This was integrated into a theoretical model that illustrates how trust at the interpersonal, intraorganisational, and interorganisational levels enables corrupt behaviours by allowing deviance-inducing factors stemming from the task environment or from the individual boundary spanner to manifest in boundary spanner corruption. Interpersonal trust between representatives of different organisations, interorganisational trust between these organisations, and intraorganisational agency trust of management in their representatives foster the development of a boundary-spanning social cocoon—a mechanism that can inculcate deviant norms leading to corrupt behaviour. This conceptualisation and model of boundary spanner corruption highlights intriguing directions for future research to support practitioners engaged in a difficult problem in business relationships

    Raising Critical Consciousness in Engineering Education: A Critical Exploration of Transformative Possibilities in Engineering Education and Research

    Get PDF
    This thesis represents a critical exploration of the opportunities, challenges, and barriers to enacting social justice via the engineering curriculum. Through an ethnographic case study of a British engineering for sustainable development course, I illuminate tensions and contradictions of attempts to “do good” while “doing engineering” in a higher education setting. This work is couched within critical and anti-colonial theoretical frames. Through critical and reflexive analysis, I illustrate attempts of participants to innovate in engineering education toward a counter-hegemonic engineering practice, and highlight transformative possibilities, as well as barriers. This case illustrates how the structures that formed modern engineering continue to shape engineering higher education, restraining attempts to transform engineering training for social good.A central question that has driven this work has been: Is it possible to cultivate a more socially just form of engineering practice through engineering higher education? The function of asking this question has been to interrogate a core assumption in engineering education research – that with the right blend of educational interventions, we can make strides towards social justice. My intent in interrogating this assumption is not to be nihilistic per se. I believe it is entirely possible that engineering could potentially be wielded for just cause and consequence. However, if we do not critically examine our core assumptions around this issue, we may also miss out on the possibility that socially just engineering is not achievable, at least in the way we are currently approaching it or in the current context within which it exists.An examination of this topic is already underway in the US context. However, it is under-explored in a British context. Given the different historical trajectories of engineering and engineering in higher education between these two contexts, a closer look at the British context is warranted

    Generalizations for Cell Biological Explanations: Distinguishing between Principles and Laws

    Get PDF
    Laws have figured in the development of modern biology (e.g. Mendelian laws of inheritance), but there is a tacit assumption particularly in contemporary cell and molecular biology that laws are only of the 'strict' kind (e.g. the laws of motion or universal gravitation), which cell biology appears to lack. Moreover, the cell-biology-specific non-universal laws that do exist (e.g. scaling laws in biochemical networks within single cells) are few and far between. As discussed elsewhere (and not further argued for in this paper), mechanistic explanations, which are the dominant kind of explanation in cell biology, face significant challenges and their utility has been checkered in different biomedical areas. Just as laws and mechanisms figure in explanations in organic chemistry and ecology, fields that deal with lower- and higher-scale phenomena compared to cell biology, respectively, it should not be assumed that cell biology is somehow in a unique position where few or no laws could be discovered and used in its explanations. An impediment to discovering lawlike generalizations in cell biology is that the understanding of many cellular phenomena is still quite qualitative and imprecise. This paper is motivated by the premise that mechanisms and laws can both be in the foreground of explanations in cell biology and that a framework should be developed to encourage and facilitate the discovery of laws specific to and operative at the individual cell level. To that end, in the domain of scientifically-relevant non-universal (i.e. non-exceptionless) generalizations, which some philosophers equate with the notion of ceteris paribus laws (henceforth, 'cp-laws'), I propose that a cp-law might have one or more corresponding 'principles'. Using a running example of generalizations of oscillatory movements from physics with direct relevance to cell biology, I argue that while a cp-law and its paired principle(s) might have the same explanatory theme (e.g. explain the same phenomenon), a principle is broader in scope compared to its paired cp-law but less expectable or reliable in its predictions. This is because principles appear to be more qualitative and less numerically precise compared to cp-laws, reflective of our lack of precise understanding of the systems to which the generalizations apply. The principles–laws concept makes for a more lenient approach for what could count as a lawlike generalization and can encourage the discovery of novel generalizations in areas of cell biology where no specific generalizations typically figure in explanations. A principle could be thought of as providing a program for explanation, whereas its paired law provides explanations for specific instances. Newly posited principles could augment mechanistic explanations and also potentially lead to the discovery of corresponding cp-laws

    Evaluating central bank asset purchases in a term structure model with a forward-looking supply factor

    Get PDF
    La literatura teĂłrica de modelos de curva de tipos enfatiza la importancia de la absorciĂłn de riesgo de duraciĂłn esperada durante la vida residual de los bonos para entender el efecto de las compras de activos de los bancos centrales sobre las curvas de tipos. Motivados por esto, construimos una medida de oferta esperada, a horizontes de largo plazo, de bonos soberanos del ĂĄrea del euro neta de tenencias del Eurosistema, y la empleamos para estimar el impacto de los programas de compra de activos del BCE en un modelo afĂ­n de curva de tipos sin arbitraje. Encontramos que un shock de compra de activos equivalente al 10 % del PIB del ĂĄrea del euro reduce el tipo medio a diez años de los cuatro grandes paĂ­ses del ĂĄrea del euro en 59 puntos bĂĄsicos (pb) y la prima de plazo asociada en 50 pb. Aplicando el modelo a la curva de tipos libre de riesgo (OIS), el mismo shock reduce el tipo a diez años y la prima de plazo en 35 pb y 26 pb, respectivamente.The theoretical literature on term structure models emphasises the importance of the expected absorption of duration risk during the residual life of term bonds in order to understand the yield curve effect of central banks’ government bond purchases. Motivated by this, we develop a forward-looking, long-horizon measure of euro area government bond supply net of Eurosystem holdings, and use it to estimate the impact of the ECB’s asset purchase programmes in the context of a no-arbitrage affine term structure model. We find that an asset purchase shock equivalent to 10% of euro area GDP lowers the 10-year average yield of the euro area big four by 59 basis points (bp) and the associated term premium by 50 bp. Applying the model to the risk-free (OIS) yield curve, the same shock lowers the 10-year rate and term premium by 35 and 26 bp, respectively

    Corporate Social Responsibility: the institutionalization of ESG

    Get PDF
    Understanding the impact of Corporate Social Responsibility (CSR) on firm performance as it relates to industries reliant on technological innovation is a complex and perpetually evolving challenge. To thoroughly investigate this topic, this dissertation will adopt an economics-based structure to address three primary hypotheses. This structure allows for each hypothesis to essentially be a standalone empirical paper, unified by an overall analysis of the nature of impact that ESG has on firm performance. The first hypothesis explores the evolution of CSR to the modern quantified iteration of ESG has led to the institutionalization and standardization of the CSR concept. The second hypothesis fills gaps in existing literature testing the relationship between firm performance and ESG by finding that the relationship is significantly positive in long-term, strategic metrics (ROA and ROIC) and that there is no correlation in short-term metrics (ROE and ROS). Finally, the third hypothesis states that if a firm has a long-term strategic ESG plan, as proxied by the publication of CSR reports, then it is more resilience to damage from controversies. This is supported by the finding that pro-ESG firms consistently fared better than their counterparts in both financial and ESG performance, even in the event of a controversy. However, firms with consistent reporting are also held to a higher standard than their nonreporting peers, suggesting a higher risk and higher reward dynamic. These findings support the theory of good management, in that long-term strategic planning is both immediately economically beneficial and serves as a means of risk management and social impact mitigation. Overall, this contributes to the literature by fillings gaps in the nature of impact that ESG has on firm performance, particularly from a management perspective

    On the Mechanism of Building Core Competencies: a Study of Chinese Multinational Port Enterprises

    Get PDF
    This study aims to explore how Chinese multinational port enterprises (MNPEs) build their core competencies. Core competencies are firms’special capabilities and sources to gain sustainable competitive advantage (SCA) in marketplace, and the concept led to extensive research and debates. However, few studies include inquiries about the mechanisms of building core competencies in the context of Chinese MNPEs. Accordingly, answers were sought to three research questions: 1. What are the core competencies of the Chinese MNPEs? 2. What are the mechanisms that the Chinese MNPEs use to build their core competencies? 3. What are the paths that the Chinese MNPEs pursue to build their resources bases? The study adopted a multiple-case study design, focusing on building mechanism of core competencies with RBV. It selected purposively five Chinese leading MNPEs and three industry associations as Case Companies. The study revealed three main findings. First, it identified three generic core competencies possessed by Case Companies, i.e., innovation in business models and operations, utilisation of technologies, and acquisition of strategic resources. Second, it developed the conceptual framework of the Mechanism of Building Core Competencies (MBCC), which is a process of change of collective learning in effective and efficient utilization of resources of a firm in response to critical events. Third, it proposed three paths to build core competencies, i.e., enhancing collective learning, selecting sustainable processes, and building resource base. The study contributes to the knowledge of core competencies and RBV in three ways: (1) presenting three generic core competencies of the Chinese MNPEs, (2) proposing a new conceptual framework to explain how Chinese MNPEs build their core competencies, (3) suggesting a solid anchor point (MBCC) to explain the links among resources, core competencies, and SCA. The findings set benchmarks for Chinese logistics industry and provide guidelines to build core competencies

    Developing novel measures and treatments for gambling disorder

    Get PDF
    Background: While gambling is an activity that seems to have entertained humanity for millennia, it is less clear why problematic gambling behavior may persist despite obvious negative consequences, from a research and clinical perspective. With the introduction of the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM–5), gambling was equated with alcohol and drug use and labeled an addictive disorder, Gambling Disorder (GD). Problem gambling is associated with destroyed careers, broken marriages, financial ruin, and psychiatric comorbidities. Still, research on gambling can be described as a field still in its infancy, with a need to conduct further gambling research on measurement and treatment procedures. Aims: The overall aim for the thesis was to develop and evaluate measures and treatments for Gambling Disorder. ‱ The aims of Study I were to reach a consensus regarding a specific set of potential new measurement items, to yield a testable draft version of a new gambling measure, and to establish preliminary construct and face validity for this novel gambling measure, the Gambling Disorder Identification Test (GDIT). ‱ The aim of Study II was to evaluate psychometric properties (e.g., internal consistency and test-retest reliability, factor structure, convergent and discriminant validity, as well as diagnostic accuracy) of the GDIT, among treatment- and support-seeking samples (n = 79 and n = 185), self-help groups (n = 47), and a population sample (n = 292). ‱ The aim of Study III was to formulate hypotheses on the maintenance of GD by identifying clinically relevant behaviors at an individual level, among six treatmentseeking participants with GD. This qualitative study was conducted as a preparatory step to develop the iCBTG (see Study IV). ‱ The aim of Study IV was to evaluate acceptability and clinical effectiveness of the newly developed iCBTG, among treatment seeking-patients with GD (n = 23) in routine care. A further aim was to evaluate research feasibility of using existing healthcare infrastructure to deliver the iCBTG program. Methods: In Study I, gambling experts from ten countries rated 30 items proposed for inclusion in the GDIT, in a two-round Delphi (n = 61; n = 30). Three following consensus meetings including gambling researchers and clinicians (n = 10; n = 4; n = 3), were held to solve item-related issues and establish a GDIT draft version. To evaluate face validity, the GDIT draft version was presented to individuals with experience of problem gambling (n = 12) and to treatment-seeker participants with Gambling Disorder (n = 8). In Study II, the psychometric properties of the GDIT were evaluated among gamblers (N = 603), recruited from treatment- and support-seeking contexts (n = 79; n = 185), self-help groups (n = 47), and a population sample (n = 292). The participants completed self-report measures, a GDIT retest (n = 499) and a diagnostic semi- structured interview assessing GD (n = 203). In Study III, treatment-seeking patients with GD and various additional psychiatric symptom profiles (n = 6), were interviewed using an in-depth semi-structured functional interview. Participants also completed self-report measures assessing gambling behavior. A qualitative thematic analysis was performed using functional analysis as a theoretical framework. Following completion of Study III, the results were synthesized with existing experimental evidence on gambling behavior and used to develop the novel treatment model and internet-delivered treatment evaluated in Study IV, i.e., the iCBTG. In Study IV, a non-randomized preliminary evaluation of the novel iCBTG was conducted in parallel with implementation into routine addiction care, through the Support and Treatment platform (St d och behandlingsplattformen; ST platform). Feasibility was evaluated among a sample of treatment-seeking patients (N = 23), in terms of iCBTG adherence, acceptability and clinical effectiveness, and feasibility of using existing healthcare infrastructure for clinical delivery as well as research purposes. Results: Study I established preliminary face validity for the GDIT, as well as construct validity in relation to a researcher agreement from 2006 on measuring problem gambling, known as the Banff consensus. Study II showed excellent internal consistency reliability (α = .94) and test–retest reliability (6-16 days, intraclass correlation coefficient = 0.93) for the GDIT. Confirmatory factor analysis yielded factor loadings supporting the three proposed GDIT domains of gambling behavior, gambling symptoms, and negative consequences. Receiver operating characteristic curves (ROC) and clinical significance estimates were used to establish GDIT cut-off scores for recreational gambling (<15), problem gambling (15-19), and GD (any ≄20; mild 20-24; moderate 25-29; and severe ≄30). Study III yielded several functional categories for gambling behavior, as well as four main processes potentially important for treatment, i.e., access to money, anticipation, selective attention (focus) and chasing behaviors. Study IV showed that patient engagement in the iCBTG modules was comparable to previous internet-delivered cognitive behavioral treatment trials in the general population. The iCBTG was rated satisfactory in treatment credibility, expectancy, and satisfaction. Mixed effects modeling revealed a significant decrease in gambling symptoms during treatment (within-group effect size d=1.05 at follow-up), which correlated with changes in loss of control (in the expected direction of increased control). However, measurement issues related to the ST platform were also identified, which led to significant attrition in several measures. Conclusions: GDIT is a reliable and valid measure to assess GD and problem gambling. In addition, GDIT demonstrates high content validity relation to the Banff consensus. The iCBTG was developed to achieve a theoretically grounded and meaningful treatment model for GD. Preliminary estimates support acceptability and clinical effectiveness in “real world” settings, but further randomized controlled studies are warranted to ensure treatment efficacy

    Deciphering Regulation in Escherichia coli: From Genes to Genomes

    Get PDF
    Advances in DNA sequencing have revolutionized our ability to read genomes. However, even in the most well-studied of organisms, the bacterium Escherichia coli, for ≈ 65% of promoters we remain ignorant of their regulation. Until we crack this regulatory Rosetta Stone, efforts to read and write genomes will remain haphazard. We introduce a new method, Reg-Seq, that links massively-parallel reporter assays with mass spectrometry to produce a base pair resolution dissection of more than 100 E. coli promoters in 12 growth conditions. We demonstrate that the method recapitulates known regulatory information. Then, we examine regulatory architectures for more than 80 promoters which previously had no known regulatory information. In many cases, we also identify which transcription factors mediate their regulation. This method clears a path for highly multiplexed investigations of the regulatory genome of model organisms, with the potential of moving to an array of microbes of ecological and medical relevance.</p

    The Anthropocene Hypothesis

    Get PDF
    • 

    corecore