161 research outputs found

    Molecularly Resolved Electronic Landscapes of Differing Acceptor-Donor Interface Geometries

    Full text link
    Organic semiconductors are a promising class of materials for numerous electronic and optoelectronic applications, including solar cells. However, these materials tend to be extremely sensitive to the local environment and surrounding molecular geometry, causing the energy levels near boundaries and interfaces essential to device function to differ from those of the bulk. Scanning Tunneling Microscopy and Spectroscopy (STM/STS) has the ability to examine both the structural and electronic properties of these interfaces on the molecular and submolecular scale. Here we investigate the prototypical acceptor/donor system PTCDA/CuPc using sub-molecularly resolved pixel-by-pixel STS to demonstrate the importance of subtle changes in interface geometry in prototypical solar cell materials. PTCDA and CuPc were sequentially deposited on NaCl bilayers to create lateral heterojunctions that were decoupled from the underlying substrate. Donor and acceptor states were observed to shift in opposite directions suggesting an equilibrium charge transfer between the two. Narrowing of the gap energy compared to isolated molecules on the same surface are indicative of the influence of the local dielectric environment. Further, we find that the electronic state energies of both acceptor and donor are strongly dependent on the ratio and positioning of both molecules in larger clusters. This molecular-scale structural dependence of the electronic states of both interfacial acceptor and donor has significant implications for device design where level alignment strongly correlates to device performance

    Impact of student choice on academic performance:cross-sectional and longitudinal observations of a student cohort

    Get PDF
    BACKGROUND: Student choice plays a prominent role in the undergraduate curriculum in many contemporary medical schools. A key unanswered question relates to its impact on academic performance. METHODS: We studied 301 students who were in years 2 and 3 of their medical studies in 2005/06. We investigated the relationship between SSC grade and allocated preference. Separately, we examined the impact of ‘self-proposing’ (students designing and completing their own SSC) on academic performance in other, standard-set, summative assessments throughout the curriculum. The chi-squared test was used to compare academic performance in SSC according to allocated preference. Generalised estimating equations were used to investigate the effect of self-proposing on performance in standard-set examinations. RESULTS: (1) Performance in staff-designed SSC was not related to allocated preference. (2) Performance in year 1 main examination was one of the key predictors of performance in written and OSCE examinations in years 2, 3 and 4 (p<0.001). (3) The higher the score in the year 1 examination, the more likely a student was to self-propose in subsequent years (OR [CI] 1.07 [1.03-1.11], p<0.001). (4) Academic performance of students who self-proposed at least once in years 2 and/or 3 varied according to gender and year of course. CONCLUSION: In this study, no association was observed between allocated preference and SSC grade. The effect of self-proposing on academic performance in standard-set examinations was small. Our findings suggest instead that academically brighter students are more likely to design their own modules. Although student choice may have educational benefits, this report does not provide convincing evidence that it improves academic performance

    Detecting and defeating advanced man-in-the-middle attacks against TLS

    Get PDF
    Sobre los derechos de acceso: Permission to make digital or hard copies of this publication for internal use within NATO and for personal or educational use when for non-profi t or non-commercial purposes is granted providing that copies bear this notice and a full citation on the first page. Any other reproduction or transmission requires prior written permission by NATO CCD COE.TLS es un bloque esencial para la construcción de redes privadas. Un aspecto crítico para la seguridad de TLS es la autenticación y el intercambio de claves, que habitualmente se realiza mediante certificados. Un intercambio inseguro de claves puede conducir a un ataque de hombre en el medio (MITM). La confianza en los certificados se consigue habitualmente gracias a la utilización de una infraestructura de clave pública (PKI), que emplea autoridades de certificación (CA) de confianza para el establecimiento de cadenas de validez de certificados. En los últimos años, han surgido una serie de problemas relacionados con el uso del PKI: lo certificados pueden ser emitidos para cualquier entidad de Internet, con independencia de la posición de la CA en el árbol jerárquico. Esto implica que un ataque exitoso contra una CA tiene el potencial de permitir la generación de certificados válidos que posibilitarán la realización de ataques de hombre en el medio. No podemos descartar la posibilidad de usos malicioso de CA intermedias para llevar a cabo ataques dirigidos mediante la emisión de certificados ad-hoc, que serían extremadamente difíciles de detectar. La infraestructura PKI actual es susceptible a este tipo de ataques, por lo que se hace necesaria la creación de nuevos mecanismos para la detección y neutralización de los mismos. El IETF y otros organismos de estandarización han lanzado distintas iniciativas para posibilitar la detección de certificados falsificados. La mayoría de estas iniciativas intentan solucionar los problemas existentes mantenimiento el modelo PKI y agregando la técnica de 'certificate pinning', que asocia certificados concretos a servidores. Estas técnicas tienen limitaciones significativas, como la necesidad de un proceso de arranque seguro, o el establecimiento de la asociación para cada host de forma individual y uno por uno. Este trabajo proporciona una evolución desde el esquema de 'pinning' realizado en el host a un esquema de 'pinning' en la red, mediante la habilitación de mecanismos para la validación de certificados cuando atraviesan una red determinada. Los certificados se clasificarán como confiables o no como resultado del cruce de información obtenida de distintas fuentes. Esto resultaría en la detección temprana de certificados sospechosos y lanzaría mecanismos para rechazar el ataque, minimizar su impacto y recopilar información sobre los atacantes. Junto con lo anterior, se podría realizar un análisis más detallado y pormenorizado.TLS is an essential building block for virtual private networks. A critical aspect for the security of TLS dialogs is authentication and key exchange, usually performed by means of certificates. An insecure key exchange can lead to a man-in-the-middle attack (MITM). Trust in certificates is generally achieved using Public Key Infrastructures (PKIs), which employ trusted certificate authorities (CAs) to establish certificate validity chains. In the last years, a number of security concerns regarding PKI usage have arisen: certificates can be issued for entities in the Internet, regardless of its position in the CA hierarchy tree. This means that successful attacks on CAs have the potential to generate valid certificates enabling man-in-the-middle attacks. The possibility of malicious use of intermediate CAs to perform targeted attacks through ad-hoc certificates cannot be neglected and are extremely difficult to detect. Current PKI infrastructure for TLS is prone to MITM attacks, and new mechanisms for detection and avoidance of those attacks are needed. IETF and other standardization bodies have launched several initiatives to enable the detection of “forged” certificates. Most of these initiatives attempt to solve the existing problems by maintaining the current PKI model and using certificate pinning, which associates certificates and servers on use. These techniques have significant limitations, such as the need of a secure bootstrap procedure, or pinning requiring some host-by-host basis. This study proposes an evolution from pinning-in-the-host to pinning-in-the-net, by enabling mechanisms to validate certificates as they travel through a given network. Certificates would be classified as trusted or not trusted as a result of cross-information obtained from different sources. This would result in early detection of suspicious certificates and would trigger mechanisms to defeat the attack; minimize its impact; and gather information on the attackers. Additionally, a more detailed and thorough analysis could be performed

    Review of economic bubbles

    Get PDF
    © 2016 Elsevier Ltd. All rights reserved. This paper investigates the history of economic bubbles and attempts to identify whether there are direct correlations between different bubbles. To support this research, literature has been consulted on historical and recent bubbles, theories surrounding speculation, the market for venture capital, and bubbles in the technology sector. By analysing a range of bubbles, rather than just those in the technology sector, general bubble-principles are also identified. All the economic bubbles are classified under "uncontrolled risk" and a recommended method that can detect and analyse full impacts by uncontrolled risk will be presented, together with future directions to be discussed

    How roadway composition matters in analyzing police data on racial profiling

    Get PDF
    This article argues that roadway composition data is essential to the analysis of police behavior when studying racial profiling of motorists. Police data alone show only the number and proportion of stops of African American and White drivers. They do not show how these numbers relate to the number of African American and White drivers using the roads. Proxy measures, drawn from the number of African American residents or license holders, assume that all roads in the community contain the same proportion The authors acknowledge the generous assistance of the anonymous police department that provided access. The Roadway Observation Study was supported by the Oakland University research committee and Vice-Provost Randy Hanson. We thank our research assistant

    A Ranking System for Reference Libraries of DNA Barcodes: Application to Marine Fish Species from Portugal

    Get PDF
    BACKGROUND: The increasing availability of reference libraries of DNA barcodes (RLDB) offers the opportunity to the screen the level of consistency in DNA barcode data among libraries, in order to detect possible disagreements generated from taxonomic uncertainty or operational shortcomings. We propose a ranking system to attribute a confidence level to species identifications associated with DNA barcode records from a RLDB. Here we apply the proposed ranking system to a newly generated RLDB for marine fish of Portugal. METHODOLOGY/PRINCIPAL FINDINGS: Specimens (n = 659) representing 102 marine fish species were collected along the continental shelf of Portugal, morphologically identified and archived in a museum collection. Samples were sequenced at the barcode region of the cytochrome oxidase subunit I gene (COI-5P). Resultant DNA barcodes had average intra-specific and inter-specific Kimura-2-parameter distances (0.32% and 8.84%, respectively) within the range usually observed for marine fishes. All specimens were ranked in five different levels (A-E), according to the reliability of the match between their species identification and the respective diagnostic DNA barcodes. Grades A to E were attributed upon submission of individual specimen sequences to BOLD-IDS and inspection of the clustering pattern in the NJ tree generated. Overall, our study resulted in 73.5% of unambiguous species IDs (grade A), 7.8% taxonomically congruent barcode clusters within our dataset, but awaiting external confirmation (grade B), and 18.7% of species identifications with lower levels of reliability (grades C/E). CONCLUSION/SIGNIFICANCE: We highlight the importance of implementing a system to rank barcode records in RLDB, in order to flag taxa in need of taxonomic revision, or reduce ambiguities of discordant data. With increasing DNA barcode records publicly available, this cross-validation system would provide a metric of relative accuracy of barcodes, while enabling the continuous revision and annotation required in taxonomic work

    Effects of antiplatelet therapy on stroke risk by brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases: subgroup analyses of the RESTART randomised, open-label trial

    Get PDF
    Background Findings from the RESTART trial suggest that starting antiplatelet therapy might reduce the risk of recurrent symptomatic intracerebral haemorrhage compared with avoiding antiplatelet therapy. Brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases (such as cerebral microbleeds) are associated with greater risks of recurrent intracerebral haemorrhage. We did subgroup analyses of the RESTART trial to explore whether these brain imaging features modify the effects of antiplatelet therapy

    Increasing value and reducing waste in stroke research

    Get PDF
    Stroke represents a major burden to patients and society, and resources spent on stroke research must be used efficiently and produce good value in terms of improvements in human health. However, there are many examples of poor value from stroke research funding, which result from the way in which stroke research has been chosen, designed, conducted, analysed, regulated, managed, disseminated, or reported. In a project including a survey and a symposium and involving stroke researchers in the European Stroke Organisation we have sought to identify sources of inefficiency and waste, recommended approaches to increase value, and highlighted examples of best practice in stroke research. Recent evidence suggests that progress has been made, but there is room for much improvement, and stroke researchers, funders and other stakeholders might consider our recommendations when planning new research

    Why High Leverage is Optimal for Banks

    Get PDF
    Abstract Liquidity production is a central role of banks. High leverage is optimal for banks in a capital structure model in which there is a market premium for (socially valuable) liquid financial claims and no deviations fro
    corecore