4,325 research outputs found

    Origins of the Quantum Efficiency Duality In the Primary Photochemical Event of Bacteriorhodopsin

    Get PDF
    Experimental and theoretical evidence is presented which suggests that two distinct forms of light-adapted bacteriorhodopsin may exist. We propose that these two forms have characteristic photocycles with significantly different primary quantum yields. INDO-PSDCI molecular orbital procedures and semiempirical molecular dynamics simulations predict that one ground state geometry of bR undergoes photochemistry with a primary quantum yield, Φ1, of ~ 0.27, and that a second ground state geometry, with a slightly displaced counterion, yields Φ1 ~ 0.74. This theoretical model is supported by the observation that literature measurements of Φ1 tend to fall into one of two categories- those that observe Φ1 ~ 0.33 or below, and those that observe Φ1 ~ 0.6 or above. The observation that all photostationary state measurements of the primary quantum yield give values near 0.3, and all direct measurements of the quantum yield result in values near 0.6, suggests that photochemical back reactions may select the bacteriorhodopsin conformation with the lower quantum yield. The two photocycles may have developed as a natural biological requirement that the bacterium have the capacity to adjust the efficiency of the photocycle in relation to the intensity of light and/or membrane electrochemical gradient

    Hierarchical Organization in Complex Networks

    Full text link
    Many real networks in nature and society share two generic properties: they are scale-free and they display a high degree of clustering. We show that these two features are the consequence of a hierarchical organization, implying that small groups of nodes organize in a hierarchical manner into increasingly large groups, while maintaining a scale-free topology. In hierarchical networks the degree of clustering characterizing the different groups follows a strict scaling law, which can be used to identify the presence of a hierarchical organization in real networks. We find that several real networks, such as the World Wide Web, actor network, the Internet at the domain level and the semantic web obey this scaling law, indicating that hierarchy is a fundamental characteristic of many complex systems

    Extending the Canada-France brown Dwarfs Survey to the near-infrared: first ultracool brown dwarfs from CFBDSIR

    Full text link
    We present the first results of the ongoing Canada-France Brown Dwarfs Survey-InfraRed, hereafter CFBDSIR, a Near InfraRed extension to the optical wide-field survey CFBDS. Our final objectives are to constrain ultracool atmosphere physics by finding a statistically significant sample of objects cooler than 650K and to explore the ultracool brown dwarf mass function building on a well defined sample of such objects. Candidates are identified in CFHT/WIRCam J and CFHT/MegaCam z' images using optimised psf-fitting, and we follow them up with pointed near infrared imaging with SOFI at NTT. We finally obtain low resolution spectroscopy of the coolest candidates to characterise their atmospheric physics. We have so far analysed and followed up all candidates on the first 66 square degrees of the 335 square degrees survey. We identified 55 T-dwarfs candidates with z'-J > 3:5 and have confirmed six of them as T-dwarfs, including 3 that are strong later-than-T8 candidates, based on their far-red and NIR colours. We also present here the NIR spectra of one of these ultracool dwarfs, CFBDSIR1458+1013 which confirms it as one of the coolest brown dwarf known, possibly in the 550-600K temperature range. From the completed survey we expect to discover 10 to 15 dwarfs later than T8, more than doubling the known number of such objects. This will enable detailed studies of their extreme atmospheric properties and provide a stronger statistical base for studies of their luminosity function.Comment: A&A, Accepte

    From white elephant to Nobel Prize: Dennis Gabor’s wavefront reconstruction

    Get PDF
    Dennis Gabor devised a new concept for optical imaging in 1947 that went by a variety of names over the following decade: holoscopy, wavefront reconstruction, interference microscopy, diffraction microscopy and Gaboroscopy. A well-connected and creative research engineer, Gabor worked actively to publicize and exploit his concept, but the scheme failed to capture the interest of many researchers. Gabor’s theory was repeatedly deemed unintuitive and baffling; the technique was appraised by his contemporaries to be of dubious practicality and, at best, constrained to a narrow branch of science. By the late 1950s, Gabor’s subject had been assessed by its handful of practitioners to be a white elephant. Nevertheless, the concept was later rehabilitated by the research of Emmett Leith and Juris Upatnieks at the University of Michigan, and Yury Denisyuk at the Vavilov Institute in Leningrad. What had been judged a failure was recast as a success: evaluations of Gabor’s work were transformed during the 1960s, when it was represented as the foundation on which to construct the new and distinctly different subject of holography, a re-evaluation that gained the Nobel Prize for Physics for Gabor alone in 1971. This paper focuses on the difficulties experienced in constructing a meaningful subject, a practical application and a viable technical community from Gabor’s ideas during the decade 1947-1957

    Good practices for a literature survey are not followed by authors while preparing scientific manuscripts

    Full text link
    The number of citations received by authors in scientific journals has become a major parameter to assess individual researchers and the journals themselves through the impact factor. A fair assessment therefore requires that the criteria for selecting references in a given manuscript should be unbiased with respect to the authors or the journals cited. In this paper, we advocate that authors should follow two mandatory principles to select papers (later reflected in the list of references) while studying the literature for a given research: i) consider similarity of content with the topics investigated, lest very related work should be reproduced or ignored; ii) perform a systematic search over the network of citations including seminal or very related papers. We use formalisms of complex networks for two datasets of papers from the arXiv repository to show that neither of these two criteria is fulfilled in practice

    Statistical mechanics of complex networks

    Get PDF
    Complex networks describe a wide range of systems in nature and society, much quoted examples including the cell, a network of chemicals linked by chemical reactions, or the Internet, a network of routers and computers connected by physical links. While traditionally these systems were modeled as random graphs, it is increasingly recognized that the topology and evolution of real networks is governed by robust organizing principles. Here we review the recent advances in the field of complex networks, focusing on the statistical mechanics of network topology and dynamics. After reviewing the empirical data that motivated the recent interest in networks, we discuss the main models and analytical tools, covering random graphs, small-world and scale-free networks, as well as the interplay between topology and the network's robustness against failures and attacks.Comment: 54 pages, submitted to Reviews of Modern Physic
    corecore