618 research outputs found

    On Probabilistic Applicative Bisimulation and Call-by-Value λ\lambda-Calculi (Long Version)

    Get PDF
    Probabilistic applicative bisimulation is a recently introduced coinductive methodology for program equivalence in a probabilistic, higher-order, setting. In this paper, the technique is applied to a typed, call-by-value, lambda-calculus. Surprisingly, the obtained relation coincides with context equivalence, contrary to what happens when call-by-name evaluation is considered. Even more surprisingly, full-abstraction only holds in a symmetric setting.Comment: 30 page

    UBRI Photometry of Globular Clusters in the Leo Group Galaxy NGC 3379

    Full text link
    We present wide area UBRI photometry for globular clusters around the Leo group galaxy NGC 3379. Globular cluster candidates are selected from their B-band magnitudes and their (U-B)o vs (B-I)o colours. A colour-colour selection region was defined from photometry of the Milky Way and M31 globular cluster systems. We detect 133 globular cluster candidates which, supports previous claims of a low specific frequency for NGC 3379. The Milky Way and M31 reveal blue and red subpopulations, with (U-B)o and (B-I)o colours indicating mean metallicities similar to those expected based on previous spectroscopic work. The stellar population models of Maraston (2003) and Brocato etal (2000) are consistent with both subpopulations being old, and with metallicities of [Fe/H] \~ -1.5 and -0.6 for the blue and red subpopulations respectively. The models of Worthey (1994) do not reproduce the (U-B)o colours of the red (metal-rich) subpopulation for any modelled age. For NGC 3379 we detect a blue subpopulation with similar colours and presumably age/metallicity, to that of the Milky Way and M31 globular cluster systems. The red subpopulation is less well defined, perhaps due to increased photometric errors, but indicates a mean metallicity of [Fe/H] ~ -0.6.Comment: 12 pages, Latex, 10 figures, 1 table, submitted for publication in MNRAS, Fig. 11 available in source file or from [email protected]

    Integrating Building Information Modeling and Health and Safety for Onsite Construction

    Get PDF
    Background: Health and safety (H&S) on a construction site can either make or break a contractor, if not properly managed. The usage of Building Information Modeling (BIM) for H&S on construction execution has the potential to augment practitioner understanding of their sites, and by so doing reduce the probability of accidents. This research explores BIM usage within the construction industry in relation to H&S communication. Methods: In addition to an extensive literature review, a questionnaire survey was conducted to gather information on the embedment of H&S planning with the BIM environment for site practitioners. Results: The analysis of responses indicated that BIM will enhance the current approach of H&S planning for construction site personnel. Conclusion: From the survey, toolbox talk will have to be integrated with the BIM environment, because it is the predominantly used procedure for enhancing H&S issues within construction sites. The advantage is that personnel can visually understand H&S issues as work progresses during the toolbox talk onsite

    Timber gridshells: beyond the drawing board

    Get PDF
    In March 2011, a week-long workshop that invited participation from all architecture and architectural technology students at Sheffield Hallam University, UK was organised with the objective of enhancing students’ thinking and experience by construction thinking. It was aimed at creating a sense of realness to realise a design project collectively. Timber was set as the material of exploration. The students had to make use of bending to design and create a timber gridshell structure. This made use of a quality traditionally felt to be a structural weakness of the material. To do this, students form-found non-mathematically and non-digitally using paper gridmats. This paper describes the aims, activity and outcome of the timber gridshell workshop as a way of preparing architects and technologists of the future and introducing the challenges of architectural design in terms of economics and construction process, aesthetics, effective communication and structural intuition by working with a given material – all important aspects in achieving effective architecture

    Dietary nitrate supplementation enhances high-intensity running performance in moderate normobaric hypoxia, independent of aerobic fitness.

    Get PDF
    Nitrate-rich beetroot juice (BRJ) increases plasma nitrite concentrations, lowers the oxygen cost (V̇O2) of steady-state exercise and improves exercise performance in sedentary and moderately-trained, but rarely in well-trained individuals exercising at sea-level. BRJ supplementation may be more effective in a hypoxic environment, where the reduction of nitrite into nitric oxide (NO) is potentiated, such that well-trained and less well-trained individuals may derive a similar ergogenic effect. We conducted a randomised, counterbalanced, double-blind placebo controlled trial to determine the effects of BRJ on treadmill running performance in moderate normobaric hypoxia (equivalent to 2500 m altitude) in participants with a range of aerobic fitness levels. Twelve healthy males (V̇O2max ranging from 47.1 to 76.8 ml kg(-1)·min(-1)) ingested 138 ml concentrated BRJ (∼15.2 mmol nitrate) or a nitrate-deplete placebo (PLA) (∼0.2 mmol nitrate). Three hours later, participants completed steady-state moderate intensity running, and a 1500 m time-trial (TT) in a normobaric hypoxic chamber (FIO2 ∼15%). Plasma nitrite concentrations were significantly greater following BRJ versus PLA 1 h post supplementation, and remained higher in BRJ throughout the testing session (p  0.05). These findings suggests that a high nitrate dose in the form of a BRJ supplement may improve running performance in individuals with a range of aerobic fitness levels conducting moderate and high-intensity exercise in a normobaric hypoxic environment

    Modelos de Sucesso S.I., 25 Anos de Evolução

    Get PDF
    Este estudo pretende relatar uma revisão da literatura ao nível da evolução do modelo da avaliação do sucesso dos sistemas de informação, especificamente o modelo de DeLone & McLean (1992) durante os últimos vinte cinco anos. Pretende-se ainda referir as principais criticas ao modelo pelos diversos investigadores que contribuíram para a sua atualização, fazendo do mesmo nos dias de hoje, um dos mais utilizados para medir o sucesso dos sistemas da informação.info:eu-repo/semantics/publishedVersio

    The Wide-field Infrared Survey Explorer (WISE): Mission Description and Initial On-orbit Performance

    Full text link
    The all sky surveys done by the Palomar Observatory Schmidt, the European Southern Observatory Schmidt, and the United Kingdom Schmidt, the InfraRed Astronomical Satellite and the 2 Micron All Sky Survey have proven to be extremely useful tools for astronomy with value that lasts for decades. The Wide-field Infrared Survey Explorer is mapping the whole sky following its launch on 14 December 2009. WISE began surveying the sky on 14 Jan 2010 and completed its first full coverage of the sky on July 17. The survey will continue to cover the sky a second time until the cryogen is exhausted (anticipated in November 2010). WISE is achieving 5 sigma point source sensitivities better than 0.08, 0.11, 1 and 6 mJy in unconfused regions on the ecliptic in bands centered at wavelengths of 3.4, 4.6, 12 and 22 microns. Sensitivity improves toward the ecliptic poles due to denser coverage and lower zodiacal background. The angular resolution is 6.1, 6.4, 6.5 and 12.0 arc-seconds at 3.4, 4.6, 12 and 22 microns, and the astrometric precision for high SNR sources is better than 0.15 arc-seconds.Comment: 22 pages with 19 included figures. Updated to better match the accepted version in the A

    Combining Experiments and Simulations Using the Maximum Entropy Principle

    Get PDF
    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges

    Evaluating the use of HILIC in large-scale, multi dimensional proteomics:horses for courses?

    Get PDF
    AbstractDespite many recent advances in instrumentation, the sheer complexity of biological samples remains a major challenge in large-scale proteomics experiments, reflecting both the large number of protein isoforms and the wide dynamic range of their expression levels. However, while the dynamic range of expression levels for different components of the proteome is estimated to be ∼107–8, the equivalent dynamic range of LC–MS is currently limited to ∼106. Sample pre-fractionation has therefore become routinely used in large-scale proteomics to reduce sample complexity during MS analysis and thus alleviate the problem of ion suppression and undersampling. There is currently a wide range of chromatographic techniques that can be applied as a first dimension separation. Here, we systematically evaluated the use of hydrophilic interaction liquid chromatography (HILIC), in comparison with hSAX, as a first dimension for peptide fractionation in a bottom-up proteomics workflow. The data indicate that in addition to its role as a useful pre-enrichment method for PTM analysis, HILIC can provide a robust, orthogonal and high-resolution method for increasing the depth of proteome coverage in large-scale proteomics experiments. The data also indicate that the choice of using either HILIC, hSAX, or other methods, is best made taking into account the specific types of biological analyses being performed

    Contribution of commuting to total daily moderate-to-vigorous physical activity

    Get PDF
    Background: Actively commuting to and from work can increase moderate-to-vigorous physical activity (MVPA) and increase adherence to physical activity (PA) guidelines; however, there is a lack of evidence on the contribution of mixed-mode commutes and continuous stepping bouts to PA. Many commuting studies employ the use of self-reported PA measures. This study objectively determined the contribution of MVPA during commuting to total MVPA, using cadence to define MVPA, and explored how the length of stepping bouts affects adherence to PA guidelines. Methods: Twenty-seven university staff wore an activPAL™ activity monitor for seven days and kept an activity diary. The activPAL™quantified MVPA and bouts duration and the activity diary collected information about commute times and the modes of commute. Twenty-three participants with at least four days of data were included in the final analysis. Results: The median total time per day spent in MVPA was 49.6 (IQR: 27.4–75.8) minutes and 31% of the total time was accumulated during commuting (median = 15.2 minutes; IQR: 4.11–26.9). Walking and mixed-mode commuters spent more time in MVPA (37.6 and 26.9 minutes, respectively), compared to car commuters (5.8 minutes). Seventeen out of the 23 participants achieved more than 30 minutes of MVPA per day, with five achieving this in their commute alone. A significant positive association was found between commute time spent in MVPA and total MVPA (p < .001). Conclusion: Commuting can be a major contributor to total MVPA, with the mode of commute having a significant role in the level of this contribution to total MVPA
    • …
    corecore