2,913 research outputs found

    Low-mass X-ray binaries from black-hole retaining globular clusters

    Get PDF
    Recent studies suggest that globular clusters (GCs) may retain a substantial population of stellar-mass black holes (BHs), in contrast to the long-held belief of a few to zero BHs. We model the population of BH low-mass X-ray binaries (BH-LMXBs), an ideal observable proxy for elusive single BHs, produced from a representative group of Milky Way GCs with variable BH populations. We simulate the formation of BH-binaries in GCs through exchange interactions between binary and single stars in the company of tens to hundreds of BHs. Additionally, we consider the impact of the BH population on the rate of compact binaries undergoing gravitational wave driven mergers. The characteristics of the BH-LMXB population and binary properties are sensitive to the GCs structural parameters as well as its unobservable BH population. We find that GCs retaining ∼1000\sim 1000 BHs produce a galactic population of ∼150\sim 150 ejected BH-LMXBs whereas GCs retaining only ∼20\sim20 BHs produce zero ejected BH-LMXBs. Moreover, we explore the possibility that some of the presently known BH-LMXBs might have originated in GCs and identify five candidate systems.Comment: 27 pages, 18 figures, 7 tables, submitted to MNRA

    Shiftable Context: Addressing Training-Inference Context Mismatch in Simultaneous Speech Translation

    Full text link
    Transformer models using segment-based processing have been an effective architecture for simultaneous speech translation. However, such models create a context mismatch between training and inference environments, hindering potential translation accuracy. We solve this issue by proposing Shiftable Context, a simple yet effective scheme to ensure that consistent segment and context sizes are maintained throughout training and inference, even with the presence of partially filled segments due to the streaming nature of simultaneous translation. Shiftable Context is also broadly applicable to segment-based transformers for streaming tasks. Our experiments on the English-German, English-French, and English-Spanish language pairs from the MUST-C dataset demonstrate that when applied to the Augmented Memory Transformer, a state-of-the-art model for simultaneous speech translation, the proposed scheme achieves an average increase of 2.09, 1.83, and 1.95 BLEU scores across each wait-k value for the three language pairs, respectively, with a minimal impact on computation-aware Average Lagging.Comment: Accepted at ICML 202

    InQuiry: A Participatory Approach for Understanding Stakeholder Perceptions

    Get PDF
    This article addresses two important and elusive issues for funded projects: quantifiable measures and deep understandings of participant perceptions. It describes the development of the InQuiry evaluation tool, which combines Q methodology (factor analysis process to quantify perceptions) with a qualitative participatory approach. InQuiry generates both quantified metrics of what participants believe about a given topic and also a rich narrative of why participants think the way they do. These data yield metrics for understanding fidelity, outcomes, and impacts. Beginning with the history of a program funded by the W.K. Kellogg Foundation, this article also illustrates the tool’s usefulness. The Seattle Community Learning Exchange, an example of InQuiry in action from beginning to end, explored how members of a diverse community perceived peacemaking and healing within the community and implemented peacemaking circles by building capacity and shifting perceptions

    Do childhood socioeconomic circumstances moderate the association between childhood cognitive ability and all-cause mortality across the life course? Prospective observational study of the 36-day sample of the Scottish Mental Survey 1947

    Get PDF
    Background There is growing evidence that higher childhood cognitive ability predicts lower all-cause mortality risk across the life course. Whereas this association does not appear to be mediated by childhood socioeconomic circumstances, it is unclear whether socioeconomic circumstances moderate this association.Methods The moderating role of childhood socioeconomic circumstances was assessed in 5318 members of the 36-day sample of the Scottish Mental Survey 1947. Univariate, sex-adjusted and age-adjusted, and mutually adjusted Cox models predicting all-cause mortality risk up to age 79 years were created using childhood IQ scores and childhood social class as predictors. Moderation was assessed by adding an interaction term between IQ scores and social class and comparing model fit.Results An SD advantage in childhood IQ scores (HR=0.83, 95% CI 0.79 to 0.86, p<0.001) and a single-class advantage in childhood social class (HR=0.92, 95% CI 0.88 to 0.97, p<0.001) independently predicted lower mortality risk. Adding the IQ–social class interaction effect did not improve model fit (χ2Δ=1.36, p=0.24), and the interaction effect did not predict mortality risk (HR=1.03, 95% CI 0.98 to 1.07, p=0.25).Conclusions The present study demonstrated that the association between higher childhood cognitive ability and lower all-cause mortality risk is not conditional on childhood social class. Whereas other measures of socioeconomic circumstances may play a moderating role, these findings suggest that the benefits of higher childhood cognitive ability for longevity apply regardless of the material socioeconomic circumstances experienced in childhood

    The role of fully coupled ice sheet basal processes in quaternary glacial cycles

    Get PDF
    Bed conditions such as meltwater pressurization and unconsolidated sediment cover (soft versus hard bedded) strongly impact ice sheet sliding velocities. How the dynamical processes governing these conditions affect glacial cycle scale ice sheet evolution has been little studied. The influence of subglacial hydrology and glacial sediment production and transport is therefore largely unknown. Here I present a glaciological model Glacial Systems Model (GSM) with the to-date most complete representations of fully coupled subglacial hydrology and sediment production and transport for the glacial cycle continental scale context. I compare the influence of of several types of subglacial hydrology drainage systems on millennial scale variability and examine the role dynamical sediment processes potentially played in the mid-Pleistocene Transition (MPT) from 41 to 100 kyr glacial cycles. Subglacial hydrology has long been inferred to play a role in glacial dynamics at decadal and shorter scales. However, it remains unclear whether subglacial hydrology has a critical role in ice sheet evolution on greater than centennial time-scales. It is also unclear which drainage system is most appropriate for the continental/glacial cycle scale. Here I compare the dynamical role of three subglacial hydrology systems most dominant in the literature in the context of surge behaviour for an idealized Hudson Strait scale ice stream. I find that subglacial hydrology is an important system inductance for realistic ice stream surging and that the three formulations all exhibit similar surge behaviour. Even a detail as fundamental as mass conserving transport of subglacial water is not necessary for simulating a full range of surge frequency and amplitude. However, one difference is apparent: the combined positive and negative feedbacks of the linked-cavity system yields longer duration surges and a broader range of effective pressures than its poro-elastic and leaky-bucket counterparts. The MPT from 41 kyr to 100 kyr glacial cycles was one of the largest changes in the Earth system over the past million years. A change from a low to high friction base under the North American Ice Complex through the removal of pre-glacial regolith has been hypothesized to play a critical role in the transition to longer and stronger glaciations. However, this hypothesis requires constraint on pre-glacial regolith cover as well as mechanistic constraints on whether the appropriate amount of regolith can be removed from the required regions to enable MPT occurrence at the right time. This is the first study to test the regolith hypothesis for a realistic 3D North American ice sheet that treats regolith removal as a system internal process instead of a forced soft to hard transition. The fully coupled climate, ice, subglacial hydrology, and sediment physics capture the progression of Pleistocene glacial cycles within parametric and observational uncertainty. Incorporating the constraint from estimates for the present day sediment distribution, Quaternary erosion, and Atlantic Quaternary sediment volume suggests the mean Pliocene regolith thickness was 40 m or less. Given this constraint, I compare the simulated soft to hard bed transitions with the timing inferred for the MPT. The combined constraint, bedrock erosion, and sediment transport poses a challenge to the Regolith Hypothesis: denudation occurs well in advance of the MPT and the hard bedded area stays largely constant by 1.5 Ma. Furthermore, I examine the sensitivity of glacial cycle evolution to the initial thickness of the regolith in the absence of erosion. Surprisingly, thicker regolith does not delay the transition but produces large glacial cycles in the early Pleistocene even extending the length of some. This is due to the effect from higher topography on ice sheet mass balance. Therefore, I suggest that the regolith removal mechanism is not singularly responsible for the MPT, but that the MPT results from changes in many aspects of the systems. One of these aspects which remains under-studied in the literature is the long term evolution of glacierized beds over the Pleistocene

    Generational differences in loneliness and its psychological and sociodemographic predictors:An exploratory and confirmatory machine learning study

    Get PDF
    BACKGROUND: Loneliness is a growing public health issue in the developed world. Among older adults, loneliness is a particular challenge, as the older segment of the population is growing and loneliness is comorbid with many mental as well as physical health issues. Comorbidity and common cause factors make identifying the antecedents of loneliness difficult, however, contemporary machine learning techniques are positioned to tackle this problem. METHODS: This study analyzed four cohorts of older individuals, split into two age groups – 45–69 and 70–79 – to examine which common psychological and sociodemographic are associated with loneliness at different ages. Gradient boosted modeling, a machine learning technique, and regression models were used to identify and replicate associations with loneliness. RESULTS: In all cohorts, higher emotional stability was associated with lower loneliness. In the older group, social circumstances such as living alone were also associated with higher loneliness. In the younger group, extraversion's association with lower loneliness was the only other confirmed relationship. CONCLUSIONS: Different individual and social factors might underlie loneliness differences in distinct age groups. Machine learning methods have the potential to unveil novel associations between psychological and social variables, particularly interactions, and mental health outcomes

    Human-Machine Interfacing via Epidermal Electronic Systems

    Get PDF
    Surface electromyography (EMG) is rapidly becoming a viable control source for interfacing with machines. By measuring the electric potential generated by the contractions of skeletal muscles, systems can be controlled with a mere flick of the wrist, allowing intuitive and versatile control to the wielder. As sensors and classification algorithms become more sophisticated, EMG control has increasing potential to revolutionize the way we interact with and utilize technology. Prosthetics in particular have benefited the most from these recent advances, with one research team successfully returning ambulation to a leg amputee last year. However, this technology is not yet suitable for practical use, as implementations often require bulky hardware and is limited by the complexities of the software. To amend these issues and facilitate further research in this field, we propose a consolidated solution that will handle the acquisition and classification of an EMG input while providing protocols to interface with an external system. Where most setups are cumbersome and impractical, usually requiring a piece of dedicated hardware for each step in the signal chain, we have made our system as small and cost-effective as possible. By consolidating our solution onto a single circuit board with bluetooth integration, we will maximize portability and afford researchers flexibility when working with our system. This portability will allow our device to be placed in close proximity to the EMG sensors to transmit the signal wirelessly to a central hub, which will process it further. Here the central hub will classify the waveform and map it to a definitive command that can be used to interface with an external system. This will abstract the classification aspect away from the developer, simplifying the process and allowing them to focus on what they are trying to accomplish. Our system will also allow for further extension by being robust enough to handle multiple EMG inputs and allowing researchers to easily configure the device for their purposes. To accommodate future advances in classification algorithms or future improvements to the system itself, we will also provide frameworks that will allow researchers and developers to program the device themselves. By giving researchers the tools to quickly implement this technology, we allow them to focus on other aspects of what they are trying to build instead of worrying about the technicalities that go into designing a system like this. Further development in this field will give us unprecedented ways to interact with the world around us and change how we utilize technology. Given this technology’s proclivity towards those who are disabled, our project has the potential to drastically improve the quality of life for the unfortunate as well.https://scholarscompass.vcu.edu/capstone/1040/thumbnail.jp

    Liquitronics Final Project Report

    Get PDF
    This final project report details the design evaluation and tests the Liquitronics team conducted on the 96 well plate robotic liquid handler. The team was able to create a prototype that reflects the most important aspects the team set out to accomplish. The main focus of the semester was completing a functioning chassis and movement system along with the pipette mechanism. The following tests were completed: z-axis positional accuracy, x/y-axis positional accuracy, tip discard test, plunger actuator test, fluid volume test, sustained power draw test, and a size and weight test. Both positional accuracy tests passed without significant issues. The z-axis needed to be within 0.5 millimeters of the location for every trial, and the trial with the largest error had an error of 0.1 millimeters. Similarly, the x/y test needed each trial to be within 1 millimeter and the greatest error measured was only 0.6 millimeters. The tip discard test proved that the prototype could eject a pipette tip without fail. This test also gave the time a relationship between the voltage supplied to the linear actuator and the speed at which it moved. These results will aid in determining the working voltage for the prototype’s actuators and electronics. Unfortunately, there were two tests that did not meet their acceptance criteria. The final design is limited to a four foot wide and 2 foot deep space. The current prototype is currently 2.23 feet in both directions. However, after speaking with the project sponsor, it was agreed that the size limit was more flexible than originally stated and thus the current dimensions do not present any practical issues. Additionally, the prototype is well under the 500 pound weight limit measuring at 34 pounds. The second unsuccessful test was the sustained power draw test. This test is meant to prove that the circuitry of the prototype can run for extended periods of time without any components failing. Without any of the motors running, the prototype was drawing just over 300 milliamps. This was lower than what was expected. Also, the voltage regulator began to burn out, and 2 of the 9 stepper motor drivers stopped working. The reason for these failures is not yet known, but the team is currently brainstorming ideas for how to pinpoint the solution, and ensure that it will be fixed. Future improvements will be focused on getting a fully automated prototype. For this to happen, assembly of the mechanical parts must be completed, a full code must be written, and the power draw problems must be addressed

    Drag Reduction of a Modern Straight Truck

    Get PDF
    A wind tunnel test program was conducted at the Langley Full Scale Tunnel (LFST) to evaluate the performance of five passive drag reduction configurations on a modern straight truck at full scale. Configurations were tested in a build-up fashion with results representing a cumulative effect. Tested configurations include a front valance, a front box fairing, a boat-tail, an ideal side-skirt, and a practical side-skirt. Configurations were evaluated over a nominal 9 degree yaw sweep to establish wind averaged drag coefficients using SAE J1252. Genuine replicate yaw sweeps were used in an uncertainty analysis. Results show up to 28% improvement in wind-averaged drag coefficient and that significant gains can be made in straight truck fuel economy, even at non-highway speeds. © 2011 SAE International
    • …
    corecore