195 research outputs found

    Multiplicity and Optical Excess Across the Substellar Boundary in Taurus

    Get PDF
    We present the results of a high-resolution imaging survey of 22 brown dwarfs and very low mass stars in the nearby (~145 pc) young (~1-2 Myr) low-density star-forming region Taurus-Auriga. We obtained images with the Advanced Camera for Surveys/High Resolution Channel on HST through the F555W (V), F775W (i'), and F850LP (z') filters. This survey confirmed the binarity of MHO-Tau-8 and discovered a new candidate binary system, V410-Xray3, resulting in a binary fraction of 9+/-5% at separations >4 AU. Both binary systems are tight (<10 AU) and they possess mass ratios of 0.75 and 0.46, respectively. The binary frequency and separations are consistent with low-mass binary properties in the field, but the mass ratio of V410-Xray3 is among the lowest known. We find that the binary frequency is higher for very low mass stars and high-mass brown dwarfs than for lower-mass brown dwarfs, implying either a decline in frequency or a shift to smaller separations for the lowest mass binaries. Combining these results with multiplicity statistics for higher-mass Taurus members suggests a gradual decline in binary frequency and separation toward low masses. The implication is that the distinct binary properties of very low-mass systems are set during formation and that the formation process is similar to the process which creates higher-mass stellar binaries, but occurs on a smaller scale. We show that there are no planets or very low-mass brown dwarfs with mass >3 M_J at projected separation >40 AU orbiting any of the Taurus members in our sample. We identify several BDs with significant (>1 mag) V-band excesses. The excesses appear to be correlated with signatures of accretion, and if attributed to accretion luminosity, may imply mass accretion rates several orders of magnitude above those inferred from line-profile analyses. (abridged)Comment: Accepted for publication in ApJ; 15 pages, 8 figures in emulateapj forma

    Translational and Regulatory Challenges for Exon Skipping Therapies

    Get PDF
    Several translational challenges are currently impeding the therapeutic development of antisense-mediated exon skipping approaches for rare diseases. Some of these are inherent to developing therapies for rare diseases, such as small patient numbers and limited information on natural history and interpretation of appropriate clinical outcome measures. Others are inherent to the antisense oligonucleotide (AON)-mediated exon skipping approach, which employs small modified DNA or RNA molecules to manipulate the splicing process. This is a new approach and only limited information is available on long-term safety and toxicity for most AON chemistries. Furthermore, AONs often act in a mutation-specific manner, in which case multiple AONs have to be developed for a single disease. A workshop focusing on preclinical development, trial design, outcome measures, and different forms of marketing authorization was organized by the regulatory models and biochemical outcome measures working groups of Cooperation of Science and Technology Action: "Networking towards clinical application of antisense-mediated exon skipping for rare diseases." The workshop included participants from patient organizations, academia, and members of staff from the European Medicine Agency and Medicine Evaluation Board (the Netherlands). This statement article contains the key outcomes of this meeting.status: publishe

    Simplicial Complex based Point Correspondence between Images warped onto Manifolds

    Full text link
    Recent increase in the availability of warped images projected onto a manifold (e.g., omnidirectional spherical images), coupled with the success of higher-order assignment methods, has sparked an interest in the search for improved higher-order matching algorithms on warped images due to projection. Although currently, several existing methods "flatten" such 3D images to use planar graph / hypergraph matching methods, they still suffer from severe distortions and other undesired artifacts, which result in inaccurate matching. Alternatively, current planar methods cannot be trivially extended to effectively match points on images warped onto manifolds. Hence, matching on these warped images persists as a formidable challenge. In this paper, we pose the assignment problem as finding a bijective map between two graph induced simplicial complexes, which are higher-order analogues of graphs. We propose a constrained quadratic assignment problem (QAP) that matches each p-skeleton of the simplicial complexes, iterating from the highest to the lowest dimension. The accuracy and robustness of our approach are illustrated on both synthetic and real-world spherical / warped (projected) images with known ground-truth correspondences. We significantly outperform existing state-of-the-art spherical matching methods on a diverse set of datasets.Comment: Accepted at ECCV 202

    Graph similarity through entropic manifold alignment

    Get PDF
    In this paper we decouple the problem of measuring graph similarity into two sequential steps. The first step is the linearization of the quadratic assignment problem (QAP) in a low-dimensional space, given by the embedding trick. The second step is the evaluation of an information-theoretic distributional measure, which relies on deformable manifold alignment. The proposed measure is a normalized conditional entropy, which induces a positive definite kernel when symmetrized. We use bypass entropy estimation methods to compute an approximation of the normalized conditional entropy. Our approach, which is purely topological (i.e., it does not rely on node or edge attributes although it can potentially accommodate them as additional sources of information) is competitive with state-of-the-art graph matching algorithms as sources of correspondence-based graph similarity, but its complexity is linear instead of cubic (although the complexity of the similarity measure is quadratic). We also determine that the best embedding strategy for graph similarity is provided by commute time embedding, and we conjecture that this is related to its inversibility property, since the inverse of the embeddings obtained using our method can be used as a generative sampler of graph structure.The work of the first and third authors was supported by the projects TIN2012-32839 and TIN2015-69077-P of the Spanish Government. The work of the second author was supported by a Royal Society Wolfson Research Merit Award

    Macroevolution of the plant–hummingbird pollination system

    Get PDF
    ABSTRACTPlant–hummingbird interactions are considered a classic example of coevolution, a process in which mutually dependent species influence each other's evolution. Plants depend on hummingbirds for pollination, whereas hummingbirds rely on nectar for food. As a step towards understanding coevolution, this review focuses on the macroevolutionary consequences of plant–hummingbird interactions, a relatively underexplored area in the current literature. We synthesize prior studies, illustrating the origins and dynamics of hummingbird pollination across different angiosperm clades previously pollinated by insects (mostly bees), bats, and passerine birds. In some cases, the crown age of hummingbirds pre‐dates the plants they pollinate. In other cases, plant groups transitioned to hummingbird pollination early in the establishment of this bird group in the Americas, with the build‐up of both diversities coinciding temporally, and hence suggesting co‐diversification. Determining what triggers shifts to and away from hummingbird pollination remains a major open challenge. The impact of hummingbirds on plant diversification is complex, with many tropical plant lineages experiencing increased diversification after acquiring flowers that attract hummingbirds, and others experiencing no change or even a decrease in diversification rates. This mixed evidence suggests that other extrinsic or intrinsic factors, such as local climate and isolation, are important covariables driving the diversification of plants adapted to hummingbird pollination. To guide future studies, we discuss the mechanisms and contexts under which hummingbirds, as a clade and as individual species (e.g. traits, foraging behaviour, degree of specialization), could influence plant evolution. We conclude by commenting on how macroevolutionary signals of the mutualism could relate to coevolution, highlighting the unbalanced focus on the plant side of the interaction, and advocating for the use of species‐level interaction data in macroevolutionary studies

    This other atmosphere: against human resources, Emoji, and devices

    Get PDF
    Frequently humans are invited to engage with modern visual forms: emoji, emoticons, pictograms. Some of these forms are finding their ways into the workplace, understood as augmentations to workplace atmospheres. What has been called the ‘quantified workplace’ requires its workers to log their rates of stress, wellbeing, their subjective sense of productivity on scale of 1-5 or by emoji, in a context in which HR professionals develop a vocabulary of Workforce Analytics, People Analytics, Human Capital Analytics or Talent Analytics, and all this in the context of managing the work environment or its atmosphere. Atmosphere is mood, a compote of emotions. Emotions are a part of a human package characterised as ‘the quantified self’, a self intertwined with - subject to but also compliant with - tracking and archiving. The logical step for managing atmospheres is to track emotions at a granular and largescale level. Through the concept of the digital crowd, rated and self-rating, as well as emotion tracking strategies, the human resource (as worker and consumer) engages in a new politics of the crowd, organised around what political philosopher Jodi Dean calls, affirmatively, ‘secondary visuality’, high circulation communication fusing together speech, writing and image as a new form. This is the visuality of communicative, or social media, capitalism. But to the extent that it is captured by HR, is it an exposure less to crowdsourced democracy, and more a stage in turning the employee into an on-the-shelf item in a digital economy warehouse, assessed by Likert scales? While HR works on new atmospheres of work, what other atmospheres pervade the context of labour, and can these be deployed in the generation of other types of affect, ones that work towards the free association of labour and life

    ROBITT: a tool for assessing the risk-of-bias in studies of temporal trends in ecology

    Get PDF
    1. Aggregated species occurrence and abundance data from disparate sources are increasingly accessible to ecologists for the analysis of temporal trends in biodiversity. However, sampling biases relevant to any given research question are often poorly explored and infrequently reported; this can undermine statistical inference. In other disciplines, it is common for researchers to complete ‘risk-of-bias’ assessments to expose and document the potential for biases to undermine conclusions. The huge growth in available data, and recent controversies surrounding their use to infer temporal trends, indicate that similar assessments are urgently needed in ecology. 2. We introduce ROBITT, a structured tool for assessing the ‘Risk-Of-Bias In studies of Temporal Trends in ecology’. ROBITT has a similar format to its counterparts in other disciplines: it comprises signalling questions designed to elicit information on the potential for bias in key study domains. In answering these, users will define study inferential goal(s) and relevant statistical target populations. This information is used to assess potential sampling biases across domains relevant to the research question (e.g. geography, taxonomy, environment), and how these vary through time. If assessments indicate biases, then users must clearly describe them and/or explain what mitigating action will be taken. 3. Everything that users need to complete a ROBITT assessment is provided: the tool, a guidance document and a worked example. Following other disciplines, the tool and guidance document were developed through a consensus-forming process across experts working in relevant areas of ecology and evidence synthesis. 4. We propose that researchers should be strongly encouraged to include a ROBITT assessment when publishing studies of biodiversity trends, especially when using aggregated data. This will help researchers to structure their thinking, clearly acknowledge potential sampling issues, highlight where expert consultation is required and provide an opportunity to describe data checks that might go unreported. ROBITT will also enable reviewers, editors and readers to establish how well research conclusions are supported given a dataset combined with some analytical approach. In turn, it should strengthen evidence-based policy and practice, reduce differing interpretations of data and provide a clearer picture of the uncertainties associated with our understanding of reality

    Simultaneous Optimization of Both Node and Edge Conservation in Network Alignment via WAVE

    Full text link
    Network alignment can be used to transfer functional knowledge between conserved regions of different networks. Typically, existing methods use a node cost function (NCF) to compute similarity between nodes in different networks and an alignment strategy (AS) to find high-scoring alignments with respect to the total NCF over all aligned nodes (or node conservation). But, they then evaluate quality of their alignments via some other measure that is different than the node conservation measure used to guide the alignment construction process. Typically, one measures the amount of conserved edges, but only after alignments are produced. Hence, a recent attempt aimed to directly maximize the amount of conserved edges while constructing alignments, which improved alignment accuracy. Here, we aim to directly maximize both node and edge conservation during alignment construction to further improve alignment accuracy. For this, we design a novel measure of edge conservation that (unlike existing measures that treat each conserved edge the same) weighs each conserved edge so that edges with highly NCF-similar end nodes are favored. As a result, we introduce a novel AS, Weighted Alignment VotEr (WAVE), which can optimize any measures of node and edge conservation, and which can be used with any NCF or combination of multiple NCFs. Using WAVE on top of established state-of-the-art NCFs leads to superior alignments compared to the existing methods that optimize only node conservation or only edge conservation or that treat each conserved edge the same. And while we evaluate WAVE in the computational biology domain, it is easily applicable in any domain.Comment: 12 pages, 4 figure

    Progress in muscular dystrophy research with special emphasis on gene therapy

    Get PDF
    Duchenne muscular dystrophy (DMD) is an X-linked, progressive muscle-wasting disease caused by mutations in the DMD gene. Since the disease was described by physicians in the 19th century, information about the subject has been accumulated. One author (Sugita) was one of the coworkers who first reported that the serum creatine kinase (CK) level is elevated in progressive muscular dystrophy patients. Even 50 years after that first report, an elevated serum CK level is still the most useful marker in the diagnosis of DMD, a sensitive index of the state of skeletal muscle, and useful to evaluate therapeutic effects. In the latter half of this article, we describe recent progress in the therapy of DMD, with an emphasis on gene therapies, particularly exon skipping

    Shape description and matching using integral invariants on eccentricity transformed images

    Get PDF
    Matching occluded and noisy shapes is a problem frequently encountered in medical image analysis and more generally in computer vision. To keep track of changes inside the breast, for example, it is important for a computer aided detection system to establish correspondences between regions of interest. Shape transformations, computed both with integral invariants (II) and with geodesic distance, yield signatures that are invariant to isometric deformations, such as bending and articulations. Integral invariants describe the boundaries of planar shapes. However, they provide no information about where a particular feature lies on the boundary with regard to the overall shape structure. Conversely, eccentricity transforms (Ecc) can match shapes by signatures of geodesic distance histograms based on information from inside the shape; but they ignore the boundary information. We describe a method that combines the boundary signature of a shape obtained from II and structural information from the Ecc to yield results that improve on them separately
    corecore