926 research outputs found

    Instrument Packages for the Cold, Dark, High Radiation Environments

    Get PDF
    We are developing a small cold temperature instrument package concept that integrates a cold temperature power system and radhard ultra low temperature ultra low power electronics components and power supplies now under development into a cold temperature surface operational version of a planetary surface instrument package. We are already in the process of developing a lower power lower tem-perature version for an instrument of mutual interest to SMD and ESMD to support the search for volatiles (the mass spectrometer VAPoR, Volatile Analysis by Pyrolysis of Regolith) both as a stand alone instrument and as part of an environmental monitoring package

    Flexibility in Animal Signals Facilitates Adaptation to Rapidly Changing Environments

    Get PDF
    Charles Darwin posited that secondary sexual characteristics result from competition to attract mates. In male songbirds, specialized vocalizations represent secondary sexual characteristics of particular importance because females prefer songs at specific frequencies, amplitudes, and duration. For birds living in human-dominated landscapes, historic selection for song characteristics that convey fitness may compete with novel selective pressures from anthropogenic noise. Here we show that black-capped chickadees (Poecile atricapillus) use shorter, higher-frequency songs when traffic noise is high, and longer, lower-frequency songs when noise abates. We suggest that chickadees balance opposing selective pressures by use low-frequency songs to preserve vocal characteristics of dominance that repel competitors and attract females, and high frequency songs to increase song transmission when their environment is noisy. The remarkable vocal flexibility exhibited by chickadees may be one reason that they thrive in urban environments, and such flexibility may also support subsequent genetic adaptation to an increasingly urbanized world

    Tropical Dominating Sets in Vertex-Coloured Graphs

    Full text link
    Given a vertex-coloured graph, a dominating set is said to be tropical if every colour of the graph appears at least once in the set. Here, we study minimum tropical dominating sets from structural and algorithmic points of view. First, we prove that the tropical dominating set problem is NP-complete even when restricted to a simple path. Then, we establish upper bounds related to various parameters of the graph such as minimum degree and number of edges. We also give upper bounds for random graphs. Last, we give approximability and inapproximability results for general and restricted classes of graphs, and establish a FPT algorithm for interval graphs.Comment: 19 pages, 4 figure

    On the Metric Dimension of Cartesian Products of Graphs

    Get PDF
    A set S of vertices in a graph G resolves G if every vertex is uniquely determined by its vector of distances to the vertices in S. The metric dimension of G is the minimum cardinality of a resolving set of G. This paper studies the metric dimension of cartesian products G*H. We prove that the metric dimension of G*G is tied in a strong sense to the minimum order of a so-called doubly resolving set in G. Using bounds on the order of doubly resolving sets, we establish bounds on G*H for many examples of G and H. One of our main results is a family of graphs G with bounded metric dimension for which the metric dimension of G*G is unbounded

    Hierarchical search strategy for the detection of gravitational waves from coalescing binaries: Extension to post-Newtonian wave forms

    Get PDF
    The detection of gravitational waves from coalescing compact binaries would be a computationally intensive process if a single bank of template wave forms (i.e., a one step search) is used. In an earlier paper we had presented a detection strategy, called a two step search}, that utilizes a hierarchy of template banks. It was shown that in the simple case of a family of Newtonian signals, an on-line two step search was about 8 times faster than an on-line one step search (for initial LIGO). In this paper we extend the two step search to the more realistic case of zero spin 1.5 post-Newtonian wave forms. We also present formulas for detection and false alarm probabilities which take statistical correlations into account. We find that for the case of a 1.5 post-Newtonian family of templates and signals, an on-line two step search requires about 1/21 the computing power that would be required for the corresponding on-line one step search. This reduction is achieved when signals having strength S = 10.34 are required to be detected with a probability of 0.95, at an average of one false event per year, and the noise power spectral density used is that of advanced LIGO. For initial LIGO, the reduction achieved in computing power is about 1/27 for S = 9.98 and the same probabilities for detection and false alarm as above.Comment: 30 page RevTeX file and 17 figures (postscript). Submitted to PRD Feb 21, 199

    Pliocene to Pleistocene climate and environmental history of Lake El\u27gygytgyn, Far East Russian Arctic, based on high-resolution inorganic geochemistry data

    Get PDF
    The 3.6 Ma sediment record of Lake El\u27gygytgyn/NE Russia, Far East Russian Arctic, represents the longest continuous climate archive of the terrestrial Arctic. Its elemental composition as determined by X-ray fluorescence scanning exhibits significant changes since the mid-Pliocene caused by climate-driven variations in primary production, postdepositional diagenetic processes, and lake circulation as well as weathering processes in its catchment. During the mid- to late Pliocene, warmer and wetter climatic conditions are reflected by elevated Si / Ti ratios, indicating enhanced diatom production in the lake. Prior to 3.3 Ma, this signal is overprinted by intensified detrital input from the catchment, visible in maxima of clastic-related proxies, such as K. In addition, calcite formation in the early lake history points to enhanced Ca flux into the lake caused by intensified weathering in the catchment. A lack of calcite deposition after ca. 3.3 Ma is linked to the development of permafrost in the region triggered by cooling in the mid-Pliocene. After ca. 3.0 Ma the elemental data suggest a gradual transition to Pleistocene-style glacial-interglacial cyclicity. In the early Pleistocene, the cyclicity was first dominated by variations on the 41 kyr obliquity band but experienced a change to a 100 kyr eccentricity dominance during the middle Pleistocene transition (MPT) at ca. 1.2-0.6 Ma. This clearly demonstrates the sensitivity of the Lake El\u27gygytgyn record to orbital forcing. A successive decrease of the baseline levels of the redox-sensitive Mn / Fe ratio and magnetic susceptibility between 2.3 and 1.8 Ma reflects an overall change in the bottom-water oxygenation due to an intensified occurrence of pervasive glacial episodes in the early Pleistocene. The coincidence with major changes in the North Pacific and Bering Sea paleoceanography at ca. 1.8 Ma implies that the change in lake hydrology was caused by a regional cooling in the North Pacific and the western Beringian landmass and/or changes in the continentality. Further increases in total organic carbon and total nitrogen content after ca. 1.6 Ma are attributed to reduced organic matter decay in the sediment during prolonged anoxic periods. This points to more extensive periods of perennial ice coverage, and thus, to a progressive shifts towards more intense peak glacial periods. In the course of the Pleistocene glacial-interglacial sequence eight so-called super-interglacials occur. Their exceptionally warm conditions are reflected by extreme Si / Ti peaks accompanied by lows in Ti, K, and Fe, thus indicating extraordinary high lake productivity

    On the Tractability of (k, i)-Coloring

    Get PDF
    In an undirected graph, a proper ( k, i )-coloring is an assign- ment of a set of k colors to each vertex such that any two adjacent vertices have at most i common colors. The ( k, i )-coloring problem is to compute the minimum number of colors required for a proper ( k, i )- coloring. This is a generalization of the classic graph colo ring problem. Majumdar et. al. [CALDAM 2017] studied this problem and show ed that the decision version of the ( k, i )-coloring problem is fixed parameter tractable (FPT) with tree-width as the parameter. They aske d if there exists an FPT algorithm with the size of the feedback vertex s et (FVS) as the parameter without using tree-width machinery. We ans wer this in positive by giving a parameterized algorithm with the size o f the FVS as the parameter. We also give a faster and simpler exact algo rithm for ( k, k − 1)-coloring, and make progress on the NP-completeness of sp ecific cases of ( k, i )-colorin

    Climate change and health: rethinking public health messaging for wildfire smoke and extreme heat co-exposures

    Get PDF
    With the growing climate change crisis, public health agencies and practitioners must increasingly develop guidance documents addressing the public health risks and protective measures associated with multi-hazard events. Our Policy and Practice Review aims to assess current public health guidance and related messaging about co-exposure to wildfire smoke and extreme heat and recommend strengthened messaging to better protect people from these climate-sensitive hazards. We reviewed public health messaging published by governmental agencies between January 2013 and May 2023 in Canada and the United States. Publicly available resources were eligible if they discussed the co-occurrence of wildfire smoke and extreme heat and mentioned personal interventions (protective measures) to prevent exposure to either hazard. We reviewed local, regional, and national governmental agency messaging resources, such as online fact sheets and guidance documents. We assessed these resources according to four public health messaging themes, including (1) discussions around vulnerable groups and risk factors, (2) symptoms associated with these exposures, (3) health risks of each exposure individually, and (4) health risks from combined exposure. Additionally, we conducted a detailed assessment of current messaging about measures to mitigate exposure. We found 15 online public-facing resources that provided health messaging about co-exposure; however, only one discussed all four themes. We identified 21 distinct protective measures mentioned across the 15 resources. There is considerable variability and inconsistency regarding the types and level of detail across described protective measures. Of the identified 21 protective measures, nine may protect against both hazards simultaneously, suggesting opportunities to emphasize these particular messages to address both hazards together. More precise, complete, and coordinated public health messaging would protect against climate-sensitive health outcomes attributable to wildfire smoke and extreme heat co-exposures

    Enabling Next Generation Dark Energy and Epoch of Reionization Radio Observatories with the MOFF Correlator

    Full text link
    Proposed 21 cm cosmology observatories for studying the epoch of reionization (z ~6-15) and dark energy (z ~0-6) envision compact arrays with tens of thousands of antenna elements. Fully correlating this many elements is computationally expensive using traditional XF or FX correlators, and has led some groups to reconsider direct imaging/FFT correlators. In this paper we develop a variation of the direct imaging correlator we call the MOFF correlator. The MOFF correlator shares the computational advantages of a direct imaging correlator, while avoiding a number of its shortcomings. In particular the MOFF correlator makes no constraints on the antenna arrangement or type, provides a fully calibrated output image including widefield polarimetry and non-coplanar baseline effects, and can be orders-of-magnitude more efficient than XF or FX correlators for compact radio cosmology arrays.Comment: Version accepted for publication in PASP (delay due to author's distraction). Includes a number of advancements and refinements, including the feedback calibration technique and a clearer development. If you downloaded previous version please upgrade to this on

    On State Fusers Over Long-Haul Sensor Networks

    Get PDF
    Abstract-We consider a network of sensors wherein the state estimates are sent from sensors to a fusion center to generate a global state estimate. The underlying fusion algorithm affects the performance measure QCC (τ ) (with subscripts CC indicating the effects of the communications and computing quality) of the global state estimate computed within the allocated time τ . We present a probabilistic performance bound on QCC (τ ) as a function of the distributions of state estimates, communications parameters as well as the fusion algorithm. We present simulations of simplified scenarios to illustrate the qualitative effects of different fusers, and system-level simulations to complement the analytical results
    corecore