5,670 research outputs found

    A Multi-hop Topology Control Based on Inter-node Range Measurement for Wireless Sensor Networks Node Localization

    Get PDF
    In centralized range-based localization techniques, sufficiency of inter-node range information received by the base station strongly affects node position estimation results. Successful data aggregation is influenced by link stability of each connection of routes, especially in a multi-hop topology model. In general, measuring the inter-node range is only performed for position determination purposes. This research introduces the use of inter-node range measurement information for link selection in a multi-hop route composition in order to increase the rate of data aggregation. Due to irregularity problems of wireless media, two areas of node communication have been considered. The regular communication area is the area in which other nodes are able to perform symmetrical communication to the node without failure. The irregular area is the area in which other nodes are seldom able to communicate. Due to its instability, some existing methods tried to avoid the irregular area completely. The proposed method, named Virtual Boundaries (VBs) prioritizes these areas. The regular communication area’s nodes have high priority to be selected as link vertices; however, when there is no link candidate inside this area, nodes within the irregular area will be selected with respect to their range to the parent node. This technique resulted in a more robust multi-hop topology that can reduce isolated node numbers and increase the percentage of data collected by the base station accordingly

    GoSam-2.0: a tool for automated one-loop calculations within the Standard Model and beyond

    Get PDF
    We present the version 2.0 of the program package GoSam for the automated calculation of one-loop amplitudes. GoSam is devised to compute one-loop QCD and/or electroweak corrections to multi-particle processes within and beyond the Standard Model. The new code contains improvements in the generation and in the reduction of the amplitudes, performs better in computing time and numerical accuracy, and has an extended range of applicability. The extended version of the "Binoth-Les-Houches-Accord" interface to Monte Carlo programs is also implemented. We give a detailed description of installation and usage of the code, and illustrate the new features in dedicated examples.Comment: replaced by published version and reference adde

    \u27How To\u27 Guide for Synthesizing NERRs Marsh Monitoring Data

    Get PDF
    The purpose of this guide is to provide a user-friendly and informative guide on ‘How to’ synthesize salt marsh data from theNational Estuarine Research Reserve System (NERRs). In this guide, we outline and detail the steps taken from requesting/cataloguing data to summarizing these data through visual and statistical analysis. These methods can be used at a single or multiple site(s) as well as over multiple years. Though this guide is specific to NERRs and focuses on plant community data, it may also be useful for other monitoring parameters and programs to guide protocol design and analyses. Here, we conduct a synthesis of New England salt marshes using NERRs data collected from the past decade

    Scattering AMplitudes from Unitarity-based Reduction Algorithm at the Integrand-level

    Get PDF
    SAMURAI is a tool for the automated numerical evaluation of one-loop corrections to any scattering amplitudes within the dimensional-regularization scheme. It is based on the decomposition of the integrand according to the OPP-approach, extended to accommodate an implementation of the generalized d-dimensional unitarity-cuts technique, and uses a polynomial interpolation exploiting the Discrete Fourier Transform. SAMURAI can process integrands written either as numerator of Feynman diagrams or as product of tree-level amplitudes. We discuss some applications, among which the 6- and 8-photon scattering in QED, and the 6-quark scattering in QCD. SAMURAI has been implemented as a Fortran90 library, publicly available, and it could be a useful module for the systematic evaluation of the virtual corrections oriented towards automating next-to-leading order calculations relevant for the LHC phenomenology.Comment: 35 pages, 7 figure

    Liquidity, Volatility, and Equity Trading Costs Across Countries and Over Time

    Full text link
    Actual investment performance reflects the underlying strategy of the portfolio manager and the execution costs incurred in realizing those objectives. Execution costs, especially in illiquid markets, can dramatically reduce the notional return to an investment strategy. This paper examines the interactions between cost, liquidity, and volatility, and analyzes their determinants using panel-data for 42 countries from September 1996 to December 1998. We document wide variation in trading costs across countries; emerging markets in particular have significantly higher trading costs even after correcting for factors affecting costs such as market capitalization and volatility. We analyze the inter-relationships between turnover, equity trading costs, and volatility, and investigate the impact of these variables on equity returns. In particular, we show that increased volatility, acting through costs, reduces a portfolio's expected return. However, higher volatility reduces turnover also, mitigating the actual impact of higher costs on returns. Further, turnover is inversely related to trading costs, providing a possible explanation for the increase in turnover in recent years. The results demonstrate that the composition of global efficient portfolios can change dramatically when cost and turnover are taken into account.http://deepblue.lib.umich.edu/bitstream/2027.42/39706/3/wp322.pd

    Industry 4.0 And Short-Term Outlook for AEC Industry Workforce

    Get PDF
    Technology is uniquely transforming our society to a significant degree. This transformation has been described as Industry 4.0 and encompasses machine learning, computerization, automation, artificial intelligence, and robotics. Industry 4.0 is currently impacting the United States’ workplace and is projected in continue uniquely changing our society over the next twenty years or so. Looking specifically at the AEC industry, this paper researches how the AEC industry workplace could be impacted by Industry 4.0 over the next several years. The hypothesis that jobs more at risk for automation should see low or negative growth and lower wages over the next several years was tested by using U.S. Bureau of Labor Statistics (BLS) occupational wage data and growth projections to create an opportunity value for each occupation, and then evaluating the relationship between the opportunity value and probability of automation. A statistical significance was found between the two variables. The hypothesis that certain skills are particularly associated with high growth/high wage jobs versus low growth/low wage jobs was tested by scraping important skills/qualities from the individual occupational webpages hosted by the U.S. Bureau of Labor Statistics, and then comparing the approximately top 80% of skills scraped between the two groups. Certain skills/qualities were found to be particularly associated with each group. Finally, the occupations associated with the AEC industry were compared with the findings from the first two hypotheses. The discoveries were that the AEC industry is potentially more susceptible to Industry 4.0 than other industries. This research is of significance because research into how the AEC industry workplace will be impacted by Industry 4.0 over the next several years was not found in the research background, and it has implications on potential career choices, skill requirements, and areas of research and development. Recommendations for future work include utilizing new data sources, Monte Carlo simulations, cohort analysis, and cluster analysis to make more specific forecasts on Industry 4.0’s impact on the AEC industry.M.S
    • …
    corecore