709 research outputs found

    Complexity-Aware Scheduling for an LDPC Encoded C-RAN Uplink

    Full text link
    Centralized Radio Access Network (C-RAN) is a new paradigm for wireless networks that centralizes the signal processing in a computing cloud, allowing commodity computational resources to be pooled. While C-RAN improves utilization and efficiency, the computational load occasionally exceeds the available resources, creating a computational outage. This paper provides a mathematical characterization of the computational outage probability for low-density parity check (LDPC) codes, a common class of error-correcting codes. For tractability, a binary erasures channel is assumed. Using the concept of density evolution, the computational demand is determined for a given ensemble of codes as a function of the erasure probability. The analysis reveals a trade-off: aggressively signaling at a high rate stresses the computing pool, while conservatively backing-off the rate can avoid computational outages. Motivated by this trade-off, an effective computationally aware scheduling algorithm is developed that balances demands for high throughput and low outage rates.Comment: Conference on Information Sciences and Systems (CISS) 2017, to appea

    Biomedical Resource Ontology

    Get PDF
    non

    NCBO Overview and Biositemaps

    Get PDF
    non

    No Way FDA, Let States Lead the Way on Expanding the Prescriptive Authority of Pharmacists

    Get PDF

    Complexity aware C-RAN scheduling for LDPC codes over BEC

    Get PDF
    Effective transmission of data over a noisy wireless channel is a vital part of today\u27s high speed technology driven society. In a wireless cell network, information is sent from mobile users to base stations. The information being transmitted is protected by error-control codes. In a conventional architecture the signal processing, including error-control decoding, is performed locally at each base station. Recently, a new architecture has emerged called Centralized Radio Access Network (C-RAN), which involves the centralized processing of the signals in a computing cloud. Using a computing cloud allows computational resources to be pooled, which improves utilization and efficiency. When the computational resources are finite and when the computational load varies over time, then there is a chance that the load exceeds the available resources. This situation creates a so-called computational outage, which has characteristics that are similar to outages caused by channel fading or interference. In this report, the computational complexity is quantified for a common class of error-correcting codes known as low-density parity check (LDPC) codes. To make the analysis tractable, a binary erasure channel is assumed. The concept of density evolution is used to obtain the complexity as a function of the code design parameters and the signal-to-interference-plus-noise ratio (SINR) of the channel. The analysis shows that there is a trade-off in that aggressively signaling at a high data rate causes high computational demands, while conservatively backing off on the rate can dramatically reduce the computational demand. Motivated by this trade-off, a scheduling algorithm is developed that balances the demands for high throughput and low computational outage rates

    From Flames to Forage: How Wildfire Affects Elk Behavior and Abundance

    Get PDF
    The Rocky Mountain elk (Cervus elaphus nelsoni) is an ecologically and culturally important wildlife species in the Intermountain West, but it is facing habitat changes caused by increasing fire activity. Wildfire frequency is projected to continue to change into the future, yet increases in annual area burned and increases in area burned at high severity may actually represent opportunities for some species. Large herbivores like elk may benefit from increased access to regenerating areas where forage abundance and quality are often elevated. Therefore, effective management of wildlife populations may depend on quantifying how large ungulates, like elk, alter their behavior in the context of rapidly shifting fire regimes. In order to evaluate elk foraging activity in previously burned areas, my research examined differences in severity and habitat types. I used two sampling methods to understand elk behavior and habitat selection post-fire. First, I ran a Hidden Markov Model (HMM) on GPS collar data to assign one of three behavioral states (‘resting’, ‘foraging’, or ‘commuting’) to each of the approximately 730,000 elk positions located in a previously burned fire perimeter. I statistically tested whether the probability of an elk position being assigned a ‘foraging’ state depended on fire severity and time since fire, while controlling for other potential behavioral drivers (remote-sensed vegetation type, cover, and productivity). I then used camera data from 40 camera traps, stratified by fire severity (unburned, low, moderate, and high severity), to monitor elk use of burned areas. Results suggest that elk probability of foraging in burned areas peaks 3-4 years post-fire in conifers, but peaks between 7-9 years in aspen. Also, elk have higher probabilities of being in a foraging state in areas where aspen is burned at high severity. From camera data, I found that the post-fire abundance of herbaceous biomass is the strongest driver of elk abundance, and abundance is highest at higher burn severity. Combined, this research provides information on wildfire’s influence on elk behavior and abundance and can help inform management decisions for elk on increasing fiery landscapes in the western United States

    George Francis Atkinson

    Full text link

    Measuring DHEA-S in saliva: time of day differences and positive correlations between two different types of collection methods

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The anabolic steroid, dehydroepiandosterone sulfate (DHEA-S), is secreted from the adrenal cortex. It plays a significant role in the body as a precursor to sex steroids as well as a lesser known role in the hypothalamic pituitary adrenal axis (HPA) response to stress. DHEA-S can be measured reliably in saliva, making saliva collection a valuable tool for health research because it minimizes the need for invasive sampling procedures (e.g., blood draws). Typical saliva collection methods include the use of plain cotton swab collection devices (e.g., Salivette<sup>®</sup>) or passive drool. There has been some speculation that the plain saliva cotton collection device may interfere with determination of DHEA-S by enzyme immunoassay (EIA) bringing this saliva collection method into question. Because of the increasing popularity of salivary biomarker research, we sought to determine whether the cotton swab interferes with DHEA-S determination through EIA techniques.</p> <p>Findings</p> <p>Fifty-six healthy young adult men and women aged 18-30 years came to the lab in the morning (0800 hrs; 14 men, 14 women) or late afternoon (1600 hrs; 14 men, 14 women) and provided saliva samples via cotton Salivette and passive drool. Passive drool collection was taken first to minimize particle cross contamination from the cotton swab. Samples were assayed for DHEA-S in duplicate using a commercially available kit (DSL, Inc., Webster, TX). DHEA-S levels collected via Salivette and passive drool were positively correlated (r = + 0.83, p < 0.05). Mean DHEA-S levels were not significantly different between collection methods. Salivary DHEA-S levels were significantly higher in males than in females, regardless of saliva collection method (p < 0.05), and morning DHEA-S values were higher than evening levels (p < 0.05).</p> <p>Conclusions</p> <p>Results suggest that DHEA-S can be measured accurately using passive drool or cotton Salivette collection methods. Results also suggest that DHEA-S levels change across the day and that future studies need to take this time of day difference into account when measuring DHEA-S.</p
    corecore