1,015 research outputs found
Recommended from our members
TTL implementation of a CAMB tree network switch
Packet collisions and their resolution create a performance bottleneck in random-access LANs. A hardware solution to this problem is to use a collision avoidance switch. These switches allow the implementation of random access protocols without the penalty of collisions among packets. An architecture based on collision avoidance is the CAMB (Collision Avoidance Multiple Broadcast) Tree network, where concurrent broadcasts are possible.The purpose of this paper is to present two implementations for a CAMB Tree switch. First, a general outline of the CAMB switch is provided. Then, a description of the two implementations is given
Recommended from our members
Implementation of a station/network interface for a CAMB tree network
Packet collisions and their resolution create a performance bottleneck in random-access LANs. A hardware solution to this problem is to use collision avoidance switches. These switches allow the implementation of random access protocols without the penalty of collisions among packets. An architecture based on collision avoidance is the CAMB (Collision Avoidance Multiple Broadcast) tree network, where concurrent broadcasts are possible.This paper is a companion to an earlier report. "TTL Implementations of a CAMB Tree Switch," where a tree network architecture was described for two different implementations of a CAMB tree switch. In the pages that follow, a hardware implementation of the interface between the network stations and the packet switches is proposed. This implementation is based on the first switch design in the companion paper
Empirical Bayesian Selection for Value Maximization
We study the common problem of selecting the best units from a set of
in the asymptotic regime , where noisy,
heteroskedastic measurements of the units' true values are available and the
decision-maker wishes to maximize the average true value of the units selected.
Given a parametric prior distribution, the empirical Bayesian decision rule
incurs regret relative to the Bayesian oracle that
knows the true prior. More generally, if the error in the estimated prior is of
order , regret is . In this sense
selecting the best units is easier than estimating their values. We show this
regret bound is sharp, by giving an example in which it is attained. Using
priors calibrated from a dataset of over four thousand internet experiments, we
find that empirical Bayes methods perform well in practice for detecting the
best treatments given only a modest number of experiments
Improved Tissue Caliper
My honors thesis will be covering the work that I am doing for my senior design project. My senior design project involves building a tissue caliper for medical device company, Covidien. During medical research and development it is often necessary to measure and document the thickness of tissue samples and media (simulated tissue foam). A device known as a tissue caliper is currently used to perform such measurements. Our project requires us to design, build, calibrate, and validate an instrument, which will be capable of measuring sample thicknesses ranging from 0.10 to 1.00 inches with an accuracy of ± 0.005 inches. Additionally this device should include advanced features to improve the quality of measurements taken. The thesis will document our work throughout the year and will include the design, fabrication, and construction process
On the Limits of Regression Adjustment
Regression adjustment, sometimes known as Controlled-experiment Using
Pre-Experiment Data (CUPED), is an important technique in internet
experimentation. It decreases the variance of effect size estimates, often
cutting confidence interval widths in half or more while never making them
worse. It does so by carefully regressing the goal metric against
pre-experiment features to reduce the variance. The tremendous gains of
regression adjustment begs the question: How much better can we do by
engineering better features from pre-experiment data, for example by using
machine learning techniques or synthetic controls? Could we even reduce the
variance in our effect sizes arbitrarily close to zero with the right
predictors? Unfortunately, our answer is negative. A simple form of regression
adjustment, which uses just the pre-experiment values of the goal metric,
captures most of the benefit. Specifically, under a mild assumption that
observations closer in time are easier to predict that ones further away in
time, we upper bound the potential gains of more sophisticated feature
engineering, with respect to the gains of this simple form of regression
adjustment. The maximum reduction in variance is in Theorem 1, or
equivalently, the confidence interval width can be reduced by at most an
additional
The Financial Implications of the Chinese Healthcare System
In 1949 one of the world’s most powerful and influential countries was born: The People’s Republic of China. Perhaps the greatest challenge the country has consistently faced since its inception has been ensuring a fiscally sound healthcare system. Today, China has the world’s largest population and a rapidly aging society with 330 million citizens over the age of 65 projected by 2050- nearly the same size as the total U.S. population. Living standards across China have been drastically increasing in recent decades and the Chinese people are desiring better, higher quality healthcare to complement their new lifestyles. With this desire comes China’s unique challenge of scale - operating the world’s largest healthcare system for 1.4 billion people. This paper introduces the historical perspective and background of China’s healthcare system, the major phases of reforms, how successful those reforms have been, and finally explores the financial impact China’s healthcare system has had around the globe
Supplement to: Air concentrations of polybrominated diphenyl ethers (PBDEs) in 2002-2004 at a rural site in the Great Lakes
Atmospheric PBDEs were measured on a monthly basis in 2002-2004 at Point Petre, a rural site in the Great Lakes. Average air concentrations were 7.0 ± 13 pg m**-3 for the sum of 14BDE (excluding BDE-209), and 1.8 ± 1.5 pg m**-3 for BDE-209. Concentrations of 3 dominant congeners (i.e., BDE-47, 99, and 209) were comparable to previous measurements at remote/rural sites around the Great Lakes, but much lower than those at urban areas. Weak temperature dependence and strong linear correlations between relatively volatile congeners suggest importance of advective inputs of gaseous species. The significant correlation between BDE-209 and 183 implies their transport inputs associated with particles. Particle-bound percentages were found greater for highly brominated congeners than less brominated ones. These percentages increase with decreasing ambient temperatures. The observed gas/particle partitioning is consistent with laboratory measurements and fits well to the Junge-Pankow model. Using air mass back-trajectories, atmospheric transport to Point Petre was estimated as 76% for BDE-47, 67% for BDE-99, and 70% for BDE-209 from west-northwest and southwest directions. During the same time period, similar congener profiles and concentration levels were found at Alert in the Canadian High Arctic. Different inter-annual variations between Point Petre and Alert indicate that emissions from other regions than North America could also contribute PBDEs in the Arctic. In contrast to weak temperature effect at Point Petre, significant temperature dependence in the summertime implies volatilization emissions of PBDEs at Alert. Meanwhile, episodic observations in the wintertime were likely associated with enhanced inputs through long-range transport during the Arctic Haze period
- …