2,557 research outputs found

    Modeling and Design of Millimeter-Wave Networks for Highway Vehicular Communication

    Get PDF
    Connected and autonomous vehicles will play a pivotal role in future Intelligent Transportation Systems (ITSs) and smart cities, in general. High-speed and low-latency wireless communication links will allow municipalities to warn vehicles against safety hazards, as well as support cloud-driving solutions to drastically reduce traffic jams and air pollution. To achieve these goals, vehicles need to be equipped with a wide range of sensors generating and exchanging high rate data streams. Recently, millimeter wave (mmWave) techniques have been introduced as a means of fulfilling such high data rate requirements. In this paper, we model a highway communication network and characterize its fundamental link budget metrics. In particular, we specifically consider a network where vehicles are served by mmWave Base Stations (BSs) deployed alongside the road. To evaluate our highway network, we develop a new theoretical model that accounts for a typical scenario where heavy vehicles (such as buses and lorries) in slow lanes obstruct Line-of-Sight (LOS) paths of vehicles in fast lanes and, hence, act as blockages. Using tools from stochastic geometry, we derive approximations for the Signal-to-Interference-plus-Noise Ratio (SINR) outage probability, as well as the probability that a user achieves a target communication rate (rate coverage probability). Our analysis provides new design insights for mmWave highway communication networks. In considered highway scenarios, we show that reducing the horizontal beamwidth from 9090^\circ to 3030^\circ determines a minimal reduction in the SINR outage probability (namely, 41024 \cdot 10^{-2} at maximum). Also, unlike bi-dimensional mmWave cellular networks, for small BS densities (namely, one BS every 500500 m) it is still possible to achieve an SINR outage probability smaller than 0.20.2.Comment: Accepted for publication in IEEE Transactions on Vehicular Technology -- Connected Vehicles Serie

    Nonresidents and Jurisdiction: A Modern Dilemma in Civil and Criminal Procedure

    Get PDF

    Recursive and Viterbi Estimation for Semi-Markov Chains

    Get PDF

    Randomization in substance abuse clinical trials

    Get PDF
    BACKGROUND: A well designed randomized clinical trial rates as the highest level of evidence for a particular intervention's efficacy. Randomization, a fundamental feature of clinical trials design, is a process invoking the use of probability to assign treatment interventions to patients. In general, randomization techniques pursue the goal of providing objectivity to the assignment of treatments, while at the same time balancing for treatment assignment totals and covariate distributions. Numerous randomization techniques, each with varying properties of randomness and balance, are suggested in the statistical literature. This paper reviews common randomization techniques often used in substance abuse research and an application from a National Institute on Drug Abuse (NIDA)-funded clinical trial in substance abuse is used to illustrate several choices an investigator faces when designing a clinical trial. RESULTS: Comparisons and contrasts of randomization schemes are provided with respect to deterministic and balancing properties. Specifically, Monte Carlo simulation is used to explore the balancing nature of randomization techniques for moderately sized clinical trials. Results demonstrate large treatment imbalance for complete randomization with less imbalance for the urn or adaptive scheme. The urn and adaptive randomization methods display smaller treatment imbalance as demonstrated by the low variability of treatment allocation imbalance. For all randomization schemes, covariate imbalance between treatment arms was small with little variation between adaptive schemes, stratified schemes and unstratified schemes given that sample sizes were moderate to large. CONCLUSION: We develop this paper with the goal of reminding substance abuse researchers of the broad array of randomization options available for clinical trial designs. There may be too quick a tendency for substance abuse researchers to implement the fashionable urn randomization schemes and other highly adaptive designs. In many instances, simple or blocked randomization with stratification on a major covariate or two will accomplish the same objectives as an urn or adaptive design, and it can do so with more simply implemented schedules and without the dangers of overmatching. Furthermore, the proper analysis, fully accounting for the stratified design, can be conducted

    Profitability and risk evaluation of novel perennial pasture systems for livestock producers in the high rainfall zone: Context, Approach and Preliminary Results

    Get PDF
    The decision to invest in pasture improvement raises various questions for the livestock grazier, with the most pertinent being about the potential returns and risks. In the high rainfall zone of south-west Victoria, researchers have trialled novel perennial pasture systems with the aim of substantially increasing on-farm profits whilst simultaneously improving environmental outcomes. Results from the Hamilton EverGraze® proof site have shown potential to greatly improve livestock production. Promotion of the pasture technology is the next step. Key to this process is developing information about profitability and risk regarding the decision to invest in the new pasture. To help meet this need a model of a representative mixed livestock farm system for the region has been developed to generate information about profit, cash wealth and risk to aid extension and help inform decisions. The farm is comprised of a wool and meat producing sheep system and a beef enterprise. Using the model, the performance of two of the novel pasture systems can be evaluated against current practice, and compared to determine which of the two is the most beneficial EverGraze® option for the future. The risk associated with the pasture decision is assessed by considering different price structures and seasonal outcomes, and evaluating these effects on net benefits. Discounted cash flows, net present values and internal rates of return are estimated for the alternative systems, which include the effects of this price and seasonal variability. Preliminary results have been calculated, however further work is needed to confirm these. The method and results of the analysis provide information that is valuable for farm decisions about investing in a new pasture system and provide a basis for future economic analyses at the case study site and elsewhere.Farm Management,

    Unifying inflation and dark matter with the Peccei-Quinn field: observable axions and observable tensors

    Get PDF
    A model of high scale inflation is presented where the radial part of the Peccei-Quinn (PQ) field with a non-minimal coupling to gravity plays the role of the inflaton, and the QCD axion is the dark matter. A quantum fluctuation of O(H/2π)\mathcal{O}(H/2\pi) in the axion field will result in a smaller angular fluctuation if the PQ field is sitting at a larger radius during inflation than in the vacuum. This changes the effective axion decay constant, faf_a, during inflation and dramatically reduces the production of isocurvature modes. This mechanism opens up a new window in parameter space where an axion decay constant in the range 1012 GeVfa1015 GeV10^{12}\text{ GeV}\lesssim f_a\lesssim 10^{15}\text{ GeV} is compatible with observably large rr. The exact range allowed for faf_a depends on the efficiency of reheating. This model also predicts a minimum possible value of r=103r=10^{-3}. The new window can be explored by a measurement of rr possible with \textsc{Spider} and the proposed CASPEr experiment search for high faf_a axions.Comment: 7 pages, 4 figure

    Influence of Atmospheric Turbulence on Optical Communications using Orbital Angular Momentum for Encoding

    Get PDF
    We describe an experimental implementation of a free-space 11-dimensional communication system using orbital angular momentum (OAM) modes. This system has a maximum measured OAM channel capacity of 2.12 bits/photon. The effects of Kolmogorov thin-phase turbulence on the OAM channel capacity are quantified. We find that increasing the turbulence leads to a degradation of the channel capacity. We are able to mitigate the effects of turbulence by increasing the spacing between detected OAM modes. This study has implications for high-dimensional quantum key distribution (QKD) systems. We describe the sort of QKD system that could be built using our current technology.Comment: 6 pages, 5 figure

    Shale oil : potential economies of large-scale production, preliminary phase

    Get PDF
    Producing shale oil on a large scale is one of the possible alternatives for reducing dependence of the United States on imported petroleum. Industry is not producing shale oil on a commercial scale now because costs are too high even though industry dissatisfaction is most frequently expressed about "non-economic" barriers: innumerable permits, changing environmental regulations, lease limitations, water rights conflicts, legal challenges, and so on. The overall purpose of this study is to estimate whether improved technology might significantly reduce unit costs for production of shale oil in a planned large-scale industry as contrasted to the case usually contemplated: a small industry evolving slowly on a project-by-project basis. In this preliminary phase of the study, we collected published data on the costs of present shale oil technology and adjusted them to common conditions; these data were assembled to help identify the best targets for cost reduction through improved large-scale technology They show that the total cost of producing upgraded shale oil (i.e. shale oil accpetable as a feed to a petroleum refinery) by surface retorting ranges from about 18to18 to 28/barrel in late '78 dollars with a 20% chance that the costs would be lower than and 20% higher than that range. The probability distribution reflects our assumptions about ranges of shale richness, process performance, rate of return, and other factors that seem likely in a total industry portfolio of projects. About 40% of the total median cost is attributable to retorting, 20% to upgrading, and the remaining 40% to resource acquisition, mining, crushing, and spent shale disposal and revegetation. Capital charges account for about 70% of the median total cost and operating costs for the other 30%. There is a reasonable chance that modified in-situ processes (like Occidental's) may be able to produce shale oil more cheaply than surface retorting, but no reliable cost data have been published; in 1978, DOE estimated a saving of roughly $5/B for in-situ. Because the total costs of shale oil are spread over many steps in the production process, improvements in most or all of those steps are required if we seek a significant reduction in total cost. A June 1979 workshop of industry experts was held to help us identify possible cost-reduction technologies. Examples of the improved large-scale technologies proposed (for further evaluation) to the workshop were: - Instead of hydrotreating raw shale oil to make syncrude capable of being refined conventionally, rebalance all of a refinery's processes (or develop new catalysts/processes less sensitive to feed nitrogen) to accommodate shale oil feed -- a change analogous to a shift from sweet crude to sour crude. - Instead of refining at or near the retort site, use heated pipelines to move raw shale oil to existing major refining areas. - Instead of operating individual mines, open-pit mine all or much of the Piceance Creek Basin. - Instead of building individual retorts, develop new methods for mass production of hundreds of retorts
    corecore