3,065 research outputs found

    Energy-Efficient Joint Estimation in Sensor Networks: Analog vs. Digital

    Full text link
    Sensor networks in which energy is a limited resource so that energy consumption must be minimized for the intended application are considered. In this context, an energy-efficient method for the joint estimation of an unknown analog source under a given distortion constraint is proposed. The approach is purely analog, in which each sensor simply amplifies and forwards the noise-corrupted analog bservation to the fusion center for joint estimation. The total transmission power across all the sensor nodes is minimized while satisfying a distortion requirement on the joint estimate. The energy efficiency of this analog approach is compared with previously proposed digital approaches with and without coding. It is shown in our simulation that the analog approach is more energy-efficient than the digital system without coding, and in some cases outperforms the digital system with optimal coding.Comment: To appear in Proceedings of the 2005 IEEE International Conference on Acoustics, Speech and Signal Processing, Philadelphia, PA, March 19 - 23, 200

    An initial access optimization algorithm for millimetre wave 5G NR networks

    Get PDF
    Abstract. The fifth generation (5G) of cellular technology is expected to address the ever-increasing traffic requirements of the digital society. Delivering these higher data rates, higher bandwidth is required, thus, moving to the higher frequency millimetre wave (mmWave) spectrum is needed. However, to overcome the high isotropic propagation loss experienced at these frequencies, base station (BS) and the user equipment (UE) need to have highly directional antennas. Therefore, BS and UE are required to find the correct transmission (Tx) and reception (Rx) beam pair that align with each other. Achieving these fine alignment of beams at the initial access phase is quite challenging due to the unavailability of location information about BS and UE. In mmWave small cells, signals are blocked by obstacles. Hence, signal transmissions may not reach users. Also, some directions may have higher user density while some directions have lower or no user density. Therefore, an intelligent cell search is needed for initial access, which can steer its beams to a known populated area for UEs instead of wasting time and resources emitting towards an obstacle or unpopulated directions. In this thesis, we provide a dynamic weight-based beam sweeping direction and synchronization signal block (SSB) allocation algorithm to optimize the cell search in the mmWave initial access. The order of beam sweeping directions and the number of SSBs transmitted in each beam sweeping direction depend on previously learned experience. Previous learning is based on the number of detected UEs per SSB for each sweeping direction. Based on numerical simulations, the proposed algorithm is shown to be capable of detecting more users with a lower misdetection probability. Furthermore, it is possible to achieve the same performance with a smaller number of dynamic resource (i.e., SSB) allocation, compared to constant resource allocation. Therefore, this algorithm has better performance and optimum resource usage

    Pulse vs. Optimal Stationary Fishing: The Northern Stock of Hake

    Get PDF
    Pulse fishing may be a global optimal strategy in multicohort fisheries. In this article we compare the pulse fishing solutions obtained by using global numerical methods with the analytical stationary optimal solution. This allows us to quantify the potential benefits associated with the use of periodic fishing in the Northern Stock of hake. Results show that: first, management plans based exclusively on traditional reference targets as Fmsy may drive fishery economic results far from the optimal; second, global optimal solutions would imply, in a cyclical manner, the closure of the fishery for some periods and third, second best stationary policies with stable employment only reduce optimal present value of discounted profit in a 2%.optimal fisheries, management optimization in age-structured models, pulse fishing

    Optimal Accomplice-Witnesses Regulation under Asymmetric Information

    Get PDF
    We study the problem of a Legislator designing immunity for privately informed cooperating accomplices. Our objective is to highlight the positive (vertical) externality between expected returns from crime and the information rent that must be granted by the Legislator to whistleblowers in order to break their code of silence (omertĂ ) and elicit truthful information revelation. We identify the accomplices' incentives to release distorted information and characterize the second-best policy limiting this behavior. The central finding is that this externality leads to a second-best policy that purposefully allows whistleblowers not to disclose part of their private information. We also show that accomplices must fulfill minimal information requirements to be admitted into the program (rationing), that a bonus must be awarded to accomplices providing more reliable information and that, under some conditions, rewarding a self-reporting `boss' can increase efficiency. These results are consistent with a number of widespread legislative provisions.Accomplice-witnesses, Adverse Selection, Leniency, Organized Crime

    LASER: Light, Accurate Sharing dEtection and Repair

    Get PDF
    Contention for shared memory, in the forms of true sharing and false sharing, is a challenging performance bug to discover and to repair. Understanding cache contention requires global knowledge of the program\u27s actual sharing behavior, and can even arise invisibly in the program due to the opaque decisions of the memory allocator. Previous schemes have focused only on false sharing, and impose significant performance penalties or require non-trivial alterations to the operating system or runtime system environment. This paper presents the Light, Accurate Sharing dEtection and Repair (LASER) system, which leverages new performance counter capabilities available on Intel\u27s Haswell architecture that identify the source of expensive cache coherence events. Using records of these events generated by the hardware, we build a system for online contention detection and repair that operates with low performance overhead and does not require any invasive program, compiler or operating system changes. Our experiments show that LASER imposes just 2% average runtime overhead on the Phoenix, Parsec and Splash2x benchmarks. LASER can automatically improve the performance of programs by up to 19% on commodity hardware

    Assessing Risk and Uncertainty in Fisheries Rebuilding Plans

    Get PDF
    This paper deals with risk and uncertainties that are an inherent part of design-ing and implementing fisheries rebuilding plans. Such risk and uncertainties stem from a variety of sources, biological, economic and/or political factors, and are influenced by external factors like changing environmental conditions. The aim of this paper is to characterize such risks and uncertainties and to as-sess the importance of it in relation to the performance of fisheries rebuilding plans, to give some examples where uncertainties have negatively affected the ability of rebuilding plans to reach their intended targets and to give some guidelines how to deal with risk and uncertainties. The conclusion is that when designing fisheries rebuilding plans, it should be taken into account the availability of relevant information, such that progress is (indisputable) measurable, and causes of potential failure can be clarified. Rebuilding plans need to consider biological, economic and distributional consequences in order to reduce uncertainties and to ensure successful implementation of the plan. Risk communication is also valuable in the process, since it gives transparency of the objectives and means to meet these objectives, elucidates crucial information from stakeholders and legitimates the whole process of designing and implementing the rebuilding plans, which is essential for the success of these plans. To that end the plans should be as simple and realistic as possible. It is recommended to apply risk analysis and to use the precautionary principle only in cases where large uncertainties exists and/or potentially high costs exits of ignoring the uncertainty cannot be resolved. Two fisheries rebuilding plans are analysed and how they address risk and uncertainties are evaluated. This study was done under contract with the OECD. The authors are grateful to Gunnar Haraldsson and Saba Khwaja for comments and advise.

    Template-dependent multiple displacement amplification for profiling human circulating RNA

    Get PDF
    Multiple displacement amplification (MDA) is widely used in whole-genome/transcriptome amplification. However, template-independent amplification (TIA) in MDA is a commonly observed phenomenon, particularly when using high concentrations of random hexamer primers and extended incubation times. Here, we demonstrate that the use of random pentamer primers with 5ÂŽ ends blocked by a C18 spacer results in MDA solely in a template-dependent manner, a technique we have named tdMDA. Together with an optimized procedure for the removal of residual genomic DNA during RNA extraction, tdMDA was used to profile circulating RNA from 0.2 mL of patient sera. In comparison to regular MDA, tdMDA demonstrated a lack of quantifiable DNA amplification in the negative control, a remarkable reduction of unmapped reads from Illumina sequencing (7 ± 10.9% versus 58.6 ± 39%, P = 0.006), and increased mapping rates of the serum transcriptome (26.9 ± 7.9% versus 5.8 ± 8.2%, P = 3.8 × 10-4). Transcriptome profiles could be used to separate patients with chronic hepatitis C virus (HCV) infection from those with HCV-associated hepatocellular carcinoma (HCC). We conclude that tdMDA should facilitate RNA-based liquid biopsy, as well as other genome studies with biological specimens having ultralow amounts of genetic material. </jats:p
    • 

    corecore