4,017 research outputs found

    A Method for 21cm Power Spectrum Estimation in the Presence of Foregrounds

    Full text link
    21cm tomography promises to be a powerful tool for estimating cosmological parameters, constraining the epoch of reionization, and probing the so-called dark ages. However, realizing this promise will require the extraction of a cosmological power spectrum from beneath overwhelmingly large sources of foreground contamination. In this paper, we develop a unified matrix-based framework for foreground subtraction and power spectrum estimation, which allows us to quantify the errors and biases that arise in the power spectrum as a result of foreground subtraction. We find that existing line-of-sight foreground subtraction proposals can lead to substantial mode-mixing as well as residual noise and foreground biases, whereas our proposed inverse variance foreground subtraction eliminates noise and foreground biases, gives smaller error bars, and produces less correlated measurements of the power spectrum. We also numerically confirm the intuitive belief in the literature that 21cm foreground subtraction is best done using frequency rather than angular information.Comment: 24 pages, 11 figures; replaced with accepted PRD version (minor editorial changes to text; methods, results, and conclusions unchanged

    Jitter model and signal processing techniques for pulse width modulation optical recording

    Get PDF
    A jitter model and signal processing techniques are discussed for data recovery in Pulse Width Modulation (PWM) optical recording. In PWM, information is stored through modulating sizes of sequential marks alternating in magnetic polarization or in material structure. Jitter, defined as the deviation from the original mark size in the time domain, will result in error detection if it is excessively large. A new approach is taken in data recovery by first using a high speed counter clock to convert time marks to amplitude marks, and signal processing techniques are used to minimize jitter according to the jitter model. The signal processing techniques include motor speed and intersymbol interference equalization, differential and additive detection, and differential and additive modulation

    Precision Calibration of Radio Interferometers Using Redundant Baselines

    Get PDF
    Growing interest in 21 cm tomography has led to the design and construction of broadband radio interferometers with low noise, moderate angular resolution, high spectral resolution, and wide fields of view. With characteristics somewhat different from traditional radio instruments, these interferometers may require new calibration techniques in order to reach their design sensitivities. Self-calibration or redundant calibration techniques that allow an instrument to be calibrated off complicated sky emission structures are ideal. In particular, the large number of redundant baselines possessed by these new instruments makes redundant calibration an especially attractive option. In this paper, we explore the errors and biases in existing redundant calibration schemes through simulations, and show how statistical biases can be eliminated. We also develop a general calibration formalism that includes both redundant baseline methods and basic point source calibration methods as special cases, and show how slight deviations from perfect redundancy and coplanarity can be taken into account.Comment: 18 pages, 13 figures; Replaced to match accepted MNRAS versio

    PABO: Mitigating Congestion via Packet Bounce in Data Center Networks

    Full text link
    In today's data center, a diverse mix of throughput-sensitive long flows and delay-sensitive short flows are commonly presented in shallow-buffered switches. Long flows could potentially block the transmission of delay-sensitive short flows, leading to degraded performance. Congestion can also be caused by the synchronization of multiple TCP connections for short flows, as typically seen in the partition/aggregate traffic pattern. While multiple end-to-end transport-layer solutions have been proposed, none of them have tackled the real challenge: reliable transmission in the network. In this paper, we fill this gap by presenting PABO -- a novel link-layer design that can mitigate congestion by temporarily bouncing packets to upstream switches. PABO's design fulfills the following goals: i) providing per-flow based flow control on the link layer, ii) handling transient congestion without the intervention of end devices, and iii) gradually back propagating the congestion signal to the source when the network is not capable to handle the congestion.Experiment results show that PABO can provide prominent advantage of mitigating transient congestions and can achieve significant gain on end-to-end delay

    The Big Ban(g) Theory

    Get PDF
    The term “Big Tech” is referred to: Amazon, Apple, Facebook (Meta), Google and Microsoft. These companies are the five largest multinational online service or computer hardware and software companies and have the top position in the stock market by market share. Data indicated that these five firms have made over 700 acquisitions from 1987 to 2019. (Google 32%, Microsoft 31%, Apple 15%, Amazon 11%, and Facebook 11%). After 2001, The DOJ and FTC began to use NAICS codes to report HSR (Hart-Scott-Rodino) transactions. The code name is NAICS 518 for data processing, hosting, and related services (mainly including Google, Amazon, Facebook). Over 200 transactions were reportable between 2001 and 2017 and only one of which was challenged by the DOJ in federal district court – the Google/ITA case. This rate, as a percentage of transactions cleared to the agencies over the period, is about 3%, which is significantly lower than that of 13% across all sectors. All this data raises controversy in relation to the effects of the dominance and overpowering of the Big Tech to innovation and market entry; incentives to compete on price and nonprice dimensions; and the potential for AI-driven biased pricing and other theories of harms. In realizing this growing power of the Big Tech and underenforcement in regulations, US Senator Josh Hawley proposed the bill of “Bust Up Big Tech Act” on April 19th 2021, which will “crack down on mergers and acquisitions by mega-corporations and strengthen antitrust enforcement to pursue the breakup of dominant, anticompetitive firms,” according to him. In section 2, this article examines the US regulations on both horizontal and non-horizontal mergers and the evolution of the law in the past 60 years. In section 3, the article looks at how the law interacts with the Big Tech merger and acquisition activities and introduce the shortcomings to the existing system. In section 4, the article in-depth analyses the theories of harm and what would happen if an authority banned all the mergers and acquisitions for the Big Tech. In section 5, the article briefly expresses the authors’ view regarding to what extend the authors agree with “The Big Ban(g) Theory.
    • …
    corecore