710 research outputs found

    Estimation in the group action channel

    Full text link
    We analyze the problem of estimating a signal from multiple measurements on a \mbox{group action channel} that linearly transforms a signal by a random group action followed by a fixed projection and additive Gaussian noise. This channel is motivated by applications such as multi-reference alignment and cryo-electron microscopy. We focus on the large noise regime prevalent in these applications. We give a lower bound on the mean square error (MSE) of any asymptotically unbiased estimator of the signal's orbit in terms of the signal's moment tensors, which implies that the MSE is bounded away from 0 when N/σ2dN/\sigma^{2d} is bounded from above, where NN is the number of observations, σ\sigma is the noise standard deviation, and dd is the so-called \mbox{moment order cutoff}. In contrast, the maximum likelihood estimator is shown to be consistent if N/σ2dN /\sigma^{2d} diverges.Comment: 5 pages, conferenc

    Putting the wood back into our rivers: an experiment in river rehabilitation

    Get PDF
    This paper presents an overview of a project established to assess the effectiveness of woody debris (WD) reintroduction as a river rehabilitation tool. An outline of an experiment is presented that aims to develop and assess the effectiveness of engineered log jams (ELJs) under Australian conditions, and to demonstrate the potential for using a range of ELJs to stabilise a previously de-snagged, high energy gravel-bed channel. Furthermore, the experiment will test the effectiveness of a reach based rehabilitation strategy to increase geomorphic variability and hence habitat diversity. While primarily focusing on the geomorphic and engineering aspects of the rehabilitation strategy, fish and freshwater mussel populations are also being monitored. The project is located within an 1100m reach of the Williams River, NSW. Twenty separate ELJ structures were constructed, incorporating a total of 430 logs placed without any artificial anchoring (e.g., no cabling or imported ballast). A geomorphic control reach was established 3.1 km upstream of the project reach. In the 6 months since the structures were built the study site has experienced 6 flows that have overtopped most structures, 3 of the flows were in excess of the mean annual flood, inundating 19 of the ELJs by 2 - 3 m, and one by 0.5 m. Early results indicate that with the exception of LS4 and LS5, all structures are performing as intended and that the geomorphic variability of the reach has substantially increased

    The Congressional Bureaucracy

    Get PDF
    Congress has a bureaucracy.This Article introduces the concept of the “congressional bureaucracy,” and theorizes what it means for Congress to have an internal workforce of more than 4,000 nonpartisan, highly specialized, and long-serving experts, without which the modern Congress could not function. These experts—not elected Members or their political staffs—write the text of the laws, audit implementation, research policy, estimate bills’ economic effects, decide which committees control legislation and which amendments can be made, edit and rearrange already-enacted (!) legislation into the law as we see it in the U.S. Code, and much more. The congressional bureaucracy furthers internal and external separation of powers, revives theories of Congress as a rational actor, and supplies key insight for statutory interpretation. But Courts, lawyers and legal scholars have almost entirely ignored their existence.This project is based on two years of confidential interviews with high-level staffers in Congress’s nine nonpartisan legislative institutions—the Office of the Law Revision Counsel; the Offices of the Legislative Counsels; the Congressional Research Service; the Government Accountability Office; the Parliamentarians; the Congressional Budget Office; the Joint Committee on Taxation; MedPAC and MACPAC—and additional interviews with partisan staff. The project furthers a new line of legislation scholarship about the value to theory and doctrine of understanding how Congress actually works. Courts cannot claim the doctrines of statutory interpretation are democratically linked to Congress, as virtually all judges do, without understanding how it writes legislation.Our research reveals that the congressional bureaucracy serves purposes previously unimagined by legal scholarship. Classic bureaucracy literature posits that Congress loses power when it delegates. But the congressional bureaucracy was explicitly founded so that Congress could reclaim and safeguard its own powers against an executive branch that was encroaching on the legislative process. The bureaucracy also safeguards Congress’s own internal separation of powers, the salutary decentralization of law-producing responsibilities among a collection of nonpartisan actors, preventing any one aspect of the lawmaking process from coming under undue political or centralized control.Understanding the congressional bureaucracy’s work also provocatively deconstructs the concept of a “statutory text.” The words Congress enacts are the result of a highly dialogic process that is triggered by and includes assumptions about critical inputs from the bureaucracy. Members and partisan staff focus on the substance of legislation at the macro level, not the specific words chosen at the micro level—that is the bureaucracy’s job. What we see when we open the statute books often is not even what Congress enacted or how Congress arranged it, because OLRC reorganizes and edits the laws after passage. So conceived, the concept of a “statute” is much more capacious than merely the “text” at the moment of the vote. None of this is illegitimate; Congress has set itself up this way. All of these inputs are part of the “text” as Congress intends it to be understood.Together, these institutions paint a picture of a Congress that is not as irrational as the public considers it to be. They also have on-the-ground lessons for statutory interpretation, highlighting critical inputs that courts miss and numerous statutory cues—from code placement to consistency of language to CBO scores—some of which courts dramatically overread, others of which should be attractive even to textualists because they result from formalist, objective, collectively congressional action. The field is now engaged in emerging debates about whether doctrine can absorb this kind of detail about legislative process; understanding the congressional bureaucracy is a critical new piece of this account

    Local Algorithms for Block Models with Side Information

    Full text link
    There has been a recent interest in understanding the power of local algorithms for optimization and inference problems on sparse graphs. Gamarnik and Sudan (2014) showed that local algorithms are weaker than global algorithms for finding large independent sets in sparse random regular graphs. Montanari (2015) showed that local algorithms are suboptimal for finding a community with high connectivity in the sparse Erd\H{o}s-R\'enyi random graphs. For the symmetric planted partition problem (also named community detection for the block models) on sparse graphs, a simple observation is that local algorithms cannot have non-trivial performance. In this work we consider the effect of side information on local algorithms for community detection under the binary symmetric stochastic block model. In the block model with side information each of the nn vertices is labeled ++ or −- independently and uniformly at random; each pair of vertices is connected independently with probability a/na/n if both of them have the same label or b/nb/n otherwise. The goal is to estimate the underlying vertex labeling given 1) the graph structure and 2) side information in the form of a vertex labeling positively correlated with the true one. Assuming that the ratio between in and out degree a/ba/b is Θ(1)\Theta(1) and the average degree (a+b)/2=no(1) (a+b) / 2 = n^{o(1)}, we characterize three different regimes under which a local algorithm, namely, belief propagation run on the local neighborhoods, maximizes the expected fraction of vertices labeled correctly. Thus, in contrast to the case of symmetric block models without side information, we show that local algorithms can achieve optimal performance for the block model with side information.Comment: Due to the limitation "The abstract field cannot be longer than 1,920 characters", the abstract here is shorter than that in the PDF fil

    Weekly group tummy time classes are feasible and acceptable to mothers with infants: a pilot cluster randomized controlled trial

    Get PDF
    2020, The Author(s). Background: The World Health Organization recommends 30 min of tummy time daily for improved motor development and reduced likelihood of plagiocephaly. As only 30% of infants meet this recommendation, parents require strategies and support to increase this proportion. Methods: The aim of this study was to determine the feasibility, acceptability, and potential efficacy of a group intervention to promote tummy time. The design is a cluster randomized controlled trial with concealed allocation, assessor blinding, and intention-to-treat analysis. Five groups of healthy infants (N = 35, baseline mean (SD) age 5.9 (2.8) weeks) and their mothers attending local mother\u27s groups (Australia) were randomly allocated to the intervention or control group. The intervention group received group tummy time classes in addition to usual care. The control group received usual care with their child and family health nurse. Primary outcomes were intervention feasibility and acceptability. Secondary outcomes were tummy time duration (accelerometry), adherence to physical activity guidelines, head shape, and motor development. Measures were taken at baseline, post-intervention, and when infants were 6 months of age. Analyses were by linear mixed models and Cohen\u27s d statistic. Results: Recruitment, retention, and collection of objective data met feasibility targets. Acceptability was also met with intervention mothers reporting the information, goal planning, and handouts significantly more useful and relevant than control group mothers (p \u3c 0.01). Moderate effect sizes were also found at post-intervention for tummy time duration, adherence to physical activity guidelines and infant ability in prone and supine favoring the intervention group (intervention infants had a mean of 30 min and 30% adherence to guidelines (95% CI 0 to 60.6 min) compared to the control infants who had a mean of 16.6 min and 13% adherence to the guidelines (95% CI 0 to 42.1 min, Cohen\u27s d = 0.5). Limitations were the small sample size, 4-week intervention, limited accelerometer use, and a homogenous sample of participants. Conclusion: Group tummy time classes delivered in a mother\u27s group setting were shown to be feasible and acceptable. A larger randomized controlled trial is warranted. Trial registration: ANZCTR, ACTRN12617001298303p. Registered 11 September 201

    Effect of β-glucan and black tea in a functional bread on short chain fatty acid production by the gut microbiota in a gut digestion/fermentation model

    Get PDF
    β-Glucan and black tea are fermented by the colonic microbiota producing short chain fatty acids (SCFA) and phenolic acids (PA). We hypothesized that the addition of β-glucan, a dietary fiber, and tea polyphenols to a food matrix like bread will also affect starch digestion in the upper gut and thus further influence colonic fermentation and SCFA production. This study investigated SCFA and PA production from locally developed breads: white bread (WB), black tea bread (BT), β-glucan bread (βG), β-glucan plus black tea bread (βGBT). Each bread was incubated in an in vitro system mimicking human digestion and colonic fermentation. Digestion with ι-amylase significantly (p = 0.0001) increased total polyphenol and polyphenolic metabolites from BT bread compared with WB, βG, and βGBT. Total polyphenols in βGBT remained higher (p = 0.016; 1.3-fold) after digestion with pepsin and pancreatin compared with WB. Fermentations containing βG and βGBT produced similar propionate concentrations ranging from 17.5 to 18.6 mmol/L and total SCFA from 46.0 to 48.9 mmol/L compared with control WB (14.0 and 37.4 mmol/L, respectively). This study suggests that combination of black tea with β-glucan in this functional bread did not impact on SCFA production. A higher dose of black tea and β-glucan or in combination with other fibers may be needed to increase SCFA production

    Super-resolution provided by the arbitrarily strong superlinearity of the blackbody radiation

    Get PDF
    Blackbody radiation is a fundamental phenomenon in nature, and its explanation by Planck marks a cornerstone in the history of Physics. In this theoretical work, we show that the spectral radiance given by Planck's law is strongly superlinear with temperature, with an arbitrarily large local exponent for decreasing wavelengths. From that scaling analysis, we propose a new concept of super-resolved detection and imaging: if a focused beam of energy is scanned over an object that absorbs and linearly converts that energy into heat, a highly nonlinear thermal radiation response is generated, and its point spread function can be made arbitrarily smaller than the excitation beam focus. Based on a few practical scenarios, we propose to extend the notion of super-resolution beyond its current niche in microscopy to various kinds of excitation beams, a wide range of spatial scales, and a broader diversity of target objects

    Channel and terminal description of the ACTS mobile terminal

    Get PDF
    The Advanced Communications Technology Satellite (ACTS) Mobile Terminal (AMT) is a proof-of-concept K/Ka-band mobile satellite communications terminal under development by NASA at JPL. Currently the AMT is undergoing system integration and test in preparation for a July 1993 ACTS launch and the subsequent commencement of mobile experiments in the fall of 1993. The AMT objectives are presented followed by a discussion of the AMT communications channel and mobile terminal design and performance

    Identification of and Molecular Basis for SIRT6 Loss-of-Function Point Mutations in Cancer

    Get PDF
    SummaryChromatin factors have emerged as the most frequently dysregulated family of proteins in cancer. We have previously identified the histone deacetylase SIRT6 as a key tumor suppressor, yet whether point mutations are selected for in cancer remains unclear. In this manuscript, we characterized naturally occurring patient-derived SIRT6 mutations. Strikingly, all the mutations significantly affected either stability or catalytic activity of SIRT6, indicating that these mutations were selected for in these tumors. Further, the mutant proteins failed to rescue sirt6 knockout (SIRT6 KO) cells, as measured by the levels of histone acetylation at glycolytic genes and their inability to rescue the tumorigenic potential of these cells. Notably, the main activity affected in the mutants was histone deacetylation rather than demyristoylation, pointing to the former as the main tumor-suppressive function for SIRT6. Our results identified cancer-associated point mutations in SIRT6, cementing its function as a tumor suppressor in human cancer
    • …
    corecore