3,025 research outputs found

    Early formation of massive, compact, spheroidal galaxies with classical profiles by violent disc instability or mergers

    Full text link
    We address the formation of massive stellar spheroids between redshifts z=4z=4 and 1 using a suite of AMR hydro-cosmological simulations. The spheroids form as bulges, and the spheroid mass growth is partly driven by violent disc instability (VDI) and partly by mergers. A kinematic decomposition to disc and spheroid yields that the mass fraction in the spheroid is between 50\% and 90\% and is roughly constant in time, consistent with a cosmological steady state of VDI discs that are continuously fed from the cosmic web. The density profile of the spheroid is typically "classical", with a Sersic index n=4.5±1n = 4.5\pm 1, independent of whether it grew by mergers or VDI and independent of the feedback strength. The disc is characterized by n=1.5±0.5n=1.5\pm 0.5, and the whole galaxy by n=3±1n=3\pm 1. The high-redshift spheroids are compact due to the dissipative inflow of gas and the high universal density. The stellar surface density within the effective radius of each galaxy as it evolves remains roughly constant in time after its first growth. For galaxies of a fixed stellar mass, the surface density is higher at higher redshifts.Comment: 22 pages, 15 figures, accepted in MNRA

    A bulk inflaton from large volume extra dimensions

    Get PDF
    The universe may have extra spatial dimensions with large volume that we cannot perceive because the energy required to excite modes in the extra directions is too high. Many examples are known of such manifolds with a large volume and a large mass gap. These compactifications can help explain the weakness of four-dimensional gravity and, as we show here, they also have the capacity to produce reasonable potentials for an inflaton field. Modeling the inflaton as a bulk scalar field, it becomes very weakly coupled in four dimensions, and this enables us to build phenomenologically acceptable inflationary models with tunings at the few per mil level. We speculate on dark matter candidates and the possibility of braneless models in this setting.Comment: 21 pages, LaTeX, 2 pdf figures. v2: additional references. v3: added comments on moduli stabilizatio

    Risk Management for Nonprofits

    Get PDF
    Our research, based on the first comprehensive financial analysis of New York's nonprofit sector, found that 10% of the city's nonprofits were insolvent and 40% had virtually no cash reserves. Less than 30% were financially strong. If anything, things are getting harder, given market volatility, the move to value-based payments in health care, and increased costs for real estate and labor.Fortunately, we also discovered that nonprofits can take a few concrete steps to reduce their risk of failure and sustain vital programs:Make risk management an explicit responsibility of the audit and/or finance committee.Develop a risk-tolerance statement, indicating the limits for risk-taking and the willingness to trade short-term impact for longer-term sustainability.Keep a running list of major risks and the likelihood and expected loss for each.Put in place plans for how to maintain service in the event of a financial disaster, or even a "living will" that specifies how programs will be transferred to other providers (or wound down in an orderly fashion) in the event that recovery is not possible.Brief trustees regularly about longer-term trends in the operating environment.Periodically explore the potential benefits of various forms of organizational redesign, such as mergers, acquisitions, joint ventures, partnerships, outsourcing, managed dissolutions, and divestments.Compare financial performance to peers on an annual basis.Develop explicit targets for operating results (margins, months of cash, etc.) and contingency plans if minimum targets are not met.Redouble efforts to build and safeguard a financial cushion or "rainy-day fund," even if doing so forces consideration of difficult programmatic trade-offs.Doing any of these will depend on a functioning partnership between capable management and a critical mass of experienced, educated and engaged board members. Therefore, organizations serious about risk management must work hard to recruit board members with a wide range of experience. They need to ensure ongoing education for both new and existing board members and to empower high-functioning committees. Many organizations, particularly large and complex ones, would also benefit from having an experienced nonprofit executive on their board

    Optimizing a One DOF Robot Without a Mathematical Model Using a Genetic Algorithm

    Get PDF
    The purpose of the present study was to design a flight control system with no pre-determined mathematical model, but instead using a genetic algorithm to maintain the optimal altitude. The study is done through a quantitative empirical research method. In the process of conducting the research, we found that programming a genetic algorithm was cumbersome for novice users to implement. Due to this, we created and released an open source Python package called EasyGA. An initial population of 10 chromosomes and 5 generations were used during the trial. The throttle value of the device had an associated gene value of 1 second. When the trial of five generations was completed, the total increase percentage was 171 percent. Preliminary results showed that optimizing a one DOF device, in real-time, is possible without using a pre-determined mathematical model

    Relaxation dynamics of the toric code in contact with a thermal reservoir: Finite-size scaling in a low temperature regime

    Get PDF
    We present an analysis of the relaxation dynamics of finite-size topological qubits in contact with a thermal bath. Using a continuous-time Monte Carlo method, we explicitly compute the low-temperature nonequilibrium dynamics of the toric code on finite lattices. In contrast to the size-independent bound predicted for the toric code in the thermodynamic limit, we identify a low-temperature regime on finite lattices below a size-dependent crossover temperature with nontrivial finite-size and temperature scaling of the relaxation time. We demonstrate how this nontrivial finite-size scaling is governed by the scaling of topologically nontrivial two-dimensional classical random walks. The transition out of this low-temperature regime defines a dynamical finite-size crossover temperature that scales inversely with the log of the system size, in agreement with a crossover temperature defined from equilibrium properties. We find that both the finite-size and finite-temperature scaling are stronger in the low-temperature regime than above the crossover temperature. Since this finite-temperature scaling competes with the scaling of the robustness to unitary perturbations, this analysis may elucidate the scaling of memory lifetimes of possible physical realizations of topological qubits.Comment: 14 Pages, 13 figure

    Transient LTRE analysis reveals the demographic and trait-mediated processes that buffer population growth.

    Get PDF
    Temporal variation in environmental conditions affects population growth directly via its impact on vital rates, and indirectly through induced variation in demographic structure and phenotypic trait distributions. We currently know very little about how these processes jointly mediate population responses to their environment. To address this gap, we develop a general transient life table response experiment (LTRE) which partitions the contributions to population growth arising from variation in (1) survival and reproduction, (2) demographic structure, (3) trait values and (4) climatic drivers. We apply the LTRE to a population of yellow-bellied marmots (Marmota flaviventer) to demonstrate the impact of demographic and trait-mediated processes. Our analysis provides a new perspective on demographic buffering, which may be a more subtle phenomena than is currently assumed. The new LTRE framework presents opportunities to improve our understanding of how trait variation influences population dynamics and adaptation in stochastic environments

    Measuring What Matters Most

    Get PDF
    An argument that choice-based, process-oriented educational assessments are more effective than static assessments of fact retrieval. If a fundamental goal of education is to prepare students to act independently in the world—in other words, to make good choices—an ideal educational assessment would measure how well we are preparing students to do so. Current assessments, however, focus almost exclusively on how much knowledge students have accrued and can retrieve. In Measuring What Matters Most, Daniel Schwartz and Dylan Arena argue that choice should be the interpretive framework within which learning assessments are organized. Digital technologies, they suggest, make this possible; interactive assessments can evaluate students in a context of choosing whether, what, how, and when to learn. Schwartz and Arena view choice not as an instructional ingredient to improve learning but as the outcome of learning. Because assessments shape public perception about what is useful and valued in education, choice-based assessments would provide a powerful lever in this reorientation in how people think about learning. Schwartz and Arena consider both theoretical and practical matters. They provide an anchoring example of a computerized, choice-based assessment, argue that knowledge-based assessments are a mismatch for our educational aims, offer concrete examples of choice-based assessments that reveal what knowledge-based assessments cannot, and analyze the practice of designing assessments. Because high variability leads to innovation, they suggest democratizing assessment design to generate as many instances as possible. Finally, they consider the most difficult aspect of assessment: fairness. Choice-based assessments, they argue, shed helpful light on fairness considerations
    • …
    corecore