13,738 research outputs found

    Criticality, factorization and long-range correlations in the anisotropic XY-model

    Get PDF
    We study the long-range quantum correlations in the anisotropic XY-model. By first examining the thermodynamic limit we show that employing the quantum discord as a figure of merit allows one to capture the main features of the model at zero temperature. Further, by considering suitably large site separations we find that these correlations obey a simple scaling behavior for finite temperatures, allowing for efficient estimation of the critical point. We also address ground-state factorization of this model by explicitly considering finite size systems, showing its relation to the energy spectrum and explaining the persistence of the phenomenon at finite temperatures. Finally, we compute the fidelity between finite and infinite systems in order to show that remarkably small system sizes can closely approximate the thermodynamic limit.Comment: 8 pages, 8 figures. Close to published versio

    How Did SCHIP Affect the Insurance Coverage of Immigrant Children?

    Get PDF
    The State Children's Health Insurance Program (SCHIP) significantly expanded public insurance eligibility and coverage for children in "working poor" families. Despite this success, it is estimated that over 6 million children who are eligible for public insurance remain uninsured. An important first step for designing strategies to increase enrollment of eligible but uninsured children is to determine how the take-up of public coverage varies within the population. Because of their low rates of insurance coverage and unique enrollment barriers, children of immigrants are an especially important group to consider. We compare the effect of SCHIP eligibility on the insurance coverage of children of foreign-born and native-born parents. In contrast to research on the earlier Medicaid expansions, we find similar take-up rates for the two groups. This suggests that state outreach strategies were not only effective at increasing take-up overall, but were successful in reducing disparities in access to coverage.

    Unifying approach to the quantification of bipartite correlations by Bures distance

    Full text link
    The notion of distance defined on the set of states of a composite quantum system can be used to quantify total, quantum and classical correlations in a unifying way. We provide new closed formulae for classical and total correlations of two-qubit Bell-diagonal states by considering the Bures distance. Complementing the known corresponding expressions for entanglement and more general quantum correlations, we thus complete the quantitative hierarchy of Bures correlations for Bell-diagonal states. We then explicitly calculate Bures correlations for two relevant families of states: Werner states and rank-2 Bell-diagonal states, highlighting the subadditivity which holds for total correlations with respect to the sum of classical and quantum ones when using Bures distance. Finally, we analyse a dynamical model of two independent qubits locally exposed to non-dissipative decoherence channels, where both quantum and classical correlations measured by Bures distance exhibit freezing phenomena, in analogy with other known quantifiers of correlations.Comment: 18 pages, 4 figures; published versio

    Dosimetric significance of manual density overrides in oropharyngeal cancer

    Get PDF
    Kilovoltage computed tomography plays a crucial role in radiotherapy planning. However, the presence of high-density metallic objects can introduce streaking artifacts in CT scans, resulting in inaccurate dose calculations by the treatment planning software. Previous studies have explored manual density overrides and artifact reduction algorithms individually to enhance dose calculation accuracy, but their combined application on patient plans within a treatment planning system remains unexplored. This research aims to assess the necessity of manual density overrides when an artifact reduction algorithm is already employed to address dental artifacts in oropharyngeal cancer treatment plans. A total of 20 previously treated volumetric modulated arc therapy plans were collected, and manual density overrides were removed followed by plan recalculation. Dosimetric parameters were then compared between the original and modified plans. Statistical analysis revealed several dosimetric parameters for the planning target volume (PTV), clinical target volume (CTV), and oral cavity that exhibited statistically significant differences upon removing the manual density override. However, these differences were found to be small in absolute terms. No other organs evaluated demonstrated statistically significant differences in dose. The most significant disparity observed was an 8.26 cGy increase in mean dose to the CTV, which represents only 0.12% of the prescription dose. Based on these findings, it can be concluded that manual density overrides are likely unnecessary when an artifact reduction algorithm is employed in oropharyngeal cancer cases. Keywords: Metal artifact reduction algorithm, manual density override, oropharyngeal cancer, head and neck cancer, dental artifact

    Planning Research Ethics

    Get PDF
    This chapter will discuss why planning researchers of all kinds – faculty, students, consultants – need to be ethically sensitive. The chapter is written against a background of what has been termed an ‘”ethical” turn’ in many disciplines, and an increasing regulation (and bureaucratisation) of planning (and other) research conducted within universities. Ladd (1980,) argues that there are no ethical principles which are specific to any occupation. In this chapter we argue that the circumstances of planning research, at least, raise distinctive ethical issues . The chapter begins by considering the ethical dimensions of research practice. These will differ according to the way research practice is understood. It then asks why regulation of researchers’ conduct is now such a preoccupation of universities and researchers. Finally, it examines the nature of codes of ethics and when they are likely to be most effective

    Do Labyrinthine Legal Limits on Leverage Lessen the Likelihood of Losses? An Analytical Framework

    Get PDF
    A common theme in the regulation of financial institutions and transactions is leverage constraints. Although such constraints are implemented in various ways—from minimum net capital rules to margin requirements to credit limits—the basic motivation is the same: to limit the potential losses of certain counterparties. However, the emergence of dynamic trading strategies, derivative securities, and other financial innovations poses new challenges to these constraints. We propose a simple analytical framework for specifying leverage constraints that addresses this challenge by explicitly linking the likelihood of financial loss to the behavior of the financial entity under supervision and prevailing market conditions. An immediate implication of this framework is that not all leverage is created equal, and any fixed numerical limit can lead to dramatically different loss probabilities over time and across assets and investment styles. This framework can also be used to investigate the macroprudential policy implications of microprudential regulations through the general-equilibrium impact of leverage constraints on market parameters such as volatility and tail probabilities.Massachusetts Institute of Technology. Laboratory for Financial EngineeringNorthwestern University School of Law (Faculty Research Program

    An Evolutionary Model of Bounded Rationality and Intelligence

    Get PDF
    Background: Most economic theories are based on the premise that individuals maximize their own self-interest and correctly incorporate the structure of their environment into all decisions, thanks to human intelligence. The influence of this paradigm goes far beyond academia–it underlies current macroeconomic and monetary policies, and is also an integral part of existing financial regulations. However, there is mounting empirical and experimental evidence, including the recent financial crisis, suggesting that humans do not always behave rationally, but often make seemingly random and suboptimal decisions. Methods and Findings: Here we propose to reconcile these contradictory perspectives by developing a simple binary-choice model that takes evolutionary consequences of decisions into account as well as the role of intelligence, which we define as any ability of an individual to increase its genetic success. If no intelligence is present, our model produces results consistent with prior literature and shows that risks that are independent across individuals in a generation generally lead to risk-neutral behaviors, but that risks that are correlated across a generation can lead to behaviors such as risk aversion, loss aversion, probability matching, and randomization. When intelligence is present the nature of risk also matters, and we show that even when risks are independent, either risk-neutral behavior or probability matching will occur depending upon the cost of intelligence in terms of reproductive success. In the case of correlated risks, we derive an implicit formula that shows how intelligence can emerge via selection, why it may be bounded, and how such bounds typically imply the coexistence of multiple levels and types of intelligence as a reflection of varying environmental conditions. Conclusions: Rational economic behavior in which individuals maximize their own self interest is only one of many possible types of behavior that arise from natural selection. The key to understanding which types of behavior are more likely to survive is how behavior affects reproductive success in a given population’s environment. From this perspective, intelligence is naturally defined as behavior that increases the probability of reproductive success, and bounds on rationality are determined by physiological and environmental constraints.Massachusetts Institute of Technology. Laboratory for Financial Engineerin

    Impossible Frontiers

    Get PDF
    A key result of the capital asset pricing model (CAPM) is that the market portfolio—the portfolio of all assets in which each asset's weight is proportional to its total market capitalization—lies on the mean-variance-efficient frontier, the set of portfolios having mean-variance characteristics that cannot be improved upon. Therefore, the CAPM cannot be consistent with efficient frontiers for which every frontier portfolio has at least one negative weight or short position. We call such efficient frontiers “impossible,” and show that impossible frontiers are difficult to avoid. In particular, as the number of assets, n, grows, we prove that the probability that a generically chosen frontier is impossible tends to one at a geometric rate. In fact, for one natural class of distributions, nearly one-eighth of all assets on a frontier is expected to have negative weights for every portfolio on the frontier. We also show that the expected minimum amount of short selling across frontier portfolios grows linearly with n, and even when short sales are constrained to some finite level, an impossible frontier remains impossible. Using daily and monthly U.S. stock returns, we document the impossibility of efficient frontiers in the data.AlphaSimplex Group, LLCMassachusetts Institute of Technology. Laboratory for Financial Engineerin

    Dynamic Loss Probabilities and Implications for Financial Regulation

    Get PDF
    Much of financial regulation and supervision is devoted to ensuring the safety and soundness of financial institutions. Such micro- and macro-prudential policies are almost always formulated as capital requirements, leverage constraints, and other statutory restrictions designed to limit the probability of extreme financial loss to some small but acceptable threshold. However, if the risks of a financial institution\u27s assets vary over time and across circumstances, then the efficacy of financial regulations necessarily varies in lockstep unless the regulations are adaptive. We illustrate this principle with empirical examples drawn from the financial industry, and show how the interaction of certain regulations with dynamic loss probabilities can have the unintended consequence of amplifying financial losses. We propose an ambitious research agenda in which legal scholars and financial economists collaborate to develop optimally adaptive regulations that anticipate the endogeneity of risk-taking behavior

    Machinery and China's nexus of foreign trade and economic growth

    Get PDF
    corecore