4,965 research outputs found
The high energy limit of the trajectory representation of quantum mechanics
The trajectory representation in the high energy limit (Bohr correspondence
principle) manifests a residual indeterminacy. This indeterminacy is compared
to the indeterminacy found in the classical limit (Planck's constant to 0)
[Int. J. Mod. Phys. A 15, 1363 (2000)] for particles in the classically allowed
region, the classically forbiden region, and near the WKB turning point. The
differences between Bohr's and Planck's principles for the trajectory
representation are compared with the differences between these correspondence
principles for the wave representation. The trajectory representation in the
high energy limit is shown to go to neither classical nor statistical
mechanics. The residual indeterminacy is contrasted to Heisenberg uncertainty.
The relationship between indeterminacy and 't Hooft's information loss and
equivalence classes is investigated.Comment: 12 pages of LaTeX. No figures. Incorporated into the "Proceedings of
the Seventh International Wigner Symposium" (ed. M. E. Noz), 24-29 August
2001, U. of Maryland. Proceedings available at
http://www.physics.umd.edu/robo
Recommended from our members
‘Jugglers’, ‘copers’ and ‘strugglers’: academics’ perceptions of being a head of department in a post-1992 UK university and how it influences their future careers
This study investigates the experiences of academics who became department heads in a post-1992 UK university and explores the influence that being in the position has on their planned future academic career. Drawing on life history interviews undertaken with 17 male and female heads of department, the paper constitutes an in-depth study of their careers in the same university. The findings suggest that academics who become department heads not only need the capacity to assume a range of personal and professional identities, but need flexibility to regularly adopt and switch between them. Whether individuals can successfully balance and manage such multiple identities, or whether they experience major conflicts within or between them, greatly affects their experiences of being a head of department and seems to influence their subsequent career decisions. The paper concludes by proposing a conceptual framework and typology to interpret the career trajectories of academics that became department heads in the case university
Applications of artificial intelligence to mission planning
The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques
Unexpected cell type-dependent effects of autophagy on polyglutamine aggregation revealed by natural genetic variation in C. elegans.
BACKGROUND: Monogenic protein aggregation diseases, in addition to cell selectivity, exhibit clinical variation in the age of onset and progression, driven in part by inter-individual genetic variation. While natural genetic variants may pinpoint plastic networks amenable to intervention, the mechanisms by which they impact individual susceptibility to proteotoxicity are still largely unknown.
RESULTS: We have previously shown that natural variation modifies polyglutamine (polyQ) aggregation phenotypes in C. elegans muscle cells. Here, we find that a genomic locus from C. elegans wild isolate DR1350 causes two genetically separable aggregation phenotypes, without changing the basal activity of muscle proteostasis pathways known to affect polyQ aggregation. We find that the increased aggregation phenotype was due to regulatory variants in the gene encoding a conserved autophagy protein ATG-5. The atg-5 gene itself conferred dosage-dependent enhancement of aggregation, with the DR1350-derived allele behaving as hypermorph. Surprisingly, increased aggregation in animals carrying the modifier locus was accompanied by enhanced autophagy activation in response to activating treatment. Because autophagy is expected to clear, not increase, protein aggregates, we activated autophagy in three different polyQ models and found a striking tissue-dependent effect: activation of autophagy decreased polyQ aggregation in neurons and intestine, but increased it in the muscle cells.
CONCLUSIONS: Our data show that cryptic natural variants in genes encoding proteostasis components, although not causing detectable phenotypes in wild-type individuals, can have profound effects on aggregation-prone proteins. Clinical applications of autophagy activators for aggregation diseases may need to consider the unexpected divergent effects of autophagy in different cell types
The Equivalence Postulate of Quantum Mechanics
The Equivalence Principle (EP), stating that all physical systems are
connected by a coordinate transformation to the free one with vanishing energy,
univocally leads to the Quantum Stationary HJ Equation (QSHJE). Trajectories
depend on the Planck length through hidden variables which arise as initial
conditions. The formulation has manifest p-q duality, a consequence of the
involutive nature of the Legendre transform and of its recently observed
relation with second-order linear differential equations. This reflects in an
intrinsic psi^D-psi duality between linearly independent solutions of the
Schroedinger equation. Unlike Bohm's theory, there is a non-trivial action even
for bound states. No use of any axiomatic interpretation of the wave-function
is made. Tunnelling is a direct consequence of the quantum potential which
differs from the usual one and plays the role of particle's self-energy. The
QSHJE is defined only if the ratio psi^D/psi is a local self-homeomorphism of
the extended real line. This is an important feature as the L^2 condition,
which in the usual formulation is a consequence of the axiomatic interpretation
of the wave-function, directly follows as a basic theorem which only uses the
geometrical gluing conditions of psi^D/psi at q=\pm\infty as implied by the EP.
As a result, the EP itself implies a dynamical equation that does not require
any further assumption and reproduces both tunnelling and energy quantization.
Several features of the formulation show how the Copenhagen interpretation
hides the underlying nature of QM. Finally, the non-stationary higher
dimensional quantum HJ equation and the relativistic extension are derived.Comment: 1+3+140 pages, LaTeX. Invariance of the wave-function under the
action of SL(2,R) subgroups acting on the reduced action explicitly reveals
that the wave-function describes only equivalence classes of Planck length
deterministic physics. New derivation of the Schwarzian derivative from the
cocycle condition. "Legendre brackets" introduced to further make "Legendre
duality" manifest. Introduction now contains examples and provides a short
pedagogical review. Clarifications, conclusions, ackn. and references adde
Blocked All-Pairs Shortest Paths Algorithm on Intel Xeon Phi KNL Processor: A Case Study
Manycores are consolidating in HPC community as a way of improving
performance while keeping power efficiency. Knights Landing is the recently
released second generation of Intel Xeon Phi architecture. While optimizing
applications on CPUs, GPUs and first Xeon Phi's has been largely studied in the
last years, the new features in Knights Landing processors require the revision
of programming and optimization techniques for these devices. In this work, we
selected the Floyd-Warshall algorithm as a representative case study of graph
and memory-bound applications. Starting from the default serial version, we
show how data, thread and compiler level optimizations help the parallel
implementation to reach 338 GFLOPS.Comment: Computer Science - CACIC 2017. Springer Communications in Computer
and Information Science, vol 79
Child mortality in rural Malawi: HIV closes the survival gap between the socio-economic strata
As HIV-related deaths increase in a population the usual association between low socioeconomic status and child mortality may change, particularly as death rates from other causes decline.METHODS/PRINCIPAL FINDINGS: As part of a demographic surveillance system in northern Malawi in 2002-6, covering a population of 32,000, information was collected on socio-economic status of the households. Deaths were classified as HIV/AIDS-related or not by verbal autopsy. Poisson regression models were used to assess the association of socio-economic indicators with all-cause mortality, AIDS-mortality and non-AIDS mortality among children. There were 195 deaths in infants, 109 in children aged 1-4 years, and 38 in children aged 5-15. All-cause child mortality in infants and 1-4 year olds was similar in households with higher and lower socio-economic status. In infants 13% of deaths were attributed to AIDS, and there were no clear trends with socio-economic status for AIDS or non-AIDS causes. For 1-4 year olds 27% of deaths were attributed to AIDS. AIDS mortality was higher among those with better built houses, and lowest in those with income from farming and fishing, whereas non-AIDS mortality was higher in those with worse built houses, lowest in those with income from employment, and decreased with increasing household assets.CONCLUSIONS/SIGNIFICANCE: In this population, since HIV infection among adults was initially more common among the less poor, childhood mortality patterns have changed. The usual gap in survival between the poor and the less poor has been lost, but because the less poor have been disproportionately affected by HIV, rather than because of relative improvement in the survival of the poorest
The PAPAGENO Parallel-Parser Generator
The increasing use of multicore processors has deeply transformed
computing paradigms and applications. The wide availability of multicore systems had an impact also in the field of compiler technology, although the research on deterministic parsing did not prove to be effective in exploiting the architectural advantages, the main impediment being the inherent sequential nature of traditional LL and LR algorithms. We present PAPAGENO, an automated parser generator relying on operator precedence grammars. We complemented the PAPAGENO-generated parallel parsers with parallel lexing techniques, obtaining
near-linear speedups on multicore machines, and the same speed as Bison
parsers on sequential execution
Auditing the accessibility of MOOCs: a four-component approach
This paper reports the design of a four-component audit to evaluate the accessibility of Massive Open Online Courses (MOOCs). The MOOC accessibility audit was designed as part of a research programme at The Open University (UK) that aimed to assess the current state of accessibility of MOOC platforms and resources, to uncover accessibility barriers, and to derive recommendations on how the barriers could be addressed. The audit is composed of four evaluation components: technical accessibility, user experience (UX), quality and learning design. The audit consists of four processes supported by checklists corresponding to each of the four components implemented via a heuristic evaluation approach, an evaluation technique from Human-Computer Interaction literatur
- …