45,818 research outputs found
Design study of a low cost civil aviation GPS receiver system
A low cost Navstar receiver system for civil aviation applications was defined. User objectives and constraints were established. Alternative navigation processing design trades were evaluated. Receiver hardware was synthesized by comparing technology projections with various candidate system designs. A control display unit design was recommended as the result of field test experience with Phase I GPS sets and a review of special human factors for general aviation users. Areas requiring technology development to ensure a low cost Navstar Set in the 1985 timeframe were identified
Has the deregulation of deposit interest rates raised mortgage rates?
Interest rates ; Mortgages
Predicting protein function by machine learning on amino acid sequences – a critical evaluation
Copyright @ 2007 Al-Shahib et al; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Background: Predicting the function of newly discovered proteins by simply inspecting their amino acid sequence is one of the major challenges of post-genomic computational biology, especially when done without recourse to experimentation or homology information. Machine learning classifiers are able to discriminate between proteins belonging to different functional classes. Until now, however, it has been unclear if this ability would be transferable to proteins of unknown function, which may show distinct biases compared to experimentally more tractable proteins. Results: Here we show that proteins with known and unknown function do indeed differ significantly. We then show that proteins from different bacterial species also differ to an even larger and very surprising extent, but that functional classifiers nonetheless generalize successfully across species boundaries. We also show that in the case of highly specialized proteomes classifiers from a different, but more conventional, species may in fact outperform the endogenous species-specific classifier. Conclusion: We conclude that there is very good prospect of successfully predicting the function of yet uncharacterized proteins using machine learning classifiers trained on proteins of known function
Endoscopic inspection using a panoramic annular lens
The objective of this one year study was to design, build, and demonstrate a prototype system for cavity inspection. A cylindrical view of the cavity interior was captured in real time through a compound lens system consisting of a unique panoramic annular lens and a collector lens. Images, acquired with a digitizing camera and stored in a desktop computer, were manipulated using image processing software to aid in visual inspection and qualitative analysis. A detailed description of the lens and its applications is given
Endoscopic measurements using a panoramic annular lens
The objective of this project was to design, build, demonstrate, and deliver a prototype system for making measurements within cavities. The system was to utilize structured lighting as the means for making measurements and was to rely on a stationary probe, equipped with a unique panoramic annular lens, to capture a cylindrical view of the illuminated cavity. Panoramic images, acquired with a digitizing camera and stored in a desk top computer, were to be linearized and analyzed by mouse-driven interactive software
Algorithmic linear dimension reduction in the l_1 norm for sparse vectors
This paper develops a new method for recovering m-sparse signals that is
simultaneously uniform and quick. We present a reconstruction algorithm whose
run time, O(m log^2(m) log^2(d)), is sublinear in the length d of the signal.
The reconstruction error is within a logarithmic factor (in m) of the optimal
m-term approximation error in l_1. In particular, the algorithm recovers
m-sparse signals perfectly and noisy signals are recovered with polylogarithmic
distortion. Our algorithm makes O(m log^2 (d)) measurements, which is within a
logarithmic factor of optimal. We also present a small-space implementation of
the algorithm. These sketching techniques and the corresponding reconstruction
algorithms provide an algorithmic dimension reduction in the l_1 norm. In
particular, vectors of support m in dimension d can be linearly embedded into
O(m log^2 d) dimensions with polylogarithmic distortion. We can reconstruct a
vector from its low-dimensional sketch in time O(m log^2(m) log^2(d)).
Furthermore, this reconstruction is stable and robust under small
perturbations
Playing Quantum Physics Jeopardy with zero-energy eigenstates
We describe an example of an exact, quantitative Jeopardy-type quantum
mechanics problem. This problem type is based on the conditions in
one-dimensional quantum systems that allow an energy eigenstate for the
infinite square well to have zero curvature and zero energy when suitable Dirac
delta functions are added. This condition and its solution are not often
discussed in quantum mechanics texts and have interesting pedagogical
consequences.Comment: 8 pages, 3 figures, requires graphicx and epsfig packages. Additional
information, including individual files containing the Worksheet and a
Worksheet template, are available at
http://webphysics.davidson.edu/mjb/jeopardy
Analysis of self-oscillating DC-DC resonant power converters using a hysteretic relay
The paper presents a technique for exciting resonant DC-DC converters in a self-oscillating manner. The analysis necessary to predict the behaviour of such converters is also given. The oscillation is based on the behaviour of a hysteretic relay with a negative hysteresis transition. Self-oscillating converters benefit from higher efficiency/higher power density than their non-self-oscillating counterparts as they can be operated closer to the tank resonant frequency. The self-oscillating mechanism presented here is also simple and cost effective to implement. A prototype converter is presented in order to verify the theoretical claims
Did Neoliberalizing West African Forests Produce a New Niche for Ebola?
A recent study introduced a vaccine that controls Ebola Makona, the Zaire ebolavirus variant that has infected 28,000 people in West Africa. We propose that even such successful advances are insufficient for many emergent diseases. We review work hypothesizing that Makona, phenotypically similar to much smaller outbreaks, emerged out of shifts in land use brought about by neoliberal economics. The epidemiological consequences demand a new science that explicitly addresses the foundational processes underlying multispecies health, including the deep-time histories, cultural infrastructure, and global economic geographies driving disease emergence. The approach, for instance, reverses the standard public health practice of segregating emergency responses and the structural context from which outbreaks originate. In Ebola's case, regional neoliberalism may affix the stochastic "friction" of ecological relationships imposed by the forest across populations, which, when above a threshold, keeps the virus from lining up transmission above replacement. Export-led logging, mining, and intensive agriculture may depress such functional noise, permitting novel spillovers larger forces of infection. Mature outbreaks, meanwhile, can continue to circulate even in the face of efficient vaccines. More research on these integral explanations is required, but the narrow albeit welcome success of the vaccine may be used to limit support of such a program.SCOPUS: re.jinfo:eu-repo/semantics/publishe
Predictions of the causal entropic principle for environmental conditions of the universe
The causal entropic principle has been proposed as a superior alternative to
the anthropic principle for understanding the magnitude of the cosmological
constant. In this approach, the probability to create observers is assumed to
be proportional to the entropy production \Delta S in a maximal causally
connected region -- the causal diamond. We improve on the original treatment by
better quantifying the entropy production due to stars, using an analytic model
for the star formation history which accurately accounts for changes in
cosmological parameters. We calculate the dependence of \Delta S on the density
contrast Q=\delta\rho/\rho, and find that our universe is much closer to the
most probable value of Q than in the usual anthropic approach and that
probabilities are relatively weakly dependent on this amplitude. In addition,
we make first estimates of the dependence of \Delta S on the baryon fraction
and overall matter abundance. Finally, we also explore the possibility that
decays of dark matter, suggested by various observed gamma ray excesses, might
produce a comparable amount of entropy to stars.Comment: RevTeX4, 13pp, 10 figures; v2. clarified introduction, added ref
- …
