749 research outputs found
Secondary Structures in Long Compact Polymers
Compact polymers are self-avoiding random walks which visit every site on a
lattice. This polymer model is used widely for studying statistical problems
inspired by protein folding. One difficulty with using compact polymers to
perform numerical calculations is generating a sufficiently large number of
randomly sampled configurations. We present a Monte-Carlo algorithm which
uniformly samples compact polymer configurations in an efficient manner
allowing investigations of chains much longer than previously studied. Chain
configurations generated by the algorithm are used to compute statistics of
secondary structures in compact polymers. We determine the fraction of monomers
participating in secondary structures, and show that it is self averaging in
the long chain limit and strictly less than one. Comparison with results for
lattice models of open polymer chains shows that compact chains are
significantly more likely to form secondary structure.Comment: 14 pages, 14 figure
The Single-Angle Plane-Wave Spectral Response of One-Dimensional Photonic Crystal Structures
The multiple-incident-angle transmittances or reflectances of fabricated 1-D photonic crystal (PC) structures are measured. Regularization methods are applied to these measurements to determine the single-angle plane-wave spectral response of the structure
Recommended from our members
TECHNICAL BASIS DOCUMENT FOR CRITERIA AND PROCESSES FOR THE CERTIFICATION OF NON-RADIOACTIVE HAZARDOUS AND NON-HAZARDOUS WASTES
This Technical Basis Document (TBD) identifies how the values presented in the ''Criteria and Processes for the Certification of Non-Radioactive Hazardous and Non-Hazardous Wastes'' were derived. The original moratorium document (UCRL-AR-109662) applied only to hazardous wastes generated in Radioactive Materials Management Areas (RMMAs) that were destined for off-site Treatment, Storage, and Disposal Facilities (TSDFs) that did not possess a radioactive materials license. Since its inception, the original moratorium document has become the de facto free-release procedure for potentially volumetrically contaminated materials of all varieties. This was promulgated in a February 4, 1992 memo from Jyle Lytle, Deputy Assistant Secretary for Waste Management, entitled ''Update: Moratorium on Shipment of Potentially Radioactive Hazardous and Toxic Wastes''. In this memo, Ms. Lytle states, ''While the moratorium does not apply to non-hazardous/non-TSCA solid wastes and non-waste materials, the same release criteria apply''. Over the past few years, a considerable quantity of data and operating experience has been developed, which has shown the limitations of UCRL-AR-109662. The original Moratorium is out of date, and many of the organizations and procedures that it references are no longer in existence. In addition, the original document lacked sufficient detail to be used as an LLNL-wide procedure for free release, as it only addressed hazardous wastes. The original moratorium document also used highly optimistic ''action limits'', which were based on theoretically achievable minimum detectable activity (MDA) levels for various matrices. Years of operating experience has shown that these action limits are simply not achievable for certain analyses in certain matrices, either due to limitations in sample size, or underestimates of the contribution of naturally-occurring radioactive materials, resulting in the mis-characterization of samples of these matrices as radioactive, when no radioactivity was added by LLNL operations. The new moratorium document updates the organizations involved in Moratorium Declarations, specifically addresses non-hazardous waste matrices, and allows for alternative types of analysis. The new moratorium document formalizes the process of release of potentially volumetrically-contaminated waste materials from radiological controls at LLNL
Spatial distributions of perchloroethylene reactive transport parameters in the Borden Aquifer
We determined the descriptive statistical and spatial geostatistical properties of the perchloroethene ln Kd and the ln k of a 1.5 m thick by 10 m horizontal transect of the Borden aquifer near the location of the Stanford-Waterloo (SW) tracer experiment. The ln Kd distribution is not normal and is right skewed because of a few high values that occur localized in two regions of the transect. In contrast, the ln k data can be characterized by a normal distribution. A linear regression of ln Kd on ln k yields a statistically significant positive correlation, also shown at small lags in the cross correlogram. No significant vertical or horizontal trend in the ln Kd data was detected. The semivariogram ranges of ln k and ln Kd differ from one another in the vertical direction (0.33 ± 0.06 m and 0.20 ± 0.04 m, respectively) and are much less than the horizontal ranges (a few meters). Despite significant effort the horizontal range of ln Kd remains poorly characterized because of limitations of the sample locations. Many of the characteristics described above do not match those assumed in prior theoretical studies that examined the importance of various aquifer characteristics on SW tracer transport. We suggest that there is knowledge to be gained by revisiting the conclusions of these prior studies in light of the new information presented here
Antireflection coatings from analogy between electron scattering and spin precession
We use the analogy between scattering of a wave from a potential, and the precession of a spin-half particle in a magnetic field, to gain insight into the design of an antireflection coating for electrons in a semiconductor superlattice. It is shown that the classic recipes derived for optics are generally not applicable due to the different dispersion law for electrons. Using the stability conditions we show that a Poisson distribution of impedance steps is a better approximation than is a Gaussian distribution. Examples are given of filters with average transmissivity exceeding 95% over an allowed band
Tube Models for Rubber-Elastic Systems
In the first part of the paper we show that the constraining potentials
introduced to mimic entanglement effects in Edwards' tube model and Flory's
constrained junction model are diagonal in the generalized Rouse modes of the
corresponding phantom network. As a consequence, both models can formally be
solved exactly for arbitrary connectivity using the recently introduced
constrained mode model. In the second part, we solve a double tube model for
the confinement of long paths in polymer networks which is partially due to
crosslinking and partially due to entanglements. Our model describes a
non-trivial crossover between the Warner-Edwards and the Heinrich-Straube tube
models. We present results for the macroscopic elastic properties as well as
for the microscopic deformations including structure factors.Comment: 15 pages, 8 figures, Macromolecules in pres
Frustrated H-Induced Instability of Mo(110)
Using helium atom scattering Hulpke and L"udecke recently observed a giant
phonon anomaly for the hydrogen covered W(110) and Mo(110) surfaces. An
explanation which is able to account for this and other experiments is still
lacking. Below we present density-functional theory calculations of the atomic
and electronic structure of the clean and hydrogen-covered Mo(110) surfaces.
For the full adsorbate monolayer the calculations provide evidence for a strong
Fermi surface nesting instability. This explains the observed anomalies and
resolves the apparent inconsistencies of different experiments.Comment: 4 pages, 2 figures, submitted to PR
Massively parallel computing on an organic molecular layer
Current computers operate at enormous speeds of ~10^13 bits/s, but their
principle of sequential logic operation has remained unchanged since the 1950s.
Though our brain is much slower on a per-neuron base (~10^3 firings/s), it is
capable of remarkable decision-making based on the collective operations of
millions of neurons at a time in ever-evolving neural circuitry. Here we use
molecular switches to build an assembly where each molecule communicates-like
neurons-with many neighbors simultaneously. The assembly's ability to
reconfigure itself spontaneously for a new problem allows us to realize
conventional computing constructs like logic gates and Voronoi decompositions,
as well as to reproduce two natural phenomena: heat diffusion and the mutation
of normal cells to cancer cells. This is a shift from the current static
computing paradigm of serial bit-processing to a regime in which a large number
of bits are processed in parallel in dynamically changing hardware.Comment: 25 pages, 6 figure
Recommended from our members
Expected Limits on the Ocean Acidification Buffering Potential of a Temperate Seagrass Meadow
Ocean acidification threatens many marine organisms, especially marine calcifiers. The only global‐scale solution to ocean acidification remains rapid reduction in CO2 emissions. Nevertheless, interest in localized mitigation strategies has grown rapidly because of the recognized threat ocean acidification imposes on natural communities, including ones important to humans. Protection of seagrass meadows has been considered as a possible approach for localized mitigation of ocean acidification due to their large standing stocks of organic carbon and high productivity. Yet much work remains to constrain the magnitudes and timescales of potential buffering effects from seagrasses. We developed a biogeochemical box model to better understand the potential for a temperate seagrass meadow to locally mitigate the effects of ocean acidification. Then we parameterized the model using data from Tomales Bay, an inlet on the coast of California, USA which supports a major oyster farming industry. We conducted a series of month‐long model simulations to characterize processes that occur during summer and winter. We found that average pH in the seagrass meadows was typically within 0.04 units of the pH of the primary source waters into the meadow, although we did find occasional periods (hours) when seagrass metabolism may modify the pH by up to ±0.2 units. Tidal phasing relative to the diel cycle modulates localized pH buffering within the seagrass meadow such that maximum buffering occurs during periods of the year with midday low tides. Our model results suggest that seagrass metabolism in Tomales Bay would not provide long‐term ocean acidification mitigation. However, we emphasize that our model results may not hold in meadows where assumptions about depth‐averaged net production and seawater residence time within the seagrass meadow differ from our model assumptions. Our modeling approach provides a framework that is easily adaptable to other seagrass meadows in order to evaluate the extent of their individual buffering capacities. Regardless of their ability to buffer ocean acidification, seagrass meadows maintain many critically important ecosystem goods and services that will be increasingly important as humans increasingly affect coastal ecosystems
A Pilot Study with a Novel Setup for Collaborative Play of the Humanoid Robot KASPAR with children with autism
This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.This article describes a pilot study in which a novel experimental setup, involving an autonomous humanoid robot, KASPAR, participating in a collaborative, dyadic video game, was implemented and tested with children with autism, all of whom had impairments in playing socially and communicating with others. The children alternated between playing the collaborative video game with a neurotypical adult and playing the same game with the humanoid robot, being exposed to each condition twice. The equipment and experimental setup were designed to observe whether the children would engage in more collaborative behaviours while playing the video game and interacting with the adult than performing the same activities with the humanoid robot. The article describes the development of the experimental setup and its first evaluation in a small-scale exploratory pilot study. The purpose of the study was to gain experience with the operational limits of the robot as well as the dyadic video game, to determine what changes should be made to the systems, and to gain experience with analyzing the data from this study in order to conduct a more extensive evaluation in the future. Based on our observations of the childrens’ experiences in playing the cooperative game, we determined that while the children enjoyed both playing the game and interacting with the robot, the game should be made simpler to play as well as more explicitly collaborative in its mechanics. Also, the robot should be more explicit in its speech as well as more structured in its interactions. Results show that the children found the activity to be more entertaining, appeared more engaged in playing, and displayed better collaborative behaviours with their partners (For the purposes of this article, ‘partner’ refers to the human/robotic agent which interacts with the children with autism. We are not using the term’s other meanings that refer to specific relationships or emotional involvement between two individuals.) in the second sessions of playing with human adults than during their first sessions. One way of explaining these findings is that the children’s intermediary play session with the humanoid robot impacted their subsequent play session with the human adult. However, another longer and more thorough study would have to be conducted in order to better re-interpret these findings. Furthermore, although the children with autism were more interested in and entertained by the robotic partner, the children showed more examples of collaborative play and cooperation while playing with the human adult.Peer reviewe
- …