3,610 research outputs found
Statistically Preserved Structures and Anomalous Scaling in Turbulent Active Scalar Advection
The anomalous scaling of correlation functions in the turbulent statistics of
active scalars (like temperature in turbulent convection) is understood in
terms of an auxiliary passive scalar which is advected by the same turbulent
velocity field. While the odd-order correlation functions of the active and
passive fields differ, we propose that the even-order correlation functions are
the same to leading order (up to a trivial multiplicative factor). The leading
correlation functions are statistically preserved structures of the passive
scalar decaying problem, and therefore universality of the scaling exponents of
the even-order correlations of the active scalar is demonstrated.Comment: 4 pages, 5 figures, submitted to Phys. Rev. Let
On control of singleton attractors in multiple Boolean networks: integer programming-based method
published_or_final_versionThe Twelfth Asia Pacific Bioinformatics Conference (APBC 2014), Shanghai, China. 17-19 January 2014. In BMC Systems Biology, 2014, v. 8, Suppl. 1, article no. S
An efficient method of computing impact degrees for multiple reactions in metabolic networks with cycles
The impact degree is a measure of the robustness of a metabolic network against deletion of single or multiple reaction(s). Although such a measure is useful for mining important enzymes/genes, it was defined only for networks without cycles. In this paper, we extend the impact degree for metabolic networks containing cycles and develop a simple algorithm to calculate the impact degree. Furthermore we improve this algorithm to reduce computation time for the impact degree by deletions of multiple reactions. We applied our method to the metabolic network of E. coli, that includes reference pathways, consisting of 3281 reaction nodes and 2444 compound nodes, downloaded from KEGG database, and calculate the distribution of the impact degree. The results of our computational experiments show that the improved algorithm is 18.4 times faster than the simple algorithm for deletion of reaction-pairs and 11.4 times faster for deletion of reaction-triplets. We also enumerate genes with high impact degrees for single and multiple reaction deletions. © 2011 The Institute of Electronics, Information and Communication Engineers.published_or_final_versio
Probability Density Function of Longitudinal Velocity Increment in Homogeneous Turbulence
Two conditional averages for the longitudinal velocity increment u_r of the
simulated turbulence are calculated: h(u_r) is the average of the increment of
the longitudinal Laplacian velocity field with u_r fixed, while g(u_r) is the
corresponding one of the square of the difference of the gradient of the
velocity field. Based on the physical argument, we suggest the formulae for h
and g, which are quite satisfactorily fitted to the 512^3 DNS data. The
predicted PDF is characterized as
(1) the Gaussian distribution for the small amplitudes,
(2) the exponential distribution for the large ones, and (3) a prefactor
before the exponential function for the intermediate ones.Comment: 4 pages, 4 figures, using RevTeX3.
Real-time Loss Estimation for Instrumented Buildings
Motivation. A growing number of buildings have been instrumented to measure and record
earthquake motions and to transmit these records to seismic-network data centers to be archived and
disseminated for research purposes. At the same time, sensors are growing smaller, less expensive to
install, and capable of sensing and transmitting other environmental parameters in addition to
acceleration. Finally, recently developed performance-based earthquake engineering methodologies
employ structural-response information to estimate probabilistic repair costs, repair durations, and
other metrics of seismic performance. The opportunity presents itself therefore to combine these
developments into the capability to estimate automatically in near-real-time the probabilistic seismic
performance of an instrumented building, shortly after the cessation of strong motion. We refer to
this opportunity as (near-) real-time loss estimation (RTLE).
Methodology. This report presents a methodology for RTLE for instrumented buildings. Seismic
performance is to be measured in terms of probabilistic repair cost, precise location of likely physical
damage, operability, and life-safety. The methodology uses the instrument recordings and a Bayesian
state-estimation algorithm called a particle filter to estimate the probabilistic structural response of
the system, in terms of member forces and deformations. The structural response estimate is then
used as input to component fragility functions to estimate the probabilistic damage state of structural
and nonstructural components. The probabilistic damage state can be used to direct structural
engineers to likely locations of physical damage, even if they are concealed behind architectural
finishes. The damage state is used with construction cost-estimation principles to estimate
probabilistic repair cost. It is also used as input to a quantified, fuzzy-set version of the FEMA-356
performance-level descriptions to estimate probabilistic safety and operability levels.
CUREE demonstration building. The procedure for estimating damage locations, repair costs, and
post-earthquake safety and operability is illustrated in parallel demonstrations by CUREE and
Kajima research teams. The CUREE demonstration is performed using a real 1960s-era, 7-story, nonductile
reinforced-concrete moment-frame building located in Van Nuys, California. The building is
instrumented with 16 channels at five levels: ground level, floors 2, 3, 6, and the roof. We used the
records obtained after the 1994 Northridge earthquake to hindcast performance in that earthquake.
The building is analyzed in its condition prior to the 1994 Northridge Earthquake. It is found that,
while hindcasting of the overall system performance level was excellent, prediction of detailed damage
locations was poor, implying that either actual conditions differed substantially from those shown on
the structural drawings, or inappropriate fragility functions were employed, or both. We also found
that Bayesian updating of the structural model using observed structural response above the base of
the building adds little information to the performance prediction. The reason is probably that
Real-Time Loss Estimation for Instrumented Buildings
ii
structural uncertainties have only secondary effect on performance uncertainty, compared with the
uncertainty in assembly damageability as quantified by their fragility functions. The implication is
that real-time loss estimation is not sensitive to structural uncertainties (saving costly multiple
simulations of structural response), and that real-time loss estimation does not benefit significantly
from installing measuring instruments other than those at the base of the building.
Kajima demonstration building. The Kajima demonstration is performed using a real 1960s-era
office building in Kobe, Japan. The building, a 7-story reinforced-concrete shearwall building, was not
instrumented in the 1995 Kobe earthquake, so instrument recordings are simulated. The building is
analyzed in its condition prior to the earthquake. It is found that, while hindcasting of the overall
repair cost was excellent, prediction of detailed damage locations was poor, again implying either that
as-built conditions differ substantially from those shown on structural drawings, or that
inappropriate fragility functions were used, or both. We find that the parameters of the detailed
particle filter needed significant tuning, which would be impractical in actual application. Work is
needed to prescribe values of these parameters in general.
Opportunities for implementation and further research. Because much of the cost of applying
this RTLE algorithm results from the cost of instrumentation and the effort of setting up a structural
model, the readiest application would be to instrumented buildings whose structural models are
already available, and to apply the methodology to important facilities. It would be useful to study
under what conditions RTLE would be economically justified. Two other interesting possibilities for
further study are (1) to update performance using readily observable damage; and (2) to quantify the
value of information for expensive inspections, e.g., if one inspects a connection with a modeled 50%
failure probability and finds that the connect is undamaged, is it necessary to examine one with 10%
failure probability
Opportunistic Uses of the Traditional School Day Through Student Examination of Fitbit Activity Tracker Data
In large part due to the highly prescribed nature of the typical school day for children, efforts to design new interactions with technology have often focused on less-structured after-school clubs and other out-of-school environments. We argue that while the school day imposes serious restrictions, school routines can and should be opportunistically leveraged by designers and by youth. Specifically, wearable activity tracking devices open some new avenues for opportunistic collection of and reflection on data from the school day. To demonstrate this, we present two cases from an elementary statistics classroom unit we designed that intentionally integrated wearable activity trackers and childcreated data visualizations. The first case involves a group of students comparing favored recess activities to determine which was more physically demanding. The second case is of a student who took advantage of her knowledge of teachers’ school day routines to test the reliability of a Fitbit activity tracker against a commercial mobile app
Microscopic Description of Band Structure at Very Extended Shapes in the A ~ 110 Mass Region
Recent experiments have confirmed the existence of rotational bands in the A
\~ 110 mass region with very extended shapes lying between super- and
hyper-deformation. Using the projected shell model, we make a first attempt to
describe quantitatively such a band structure in 108Cd. Excellent agreement is
achieved in the dynamic moment of inertia J(2) calculation. This allows us to
suggest the spin values for the energy levels, which are experimentally
unknown. It is found that at this large deformation, the sharply down-sloping
orbitals in the proton i_{13/2} subshell are responsible for the irregularity
in the experimental J(2), and the wave functions of the observed states have a
dominant component of two-quasiparticles from these orbitals. Measurement of
transition quadrupole moments and g-factors will test these findings, and thus
can provide a deeper understanding of the band structure at very extended
shapes.Comment: 4 pages, 3 eps figures, final version accepted by Phys. Rev. C as a
Rapid Communicatio
Dynamic deferred data structuring
Let S be a set of n reals. We show how to process on-line r membership queries, insertions, and deletions in time O(r log (n + r) + (n + r) log r). This is optimal in the binary comparison model
Quasinormal Modes of Dirty Black Holes
Quasinormal mode (QNM) gravitational radiation from black holes is expected
to be observed in a few years. A perturbative formula is derived for the shifts
in both the real and the imaginary part of the QNM frequencies away from those
of an idealized isolated black hole. The formulation provides a tool for
understanding how the astrophysical environment surrounding a black hole, e.g.,
a massive accretion disk, affects the QNM spectrum of gravitational waves. We
show, in a simple model, that the perturbed QNM spectrum can have interesting
features.Comment: 4 pages. Published in PR
Perturbative Approach to the Quasinormal Modes of Dirty Black Holes
Using a recently developed perturbation theory for uasinormal modes (QNM's),
we evaluate the shifts in the real and imaginary parts of the QNM frequencies
due to a quasi-static perturbation of the black hole spacetime. We show the
perturbed QNM spectrum of a black hole can have interesting features using a
simple model based on the scalar wave equation.Comment: Published in PR
- …