2,493 research outputs found
Helicoverpa armigera nucleopolyhedrovirus occlusion-derived virus-associated protein, HA100, affects oral infectivity in vivo but not virus replication in vitro
ORF100 (ha100) of Helicoverpa armigera nucleopolyhedrovirus (HearNPV) has been reported as one of the unique genes of group II alphabaculoviruses encoding a protein located in the occlusion-derived virus (ODV) envelope and nucleocapsid. The protein consists of 510 aa with a predicted mass of 58.1 kDa and is a homologue of poly(ADP–ribose) glycohydrolase in eukaryotes. Western blot analysis detected a 60 kDa band in HearNPV-infected HzAM1 cells starting at 18 h post-infection. Transient expression of GFP-fused HA100 in HzAM1 cells resulted in cytoplasmic localization of the protein, but after superinfection with HearNPV, GFPfused HA100 was localized in the nucleus. To study the function of HA100 further, an ha100-null virus was constructed using bacmid technology. Viral one-step growth curve analyses showed that the ha100-null virus had similar budded virus production kinetics to that of the parental virus. Electron microscopy revealed that deletion of HA100 did not alter the morphology of ODVs or occlusion bodies (OBs). However, bioassays in larvae showed that the 50% lethal concentration (LC50) value of HA100-null OBs was significantly higher than that of parental OBs; the median lethal time (LT50) of ha100-null OBs was about 24 h later than control virus. These results indicate that HA100 is not essential for virus replication in vitro. However, it significantly affects the oral infectivity of OBs in host insects, suggesting that the association HA100 with the ODV contributes to the infectivity of OBs in vivo
Les manifestations violentes
Abstract. An automatic human shape-motion analysis method based on a fusion architecture is proposed for human action recognition in videos. Robust shape-motion features are extracted from human points detection and tracking. The features are combined within the Transferable Belief Model (TBM) framework for action recognition. The TBMbased modelling and fusion process allows to take into account imprecision, uncertainty and conflict inherent to the features. Action recognition is performed by a multilevel analysis. The sequencing is exploited for feedback information extraction in order to improve tracking results. The system is tested on real videos of athletics meetings to recognize four types of jumps: high jump, pole vault, triple jump and long jump.
Monte Carlo Hamiltonian - From Statistical Physics to Quantum Theory
Monte Carlo techniques have been widely employed in statistical physics as
well as in quantum theory in the Lagrangian formulation. However, in some areas
of application to quantum theories computational progress has been slow. Here
we present a recently developed approach: the Monte Carlo Hamiltonian method,
designed to overcome the difficulties of the conventional approach.Comment: StatPhys-Taiwan-1999, 6 pages, LaTeX using elsart.cl
Adjustment of the electric current in pulsar magnetospheres and origin of subpulse modulation
The subpulse modulation of pulsar radio emission goes to prove that the
plasma flow in the open field line tube breaks into isolated narrow streams. I
propose a model which attributes formation of streams to the process of the
electric current adjustment in the magnetosphere. A mismatch between the
magnetospheric current distribution and the current injected by the polar cap
accelerator gives rise to reverse plasma flows in the magnetosphere. The
reverse flow shields the electric field in the polar gap and thus shuts up the
plasma production process. I assume that a circulating system of streams is
formed such that the upward streams are produced in narrow gaps separated by
downward streams. The electric drift is small in this model because the
potential drop in narrow gaps is small. The gaps have to drift because by the
time a downward stream reaches the star surface and shields the electric field,
the corresponding gap has to shift. The transverse size of the streams is
determined by the condition that the potential drop in the gaps is sufficient
for the pair production. This yields the radius of the stream roughly 10% of
the polar cap radius, which makes it possible to fit in the observed
morphological features such as the "carousel" with 10-20 subbeams and the
system of the core - two nested cone beams.Comment: 8 pages, 1 figur
(In)finite extent of stationary perfect fluids in Newtonian theory
For stationary, barotropic fluids in Newtonian gravity we give simple
criteria on the equation of state and the "law of motion" which guarantee
finite or infinite extent of the fluid region (providing a priori estimates for
the corresponding stationary Newton-Euler system). Under more restrictive
conditions, we can also exclude the presence of "hollow" configurations. Our
main result, which does not assume axial symmetry, uses the virial theorem as
the key ingredient and generalises a known result in the static case. In the
axially symmetric case stronger results are obtained and examples are
discussed.Comment: Corrections according to the version accepted by Ann. Henri Poincar
Strong duality in conic linear programming: facial reduction and extended duals
The facial reduction algorithm of Borwein and Wolkowicz and the extended dual
of Ramana provide a strong dual for the conic linear program in the absence of any constraint qualification. The facial
reduction algorithm solves a sequence of auxiliary optimization problems to
obtain such a dual. Ramana's dual is applicable when (P) is a semidefinite
program (SDP) and is an explicit SDP itself. Ramana, Tuncel, and Wolkowicz
showed that these approaches are closely related; in particular, they proved
the correctness of Ramana's dual using certificates from a facial reduction
algorithm.
Here we give a clear and self-contained exposition of facial reduction, of
extended duals, and generalize Ramana's dual:
-- we state a simple facial reduction algorithm and prove its correctness;
and
-- building on this algorithm we construct a family of extended duals when
is a {\em nice} cone. This class of cones includes the semidefinite cone
and other important cones.Comment: A previous version of this paper appeared as "A simple derivation of
a facial reduction algorithm and extended dual systems", technical report,
Columbia University, 2000, available from
http://www.unc.edu/~pataki/papers/fr.pdf Jonfest, a conference in honor of
Jonathan Borwein's 60th birthday, 201
Multifractal characterisation of length sequences of coding and noncoding segments in a complete genome
The coding and noncoding length sequences constructed from a complete genome
are characterised by multifractal analysis. The dimension spectrum and
its derivative, the 'analogous' specific heat , are calculated for the
coding and noncoding length sequences of bacteria, where is the moment
order of the partition sum of the sequences. From the shape of the
and curves, it is seen that there exists a clear difference between the
coding/noncoding length sequences of all organisms considered and a completely
random sequence. The complexity of noncoding length sequences is higher than
that of coding length sequences for bacteria. Almost all curves for
coding length sequences are flat, so their multifractality is small whereas
almost all curves for noncoding length sequences are multifractal-like.
We propose to characterise the bacteria according to the types of the
curves of their noncoding length sequences.Comment: 15 pages with 5 figures, Latex, Accepted for publication in Physica
proovframe: frameshift-correction for long-read (meta)genomics
Long-read sequencing technologies hold big promises for the genomic analysis of complex samples such as microbial communities. Yet, despite improving accuracy, basic gene prediction on long-read data is still often impaired by frameshifts resulting from small indels. Consensus polishing using either complementary short reads or to a lesser extent the long reads themselves can mitigate this effect but requires universally high sequencing depth, which is difficult to achieve in complex samples where the majority of community members are rare. Here we present proovframe, a software implementing an alternative approach to overcome frameshift errors in long-read assemblies and raw long reads. We utilize protein-to-nucleotide alignments against reference databases to pinpoint indels in contigs or reads and correct them by deleting or inserting 1-2 bases, thereby conservatively restoring reading-frame fidelity in aligned regions. Using simulated and real-world benchmark data we show that proovframe performs comparably to short-read-based polishing on assembled data, works well with remote protein homologs, and can even be applied to raw reads directly. Together, our results demonstrate that protein-guided frameshift correction significantly improves the analyzability of long-read data both in combination with and as an alternative to common polishing strategies. Proovframe is available from https://github.com/thackl/proovframe
Hybrid Monte Carlo with Fat Link Fermion Actions
The use of APE smearing or other blocking techniques in lattice fermion
actions can provide many advantages. There are many variants of these fat link
actions in lattice QCD currently, such as FLIC fermions. The FLIC fermion
formalism makes use of the APE blocking technique in combination with a
projection of the blocked links back into the special unitary group. This
reunitarisation is often performed using an iterative maximisation of a gauge
invariant measure. This technique is not differentiable with respect to the
gauge field and thus prevents the use of standard Hybrid Monte Carlo simulation
algorithms. The use of an alternative projection technique circumvents this
difficulty and allows the simulation of dynamical fat link fermions with
standard HMC and its variants. The necessary equations of motion for FLIC
fermions are derived, and some initial simulation results are presented. The
technique is more general however, and is straightforwardly applicable to other
smearing techniques or fat link actions
Law of Genome Evolution Direction : Coding Information Quantity Grows
The problem of the directionality of genome evolution is studied. Based on
the analysis of C-value paradox and the evolution of genome size we propose
that the function-coding information quantity of a genome always grows in the
course of evolution through sequence duplication, expansion of code, and gene
transfer from outside. The function-coding information quantity of a genome
consists of two parts, p-coding information quantity which encodes functional
protein and n-coding information quantity which encodes other functional
elements except amino acid sequence. The evidences on the evolutionary law
about the function-coding information quantity are listed. The needs of
function is the motive force for the expansion of coding information quantity
and the information quantity expansion is the way to make functional innovation
and extension for a species. So, the increase of coding information quantity of
a genome is a measure of the acquired new function and it determines the
directionality of genome evolution.Comment: 16 page
- …