1,134 research outputs found
Recommended from our members
UVA Assisted 4-Thiothymidine for Cancer Treatment
This article reviews the developments of 4-thiothymidine analogues, assisted with UVA light, as a novel cancer therapy. First, the key points on synthetic chemistry, photochemistry and cellular toxicity of 4-thiothymidine are summarized. As the chemical structure of 4-thiothymidine is very similar to that of its parent thymidine, thus 4-thiothymidine can be readily incorporated into cellular DNA, and with the help of thymidine kinase, much more preferably into cancerous DNA. Unlike thymidine, 4-thiothymdine can absorb strongly in UVA (longer wavelengths of UV) light. Thus UVA-assisted 4-thiothymidine offers an effective cancer treatment. Some underlying mechanisms of action by 4-thiothymidine/UVA and compares this cancer approach with the commonly used photodynamic therapy are discussed. The various interactions between 4-thiothymidine with human serum albumin are introduced. Finally, a short conclusion on the past efforts and a brief prospect for future work in this exciting research field are given
A General Theory for Direct Quantitative Analysis of Antigen
A theory for direct quantitative analysis of an antigen is proposed. It is
based on a potential homogenous immunoreaction system. It establishes an
equation to describe the concentration change of the antigen and antibody
complex. A maximum point is found in the concentration profile of the complex
which can be used to calculate the concentration of the antigen. An
experimental scheme was designed for a commercial time-resolved
fluoroimmunoassay kit for HBsAg, which is based heterogeneous immunoreaction.
The results showed that the theory is practically applicable.Comment: 7pages, 2 figure
Lepton-Jet Correlations in Deep Inelastic Scattering at the Electron-Ion Collider.
We propose the lepton-jet correlation in deep inelastic scattering as a unique tool for the tomography of nucleons and nuclei at the electron-ion collider (EIC). The azimuthal angular correlation between the final state lepton and jet depends on the transverse momentum dependent quark distributions. We take the example of single transverse spin asymmetries to show the sensitivity to the quark Sivers function. When the correlation is studied in lepton-nucleus collisions, transverse momentum broadening effects can be used to explore cold nuclear matter effects. These features make lepton-jet correlations an important new hard probe at the EIC
Double-Real-Virtual and Double-Virtual-Real Corrections to the Three-Loop Thrust Soft Function
We compute the double-real-virtual (RRV) and
double-virtual-real (VVR) soft contributions to the thrust/zero-jettiness event
shape. The result clears up one of the most stubborn obstacles toward the
complete thrust soft function. The results presented
here serve as the key input to realize the next-to-next-to-next-to-leading
logarithmic prime (NLL') % and even the
next-to-next-to-next-to-next-to-leading logarithmic (NLL) resummation of
the thrust event shape. The obtained results also constitute the important
ingredients of the -jettiness-subtraction scheme at
next-to-next-to-next-to-leading order (NLO).Comment: Updated version to simplify the results. Now the complete VRR results
are presented by including the channel. Full agreement was found
with a recent calculation in arXiv:2401.0524
Semantic Object Parsing with Local-Global Long Short-Term Memory
Semantic object parsing is a fundamental task for understanding objects in
detail in computer vision community, where incorporating multi-level contextual
information is critical for achieving such fine-grained pixel-level
recognition. Prior methods often leverage the contextual information through
post-processing predicted confidence maps. In this work, we propose a novel
deep Local-Global Long Short-Term Memory (LG-LSTM) architecture to seamlessly
incorporate short-distance and long-distance spatial dependencies into the
feature learning over all pixel positions. In each LG-LSTM layer, local
guidance from neighboring positions and global guidance from the whole image
are imposed on each position to better exploit complex local and global
contextual information. Individual LSTMs for distinct spatial dimensions are
also utilized to intrinsically capture various spatial layouts of semantic
parts in the images, yielding distinct hidden and memory cells of each position
for each dimension. In our parsing approach, several LG-LSTM layers are stacked
and appended to the intermediate convolutional layers to directly enhance
visual features, allowing network parameters to be learned in an end-to-end
way. The long chains of sequential computation by stacked LG-LSTM layers also
enable each pixel to sense a much larger region for inference benefiting from
the memorization of previous dependencies in all positions along all
dimensions. Comprehensive evaluations on three public datasets well demonstrate
the significant superiority of our LG-LSTM over other state-of-the-art methods.Comment: 10 page
Interpretable Structure-Evolving LSTM
This paper develops a general framework for learning interpretable data
representation via Long Short-Term Memory (LSTM) recurrent neural networks over
hierarchal graph structures. Instead of learning LSTM models over the pre-fixed
structures, we propose to further learn the intermediate interpretable
multi-level graph structures in a progressive and stochastic way from data
during the LSTM network optimization. We thus call this model the
structure-evolving LSTM. In particular, starting with an initial element-level
graph representation where each node is a small data element, the
structure-evolving LSTM gradually evolves the multi-level graph representations
by stochastically merging the graph nodes with high compatibilities along the
stacked LSTM layers. In each LSTM layer, we estimate the compatibility of two
connected nodes from their corresponding LSTM gate outputs, which is used to
generate a merging probability. The candidate graph structures are accordingly
generated where the nodes are grouped into cliques with their merging
probabilities. We then produce the new graph structure with a
Metropolis-Hasting algorithm, which alleviates the risk of getting stuck in
local optimums by stochastic sampling with an acceptance probability. Once a
graph structure is accepted, a higher-level graph is then constructed by taking
the partitioned cliques as its nodes. During the evolving process,
representation becomes more abstracted in higher-levels where redundant
information is filtered out, allowing more efficient propagation of long-range
data dependencies. We evaluate the effectiveness of structure-evolving LSTM in
the application of semantic object parsing and demonstrate its advantage over
state-of-the-art LSTM models on standard benchmarks.Comment: To appear in CVPR 2017 as a spotlight pape
- …