3,868 research outputs found
Analysis of Controlling Genes for Tiller Growth of \u3cem\u3ePsathyrostachys juncea\u3c/em\u3e Based on Transcriptome Sequencing Technology
Tillering is an important trait of bunch grass that affects biomass and seed yield. Psathyrostachys juncea is a typical perennial bunch grass, and unraveling the regulatory mechanisms of tillering in P. juncea could be helpful to improve the yield of perennial gramineous forages. Hence, we selected the tiller node of P. juncea for transcriptome sequencing to determine the differentially expressed genes (DEG) between high and low tillering materials. The metabolic pathway was studied,candidate genes were screened, and reference genes stability were evaluated. The results showed that approximately 5466 DEGs were identified between two P. juncea genotypes that significantly differed in tiller number. Pathway enrichment analysis indicated that DEGs related to the biosynthesis of three classes of phytohormones, i.e., strigolactones (SLs), auxin (IAA), and cytokinin (CTK), as well as “nitrogen metabolism” and “biosynthesis of lignin” dominated the differences between the dense and sparse tillering genotypes. Meanwhile, the reference gene Actin1, having with the best stability, was screened from twelve highest expression level genes and was used in verification of ten tillering candidate genes. The candidate genes revealed in our research are involved in the regulation of tillering in perennial grasses and are available for new breeding resources establishment for high-yield perennial grasses
Order-preserving Consistency Regularization for Domain Adaptation and Generalization
Deep learning models fail on cross-domain challenges if the model is oversensitive to domain-specific attributes, e.g., lightning, background, camera angle, etc. To alleviate this problem, data augmentation coupled with consistency regularization are commonly adopted to make the model less sensitive to domain-specific attributes. Consistency regularization enforces the model to output the same representation or prediction for two views of one image. These constraints, however, are either too strict or not order-preserving for the classification probabilities. In this work, we propose the Order-preserving Consistency Regularization (OCR) for cross-domain tasks. The order-preserving property for the prediction makes the model robust to task-irrelevant transformations. As a result, the model becomes less sensitive to the domain-specific attributes. The comprehensive experiments show that our method achieves clear advantages on five different cross-domain tasks
Variational Multi-Task Learning with Gumbel-Softmax Priors
Multi-task learning aims to explore task relatedness to improve individual
tasks, which is of particular significance in the challenging scenario that
only limited data is available for each task. To tackle this challenge, we
propose variational multi-task learning (VMTL), a general probabilistic
inference framework for learning multiple related tasks. We cast multi-task
learning as a variational Bayesian inference problem, in which task relatedness
is explored in a unified manner by specifying priors. To incorporate shared
knowledge into each task, we design the prior of a task to be a learnable
mixture of the variational posteriors of other related tasks, which is learned
by the Gumbel-Softmax technique. In contrast to previous methods, our VMTL can
exploit task relatedness for both representations and classifiers in a
principled way by jointly inferring their posteriors. This enables individual
tasks to fully leverage inductive biases provided by related tasks, therefore
improving the overall performance of all tasks. Experimental results
demonstrate that the proposed VMTL is able to effectively tackle a variety of
challenging multi-task learning settings with limited training data for both
classification and regression. Our method consistently surpasses previous
methods, including strong Bayesian approaches, and achieves state-of-the-art
performance on five benchmark datasets.Comment: 19 pages, 6 figures, accepted by NeurIPS 202
Probabilistic Integration of Object Level Annotations in Chest X-ray Classification
Medical image datasets and their annotations are not growing as fast as their equivalents in the general domain. This makes translation from the newest, more data-intensive methods that have made a large impact on the vision field increasingly more difficult and less efficient. In this paper, we propose a new probabilistic latent variable model for disease classification in chest X-ray images. Specifically we consider chest X-ray datasets that contain global disease labels, and for a smaller subset contain object level expert annotations in the form of eye gaze patterns and disease bounding boxes. We propose a two-stage optimization algorithm which is able to handle these different label granularities through a single training pipeline in a two-stage manner. In our pipeline global dataset features are learned in the lower level layers of the model. The specific details and nuances in the fine-grained expert object-level annotations are learned in the final layers of the model using a knowledge distillation method inspired by conditional variational inference. Subsequently, model weights are frozen to guide this learning process and prevent overfitting on the smaller richly annotated data subsets. The proposed method yields consistent classification improvement across different back-bones on the common benchmark datasets Chest X-ray14 and MIMIC-CXR. This shows how two-stage learning of labels from coarse to fine-grained, in particular with object level annotations, is an effective method for more optimal annotation usage.</p
Probing the lightest new gauge boson in the littlest Higgs model via the processes at the ILC
The neutral gauge boson with the mass of hundreds GeV, is the lightest
particle predicted by the littlest Higgs(LH) model, and such particle should be
the first signal of the LH model at the planed ILC if it exists indeed. In this
paper, we study some processes of the production associated with the
fermion pair at the ILC, i.e., . The studies
show that the most promising processes to detect among are , and they can
produce the sufficient signals in most parameter space preferred by the
electroweak precision data at the ILC. On the other hand, the signal produced
via the certain decay modes is typical and such signal can be easily
identified from the SM background. Therefore, , the lightest gauge boson
in the LH model would be detectable at the photon collider realized at the ILC.Comment: 12 pages, 4 figure
- …