69,297 research outputs found

    Exact generator and its high order expansions in the time-convolutionless generalized master equation: Applications to the spin-boson model and exictation energy transfer

    Full text link
    The time-convolutionless (TCL) quantum master equation provides a powerful tool to simulate reduced dynamics of a quantum system coupled to a bath. The key quantity in the TCL master equation is the so-called kernel or generator, which describes effects of the bath degrees of freedom. Since the exact TCL generators are usually hard to calculate analytically, most applications of the TCL generalized master equation have relied on approximate generators using second and fourth order perturbative expansions. By using the hierarchical equation of motion (HEOM) and extended HEOM methods, we present a new approach to calculate the exact TCL generator and its high order perturbative expansions. The new approach is applied to the spin-boson model with different sets of parameters, to investigate the convergence of the high order expansions of the TCL generator. We also discuss circumstances where the exact TCL generator becomes singular for the spin-boson model, and a model of excitation energy transfer in the Fenna-Matthews-Olson complex

    Exciting Changes are Coming to The Christian Librarian

    Full text link
    Back in 1996 I came on board the TCL team with a dream. My hope was to make TCL a peer reviewed publication. Now, many years later, I am excited to say this dream will soon become a reality! Beginning in 2009, TCL will carry peer reviewed content

    Successful Libraries Survey

    Full text link

    Flow cytometric characterization and clinical outcome of CD4+ T-cell lymphoma in dogs: 67 cases.

    Get PDF
    BackgroundCanine T-cell lymphoma (TCL) is conventionally considered an aggressive disease, but some forms are histologically and clinically indolent. CD4 TCL is reported to be the most common subtype of TCL. We assessed flow cytometric characteristics, histologic features when available, and clinical outcomes of CD4+ TCL to determine if flow cytometry can be used to subclassify this group of lymphomas.ObjectiveTo test the hypothesis that canine CD4+ T-cell lymphoma (TCL) is a homogeneous group of lymphomas with an aggressive clinical course.AnimalsSixty-seven dogs diagnosed with CD4+ TCL by flow cytometry and treated at 1 of 3 oncology referral clinics.MethodsRetrospective multivariable analysis of outcome in canine CD4+ TCL including patient characteristics, treatment, and flow cytometric features.ResultsThe majority of CD4+ TCL were CD45+, expressed low class II MHC, and exhibited an aggressive clinical course independent of treatment regimen (median survival, 159 days). Histologically, CD4+ TCL were classified as lymphoblastic or peripheral T cell. Size of the neoplastic lymphocytes had a modest effect on both PFI and survival in this group. A small number of CD4+ TCL were CD45- and class II MHC high, and exhibited an apparently more indolent clinical course (median survival not yet reached).Conclusions and clinical importanceAlthough the majority of CD4+ TCL in dogs had uniform clinical and flow cytometric features and an aggressive clinical course, a subset had a unique immunophenotype that predicts significantly longer survival. This finding strengthens the utility of flow cytometry to aid in the stratification of canine lymphoma

    Magnetism and effect of anisotropy with one dimensional monatomic chain of cobalt by a Monte Carlo simulation

    Full text link
    The magnetic properties of the one dimensional (1D) monatomic chain of Co reported in a previous experimental work are investigated by a classical Monte Carlo simulation based on the anisotropic Heisenberg model. In our simulation, the effect of the on-site uniaxial anisotropy, Ku, on each individual Co atom and the nearest neighbour exchange interaction, J, are accounted for. The normalized coercivity HC(T)/HC(TCL) is found to show a universal behaviour, HC(T)/HC(TCL) = h0(e^{TB/T}-e) in the temperature interval, TCL < T < TBCal, arising from the thermal activation effect. In the above expression, h0 is a constant, TBCal is the blocking temperature determined by the calculation, and TCL is the temperature above which the classical Monte Carlo simulation gives a good description on the investigated system. The present simulation has reproduced the experimental features, including the temperature dependent coercivity, HC(T), and the angular dependence of the remanent magnetization, MR(phi,theta), upon the relative orientation (phi,theta) of the applied field H. In addition, the calculation reveals that the ferromagnetic-like open hysteresis loop is a result of a slow dynamical process at T < TBCal. The dependence of the dynamical TBCal on the field sweeping rate R, the on-site anisotropy constant Ku, and the number of atoms in the atomic chain, N, has been investigated in detail.Comment: 20 pages, 7 figures included, J Phys Condens Matter, In Pres

    Editorial

    Full text link
    The first TCL issue of the new year is now under my belt. As the new Design Editor, the January 2000 TCL was a labor of love. It was also one of trial and error. I learned how to logically arrange the pages and articles within the issue. I experimented with various graphical changes but tried to keep a recognizable style in the grand TLC tradition

    Time-Contrastive Learning Based Deep Bottleneck Features for Text-Dependent Speaker Verification

    Get PDF
    There are a number of studies about extraction of bottleneck (BN) features from deep neural networks (DNNs)trained to discriminate speakers, pass-phrases and triphone states for improving the performance of text-dependent speaker verification (TD-SV). However, a moderate success has been achieved. A recent study [1] presented a time contrastive learning (TCL) concept to explore the non-stationarity of brain signals for classification of brain states. Speech signals have similar non-stationarity property, and TCL further has the advantage of having no need for labeled data. We therefore present a TCL based BN feature extraction method. The method uniformly partitions each speech utterance in a training dataset into a predefined number of multi-frame segments. Each segment in an utterance corresponds to one class, and class labels are shared across utterances. DNNs are then trained to discriminate all speech frames among the classes to exploit the temporal structure of speech. In addition, we propose a segment-based unsupervised clustering algorithm to re-assign class labels to the segments. TD-SV experiments were conducted on the RedDots challenge database. The TCL-DNNs were trained using speech data of fixed pass-phrases that were excluded from the TD-SV evaluation set, so the learned features can be considered phrase-independent. We compare the performance of the proposed TCL bottleneck (BN) feature with those of short-time cepstral features and BN features extracted from DNNs discriminating speakers, pass-phrases, speaker+pass-phrase, as well as monophones whose labels and boundaries are generated by three different automatic speech recognition (ASR) systems. Experimental results show that the proposed TCL-BN outperforms cepstral features and speaker+pass-phrase discriminant BN features, and its performance is on par with those of ASR derived BN features. Moreover,....Comment: Copyright (c) 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other work

    Tensor Contraction Layers for Parsimonious Deep Nets

    Get PDF
    Tensors offer a natural representation for many kinds of data frequently encountered in machine learning. Images, for example, are naturally represented as third order tensors, where the modes correspond to height, width, and channels. Tensor methods are noted for their ability to discover multi-dimensional dependencies, and tensor decompositions in particular, have been used to produce compact low-rank approximations of data. In this paper, we explore the use of tensor contractions as neural network layers and investigate several ways to apply them to activation tensors. Specifically, we propose the Tensor Contraction Layer (TCL), the first attempt to incorporate tensor contractions as end-to-end trainable neural network layers. Applied to existing networks, TCLs reduce the dimensionality of the activation tensors and thus the number of model parameters. We evaluate the TCL on the task of image recognition, augmenting two popular networks (AlexNet, VGG). The resulting models are trainable end-to-end. Applying the TCL to the task of image recognition, using the CIFAR100 and ImageNet datasets, we evaluate the effect of parameter reduction via tensor contraction on performance. We demonstrate significant model compression without significant impact on the accuracy and, in some cases, improved performance

    A novel representation of energy and signal transformation in measurement systems

    Get PDF
    This work presents a novel representation of energy and signal transformation in a measurement system, which is essentially a transducer conversion logic or language (TCL). Using two-port and three-port transducers as basic building blocks, it can be utilized to model any measurement system. It has the key features of object-orientation and consists of only text with very simple syntax. The TCL can be easily handled and processed by computers. This paper has demonstrated its use in description, classification, and computer-aided analysis and design of measuring instruments with some preliminary test results. It will find wide applications in modeling, analysis, design, and education in measurement, control, and information processing
    corecore