274 research outputs found
Effect of transients in nuclear fission on multiplicity of prescission neutrons
Transients in the fission of highly excited nuclei are studied in the
framework of the Langevin equation. Time-dependent fission widths are
calculated which show that after the initial transients, a steady flow towards
the scission point is established not only for nuclei which have fission
barriers but also for nuclei which have no fission barrier. It is shown from a
comparison of the transient time and the fission life time that fission changes
from a diffusive to a transient dominated process over a certain transition
region as a function of the spin of the fissioning nucleus. Multiplicities of
prescission neutrons are calculated in a statistical model with as well as
without a single swoop description of fission and they are found to differ in
the transition region. We however find that the difference is marginal and
hence a single swoop picture of fission though not strictly valid in the
transition region can still be used in the statistical model calculations.Comment: 15 pages including 7 figures, to appear in The European Physical
Journal
Quest for consistent modelling of statistical decay of the compound nucleus
A statistical model description of heavy ion induced fusion-fission reactions
is presented where shell effects, collective enhancement of level density,
tilting away effect of compound nuclear spin and dissipation are included. It
is shown that the inclusion of all these effects provides a consistent picture
of fission where fission hindrance is required to explain the experimental
values of both pre-scission neutron multiplicities and evaporation residue
cross-sections in contrast to some of the earlier works where a fission
hindrance is required for pre-scission neutrons but a fission enhancement for
evaporation residue cross-sections.Comment: 14 pages, 2 figure
Alleviating the inconsistencies in modelling decay of fissile compound nuclei
This work attempts to overcome the existing inconsistencies in modelling
decay of fissile nucleus by inclusion of important physical effects in the
model and through a systematic analysis of a large set of data over a wide
range of CN mass (ACN). The model includes shell effect in the level density
(LD) parameter, shell correction in the fission barrier, effect of the
orientation degree of freedom of the CN spin (Kor), collective enhancement of
level density (CELD) and dissipation in fission. Input parameters are not tuned
to reproduce observables from specific reaction(s) and the reduced dissipation
coefficient is treated as the only adjustable parameter. Calculated evaporation
residue (ER) cross sections, fission cross sections and particle, i.e. neutron,
proton and alpha-particle, multiplicities are compared with data covering ACN =
156-248. The model produces reasonable fits to ER and fission excitation
functions for all the reactions considered in this work. Pre-scission neutron
multiplicities are underestimated by the calculation beyond ACN~200. An
increasingly higher value of pre-saddle dissipation strength is required to
reproduce the data with increasing ACN. Proton and alpha-particle
multiplicities, measured in coincidence with both ERs and fission fragments,
are in qualitative agreement with model predictions. The present work mitigates
the existing inconsistencies in modelling statistical decay of the fissile CN
to a large extent.Comment: 15 pages, 9 figure
Complete absence of localization in a family of disordered lattices
We present analytically exact results to show that, certain quasi
one-dimensional lattices where the building blocks are arranged in a random
fashion, can have an absolutely continuous part in the energy spectrum when
special correlations are introduced among some of the parameters describing the
corresponding Hamiltonians. We explicitly work out two prototype cases, one
being a disordered array of a simple diamond network and isolated dots, and the
other an array of triangular plaquettes and dots. In the latter case, a
magnetic flux threading each plaquette plays a crucial role in converting the
energy spectrum into an absolutely continuous one. A flux controlled
enhancement in the electronic transport is an interesting observation in the
triangle-dot system that may be useful while considering prospective devices.
The analytical findings are comprehensively supported by extensive numerical
calculations of the density of states and transmission coefficient in each
case.Comment: 6 pages, 6 figures, epl draf
Exploring the role of therapeutic drug monitoring and regular supervision in optimizing quality of life in patients of bipolar affective disorder receiving lithium therapy in a tertiary care teaching hospital: a prospective observational study
Background: Lithium is considered first line drug effective in treating manic and mixed episodes of bipolar affective disorders throughout the globe. But the chronic and heterogenous nature of disease, along with toxicity of lithium often make patients non-adherent to medication as well as diminished health related quality of life. Present study was done to find out the prospect of regular supervision and follow up with therapeutic drug monitoring in optimization of lithium therapy based on health-related quality of life outcomes.Methods: It was a prospective, non-randomized, observational study of a cohort of subjects who are suffering from bipolar affective disorders and on lithium therapy. Patients were regularly followed up with therapeutic drug monitoring and personalized interview with questionnaires like WHO Quality of Life Score (QOL-Bref), Montgomery-Asberg Depression Scale (MADRS) and Medication Adherence Rating Scale (MARS).Results: Results revealed there was significant improvement in health-related quality of life of patients who were monitored with therapeutic drug monitoring and prescribed lithium therapy.Conclusions: Hence to maintain patients’ quality of life improved throughout the cycle of bipolar disorder spectrum, regular follow-up visits with monitoring of serum levels of lithium is needed, so that adverse effects would be minimal and adherence to medication become optimal. These optimal dosing resulting in optimal benefit to patients can be achieved with the involvement of clinical pharmacological consultation
A Hybrid Machine Translation Framework for an Improved Translation Workflow
Over the past few decades, due to a continuing surge in the amount of content being translated and ever increasing pressure to deliver high quality and high throughput translation, translation industries are focusing their interest on adopting advanced technologies such as machine translation (MT), and automatic post-editing (APE) in their translation workflows. Despite the progress of the technology, the roles of humans and machines essentially remain intact as MT/APE are moving from the peripheries of the translation field closer towards collaborative human-machine based MT/APE in modern translation workflows. Professional translators increasingly become post-editors correcting raw MT/APE output instead of translating from scratch which in turn increases productivity in terms of translation speed. The last decade has seen substantial growth in research and development activities on improving MT; usually concentrating on selected aspects of workflows starting from training data pre-processing techniques to core MT processes to post-editing methods. To date, however, complete MT workflows are less investigated than the core MT processes. In the research presented in this thesis, we investigate avenues towards achieving improved MT workflows. We study how different MT paradigms can be utilized and integrated to best effect. We also investigate how different upstream and downstream component technologies can be hybridized to achieve overall improved MT. Finally we include an investigation into human-machine collaborative MT by taking humans in the loop. In many of (but not all) the experiments presented in this thesis we focus on data scenarios provided by low resource language settings.Aufgrund des stetig ansteigenden Übersetzungsvolumens in den letzten Jahrzehnten und
gleichzeitig wachsendem Druck hohe Qualität innerhalb von kürzester Zeit liefern zu
müssen sind Übersetzungsdienstleister darauf angewiesen, moderne Technologien wie
Maschinelle Übersetzung (MT) und automatisches Post-Editing (APE) in den Übersetzungsworkflow
einzubinden. Trotz erheblicher Fortschritte dieser Technologien haben
sich die Rollen von Mensch und Maschine kaum verändert. MT/APE ist jedoch nunmehr
nicht mehr nur eine Randerscheinung, sondern wird im modernen Übersetzungsworkflow
zunehmend in Zusammenarbeit von Mensch und Maschine eingesetzt. Fachübersetzer
werden immer mehr zu Post-Editoren und korrigieren den MT/APE-Output, statt wie
bisher Übersetzungen komplett neu anzufertigen. So kann die Produktivität bezüglich
der Übersetzungsgeschwindigkeit gesteigert werden. Im letzten Jahrzehnt hat sich in den
Bereichen Forschung und Entwicklung zur Verbesserung von MT sehr viel getan: Einbindung
des vollständigen Übersetzungsworkflows von der Vorbereitung der Trainingsdaten
über den eigentlichen MT-Prozess bis hin zu Post-Editing-Methoden. Der vollständige
Übersetzungsworkflow wird jedoch aus Datenperspektive weit weniger berücksichtigt
als der eigentliche MT-Prozess. In dieser Dissertation werden Wege hin zum
idealen oder zumindest verbesserten MT-Workflow untersucht. In den Experimenten
wird dabei besondere Aufmertsamfit auf die speziellen Belange von sprachen mit geringen
ressourcen gelegt. Es wird untersucht wie unterschiedliche MT-Paradigmen verwendet
und optimal integriert werden können. Des Weiteren wird dargestellt wie unterschiedliche
vor- und nachgelagerte Technologiekomponenten angepasst werden können, um insgesamt
einen besseren MT-Output zu generieren. Abschließend wird gezeigt wie der Mensch in
den MT-Workflow intergriert werden kann. Das Ziel dieser Arbeit ist es verschiedene
Technologiekomponenten in den MT-Workflow zu integrieren um so einen verbesserten
Gesamtworkflow zu schaffen. Hierfür werden hauptsächlich Hybridisierungsansätze verwendet.
In dieser Arbeit werden außerdem Möglichkeiten untersucht, Menschen effektiv
als Post-Editoren einzubinden
DCU@FIRE-2014: fuzzy queries with rule-based normalization for mixed script information retrieval
We describe the participation of Dublin City University (DCU) in the FIRE-2014 transliteration search task (TST). The TST involves an ad-hoc search over a collection of Hindi film song lyrics. The Hindi language content of each document in the collection is either written in the native Devanagari script or transliterated in Roman script or a combination of both. The queries can be in mixed script as well. The task is challenging primarily because of the vocabulary mismatch which may arise due to the multiple transliteration alternatives. We attempt to address the vocabulary mismatch problem both during the indexing and retrieval stages. During indexing, we apply a rule-based normalization on some character sequences of the transliterated words in order to have a single representation in the index for the multiple transliteration alternatives. During the retrieval phase, we make use of prefix matched fuzzy query terms to account for the morphological variations of the transliterated words. The results show significant improvement over a standard baseline query likelihood language modelling (LM) approach. Additionally, we also apply statistical machine transliteration to train a transliteration model in order to predict the transliteration of out-of-vocabulary words. Surprisingly, even with satisfactory transliteration accuracy, we found that automatic transliteration of query terms degraded retrieval effectiveness
- …