6,287 research outputs found
Felix Alexandrovich Berezin and his work
This is a survey of Berezin's work focused on three topics: representation
theory, general concept of quantization, and supermathematics.Comment: LaTeX, 27 page
Intraoperative changes in blood coagulation and thrombelastographic monitoring in liver transplantation
The blood coagulation system of 66 consecutive patients undergoing consecutive liver transplantations was monitored by thrombelastograph and analytic coagulation profile. A poor preoperative coagulation state, decrease in levels of coagulation factors, progressive fibrinolysis, and whole blood clot lysis were observed during the preanhepatic and anhepatic stages of surgery. A further general decrease in coagulation factors and platelets, activation of fibrinolysis, and abrupt decrease in levels of factors V and VIII occurred before and with reperfusion of the homograft. Recovery of blood coagulability began 30-60 min after reperfusion of the graft liver, and coagulability had returned toward baseline values 2 hr after reperfusion. A positive correlation was shown between the variables of thrombelastography and those of the coagulation profile. Thrombelastography was shown to be a reliable and rapid monitoring system. Its use was associated with a 33% reduction of blood and fluid infusion volume, whereas blood coagulability was maintained without an increase in the number of blood product donors
Realizing value from project implementation under uncertainty : an exploratory study using system dynamics
Project Implementation is not a trivial task even after careful planning and scheduling. One of the reasons is the existence of unexpected events at strategic and operational levels during the project execution process. This paper presents a system dynamics model of a project monitoring and control system. Embedded with both strategic and tactical uncertainties, the model experiments with typical remedial actions to disturbances during the implementation of a project under a behavioral paradigm. Simple proportional adjustment seems to work well under low levels of unexpected disturbances but prospect theory-based behavior works better under extreme situations. Our findings indicate over-reacting behavior, which is influenced by biases and reporting errors, can generate project escalation. Thus, thresholds for remedial actions should be implemented in project control and monitoring systems to avoid over-reacting behavior leading to escalation and waste of resources
Effective transmission conditions for domain decomposition methods applied to the time-harmonic curl-curl Maxwell's equations
The time-harmonic Maxwell equations describe the propagation of electromagnetic waves and are therefore fundamental for the simulation of many modern devices we have become used to in everyday life. The numerical solution of these equations is hampered by two fundamental problems: first, in the high frequency regime, very fine meshes need to be used in order to avoid the pollution effect well known for the Helmholtz equation, and second the large scale systems obtained from the vector valued equations in three spatial dimensions need to be solved by iterative methods, since direct factorizations are not feasible any more at that scale. As for the Helmholtz equation, classical iterative methods applied to discretized Maxwell equations have severe convergence problems.We explain in this paper a family of domain decomposition methods based on well chosen transmission conditions. We show that all transmission conditions proposed so far in the literature, both for the first and second order formulation of Maxwell's equations, can be written and optimized in the common framework of optimized Schwarz methods, independently of the first or second order formulation one uses, and the performance of the corresponding algorithms is identical. We use a decomposition into transverse electric and transverse magnetic fields to describe these algorithms, which greatly simplifies the convergence analysis of the methods. We illustrate the performance of our algorithms with large scale numerical simulations
Recommended from our members
Pleistocene climate change promoted rapid diversification of aquatic invertebrates in Southeast Australia
Background: The Pleistocene Ice Ages were the most recent geohistorical event of major global impact, but their consequences for most parts of the Southern hemisphere remain poorly known. We investigate a radiation of ten species of Sternopriscus, the most species-rich genus of epigean Australian diving beetles. These species are distinct based on genital morphology but cannot be distinguished readily by mtDNA and nDNA because of genotype sharing caused by incomplete lineage sorting. Their genetic similarity suggests a Pleistocene origin. Results: We use a dataset of 3858 bp of mitochondrial and nuclear DNA to reconstruct a phylogeny of Sternopriscus using gene and species trees. Diversification analyses support the finding of a recent rapid speciation event with estimated speciation rates of up to 2.40 species per MY, which is considerably higher than the proposed average rate of 0.16 species per MY for insects. Additionally, we use ecological niche modeling and analyze data on habitat preferences to test for niche divergence between species of the recent Sternopriscus radiation. These analyses show that the species can be characterized by a set of ecological variables referring to habitat, climate and altitude. Conclusions: Our results suggest that the repeated isolation of populations in glacial refugia might have led to divergent ecological adaptations and the fixation of morphological traits supporting reproductive isolation and therefore may have promoted speciation. The recent Sternopriscus radiation fulfills many characteristics of a species flock and would be the first described example of an aquatic insect species flock. We argue that the species of this group may represent a stage in speciation past the species flock condition because of their mostly broad and often non-overlapping ranges and preferences for different habitat types.Organismic and Evolutionary Biolog
Wild meat hunting and use by sedentarised Baka Pygmies in southeastern Cameroon
As a result of sedentarisation many Baka Pygmies have changed their mobility patterns away from nomadic lifestyles to living in roadside villages. These settled groups are increasingly dependent on cultivated foods but still rely on forest resources. The level of dependence on hunting of wild animals for food and cash, as well as the hunting profiles of sedentarised Pygmy groups is little known. In this study we describe the use of wild meat in 10 Baka villages along the Djoum-Mintom road in southeastern Cameroon. From data collected from 1,946 hunting trips by 121 hunters, we show that most trips are of around 13 hours and a median of eight hours. A mean ± SD of 1.15 ± 1.11 animal carcasses are taken in a single trip; there was a positive correlation between duration of trips and carcasses. A total of 2,245 carcasses of 49 species of 24 animal families were taken in the study; species diversity was similar in all villages except one. Most hunted animals were mammals, with ungulates contributing the highest proportion. By species, just over half of the animal biomass extracted by all hunters in the studied villages was provided by four mammal species. Most animals were trapped (65.77% ± 16.63), followed by shot with guns (22.56% ± 17.72), other methods (8.69% ± 6.96) and with dogs (2.96% ± 4.49). A mean of 7,569.7 ± 6,103.4 kg yr−1 (2,080.8–19,351.4) were extracted per village, giving 75,697 kg yr−1 in total, which is equivalent to 123 UK dairy cattle. In all villages, 48.07% ± 17.58 of animals hunted were consumed by the hunter and his family, around 32.73% ± 12.55, were sold, followed by a lower percentage of carcasses partially sold and consumed (19.21% ± 17.02). Between 60% and 80% of carcasses belonged to the “least concern” category, followed by “near threatened”, “vulnerable” and, rarely “endangered”. The only endangered species hunted was the chimpanzee (Pan troglodytes). We suggest that hunting is a critical activity that provides a vital source of food for our study communities. Measured wild meat extraction levels are likely to be sustainable if hunter densities do not increase
Using gamma+jets Production to Calibrate the Standard Model Z(nunu)+jets Background to New Physics Processes at the LHC
The irreducible background from Z(nunu)+jets, to beyond the Standard Model
searches at the LHC, can be calibrated using gamma+jets data. The method
utilises the fact that at high vector boson pT, the event kinematics are the
same for the two processes and the cross sections differ mainly due to the
boson-quark couplings. The method relies on a precise prediction from theory of
the Z/gamma cross section ratio at high pT, which should be insensitive to
effects from full event simulation. We study the Z/gamma ratio for final states
involving 1, 2 and 3 hadronic jets, using both the leading-order parton shower
Monte Carlo program Pythia8 and a leading-order matrix element program Gambos.
This enables us both to understand the underlying parton dynamics in both
processes, and to quantify the theoretical systematic uncertainties in the
ratio predictions. Using a typical set of experimental cuts, we estimate the
net theoretical uncertainty in the ratio to be of order 7%, when obtained from
a Monte Carlo program using multiparton matrix-elements for the hard process.
Uncertainties associated with full event simulation are found to be small. The
results indicate that an overall accuracy of the method, excluding statistical
errors, of order 10% should be possible.Comment: 22 pages, 14 figures; Accepted for publication by JHE
Pooling breast cancer datasets has a synergetic effect on classification performance and improves signature stability
Background: Michiels et al. (Lancet 2005; 365: 488-92) employed a resampling strategy to show that the genes identified as predictors of prognosis from resamplings of a single gene expression dataset are highly variable. The genes most frequently identified in the separate resamplings were put forward as a 'gold standard'. On a higher level, breast cancer datasets collected by different institutions can be considered as resamplings from the underlying breast cancer population. The limited overlap between published prognostic signatures confirms the trend of signature instability identified by the resampling strategy. Six breast cancer datasets, totaling 947 samples, all measured on the Affymetrix platform, are currently available. This provides a unique opportunity to employ a substantial dataset to investigate the effects of pooling datasets on classifier accuracy, signature stability and enrichment of functional categories. Results: We show that the resampling strategy produces a suboptimal ranking of genes, which can not be considered to be a 'gold standard'. When pooling breast cancer datasets, we observed a synergetic effect on the classification performance in 73% of the cases. We also observe a significant positive correlation between the number of datasets that is pooled, the validation performance, the number of genes selected, and the enrichment of specific functional categories. In addition, we have evaluated the support for five explanations that have been postulated for the limited overlap of signatures. Conclusion: The limited overlap of current signature genes can be attributed to small sample size. Pooling datasets results in more accurate classification and a convergence of signature genes. We therefore advocate the analysis of new data within the context of a compendium, rather than analysis in isolatio
Renal impairment in a rural African antiretroviral programme
Background:
There is little knowledge regarding the prevalence and nature of renal impairment in African populations initiating antiretroviral treatment, nor evidence to inform the most cost effective methods of screening for renal impairment. With the increasing availability of the potentially nephrotixic drug, tenofovir, such information is important for the planning of antiretroviral programmes
Methods:
(i) Retrospective review of the prevalence and risk factors for impaired renal function in 2189 individuals initiating antiretroviral treatment in a rural African setting between 2004 and 2007 (ii) A prospective study of 149 consecutive patients initiating antiretrovirals to assess the utility of urine analysis for the detection of impaired renal function. Severe renal and moderately impaired renal function were defined as an estimated GFR of ≤ 30 mls/min/1.73 m2 and 30–60 mls/min/1.73 m2 respectively. Logistic regression was used to determine odds ratio (OR) of significantly impaired renal function (combining severe and moderate impairment). Co-variates for analysis were age, sex and CD4 count at initiation.
Results:
(i) There was a low prevalence of severe renal impairment (29/2189, 1.3% 95% C.I. 0.8–1.8) whereas moderate renal impairment was more frequent (287/2189, 13.1% 95% C.I. 11.6–14.5) with many patients having advanced immunosuppression at treatment initiation (median CD4 120 cells/μl). In multivariable logistic regression age over 40 (aOR 4.65, 95% C.I. 3.54–6.1), male gender (aOR 1.89, 95% C.I. 1.39–2.56) and CD4<100 cells/ul (aOR 1.4, 95% C.I. 1.07–1.82) were associated with risk of significant renal impairment (ii) In 149 consecutive patients, urine analysis had poor sensitivity and specificity for detecting impaired renal function.
Conclusion:
In this rural African setting, significant renal impairment is uncommon in patients initiating antiretrovirals. Urine analysis alone may be inadequate for identification of those with impaired renal function where resources for biochemistry are limited
Galilean quantum gravity with cosmological constant and the extended q-Heisenberg algebra
We define a theory of Galilean gravity in 2+1 dimensions with cosmological
constant as a Chern-Simons gauge theory of the doubly-extended Newton-Hooke
group, extending our previous study of classical and quantum gravity in 2+1
dimensions in the Galilean limit. We exhibit an r-matrix which is compatible
with our Chern-Simons action (in a sense to be defined) and show that the
associated bi-algebra structure of the Newton-Hooke Lie algebra is that of the
classical double of the extended Heisenberg algebra. We deduce that, in the
quantisation of the theory according to the combinatorial quantisation
programme, much of the quantum theory is determined by the quantum double of
the extended q-deformed Heisenberg algebra.Comment: 22 page
- …