445 research outputs found

    First-line treatment of persistent and long-standing persistent atrial fibrillation with single-stage hybrid ablation:a 2-year follow-up study

    Get PDF
    AIMS: This study evaluates the efficacy and safety of first-line single-stage hybrid ablation of (long-standing) persistent atrial fibrillation (AF), over a follow-up period of 2 years, and provides additional information on arrhythmia recurrences and electrophysiological findings at repeat ablation. METHODS AND RESULTS: This is a prospective cohort study that included 49 patients (65% persistent AF; 35% long-standing persistent AF) who underwent hybrid ablation as first-line ablation treatment (no previous endocardial ablation). Patients were relatively young (57.0 ± 8.5 years) and predominantly male (89.8%). Median CHA2DS2-VASc score was 1.0 (0.5; 2.0) and mean left atrium volume index was 43.7 ± 10.9 mL/m2. Efficacy was assessed by 12-lead electrocardiography and 72-h Holter monitoring after 3, 6, 12, and 24 months. Recurrence was defined as AF/atrial flutter (AFL)/tachycardia (AT) recorded by electrocardiography or Holter monitoring lasting >30 s during 2-year follow-up. At 2-year follow-up, single and multiple procedure success rates were 67% and 82%, respectively. Two (4%) patients experienced a major complication (bleeding) requiring intervention following hybrid ablation. Among the 16 (33%) patients who experienced an AF/AFL/AT recurrence, 13 (81%) were ATs/AFLs and only 3 (19%) were AF. Repeat ablation was performed in 10 (20%) patients and resulted in sinus rhythm in 7 (70%) at 2-year follow-up. CONCLUSION: First-line single-stage hybrid AF ablation is an effective treatment strategy for patients with persistent and long-standing persistent AF with an acceptable rate of major complications. Recurrences are predominantly AFL/AT that can be successfully ablated percutaneously. Hybrid ablation seems a feasible approach for first-line ablation of (long-standing) persistent AF

    Evaluating Surveillance Strategies for the Early Detection of Low Pathogenicity Avian Influenza Infections

    Get PDF
    In recent years, the early detection of low pathogenicity avian influenza (LPAI) viruses in poultry has become increasingly important, given their potential to mutate into highly pathogenic viruses. However, evaluations of LPAI surveillance have mainly focused on prevalence and not on the ability to act as an early warning system. We used a simulation model based on data from Italian LPAI epidemics in turkeys to evaluate different surveillance strategies in terms of their performance as early warning systems. The strategies differed in terms of sample size, sampling frequency, diagnostic tests, and whether or not active surveillance (i.e., routine laboratory testing of farms) was performed, and were also tested under different epidemiological scenarios. We compared surveillance strategies by simulating within-farm outbreaks. The output measures were the proportion of infected farms that are detected and the farm reproduction number (Rh). The first one provides an indication of the sensitivity of the surveillance system to detect within-farm infections, whereas Rh reflects the effectiveness of outbreak detection (i.e., if detection occurs soon enough to bring an epidemic under control). Increasing the sampling frequency was the most effective means of improving the timeliness of detection (i.e., it occurs earlier), whereas increasing the sample size increased the likelihood of detection. Surveillance was only effective in preventing an epidemic if actions were taken within two days of sampling. The strategies were not affected by the quality of the diagnostic test, although performing both serological and virological assays increased the sensitivity of active surveillance. Early detection of LPAI outbreaks in turkeys can be achieved by increasing the sampling frequency for active surveillance, though very frequent sampling may not be sustainable in the long term. We suggest that, when no LPAI virus is circulating yet and there is a low risk of virus introduction, a less frequent sampling approach might be admitted, provided that the surveillance is intensified as soon as the first outbreak is detected

    Spatial distribution of micrometre‐scale porosity and permeability across the damage zone of a reverse‐reactivated normal fault in a tight sandstone : Insights from the Otway Basin, SE Australia

    Get PDF
    This research forms part of a PhD project supported by the Australian Research Council [Discovery Project DP160101158] and through an Australian Government Research Training Program Scholarship. Dave Healy acknowledges the support of the Natural Environment Research Council (NERC, UK) through the award NE/N003063/1 ‘Quantifying the Anisotropy of Permeability in Stressed Rock’. This study was also funded by scholarships from the Petroleum Exploration Society of Australia and the Australian Petroleum Production and Exploration Association. We thank Gordon Holm for preparing thin sections and Colin Taylor for carrying out particle size measurements and mercury injection capillary pressure analyses. Aoife McFadden and David Kelsey from Adelaide Microscopy, Braden Morgan, and Sophie Harland are acknowledged for their assistance with laboratory work. Field assistants James Hall, Rowan Hansberry, and Lachlan Furness are also gratefully acknowledged for their assistance with sample collection. Discussions with Ian Duddy on the mineralogy of the Eumeralla Formation are also greatly appreciated. This forms TRaX record 416.Peer reviewedPublisher PD

    First GIS analysis of modern stone tools used by wild chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    Get PDF
    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record.Leverhulme Trust [IN-052]; MEXT [20002001, 24000001]; JSPS-U04-PWS; FCT-Portugal [SFRH/BD/36169/2007]; Wenner-Gren Foundation for Anthropological Researc

    Machine Learning in Automated Text Categorization

    Full text link
    The automated categorization (or classification) of texts into predefined categories has witnessed a booming interest in the last ten years, due to the increased availability of documents in digital form and the ensuing need to organize them. In the research community the dominant approach to this problem is based on machine learning techniques: a general inductive process automatically builds a classifier by learning, from a set of preclassified documents, the characteristics of the categories. The advantages of this approach over the knowledge engineering approach (consisting in the manual definition of a classifier by domain experts) are a very good effectiveness, considerable savings in terms of expert manpower, and straightforward portability to different domains. This survey discusses the main approaches to text categorization that fall within the machine learning paradigm. We will discuss in detail issues pertaining to three different problems, namely document representation, classifier construction, and classifier evaluation.Comment: Accepted for publication on ACM Computing Survey

    Conditions of malaria transmission in Dakar from 2007 to 2010

    Get PDF
    Background: Previous studies in Dakar have highlighted the spatial and temporal heterogeneity of Anopheles gambiae s.l. biting rates. In order to improve the knowledge of the determinants of malaria transmission in this city, the present study reports the results of an extensive entomological survey that was conducted in 45 areas in Dakar from 2007 to 2010. Methods: Water collections were monitored for the presence of anopheline larvae. Adult mosquitoes were sampled by human landing collection. Plasmodium falciparum circumsporozoite (CSP) protein indexes were measured by ELISA (enzyme-linked immunosorbent assay), and the entomological inoculation rates were calculated. Results: The presence of anopheline larvae were recorded in 1,015 out of 2,683 observations made from 325 water collections. A water pH of equal to or above 8.0, a water temperature that was equal to or above 30 degrees C, the absence of larvivorous fishes, the wet season, the presence of surface vegetation, the persistence of water and location in a slightly urbanised area were significantly associated with the presence of anopheline larvae and/or with a higher density of anopheline larvae. Most of the larval habitats were observed in public areas, i.e., freely accessible. A total of 496,310 adult mosquitoes were caught during 3096 person-nights, and 44967 of these specimens were identified as An. gambiae s.l. The mean An. gambiae s.l. human-biting rate ranged from 0.1 to 248.9 bites per person per night during the rainy season. Anopheles arabiensis (93.14%), Anopheles melas (6.83%) and An. gambiae s.s. M form (0.03%) were the three members of the An. gambiae complex. Fifty-two An. arabiensis and two An. melas specimens were CSP-positive, and the annual CSP index was 0.64% in 2007, 0.09% in 2008-2009 and 0.12% in 2009-2010. In the studied areas, the average EIR ranged from 0 to 17.6 infected bites per person during the entire transmission season. Conclusion: The spatial and temporal heterogeneity of An. gambiae s.l. larval density, adult human-biting rate (HBR) and malaria transmission in Dakar has been confirmed, and the environmental factors associated with this heterogeneity have been identified. These results pave the way for the creation of malaria risk maps and for a focused anti-vectorial control strategy

    Equilibrium (Zipf) and Dynamic (Grasseberg-Procaccia) method based analyses of human texts. A comparison of natural (english) and artificial (esperanto) languages

    Full text link
    A comparison of two english texts from Lewis Carroll, one (Alice in wonderland), also translated into esperanto, the other (Through a looking glass) are discussed in order to observe whether natural and artificial languages significantly differ from each other. One dimensional time series like signals are constructed using only word frequencies (FTS) or word lengths (LTS). The data is studied through (i) a Zipf method for sorting out correlations in the FTS and (ii) a Grassberger-Procaccia (GP) technique based method for finding correlations in LTS. Features are compared : different power laws are observed with characteristic exponents for the ranking properties, and the {\it phase space attractor dimensionality}. The Zipf exponent can take values much less than unity (ca.ca. 0.50 or 0.30) depending on how a sentence is defined. This non-universality is conjectured to be a measure of the author stylestyle. Moreover the attractor dimension rr is a simple function of the so called phase space dimension nn, i.e., r=nλr = n^{\lambda}, with λ=0.79\lambda = 0.79. Such an exponent should also conjecture to be a measure of the author creativitycreativity. However, even though there are quantitative differences between the original english text and its esperanto translation, the qualitative differences are very minutes, indicating in this case a translation relatively well respecting, along our analysis lines, the content of the author writing.Comment: 22 pages, 87 references, 5 tables, 8 figure

    Self-Interest versus Group-Interest in Antiviral Control

    Get PDF
    Antiviral agents have been hailed to hold considerable promise for the treatment and prevention of emerging viral diseases like H5N1 avian influenza and SARS. However, antiviral drugs are not completely harmless, and the conditions under which individuals are willing to participate in a large-scale antiviral drug treatment program are as yet unknown. We provide population dynamical and game theoretical analyses of large-scale prophylactic antiviral treatment programs. Throughout we compare the antiviral control strategy that is optimal from the public health perspective with the control strategy that would evolve if individuals make their own, rational decisions. To this end we investigate the conditions under which a large-scale antiviral control program can prevent an epidemic, and we analyze at what point in an unfolding epidemic the risk of infection starts to outweigh the cost of antiviral treatment. This enables investigation of how the optimal control strategy is moulded by the efficacy of antiviral drugs, the risk of mortality by antiviral prophylaxis, and the transmissibility of the pathogen. Our analyses show that there can be a strong incentive for an individual to take less antiviral drugs than is optimal from the public health perspective. In particular, when public health asks for early and aggressive control to prevent or curb an emerging pathogen, for the individual antiviral drug treatment is attractive only when the risk of infection has become non-negligible. It is even possible that from a public health perspective a situation in which everybody takes antiviral drugs is optimal, while the process of individual choice leads to a situation where nobody is willing to take antiviral drugs

    Retention and diffusion of radioactive and toxic species on cementitious systems: Main outcome of the CEBAMA project

    Get PDF
    Cement-based materials are key components in radioactive waste repository barrier systems. To improve the available knowledge base, the European CEBAMA (Cement-based materials) project aimed to provide insight on general processes and phenomena that can be easily transferred to different applications. A bottom up approach was used to study radionuclide retention by cementitious materials, encompassing both individual cement mineral phases and hardened cement pastes. Solubility experiments were conducted with Be, Mo and Se under high pH conditions to provide realistic solubility limits and radionuclide speciation schemes as a prerequisite for meaningful adsorption studies. A number of retention mechanisms were addressed including adsorption, solid solution formation and precipitation of radionuclides within new solid phases formed during cement hydration and evolution. Sorption/desorption experiments were carried out on several anionic radionuclides and/or toxic elements which have received less attention to date, namely: Be, Mo, Tc, I, Se, Cl, Ra and 14C. Solid solution formation between radionuclides in a range of oxidation states (Se, I and Mo) with the main aqueous components (OH−, SO4 −2, Cl−) of cementitious systems on AFm phases were also investigated
    • 

    corecore