144 research outputs found

    Floating with seeds: understanding hydrochorous mangrove propagule dispersal: a field and modeling approach

    Get PDF
    Présentation avec posterinfo:eu-repo/semantics/publishedYoung Marine Scientists’ Day Vlaams Instituut voor de Zee (VLIZ), 24 février, Brugge, Belgiqu

    Definición de una arquitectura de referencia para plataformas de servicios de datos

    Get PDF
    Big Data se refiere a conjuntos de datos cuyo volumen, velocidad y variedad dificultan su captura, gestión y procesamiento mediante tecnologías y herramientas convencionales. Este concepto ha generado nuevas necesidades en las organizaciones para permitir la captura, almacenamiento y análisis de datos con estas características y así obtener información relevante para la toma de decisiones. Un reto para las organizaciones es la implementación de una arquitectura que permita cubrir estas necesidades, ya que deben considerar las diferentes tecnologías existentes y deben establecer las políticas para el gobierno de datos que están en manos de los usuarios. Una arquitectura de referencia de una plataforma de analítica de datos, que se desvincule de herramientas tecnológicas es una guía que le permite a las organizaciones trazar un camino para lograr la gestión de grandes volúmenes de datos y así tener herramientas efectivas para la toma de decisiones empresariales. La arquitectura de referencia es lo suficientemente general como para implementarse con diferentes tecnologías, paradigmas informáticos y software analítico, dependiendo de los requisitos y propósitos de cada organización. En el proyecto desarrollado se realizó la implementación de la arquitectura con datos de la atención de urgencias en centros hospitalarios de la ciudad de Medellín. Uno de los resultados del trabajo de investigación es que la arquitectura propuesta considera diferentes tipos de usuario y de fuentes de datos, no genera dependencia por el tipo de herramientas tecnológica que se utilizan y establece una capa para el gobierno de datos.Big Data refers to data set whose volume, velocity, and variety make it difficult to capture, manage and process using conventional technologies and tools. This concept is generating new needs in organizations to allow the capture, storage, and analysis of data with these characteristics and thus obtain relevant information for decision-making. A challenge for organizations is the implementation of an architecture that covers these needs, since they must consider the different existing technologies and must establish the policies for data governance that will be available to users. A reference architecture of a data analytics platform that is capable of decoupling from technological tools will be a guide that will allow organizations to define a path to achieve the management of these data and thus have effective tools for make decisions in the company. The reference architecture is general enough to be implemented with different technologies, computing paradigms and analytical software, depending on the requirements and purposes of each organization. In the developed project, the architecture was implemented with data from emergency care in hospitals in the Medellín city. One of the results of the research work is that the proposed architecture considers different types of user and data sources, does not generate dependency due to the type of technological tools used and establishes a layer for data governance.Magíster en Ingeniería de SoftwareMaestrí

    Enhancing Federated Cloud Management with an Integrated Service Monitoring Approach

    Get PDF
    Cloud Computing enables the construction and the provisioning of virtualized service-based applications in a simple and cost effective outsourcing to dynamic service environments. Cloud Federations envisage a distributed, heterogeneous environment consisting of various cloud infrastructures by aggregating different IaaS provider capabilities coming from both the commercial and the academic area. In this paper, we introduce a federated cloud management solution that operates the federation through utilizing cloud-brokers for various IaaS providers. In order to enable an enhanced provider selection and inter-cloud service executions, an integrated monitoring approach is proposed which is capable of measuring the availability and reliability of the provisioned services in different providers. To this end, a minimal metric monitoring service has been designed and used together with a service monitoring solution to measure cloud performance. The transparent and cost effective operation on commercial clouds and the capability to simultaneously monitor both private and public clouds were the major design goals of this integrated cloud monitoring approach. Finally, the evaluation of our proposed solution is presented on different private IaaS systems participating in federations. © 2013 Springer Science+Business Media Dordrecht

    High intensity neutrino oscillation facilities in Europe

    Get PDF
    The EUROnu project has studied three possible options for future, high intensity neutrino oscillation facilities in Europe. The first is a Super Beam, in which the neutrinos come from the decay of pions created by bombarding targets with a 4 MW proton beam from the CERN High Power Superconducting Proton Linac. The far detector for this facility is the 500 kt MEMPHYS water Cherenkov, located in the Fréjus tunnel. The second facility is the Neutrino Factory, in which the neutrinos come from the decay of μ+ and μ− beams in a storage ring. The far detector in this case is a 100 kt magnetized iron neutrino detector at a baseline of 2000 km. The third option is a Beta Beam, in which the neutrinos come from the decay of beta emitting isotopes, in particular He6 and Ne18, also stored in a ring. The far detector is also the MEMPHYS detector in the Fréjus tunnel. EUROnu has undertaken conceptual designs of these facilities and studied the performance of the detectors. Based on this, it has determined the physics reach of each facility, in particular for the measurement of CP violation in the lepton sector, and estimated the cost of construction. These have demonstrated that the best facility to build is the Neutrino Factory. However, if a powerful proton driver is constructed for another purpose or if the MEMPHYS detector is built for astroparticle physics, the Super Beam also becomes very attractive

    Alpha-Photon Coincidence Spectroscopy Along Element 115 Decay Chains

    Get PDF
    Produced in the reaction 48Ca+243Am, thirty correlated α-decay chains were observed in an experiment conducted at the GSI Helmholzzentrum für Schwerionenforschung, Darmstadt, Germany. The decay chains are basically consistent with previous findings and are considered to originate from isotopes of element 115 with mass numbers 287, 288, and 289. A set-up aiming specifically for high-resolution charged particle and photon coincidence spectroscopy was placed behind the gas-filled separator TASCA. For the first time, γ rays as well as X-ray candidates were observed in prompt coincidence with the α-decay chains of element 115

    NEDA—NEutron Detector Array

    Get PDF
    The NEutron Detector Array, NEDA, will form the next generation neutron detection system that has been designed to be operated in conjunction with γ-ray arrays, such as the tracking-array AGATA, to aid nuclear spectroscopy studies. NEDA has been designed to be a versatile device, with high-detection efficiency, excellent neutron-γ discrimination, and high rate capabilities. It will be employed in physics campaigns in order to maximise the scientific output, making use of the different stable and radioactive ion beams available in Europe. The first implementation of the neutron detector array NEDA with AGATA 1π was realised at GANIL. This manuscript reviews the various aspects of NEDA

    Pulse pile-up identification and reconstruction for liquid scintillator based neutron detectors

    Get PDF
    The issue of pulse pile-up is frequently encountered in nuclear experiments involving high counting rates, which will distort the pulse shapes and the energy spectra. A digital method of off-line processing of pile-up pulses is presented. The pile-up pulses were firstly identified by detecting the downward-going zero-crossings in the first-order derivative of the original signal, and then the constituent pulses were reconstructed based on comparing the pile-up pulse with four models that are generated by combining pairs of neutron and γ standard pulses together with a controllable time interval. The accuracy of this method in resolving the pile-up events was investigated as a function of the time interval between two pulses constituting a pile-up event. The obtained results show that the method is capable of disentangling two pulses with a time interval among them down to 20 ns, as well as classifying them as neutrons or γ rays. Furthermore, the error of reconstructing pile-up pulses could be kept below 6% when successive peaks were separated by more than 50 ns. By applying the method in a high counting rate of pile-up events measurement of the NEutron Detector Array (NEDA), it was empirically found that this method can reconstruct the pile-up pulses and perform neutron-γ discrimination quite accurately. It can also significantly correct the distorted pulse height spectrum due to pile-up events

    Recoil-α-fission and recoil-α-α-fission events observed in the reaction 48Ca + 243Am

    Get PDF
    Products of the fusion-evaporation reaction 48Ca + 243Am were studied with the TASISpec set-up at the gas-filled separator TASCA at the GSI Helmholtzzentrum für Schwerionenforschung, Darmstadt, Germany. Amongst the detected thirty correlated α-decay chains associated with the production of element Z=115, two recoil-α-fission and five recoil-α-α-fission events were observed. The latter five chains are similar to four such events reported from experiments performed at the Dubna gas-filled separator, and three such events reported from an experiment at the Berkeley gas-filled separator. The four chains observed at the Dubna gas-filled separator were assigned to start from the 2n-evaporation channel 289115 due to the fact that these recoil-α-α-fission events were observed only at low excitation energies. Contrary to this interpretation, we suggest that some of these recoil-α-α-fission decay chains, as well as some of the recoil-α-α-fission and recoil-α-fission decay chains reported from Berkeley and in this article, start from the 3n-evaporation channel 288115

    Future response of global coastal wetlands to sea-level rise.

    Get PDF
    The response of coastal wetlands to sea-level rise during the twenty-first century remains uncertain. Global-scale projections suggest that between 20 and 90 per cent (for low and high sea-level rise scenarios, respectively) of the present-day coastal wetland area will be lost, which will in turn result in the loss of biodiversity and highly valued ecosystem services1-3. These projections do not necessarily take into account all essential geomorphological4-7 and socio-economic system feedbacks8. Here we present an integrated global modelling approach that considers both the ability of coastal wetlands to build up vertically by sediment accretion, and the accommodation space, namely, the vertical and lateral space available for fine sediments to accumulate and be colonized by wetland vegetation. We use this approach to assess global-scale changes in coastal wetland area in response to global sea-level rise and anthropogenic coastal occupation during the twenty-first century. On the basis of our simulations, we find that, globally, rather than losses, wetland gains of up to 60 per cent of the current area are possible, if more than 37 per cent (our upper estimate for current accommodation space) of coastal wetlands have sufficient accommodation space, and sediment supply remains at present levels. In contrast to previous studies1-3, we project that until 2100, the loss of global coastal wetland area will range between 0 and 30 per cent, assuming no further accommodation space in addition to current levels. Our simulations suggest that the resilience of global wetlands is primarily driven by the availability of accommodation space, which is strongly influenced by the building of anthropogenic infrastructure in the coastal zone and such infrastructure is expected to change over the twenty-first century. Rather than being an inevitable consequence of global sea-level rise, our findings indicate that large-scale loss of coastal wetlands might be avoidable, if sufficient additional accommodation space can be created through careful nature-based adaptation solutions to coastal management.Personal research fellowship of Mark Schuerch (Project Number 272052902) and by the Cambridge Coastal Research Unit (Visiting Scholar Programme). Furthermore, this work has partly been supported by the EU research project RISES-AM- (FP7-ENV-693396)
    corecore