1,017 research outputs found
GeofĂsica ambiental: tĂ©cnicas no destructivas para el reconocimiento de zonas contaminadas por vertidos
Industrial countries face the consequences of decades of inappropiate handling of hazardous waste. The dumping of al1 types of hazardous materials has been ongoing in most industrialised countries for hundreds of years. Large quantities of industrial and other waste material have been buried in landfill sites. A relatively large number of these lack reliable man-made or natural geological barriers and toxic fluids are scaping and polluting the groundwater. The problem is greatly aggravated when a soil covenng is placed over the waste and there is no information about the dumping practices used in the past. One of the first tasks in any remedial action is to delineate the physical extent of the sites and its encroachment into the surrounding area. Test borings and limited excavations are very valuable but the information obtained is not continuous and dheir destructive nature makes it possible that waste could inadvertently be released during the probing phase. In this regard, both borehole drilling and excavation are very dangerous to workers and the environment and expensive and tedious to conduct.Many of these problems may be alleviated by using a geophysical assisted system approach to determine where the pollutants will go in the subsurface, gain more complete understanding of site conditions and asses the optimal placement of exploration drills and monitonng wells. At hazardous waste sites, the main objectives must commonly include:- Determine the presence, location, distribution, depth and composition of possible buried wastes.- Determine the presence and extent of contaminant and leachate plumes within the unsaturated and saturated zones.- Characterise and asses the local (and regional) geohydrologic regime for groundwater flow paterns, recharge areas and localised permeable pathways
Bio-based synthesis of oxidation resistant copper nanowires using an aqueous plant extract
Copper nanowires have recently emerged as promising nanomaterials for transparent conducting electrodes applications, however, their production commonly involves the use of harmful reagents. In this study, we describe for the first time a simple and cost-effective bio-based synthesis of copper nanowires using an aqueous plant extract (Eucalyptus globulus) as the reducing/stabilizing agent and oleic acid and oleylamine as surfactants. Well-dispersed crystalline copper nanowires (λmĂĄxâŻ=âŻ584â613âŻnm) were obtained with average diameters in the nanometric range (44 and 145âŻnm) and lengths in the micrometric range (from 5 to dozens of micrometres) using extract concentrations between 10 and 50âŻmgâŻmLâ1. Moreover, the aspect ratio of these nanowires can be adjusted (from around 14â20 to 160â400) by changing the experimental conditions, namely the use of oleic acid. Phenolic compounds were found to have a key role in this bioreduction process allowing to obtain practically only nanowires (without other morphologies). Nevertheless, the use of oleic acid/oleylamine is essential to manipulate their size and aspect ratio. Most importantly, these bio-based copper nanowires were found to be resistant under storage in ethanol and when submitted to air exposure, both for 2 weeks, certainly due to the adsorption of antioxidant biomolecules (phenolic) at their surface, thus avoiding the use of other polymeric protective layers. The conductivity of the CuNWs was found to be 0.009âŻSâŻcmâ1. As a result, this study opens a new standpoint in this field, âclosing the doorâ to the use of hazardous reagents and synthetic polymeric protective layers, on the production of stable copper nanowires with potential application as conductive materials.publishe
The role of natural regeneration to ecosystem services provision and habitat availability: a case study in the Brazilian Atlantic Forest
Natural regeneration provides multiple benefits to nature and human societies, and can play a major role in global and national restoration targets. However, these benefits are context specific and impacted by both biophysical and socioeconomic heterogeneity across landscapes. Here we investigate the benefits of natural regeneration for climate change mitigation, sediment retention and biodiversity conservation in a spatially explicit way at very high resolution for a region within the global biodiversity hotspot of the Atlantic Forest. We classified current land-use cover in the region and simulated a natural regeneration scenario in abandoned pasturelands, areas where potential conflicts with agricultural production would be minimized and where some early stage regeneration is already occurring. We then modelled changes in biophysical functions for climate change mitigation and sediment retention, and performed an economic valuation of both ecosystem services. We also modelled how land-use changes affect habitat availability for species. We found that natural regeneration can provide significant ecological and social benefits. Economic values of climate change mitigation and sediment retention alone could completely compensate for the opportunity costs of agricultural production over 20 years. Habitat availability is improved for three species with different dispersal abilities, although by different magnitudes. Improving the understanding of how costs and benefits of natural regeneration are distributed can be useful to design incentive structures that bring farmersâ decision making more in line with societal benefits. This alignment is crucial for natural regeneration to fulfil its potential as a large-scale solution for pressing local and global environmental challenges
Antioxidant and antimicrobial films based on brewers spent grain arabinoxylans, nanocellulose and feruloylated compounds for active packaging
In this study, brewers spent grain (BSG) arabinoxylans-based nanocomposite films were prepared by solvent casting of arabinoxylans (AX) suspensions containing different amounts of nanofibrillated cellulose (NFC, 5, 10, 25, 50 and 75% mass fraction). The obtained nanocomposite films were homogeneous and presented thermal stability up to 230 °C and good mechanical properties (Young's modulus up to 7.5 GPa). Additionally, the films with 50% NFC were loaded with ferulic acid or feruloylated arabinoxylo-oligosaccharides enriched fraction from BSG (75 mg per g of film). This combination enhanced the UVâVis barrier properties and imparted additional functionalities to the films, namely (i) antioxidant activity up to 90% (DPPH scavenging activity), (ii) antibacterial activity against Gram-positive (Staphylococcus aureus) and Gram-negative (Escherichia coli) bacteria, and (iii) antifungal activity towards the polymorphic fungus Candida albicans. Therefore, these fully biobased nanocomposite films show potential for application as active food packaging systems.publishe
Project manager-to-project allocations in practice: an empirical study of the decision-making practices of a multi-project based organization
Empirical studies that examine how managers make project manager-to-project (PM2P) allocation decisions in multi-project settings are currently limited. Such decisions are crucial to organizational success. An empirical study of the PM2P practice, conducted in the context of Botswana, revealed ineffective processes in terms of optimality in decision-making. A conceptual model to guide effective PM2P practices was developed. The focus of this study is on deploying the model as a lens to study the PM2P practices of a large organization, with a view to identify and illustrate strengths and weaknesses. A case study was undertaken in the mining industry, where core activities in terms of projects are underground mineral explorations at identified geographical regions. A semi-structured interview protocol was used to collect data from 15 informants, using an enumeration. Integrated analysis of both data types (using univariate descriptive analysis for the quantitative data, content and thematic analysis for the qualitative data) revealed strengths in PM2P practices, demonstrated by informantsâ recognition of some important criteria to be considered. The key weaknesses were exemplified by a lack of effective management tools and techniques to match project managers to projects. The findings provide a novel perspective through which improvements in working practices can be made
Deriving the mass of particles from Extended Theories of Gravity in LHC era
We derive a geometrical approach to produce the mass of particles that could
be suitably tested at LHC. Starting from a 5D unification scheme, we show that
all the known interactions could be suitably deduced as an induced symmetry
breaking of the non-unitary GL(4)-group of diffeomorphisms. The deformations
inducing such a breaking act as vector bosons that, depending on the
gravitational mass states, can assume the role of interaction bosons like
gluons, electroweak bosons or photon. The further gravitational degrees of
freedom, emerging from the reduction mechanism in 4D, eliminate the hierarchy
problem since generate a cut-off comparable with electroweak one at TeV scales.
In this "economic" scheme, gravity should induce the other interactions in a
non-perturbative way.Comment: 30 pages, 1 figur
Origins of the Ambient Solar Wind: Implications for Space Weather
The Sun's outer atmosphere is heated to temperatures of millions of degrees,
and solar plasma flows out into interplanetary space at supersonic speeds. This
paper reviews our current understanding of these interrelated problems: coronal
heating and the acceleration of the ambient solar wind. We also discuss where
the community stands in its ability to forecast how variations in the solar
wind (i.e., fast and slow wind streams) impact the Earth. Although the last few
decades have seen significant progress in observations and modeling, we still
do not have a complete understanding of the relevant physical processes, nor do
we have a quantitatively precise census of which coronal structures contribute
to specific types of solar wind. Fast streams are known to be connected to the
central regions of large coronal holes. Slow streams, however, appear to come
from a wide range of sources, including streamers, pseudostreamers, coronal
loops, active regions, and coronal hole boundaries. Complicating our
understanding even more is the fact that processes such as turbulence,
stream-stream interactions, and Coulomb collisions can make it difficult to
unambiguously map a parcel measured at 1 AU back down to its coronal source. We
also review recent progress -- in theoretical modeling, observational data
analysis, and forecasting techniques that sit at the interface between data and
theory -- that gives us hope that the above problems are indeed solvable.Comment: Accepted for publication in Space Science Reviews. Special issue
connected with a 2016 ISSI workshop on "The Scientific Foundations of Space
Weather." 44 pages, 9 figure
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
- âŠ