376 research outputs found
Identifying success factors in crowdsourced geographic information use in government
Crowdsourcing geographic information in government is focusing on projects that are engaging people who are not government officials and employees in collecting, editing and sharing information with governmental bodies. This type of projects emerged in the past decade, due to technological and societal changes - such as the increased use of smartphones, combined with growing levels of education and technical abilities to use them by citizens. They also flourished due to the need for updated data in relatively quick time when financial resources are low. They range from recording the experience of feeling an earthquake to recording the location of businesses during the summer time. 50 cases of projects in which crowdsourced geographic information was used by governmental bodies across the world are analysed. About 60% of the cases were examined in 2014 and in 2017, to allow for comparison and identification of success and failure. The analysis looked at different aspects and their relationship to success: the drivers to start a project; scope and aims; stakeholders and relationships; inputs into the project; technical and organisational aspect; and problems encountered. The main key factors of the case studies were analysed with the use of Qualitative Comparative Analysis (QCA) which is an analytical method that combines quantitative and qualitative tools in sociological research. From the analysis, we can conclude that there is no “magic bullet” or a perfect methodology for a successful crowdsourcing in government project. Unless the organisation has reached maturity in the area of crowdsourcing, identifying a champion and starting a project that will not address authoritative datasets directly is a good way to ensure early success and start the process of organisational learning on how to run such projects. Governmental support and trust is undisputed. If the choice is to use new technologies, this should be accompanied by an investment of appropriate resources within the organisation to ensure that the investment bear fruits. Alternatively, using an existing technology that was successful elsewhere and investing in training and capacity building is another path for success. We also identified the importance of intermediary Non-Governmental Organizations (NGOs) with the experience and knowledge in working with crowdsourcing within a partnership. These organizations have the knowledge and skills to implement projects at the boundary between government and the crowd, and therefore can offer the experience to ensure better implementation. Changes and improvement of public services, or a focus on environmental monitoring can be a good basis for a project. Capturing base mapping is a good point to start, too. The recommendation of the report address organisational issues, resources, and legal aspects
Recommended from our members
Assessing the consistency of satellite-derived upper tropospheric humidity measurements
Four upper tropospheric humidity (UTH) datasets derived from satellite sounders are evaluated to assess their consistency as part of the activities for the Global Energy and Water Exchanges (GEWEX) water vapor assessment project. The datasets include UTH computed from brightness temperature measurements of the 183.31±1 GHz channel of the Special Sensor Microwave – Humidity (SSM/T-2), Advanced Microwave Sounding Unit-B (AMSU-B), and Microwave Humidity Sounder (MHS) and from channel 12 of the High-resolution Infrared Radiation Sounder (HIRS). The four datasets are generally consistent in the interannual temporal and spatial variability of the tropics. Large positive anomalies peaked over the central equatorial Pacific region during El Niño events in the same phase with the increase of sea surface temperature (SST). Conversely, large negative anomalies were obtained during El Niño events when the tropical-domain average is taken. The weakened ascending branch of the Pacific Walker circulation in the western Pacific and the enhanced descending branches of the local Hadley circulation along the Pacific subtropics largely contributed to widespread drying areas and thus negative anomalies in the upper troposphere during El Niño events as shown in all four datasets. During a major El Niño event, UTH had higher correlations with the coincident precipitation (0.60 to 0.75) and with 200 hPa velocity potential (−0.42 to −0.64) than with SST (0.37 to 0.49). Due to differences in retrieval definitions and gridding procedures, there can be a difference of 3 %–5 % UTH between datasets on average, and larger magnitudes of anomaly values are usually observed in spatial maps of microwave UTH data. Nevertheless, the tropical-domain averaged anomalies of the datasets are close to each other with their differences being mostly less than 0.5 %, and more importantly the phases of the time series are generally consistent for variability studie
Ceramic Paste for Patching High-Temperature Insulation
A ceramic paste that can be applied relatively easily, either by itself or in combination with one or more layer(s) of high-temperature ceramic fabrics, such as silicon carbide or zirconia, has been invented as a means of patching cracks or holes in the reinforced carbon-carbon forward surfaces of a space shuttle in orbit before returning to Earth. The paste or the paste/fabric combination could also be used to repair rocket-motor combustion chambers, and could be used on Earth to patch similar high-temperature structures. The specified chemical composition of the paste admits of a number of variations, and the exact proportions of its constituents are proprietary. In general, the paste consists of (1) silicon carbide, possibly with addition of (2) hafnium carbide, zirconium carbide, zirconium boride, silicon tetraboride, silicon hexaboride, or other metal carbides or oxides blended with (3) a silazane-based polymer. Because the paste is viscous and sticky at normal terrestrial and outer-space ambient temperatures, high-temperature ceramic fabrics such as silicon carbide or zirconia fabric impregnated with the paste (or the paste alone) sticks to the damaged surface to which it is applied. Once the patch has been applied, it is smoothed to minimize edge steps as required [forward-facing edge steps must be < or equal to 0.030 in. (< or equal to 0.76 mm) in the original intended space-shuttle application]. The patch is then heated to a curing temperature thereby converting it from a flexible material to a hard, tough material. The curing temperature is 375 to 450 F (approx.190 to 230 C). In torch tests and arc-jet tests, the cured paste was found to be capable of withstanding a temperature of 3,500 F (approx. 1,900 C) for 15 minutes. As such, the material appears to satisfy the requirement, in the original space-shuttle application, to withstand re-entry temperatures of approx.3,000 F (approx. 1,600 C)
Recommended from our members
Radiative forcing of climate: the historical evolution of the radiative forcing concept, the forcing agents and their quantification, and applications
We describe the historical evolution of the conceptualization, formulation, quantification, application and utilization of “radiative forcing (RF, see e.g., IPCC, 1990)” of Earth’s climate.
Basic theories of shortwave and long wave radiation were developed through the 19th and 20th centuries, and established the analytical framework for defining and quantifying the perturbations to the Earth’s radiative energy balance by natural and anthropogenic influences. The insight that the Earth’s climate could be radiatively forced by changes in carbon dioxide, first introduced in the 19th century, gained empirical support with sustained observations of the atmospheric concentrations of the gas beginning in 1957. Advances in laboratory and field measurements, theory, instrumentation, computational technology, data and analysis of well-mixed greenhouse gases and the global climate system through the 20th Century enabled the development and formalism of RF; this allowed RF to be related to changes in global-mean surface temperature with the aid of increasingly sophisticated models. This in turn led to RF becoming firmly established as a principal concept in climate science by 1990.
The linkage with surface temperature has proven to be the most important application of the RF concept, enabling a simple metric to evaluate the relative climate impacts of different agents. The late 1970s and 1980s saw accelerated developments in quantification including the first assessment of the effect of the forcing due to doubling of carbon dioxide on climate (the “Charney” report, National Research Council, 1979). The concept was subsequently extended to a wide variety of agents beyond well-mixed greenhouse gases (WMGHGs: carbon dioxide, methane, nitrous oxide, and halocarbons) to short-lived species such as ozone. The WMO (1986) and IPCC (1990) international assessments began the important sequence of periodic evaluations and quantifications of the forcings by natural (solar irradiance changes and stratospheric aerosols resulting from volcanic eruptions) and a growing set of anthropogenic agents (WMGHGs, ozone, aerosols, land surface changes, contrails). From 1990s to the present, knowledge and scientific confidence in the radiative agents acting on the climate system has proliferated. The conceptual basis of RF has also evolved as both our understanding of the way radiative forcing drives climate change, and the diversity of the forcing mechanisms, have grown. This has led to the current situation where “Effective Radiative Forcing (ERF, e.g., IPCC, 2013)” is regarded as the preferred practical definition of radiative forcing in order to better capture the link between forcing and global-mean surface temperature change. The use of ERF, however, comes with its own attendant issues, including challenges in its diagnosis from climate models, its applications to small forcings, and blurring of the distinction between rapid climate adjustments (fast responses) and climate feedbacks; this will necessitate further elaboration of its utility in the future. Global climate model simulations of radiative perturbations by various agents have established how the forcings affect other climate variables besides temperature e.g., precipitation. The forcing-response linkage as simulated by models, including the diversity in the spatial distribution of forcings by the different agents, has provided a practical demonstration of the effectiveness of agents in perturbing the radiative energy balance and causing climate changes.
The significant advances over the past half-century have established, with very high confidence, that the global-mean ERF due to human activity since preindustrial times is positive (the 2013 IPCC assessment gives a best estimate of 2.3 W m-2, with a range from 1.1 to 3.3 W m-2; 90% confidence interval). Further, except in the immediate aftermath of climatically-significant volcanic eruptions, the net anthropogenic forcing dominates over natural radiative forcing mechanisms. Nevertheless, the substantial remaining uncertainty in the net anthropogenic ERF leads to large uncertainties in estimates of climate sensitivity from observations and in predicting future climate impacts. The uncertainty in the ERF arises principally from the incorporation of the rapid climate adjustments in the formulation, the well-recognized difficulties in characterizing the preindustrial state of the atmosphere, and the incomplete knowledge of the interactions of aerosols with clouds. This uncertainty impairs the quantitative evaluation of climate adaptation and mitigation pathways in the future. A grand challenge in Earth System science lies in continuing to sustain the relatively simple essence of the radiative forcing concept in a form similar to that originally devised, and at the same time improving the quantification of the forcing. This, in turn, demands an accurate, yet increasingly complex and comprehensive, accounting of the relevant processes in the climate system
Crowdsourced Geographic Information Use in Government
This report is based on a six-month study of the use of volunteered geographic information (VGI) by government. It focuses on government use of information relating to a location, which was produced through what is known as “crowdsourcing”, the process of obtaining information from many contributors amongst the general public, regardless of their background and skill level. The aim of this report is to provide a guide for the successful implementation of VGI in government
Effect of thermal residual stresses on matrix failure under transverse tension at micromechanical level: A numerical and experimental analysis
International audienceIn the present work the influence at micromechanical scale of thermal residual stresses, originated in the cooling down associated to the curing process of fibrous composites, on inter-fibre failure under transverse tension is studied. In particular, the effect of the presence of thermal residual stresses on the appearance of the first debonds is discussed analytically, whereas later steps of the mechanism of damage, i.e. the growth of interface cracks and their kinking towards the matrix, are analysed by means of a single fibre model and making use of the Boundary Element Method. The results are evaluated applying Interfacial Fracture Mechanics concepts. The conclusions obtained predict, at least in the case of dilute fibre packing, a protective effect of thermal residual stresses against failure initiation, the morphology of the damage not being significantly affected in comparison with the case in which these stresses are not considered. Experimental tests are carried out, the results agreeing with the conclusions of the numerical analysis
Increased insolation threshold for runaway greenhouse processes on Earth like planets
Because the solar luminosity increases over geological timescales, Earth
climate is expected to warm, increasing water evaporation which, in turn,
enhances the atmospheric greenhouse effect. Above a certain critical
insolation, this destabilizing greenhouse feedback can "runaway" until all the
oceans are evaporated. Through increases in stratospheric humidity, warming may
also cause oceans to escape to space before the runaway greenhouse occurs. The
critical insolation thresholds for these processes, however, remain uncertain
because they have so far been evaluated with unidimensional models that cannot
account for the dynamical and cloud feedback effects that are key stabilizing
features of Earth's climate. Here we use a 3D global climate model to show that
the threshold for the runaway greenhouse is about 375 W/m, significantly
higher than previously thought. Our model is specifically developed to quantify
the climate response of Earth-like planets to increased insolation in hot and
extremely moist atmospheres. In contrast with previous studies, we find that
clouds have a destabilizing feedback on the long term warming. However,
subsident, unsaturated regions created by the Hadley circulation have a
stabilizing effect that is strong enough to defer the runaway greenhouse limit
to higher insolation than inferred from 1D models. Furthermore, because of
wavelength-dependent radiative effects, the stratosphere remains cold and dry
enough to hamper atmospheric water escape, even at large fluxes. This has
strong implications for Venus early water history and extends the size of the
habitable zone around other stars.Comment: Published in Nature. Online publication date: December 12, 2013.
Accepted version before journal editing and with Supplementary Informatio
- …