1,181 research outputs found
The 27-28 October 1986 FIRE IFO Cirrus Case Study: Cloud Optical Properties Determined by High Spectral Resolution Lidar
During the First ISCCP Region Experiment (FIRE) cirrus intensive field observation (IFO) the High Spectral Resolution Lidar was operated from a roof top site on the University of Wisconsin-Madison campus. Because the HSRL technique separately measures the molecular and cloud particle backscatter components of the lidar return, the optical thickness is determined independent of particle backscatter. This is accomplished by comparing the known molecular density distribution to the observed decrease in molecular backscatter signal with altitude. The particle to molecular backscatter ratio yields calibrated measurements of backscatter cross sections that can be plotted ro reveal cloud morphology without distortion due to attenuation. Changes in cloud particle size, shape, and phase affect the backscatter to extinction ratio (backscatter-phase function). The HSRL independently measures cloud particle backscatter phase function. This paper presents a quantitative analysis of the HSRL cirrus cloud data acquired over an approximate 33 hour period of continuous near zenith observations. Correlations between small scale wind structure and cirrus cloud morphology have been observed. These correlations can bias the range averaging inherent in wind profiling lidars of modest vertical resolution, leading to increased measurement errors at cirrus altitudes. Extended periods of low intensity backscatter were noted between more strongly organized cirrus cloud activity. Optical thicknesses ranging from 0.01-1.4, backscatter phase functions between 0.02-0.065 sr (exp -1) and backscatter cross sections spanning 4 orders of magnitude were observed. the altitude relationship between cloud top and bottom boundaries and the cloud optical center altitude was dependent on the type of formation observed Cirrus features were observed with characteristic wind drift estimated horizontal sizes of 5-400 km. The clouds frequently exhibited cellular structure with vertical to horizontal dimension ratios of 1:5-1:1
Recommended from our members
Evaluating LAB@FUTURE, a collaborative e-learning Laboratory experiments platform
This paper presents Lab@Future, an advanced e-learning platform that uses novel Information and Communication Technologies to support and expand laboratory teaching practices. For this purpose, Lab@Future uses real and computer generated objects that are interfaced using mechatronic systems, augmented reality, mobile technologies and 3D multi user environments. The main aim is to develop and demonstrate technological support for practical experiments in the following focused disciplines namely: Fluid Dynamics - Science subject in Germany, Geometry - Mathematics subject in Austria, History and Environmental Awareness – Arts and Humanities subjects in Greece and Slovenia. In order to pedagogically enhance the design and functional aspects of this e-learning technology, we are investigating the dialogical operationalisation of learning theories so as to leverage our understanding of teaching and learning practices in the targeted context of deployment. To be able to evaluate the lab@future system in its entire complexity an evaluation methodology including several phases has been developed, performing formative as well as summative evaluations
Home Manufacture of Drugs: An Online Investigation and a Toxicological Reality Check of Online Discussions on Drug Chemistry
Emerging trends in market dynamics and the use of new psychoactive substances are both a public health concern and a complex regulatory issue. One novel area of investigation is the availability of homemade opioids, amphetamines and dissociatives, and the potential fueling of interest in clandestine home manufacture of drugs via the Internet. We illustrate here how online communal folk pharmacology of homemade drugs on drug website forums may actually inform home manufacture practices or contribute to the reduction of harms associated with this practice. Discrepancies between online information around purification and making homemade drugs safer, and the synthesis of the same substances in a proper laboratory environment, exist. Moderation and shutdown of synthesis queries and discussions online are grounded in drug websites adhering to harm-reduction principles by facilitating discussions around purification of homemade drugs only. Drug discussion forums should consider reevaluating their policies on chemistry discussions in aiming to reach people who cannot or will not refrain from cooking their own drugs with credible information that may contribute to reductions in the harms associated with this practice. © 2017 Taylor & Francis Group, LL
Recommended from our members
Feasibility of tropospheric water vapor profiling using infrared heterodyne differential absorption lidar
Continuous, high quality profiles of water vapor, free of systematic bias, and of moderate temporal and spatial resolution, acquired over long periods at low operational and maintenance cost, are fundamental to the success of the ARM CART program. The development and verification of realistic climate model parameterizations for clouds and net radiation balance, and the correction of other CART site sensor observations for interferences due to the presence of water vapor are critically dependent on water vapor profile measurements. Application of profiles acquired with current techniques, have, to date, been limited by vertical resolution and uniqueness of solution [e.g. high resolution infrared (IR) Fourier transform radiometry], poor spatial and temporal coverage and high operating cost (e.g. radiosondes), or diminished daytime performance, lack of eye-safety, and high maintenance cost (e.g. Raman lidar). Recent developments in infrared laser and detector technology make possible compact IR differential absorption lidar (DIAL) systems at eye-safe wavelengths. In the study reported here, we develop DIAL system performance models and examine the potential of to solve some of the shortcomings of previous methods using parameterizations representative of current technologies. These models are also applied to diagnose and evaluate other strengths and weaknesses unique to the DIAL method for this application. This work is to continue in the direction of evaluating yet smaller and lower-cost laser diode-based systems for routine monitoring of the lower altitudes using photon counting detection methods. We regard the present report as interim in nature and will update and extend it as a final report at the end of the term of the contract
Visual and personalized quality of life assessment app for people with severe mental health problems:Qualitative evaluation
Background:Â QoL-ME is a digital visual personalized quality of life assessment app for people with severe mental health problems. Research reveals that e-mental health apps frequently suffer from low engagement and fall short of expectations regarding their impact on patients' daily lives. Studies often indicate that e-mental health apps ought to respect the needs and preferences of end users to achieve optimal user engagement. Objective:Â The aim of this study was to explore the experiences of users regarding the usability and functionality of QoL-ME and whether the app is actionable and beneficial for patients. Methods:Â End users (n=8) of QoL-ME contributed to semistructured interviews. An interview guide was used to direct the interviews. All interviews were audiorecorded and transcribed verbatim. Transcriptions were analyzed and coded thematically. Results:Â Analysis revealed 3 main themes: (1) benefit, (2) actionability, and (3) characteristics of the QoL-ME. The first theme reveals that the QoL-ME app was beneficial for the majority of respondents, primarily by prompting them to reflect on their quality of life. The current version is not yet actionable; the actionability of the QoL-ME app may be improved by enabling users to view their scores over time and by supplying practical advice for quality of life improvements. Overall, participants had positive experiences with the usability, design, and content of the app. Conclusions:Â The QoL-ME app can be beneficial to users as it provides them with insight into their quality of life and elicits reflection. Incorporating more functionalities that facilitate self-management, such as advice and strategies for improving areas that are lacking, will likely make the app actionable. Patients positively regarded the usability, design, and contents of the QoL-ME app
Fault Reactivation Analysis Using Microearthquake Clustering Based on Signal-to-Noise Weighted Waveform Similarity
The cluster formation of about 2000 induced microearthquakes (mostly M L < 2) is studied using a waveform similarity technique based on cross-correlation and a subsequent equivalence class approach. All events were detected within two separated but neighbouring seismic volumes close to the geothermal powerplants near Landau and Insheim in the Upper Rhine Graben, SW Germany between 2006 and 2013. Besides different sensors, sampling rates and individual data gaps, mainly low signal-to-noise ratios (SNR) of the recordings at most station sites provide a complication for the determination of a precise waveform similarity analysis of the microseismic events in this area. To include a large number of events for such an analysis, a newly developed weighting approach was implemented in the waveform similarity analysis which directly considers the individual SNRs across the whole seismic network. The application to both seismic volumes leads to event clusters with high waveform similarities within short (seconds to hours) and long (months to years) time periods covering two magnitude ranges. The estimated relative hypocenter locations are spatially concentrated for each single cluster and mirror the orientations of mapped faults as well as interpreted rupture planes determined from fault plane solutions. Depending on the waveform cross-correlation coefficient threshold, clusters can be resolved in space to as little as one dominant wavelength. The interpretation of these observations implies recurring fault reactivations by fluid injection with very similar faulting mechanisms during different time periods between 2006 and 2013
Leading-effect vs. Risk-taking in Dynamic Tournaments: Evidence from a Real-life Randomized Experiment
Two 'order effects' may emerge in dynamic tournaments with information feedback. First, participants adjust effort across stages, which could advantage the leading participant who faces a larger 'effective prize' after an initial victory (leading-effect). Second, participants lagging behind may increase risk at the final stage as they have 'nothing to lose' (risk-taking). We use a randomized natural experiment in professional two-game soccer tournaments where the treatment (order of a stage-specific advantage) and team characteristics, e.g. ability, are independent. We develop an identification strategy to test for leading-effects controlling for risk-taking. We find no evidence of leading-effects and negligible risk-taking effects
LDM: Lineage-Aware Data Management in Multi-tier Storage Systems
We design and develop LDM, a novel data management solution to cater the needs of applications exhibiting the lineage property, i.e. in which the current writes are future reads. In such a class of applications, slow writes significantly hurt the over-all performance of jobs, i.e. current writes determine the fate of next reads. We believe that in a large scale shared production cluster, the issues associated due to data management can be mitigated at a way higher layer in the hierarchy of the I/O path, even before requests to data access are made. Contrary to the current solutions to data management which are mostly reactive and/or based on heuristics, LDM is both deterministic and pro-active. We develop block-graphs, which enable LDM to capture the complete time-based data-task dependency associations, therefore use it to perform life-cycle management through tiering of data blocks. LDM amalgamates the information from the entire data center ecosystem, right from the application code, to file system mappings, the compute and storage devices topology, etc. to make oracle-like deterministic data management decisions. With trace-driven experiments, LDM is able to achieve 29–52% reduction in over-all data center workload execution time. Moreover, by deploying LDM with extensive pre-processing creates efficient data consumption pipelines, which also reduces write and read delays significantly
Comparison of surface-derived and ISCCP cloud optical properties
One objective of the FIRE Project is to validate the cloud parameters given on ISCCP tapes. ISCCP first defines whether or not a region is clear or has clouds based on two threshold algorithms. If the region has clouds, then a cloud optical depth is given as well as a cloud height. Special high resolution ISCCP CX tapes were created for the time period of the Wisconsin FIRE experiment. These tapes did not include the cloud height product, however, other parameters used to make up the standard ISCCP Cl products were available. The ISCCP cloud/no cloud and cloud depth parameters are compared with surface derived values for the Wisconsin FIRE region during the October 27 and 28 case study days
- …