366 research outputs found
Predator Island: Washington State Department of Social and Health Services Internship
STEP Category: InternshipsIn the summer of 2019, I completed an internship with the State of Washington Department of Social and Health Services. This internship was designed to attract students from a wide variety of backgrounds to work and explore the state government’s human service department and participate in an experience working in a health service-related area. This internship is designed for people who have an interest in public service and helping others who also have the ability to work in these types of sensitive situations. Over the course of 11 weeks, I will be working with a group of people and exploring different fields of the state government and the health-related fields. Specifically, I will be working with sexually violent predators at the Special Commitment Center on McNeil Island. This internship will provide a great opportunity to gain experience in the field of health services and working with the government as well.The Ohio State University Second-year Transformational Experience Program (STEP)Academic Major: Psycholog
Subsampled Blind Deconvolution via Nuclear Norm Minimization
Many phenomena can be modeled as systems that preform convolution, including negative effects on data
like translation/motion blurs. Blind Deconvolution (BD) is a process used to reverse the negative effects
of a system by effectively undoing the convolution. Not only can the signal be recovered, but the impulse
response can as well. "Blind" signifies that there is incomplete knowledge of the impulse responses of an
LTI system. Solutions exist for preforming BD but they assume data is fully sampled. In this project we
start from an existing method [1] for BD then extend to the subsampled case. We show that this new
formulation works under similar assumptions. Current results are empirical, but current and future work
focuses providing theoretical guarantees for this algorithm.No embargoAcademic Major: Electrical and Computer Engineerin
Promoting flood risk reduction: the role of insurance in Germany and England
Improving society's ability to prepare for, respond to and recover from flooding requires integrated, anticipatory flood risk management (FRM) . However, most countries still focus their efforts on responding to flooding events if and when they occur rather than addressing their current and future vulnerability to flooding. Flood insurance is one mechanism that could a more ex-ante approach to risk by supporting risk reduction activities. This paper uses an adapted version of Easton's System Theory to investigate the role of insurance for FRM in Germany and England. We introduce an anticipatory FRM framework, which allows to consider flood insurance as part of a broader policy field. We analyse if and how flood insurance can catalyse a change towards a more anticipatory approach to FRM. In particular we consider insurance's role in influencing five key components of an anticipatory FRM: risk knowledge, prevention through better planning, property-level protection measures, structural protection and preparedness (for response). We find that in both countries FRM is still a reactive, event-driven process, while anticipatory FRM remains underdeveloped. However, collaboration between insurers and FRM decision-makers has already been successful, for example in improving risk knowledge and awareness, while in other areas insurance acts as a disincentive for more risk reduction action. In both countries there is evidence that insurance can play a significant role in encouraging anticipatory FRM, but this remains underutilized. Effective collaboration between insurers and government, should not be seen as a cost, but as an investment to secure future insurability through flood resilience
Neue Modelle zur Abschätzung von Hochwasserschäden
Die Hochwasserschäden der letzten Jahre haben Fragen der Hochwasservorsorge wieder in den öffentlichen Fokus gerückt. Um Hochwasserrisiken zu quantifizieren, adäquate Schutzmaßnahmen zu planen und hinsichtlich ihrer Effizienz zu bewerten, sind Methoden zur Abschätzung von Hochwasserschäden im gesamten Bundesgebiet notwendig
Trivial Similarity-Based Biases and Efforts to Avoid Bias in Courtroom Judgments
Previous research on jury trials has focused mainly on the effects of group similarities (the similarity-leniency hypothesis; Kerr et al., 1995) and ways to combat those biases to ensure fair deliberation. Previous research on the effects of trivial similarities shows increased liking and compliance towards individuals who share similarities, but the similarity can also lead to participants distancing themselves from the individual if they display negative characteristics (i.e., being rude). The current study investigated the effects of shared trivial similarities between juror and defendant and possible ways to reduce or possibly eliminate these effects using bias correction. The main hypothesis was that trivial similarities between juror and defendant influence the juror's ratings of the defendant, and when asked to correct, jurors will do so. For the design of the study, a single-session study was utilized where participants were randomly selected to either see a similar or non-similar defendant and provide ratings of guilt, fault, and responsibility. The bias correction instructions immediately followed the initial decisions in the same sitting and theories of bias were measured as a possible predictor of the shift in ratings from pre-to-post-correction instructions. Participants were 150 undergraduate psychology students at The Ohio State University who were participating for credit ranging from ages 18 to 37. The effects of similarity, correction instructions, and the interaction between the two all failed to reach significance. Theories of bias also did not significantly interact with condition to impact difference in perceptions but were trending in the direction I predicted. Specifically, post-correction guilt ratings were higher for participants who were in the similar condition which is trending in the expected direction. While there were limitations to this study including the possibility of the similarities being too subtle, if significant results were found, there would be implications for jury instructions in the courtroom and jury selection.No embargoAcademic Major: Psycholog
Residential Thermal Mass Construction
The southwest has long known the
value of building homes with high mass
materials. The ancient Pueblo Indians
found that by using "adobe" they could
capture the energy necessary to
survive the harsh desert climate. Our
ancestors knew that a heavy, dense
wall (internal or external), or floor
could store collected heat or
coolness, retain it for long periods
of time, and then slowly transfer it
to its surrounding.
Due to rising construction costs
and increased competition, modern
homebuilders have completely shied
away from high mass construction
practices. In an attempt to
revitalize the use of high mass in
residential construction, we have
designed a special "Thermal Mass
Block." This new block incorporates
the use of modern construction
techniques with the value of high
mass.
This paper describes the
environment surrounding the
development of this high mass block.
It examines the research foundation
used to validate the benefits of high
mass construction
Flood risk assessment and associated uncertainty
International audienceFlood disaster mitigation strategies should be based on a comprehensive assessment of the flood risk combined with a thorough investigation of the uncertainties associated with the risk assessment procedure. Within the "German Research Network of Natural Disasters" (DFNK) the working group "Flood Risk Analysis" investigated the flood process chain from precipitation, runoff generation and concentration in the catchment, flood routing in the river network, possible failure of flood protection measures, inundation to economic damage. The working group represented each of these processes by deterministic, spatially distributed models at different scales. While these models provide the necessary understanding of the flood process chain, they are not suitable for risk and uncertainty analyses due to their complex nature and high CPU-time demand. We have therefore developed a stochastic flood risk model consisting of simplified model components associated with the components of the process chain. We parameterised these model components based on the results of the complex deterministic models and used them for the risk and uncertainty analysis in a Monte Carlo framework. The Monte Carlo framework is hierarchically structured in two layers representing two different sources of uncertainty, aleatory uncertainty (due to natural and anthropogenic variability) and epistemic uncertainty (due to incomplete knowledge of the system). The model allows us to calculate probabilities of occurrence for events of different magnitudes along with the expected economic damage in a target area in the first layer of the Monte Carlo framework, i.e. to assess the economic risks, and to derive uncertainty bounds associated with these risks in the second layer. It is also possible to identify the contributions of individual sources of uncertainty to the overall uncertainty. It could be shown that the uncertainty caused by epistemic sources significantly alters the results obtained with aleatory uncertainty alone. The model was applied to reaches of the river Rhine downstream of Cologne
Schadstoffmuster in der regionalen Grundwasserkontamination der mitteldeutschen Industrie- und Bergbauregion Bitterfeld-Wolfen
Um aus bestehenden umfangreichen Datenbeständen neuartige Informationen über Stoffe und ihre Verteilungsmuster in einer regionalen Grundwasserkontamination zu extrahieren, wurde eine dreistufige Untersuchungsstrategie entwickelt und am Beispiel der regionalen Grundwasserkontamination in Bitterfeld-Wolfen (Sachsen-Anhalt, Deutschland) umgesetzt. In der ersten Stufe wurden Datenbestände aus verschiedenen Grundwassermonitoringprogrammen zusammengeführt und auf Qualität geprüft. In der zweiten Stufe wurden für die organischen Schadstoffe stoffspezifische generalisierte Kontaminationskriterien (Emissions-Nachweishäufigkeit und mittlere Emissionskonzentration) ermittelt. Die beiden Kontaminationskriterien wurden mit Rankingverfahren (Kontaminationsprofilen, Hasse-Diagramm-Technik, Clusteranalysen) verarbeitet. Dadurch wurde die regionale und die lokale Relevanz aller untersuchten Substanzen identifiziert. Darüber hinaus konnten regionale Leitparameter für das Grundwassermonitoring abgeleitet werden (u.a. Tetrachlorethen, Vinylchlorid, Monochlorbenzen, alpha-HCH). Auf Grundlage der Rankingergebnisse wurden für eine statistische Strukturanalyse, die dritte Untersuchungsstufe, drei regional relevante Merkmalsgruppen ausgewählt und mit Korrelations-, Hauptkomponenten- und Clusteranalysen untersucht. Für alle drei Merkmalsgruppen konnten mit der Hauptkomponentenanalyse verschiedene Kontaminationsfaktoren in der regionalen Grundwasserkontamination identifiziert werden. Darüber hinaus war es möglich, mit Hilfe der Clusteranalyse räumliche Verbreitungsmuster zu visualisieren.thesi
- …