57 research outputs found

    Kindheit in Pflegefamilien: Besondere Bedürfnisse der Kinder, Erfordernisse der Pflegefamilie und die professionelle Unterstützung der Sozialen Arbeit

    Get PDF
    Die vorliegende Arbeit bietet eine umfassende Übersicht über die Dreieckskonstellation „Pflegekind – Pflegefamilie – Soziale Arbeit“, welche in einem Pflegeplatzierungsprozess mitwirken. Dabei werden die Rahmenbedingungen, welche heutzutage in der Schweiz gegeben sind, nicht aussenweggelassen. So wird ersichtlich gemacht, wie sich das Pflegekinderwesen im Laufe der Zeit professionalisiert hat und wie das Kindeswohl heute im Fokus steht. Denn bei einem Pflegeplatzierungsprozess steht in erster Linie das Kind im Mittelpunkt, weshalb verschiedene pflegekindspezifische Faktoren dargelegt werden, die es zu beachten gibt. So werden verschiedene Theorien herbeigezogen, um die Bedürfnisse und die Entwicklung von Pflegekinder zu verdeutlichen. Ausserdem soll das in der Gesellschaft mehrheitlich verbreitete Unwissen über die Aufgaben und Anforderungen einer Pflegefamilie aufgeklärt werden. Dadurch soll gezeigt werden, welche Funktionen Pflegefamilien übernehmen und was es braucht, um überhaupt ein Pflegekind aufnehmen zu können. Aus Sicht der Sozialen Arbeit wird dargelegt, welche Aufgaben und welche Rollen die Fachpersonen in diesem Pflegeplatzierungsprozess übernehmen und es wird deutlich, warum dieses Tätigkeitsfeld Aufgabe der Sozialen Arbeit ist. Da es in dieser komplexen Dreieckskonstellation viel zu beachten und zu begleiten gibt, zeigen die Autorinnen auch die Anforderungen auf, die an die Fachpersonen selbst gestellt werden. Zusammenfassend kann gesagt werden, dass mit dieser Bachelorarbeit der Pflegeplatzierungsprozess aus drei relevanten Sichtweisen beleuchtet wird, wodurch schlussendlich ein umfassendes und wichtiges Wissen erarbeitet wird

    Persistence of engineered nanoparticles in a municipal solid-waste incineration plant

    Get PDF
    More than 100 million tonnes of municipal solid waste are incinerated worldwide every year1. However, little is known about the fate of nanomaterials during incineration, even though the presence of engineered nanoparticles in waste is expected to grow2. Here, we show that cerium oxide nanoparticles introduced into a full-scale waste incineration plant bind loosely to solid residues from the combustion process and can be efficiently removed from flue gas using current filter technology. The nanoparticles were introduced either directly onto the waste before incineration or into the gas stream exiting the furnace of an incinerator that processes 200,000 tonnes of waste per year. Nanoparticles that attached to the surface of the solid residues did not become a fixed part of the residues and did not demonstrate any physical or chemical changes. Our observations show that although it is possible to incinerate waste without releasing nanoparticles into the atmosphere, the residues to which they bind eventually end up in landfills or recovered raw materials, confirming that there is a clear environmental need to develop degradable nanoparticles

    Long-term risk of adverse outcomes according to atrial fibrillation type

    Full text link
    Sustained forms of atrial fibrillation (AF) may be associated with a higher risk of adverse outcomes, but few if any long-term studies took into account changes of AF type and co-morbidities over time. We prospectively followed 3843 AF patients and collected information on AF type and co-morbidities during yearly follow-ups. The primary outcome was a composite of stroke or systemic embolism (SE). Secondary outcomes included myocardial infarction, hospitalization for congestive heart failure (CHF), bleeding and all-cause mortality. Multivariable adjusted Cox proportional hazards models with time-varying covariates were used to compare hazard ratios (HR) according to AF type. At baseline 1895 (49%), 1046 (27%) and 902 (24%) patients had paroxysmal, persistent and permanent AF and 3234 (84%) were anticoagulated. After a median (IQR) follow-up of 3.0 (1.9; 4.2) years, the incidence of stroke/SE was 1.0 per 100 patient-years. The incidence of myocardial infarction, CHF, bleeding and all-cause mortality was 0.7, 3.0, 2.9 and 2.7 per 100 patient-years, respectively. The multivariable adjusted (a) HRs (95% confidence interval) for stroke/SE were 1.13 (0.69; 1.85) and 1.27 (0.83; 1.95) for time-updated persistent and permanent AF, respectively. The corresponding aHRs were 1.23 (0.89, 1.69) and 1.45 (1.12; 1.87) for all-cause mortality, 1.34 (1.00; 1.80) and 1.30 (1.01; 1.67) for CHF, 0.91 (0.48; 1.72) and 0.95 (0.56; 1.59) for myocardial infarction, and 0.89 (0.70; 1.14) and 1.00 (0.81; 1.24) for bleeding. In this large prospective cohort of AF patients, time-updated AF type was not associated with incident stroke/SE

    Combining Evidence

    No full text
    The goal of the present thesis consists of establishing the normative foundations for reasoning about combined evidence. Unlike the inter- pretation of single items of evidence, little is known about inference tasks involving multiple items of evi- dence. In forensic practice, however, experts are regu- larly confronted with a collection of evidence rather than isolated evidence items. This necessarily raises the question on how to interpret evidence holistically. The study of the relationships between the different evidence items in a collection and between a collec- tion and a common cause (represented as hypotheses), is of central concern for this thesis. Such relation- ships and causes are almost always unobservable in judicial contexts, and therefore, inherently uncertain. Indeed, uncertainty is a fundamental feature of rea- soning about evidence. The framework for handling uncertainty is defined by probability theory. Evidential reasoning is consequentially a form of probabilistic reasoning. The present thesis locates itself in this probabilistic framework and puts a strong emphasis on graphical probabilistic modeling. The thesis is composed of four cornerstones for each of which a paper was produced. Throughout this thesis, the ordering of the cornerstones is thematic and not chronological. The first paper examines the different types of evidence and their combinations, their generic inference structures, and the relationships between these different inference structures. The ex- amination establishes, thus, a probabilistic ontology of evidence. The following study illustrates the ap- plication of generic inference structures in two real forensic cases. One case involves the combination of two features of a single footwear mark. The other involves fingermarks and a footwear mark, thus two distinct marks. The study shows that even apparently simple forms of combinations involve evidential sub- tleties that require careful analysis. The third study provides novel analysis methods for evidential phenom- ena exclusively occurring in combined evidence. To date, there are only a few methods for assessing the inferential interactions between items of evidence in a holistic setting. This study addressed this problem. The final project consists of a complex case analysis involving four different DNA specimens collected from a rape case that lead to a wrongful conviction of a young man. The model treats each specimen as a mixture profile, and includes considerations on the relevance of each specimen, the possible number of contributors to each specimen, the inferential relation- ships between the specimens, as well as between the specimens and the hypothesis about the authorship of the crime. As it turned out, the different specimens were subject to strong inferential interactions − a fact that was completely missed by the expert of the case. This thesis shows: the problems pervading the subject of combined evidence are not academic phantoms; they are measurable, real, and can affect the lives of people for better or worse

    Limit order placement optimization with Deep Reinforcement Learning: Learning from patterns in cryptocurrency market data

    No full text
    For various reasons, financial institutions often make use of high-level trading strategies when buying and selling assets. Many individuals, irrespective or their level of prior trading knowledge, have recently entered the field of trading due to the increasing popularity of cryptocurrencies, which offer a low entry barrier for trading. Regardless of the intention or trading strategy of these traders, the invariable outcome is their attempt to buy or sell assets. However, in such a competitive field, experienced market participants seek to exploit any advantage over those who are less experienced, for financial gain. Therefore, this work aims to make a contribution to the important issue of how to optimize the process of buying and selling assets on exchanges, and to do so in a form that is accessible to other traders. This research concerns the optimization of limit order placement within a given time horizon of 100 seconds and how to transpose this process into an end-to-end learning pipeline in the context of reinforcement learning.Features were constructed from raw market event data that related to movements of the Bitcoin/USD trading pair on the Bittrex cryptocurrency exchange. These features were then used by deep reinforcement learning agents in order to learn a limit order placement policy. To facilitate the implementation of this process, a reinforcement learning environment that emulates a local broker was developed as part of this work. Furthermore, we defined an evaluation procedure which can determine the capabilities and limitations of the policies learned by the reinforcement learning agents and ultimately provides means to quantify the optimization achieved with our approach. Our analysis of the results of this work includes the identification of patterns in cryptocurrency trading that were formed by market participants who posted orders, and a conceptual framework to construct data features containing these patterns. We developed a fully-functioning reinforcement learning environment that emulates a local broker and, by means of this process, we identified which components are essential.With the use of this environment, we were able to train and test multiple reinforcement learning agents whose aims were to optimize the placement of buy and sell limit orders. During the evaluation, we were able to improve the parameter settings of the constructed reinforcement learning environment and therefore improve the policy learned by the agents. Ultimately, we achieved a significant improvement in limit order placement with the application of a state-of-the-art deep Q-network agent and were able to simulate purchases and sales of 1.0 BTC at a price that was up to $33.89 better than the market price. We have made use of the OpenAI Gym library and contributed our work to the community to enable further investigations to be carried out. The work done in this thesis can be used as a framework to (1) build a component that acts as an intermediary between trader and exchange and (2) to enable exchanges to provide a new order type to be used by traders

    Ausbildungsinstitut für Klientenzentrierte Gesprächs- und Körperpsychotherapie (GFK)

    No full text
    Darstellung der InstitutionMenschenbild- und WeltbildGesundheits- und KrankheitsverständnisTherapieverständni

    Effectiveness of Massage Including Proximal Trigger Point Release for Plantar Fasciitis: a Case Report

    No full text
    Background: Plantar fasciitis (PF) is a common degenerative condition of the plantar fascia. Symptoms include tenderness on the plantar surface of the foot, pain on walking after inactivity, and difficulty with daily activities. Rest, non-steroidal anti-inflammatories, and manual therapies are frequently used treatments for PF. Trigger point release (TrPR) for PF has been found as a viable treatment option. Objective: To determine the effects of massage, including proximal TrPR, for pain and functional limitations in a patient with PF. Method: A student massage therapist from MacEwan University administered five massages, one initial and one final assessment over five weeks to a 46-yearold female with diagnosed PF. She complained of unilateral plantar heel pain (PHP) and deep pulling from mid-glutes to the distal lower limb bilaterally. Evaluation involved active and passive range of motion, myotomes, dermatomes, reflexes, and orthopedic tests. The treatment aim was to decrease PHP by releasing active trigger points (TrPs) along the posterior lower extremity to the plantar surface of the foot, lengthening the associated muscles and plantar fascia. Hydrotherapy, Swedish massage, TrPR, myofascial release, and stretches were implemented. Pain was measured using the numerical rating scale pre- and post-treatments, and the Foot Function Index was used to assess function at the first, middle, and last appointments to assess the effectiveness of massage including proximal TrPR for PF. Results: PHP and functional impairments decreased throughout the fiveweek period. Conclusion: The results indicate massage, including proximal TrPR, may decrease pain and functional impairments in patients with PF. Further research is necessary to measure its efficacy and confirm TrPR as a treatment option

    Functional Kafka

    No full text
    The aim of this thesis is to provide a summary of the current state of message oriented-middleware and eventually build a message broker in Haskell, adapted from the concepts of Apache Kafka which was originally built at LinkedIn. The implementation shall provide basic functionalities such as producing and consuming messages, with the aim to approximate performance of Apache Kafka in a non-clustered setup. The Apache Kafka Protocol is being used as the underlying wire-protocol and is implemented in a standalone library. On top of the procotol library, a separate client library is provided. Thus, the Haskell Message Broker (HMB) as well as its producer and consumer clients have been successfully proofed as compatible with Apache Kafka. This thesis first examines the fundamental concepts behind messaging and discloses the needs for message brokers. In a second stage of this technology research, the purpose of event-streaming is described, containing a comparison of batch and stream processing by explaining the differences in their nature. Finally the concept and features of Apache Kafka is presented. Insights into the HMB implementation is provided in the technical report and is split into two stages. At first, the protocol and client library is introduced. Subsequently the broker implementation is explained including its capabilities as well as the provided set of features. After all, HMB is applied to a benchmark against Apache Kafka. The results of this proof of concept show that Haskell is well suited to build messaging applications as well as implementing protocols based on context free grammars. The from HMB provided performance hit the one of Apache Kafka for transmission of larger message sizes during the benchmark. For the most tested scenarios the performance suffers as HMB is not sufficiently optimized yet. However, the Haskell Message Broker is a well established basis of a state-of-the-art message broker implementation. The authors recommend to apply further optimization techniques as well as extending the feature-set before any other use
    corecore