147 research outputs found

    BlueCollar: Optimizing Worker Paths on Factory Shop Floors with Visual Analytics

    Get PDF
    The optimization of a factory\u27s productivity regarding quality and efficiency is an important task in the manufacturing domain. To optimize the productivity, production lines are optimized to have short transportation paths and short processing times at the stations that process intermediate components or the final product. A factory\u27s layout is a key factor in this optimization aspect. This optimization mostly comprises the machine tools\u27 positions with respect to places where supply goods are being delivered and other tools are stationed, often neglecting the paths that workers need to take at the shop floor. This impairs a factory\u27s productivity, as machines may need to wait for workers, who operated another machine and are still on the way due to the long distance between the machines. In this work, we present BlueCollar, a visual analytics approach that supports layout planners to explore and optimize existing factory layouts regarding the paths taken by workers. Planners can visually inspect the paths that workers need to take based on their work schedule and the factory\u27s layout. An estimation of distribution algorithm supports them in choosing which layout elements, e.g., shared tool caches, to relocate. Its intermediate and final results are used to provide visual cues for suitable relocation areas, and to suggest new layouts automatically. We demonstrate our approach through an application scenario based on a realistic prototype layout provided by an external company

    Three Essays on Estimating and Forecasting Residential Markets

    Get PDF
    This dissertation covers three different aspects of estimating and forecasting residential real estate markets. Chapter 1: The first chapter aims to examine whether there are differences between the long and short-term relationship of house prices and interest rates. The elasticity of house prices to monetary policy changes, e.g. via interest rates, is from a theoretical perspective and in the long-run negative. However, house prices adapt in the short-run dynamically to economic, financial, institutional and demographic factors. In this chapter, we confirm the aforementioned negative relationship for the Nordic housing markets but provide evidence of drastic deviations in the short run, especially in the context of the financial crisis. Chapter 2: The second chapter tests the prediction accuracy of two innovative methods proposed along the hedonic debate: The Geographically Weighted Regression (GWR) and the Generalized Additive Model (GAM). We compare the predictions of linear, spatial and non-linear hedonic models based on a very large dataset in Germany. The results provide evidence for a clear disadvantage of the GWR model in out-of-sample forecasts. However, the simplicity of the OLS approach is not substantially outperformed by the semi-parametric approach. Since sample size is essential when estimating and forecasting hedonic models this study covers more than 570,000 observations which is – to the authors' knowledge – one of the largest datasets used for spatial real estate analysis. Chapter 3: Google Trends offers virtually unlimited, instantaneously available, spatially and textually adjustable and, in addition, free data. Real estate markets appear to be particularly well-suited for search volume related studies, as the “products” of this market involve a large financial commitment, which demands an extensive information gathering process. Although Google Trends data can be accessed already since 2008, many interpretation and usage misunderstandings can be found amongst the literature. Therefore, the third chapter will focus on two main objectives: Firstly, I will give an overview of what Google data is in the first place and what the potential pitfalls are. Secondly, I will conduct an empirical analysis to find out, whether the results are still in line with the literature after accounting for those difficulties

    The Geriatric-Oncologic Conference, a new approach in decision-making

    Get PDF
    Onkologen entscheiden auf der Basis von Alter und klinischem Eindruck tĂ€glich, welche Patienten welche Chemotherapie erhalten sollen. Bei Ă€lteren Patienten sind diese Entscheidungen aufgrund fehlender Evidenz von der persönlichen Erfahrung und „Behandlungsphilosophie“ des jeweils behandelnden Onkologen abhĂ€ngig. Ein umfassendes geriatrisches Assessment wurde vorgeschlagen, um diese Entscheidung datenbasiert und reproduzierbar treffen zu können. Beide Strategien haben Vor- und Nachteile. In der vorliegenden Arbeit wird eine Synthese beider AnsĂ€tze beschrieben und weiter untersucht. Bei der Kombination beider AnsĂ€tze wird innerhalb einer geriatrisch-onkologischen Konferenz auf der Datengrundlage eines umfassenden geriatrischen Assessments und zusĂ€tzlich unter Einbeziehung des Behandlers, eines erfahrenen Onkologen und eines Geriaters sowie der das Assessment durchfĂŒhrenden Berufsgruppen eine gemeinsame Entscheidung getroffen. Anhand der Daten eines umfassenden geriatrischen Assessments von ĂŒber 400 Patienten, die an dieser Klinik 2014 und 2015 routinemĂ€ĂŸig bei allen stationĂ€ren Patienten, die 65 Jahre oder Ă€lter waren, vor Einleitung einer spezifischen Therapie erhoben wurden, wurde gezeigt, dass die Entscheidung einer geriatrisch-onkologischen Konferenz entscheidend von der Klassifizierung allein aufgrund des Assessments, aber auch von der persönlichen EinschĂ€tzung des Behandlers abweicht. Es wurde ein statistisches Modell erstellt, welches anhand der Assessmentdaten die Klassifizierung von Patienten in einer geriatrischonkologischen Konferenz vorhersagt. Bis zur weiteren prospektiven Validierung und Bewertung der denkbaren Entscheidungsstrategien ist eine geriatrisch-onkologische Konferenz am besten geeignet, die TherapiefĂ€higkeit geriatrisch-onkologischer Patienten zu beurteilen, da sie alle verfĂŒgbaren Informationen berĂŒcksichtigen und damit die maximale Patientensicherheit gewĂ€hrleisten kann.Background A comprehensive geriatric assessment (CGA) is recommended before treating elderly cancer patients. However, it is not proven that additional information from the CGA will change our treatment decision. Methods 421 cancer patients, 65 years or older, were judged by their treating oncologist regarding fitness for chemotherapy. Accompanying a CGA was performed and each patient was discussed in a multidisciplinary board (MB) including a geriatrician. The differences between the judgements of the treating oncologist, the MB, and a classification based solely on the CGA were examined. Additionally, a statistical model of the decision-making process within the MB, based on the findings of the CGA was established and evaluated. Results Treating oncologist and MB judged 12% and 15% of the patients as frail, 41% and 38% as vulnerable, 46% and 47% as fit. 83% of congruence was observed. Based on the proposal of Balducci, 55% of the patients were classified as frail, 30% as vulnerable and 15% as fit. 34% of congruence with treating oncologist judgement was observed. In the 2-stage logistic model the activities of daily living and the mini mental state examination (MMSE) discriminated between frail and vulnerable or fit. Tinetti test, age, Charlson comorbidity index, living alone, MMSE and mini nutritional assessment discriminated between vulnerable and fit. The statistical models were able to differentiate with an accuracy of 95% between frail and vulnerable or fit and 83% between vulnerable and fit. Conclusions To our experience, the judgement of an experienced oncologist is well comparable with the judgement of a MB. Nevertheless, for some patients discussion of CGA data in the MB may essentially change treatment decisions. A logistic regression model of the decision making process within the MB may replace the elaborate team discussion, if a conference is not feasible

    Erarbeitung eines Vorschlags einer einheitlichen, sektoren- und berufsgruppenĂŒbergreifenden Basisdokumentation fĂŒr die Kinder-, Jugend-, und Familienpsychosomatik – Entwicklung und Evaluation eines Instrumentariums zur QualitĂ€tssicherung aus multidisziplinĂ€rer Perspektive unter besonderer BerĂŒcksichtigung der Erfassung sozialer Dimension psychischer Erkrankung

    Get PDF
    Hintergrund: In Psychiatrie, Psychosomatik, Psychotherapie und komplementĂ€rer medizinischer Fachgebiete (P-FĂ€chern), werden Basisdokumentationen (BaDo‘s) als Instrumente fĂŒr die QualitĂ€tssicherung (QS) eingesetzt. Mit der Psy-BaDo-PTM konnte sich fĂŒr die psychosomatische Behandlung erwachsener Patienten eine einrichtungs-, sektoren- und berufsgruppenĂŒbergreifende BaDo etablieren. Ein vergleichbares Instrument fĂŒr die Kinder-, Jugend- und Familienpsychosomatik fehlt bislang. Zielstellung: Das Ziel der vorliegenden Arbeit bestand in der Entwicklung eines Vorschlags fĂŒr eine einrichtungs-, sektoren- und berufsgruppenĂŒbergreifende BaDo fĂŒr die Kinder-, Jugend- und Familienpsychosomatik. Aufgrund der Relevanz fĂŒr die Therapieevaluation wurden die psychometrischen Eigenschaften des SDQ und KINDL-R evaluiert. Das Potential der entwickelten BaDo, die soziale Dimension psychischer Erkrankung und sozialtherapeutische Interventionen zu erfassen, wurde untersucht. Methode: Nach einer systematischen Literaturrecherche wurden in einer qualitativen Dokumentenanalyse Schnittmengen zwischen BaDo’s fĂŒr Kinder und Jugendliche sowie Erwachsene herausgearbeitet. Kinder- und jugendspezifische Aspekte wurden identifiziert. Aufgrund deskriptiv-statistischer Analysen und konzeptueller Überlegungen wurde ein Merkmalkatalogs fĂŒr die Kinder-, Jugend- und Familienpsychosomatik zusammengestellt. Die Evaluation der psychometrischen Eigenschaften der deutschsprachigen SDQ und KINDL-R Versionen erfolgte durch eine metaanalytische Aggregation vorliegender Befunde. Eine Bewertung wurde auf Grundlage eines Kriterienkatalogs des Deutschen Kollegiums fĂŒr Psychosomatische Medizin (DKPM) vorgenommen. Die Untersuchung der faktoriellen Struktur und der Known Groups ValiditĂ€t des Kid-KINDL erfolgte durch SekundĂ€ranalyse eines Datensatzes 8- bis 11-jĂ€hriger psychisch kranker Kinder, die sich zum Erhebungszeitpunkt in psychosomatischer Krankenhausbehandlung befanden. Auf Grundlage einer theoretischen Ableitung psycho-sozialer Kontextfaktoren wurden BaDo-Items identifiziert, ĂŒber die eine Kontextualisierung psychischer Erkrankung vorgenommen und sozialtherapeutische Interventionen ausgewertet werden können. Ergebnisse: Es konnte eine BaDo fĂŒr die Kinder- und Jugendpsychosomatik erarbeitet werden. Eine Kinder- und Jugendspezifikation wurde durch die Erfassung biographischer und familienanamnestischer Daten erzielt. Die Erhebung soziodemographischer Merkmale wurde konzeptuell angepasst. Trotz einiger Limitationen sprach die zusammenfassende Bewertung nach DKPM-Kriterien fĂŒr den Einsatz des SDQ und KINDL-R in der QS. In der Evaluation des Kid-KINDL zeigte ein Modell mit Item-Doppelladungen krankheitsassoziierter Items auf den HRQoL-Dimensionen und zwei mit den HRQoL Faktoren unkorrelierten Psychopathologie-Faktoren im Vergleich den besten Modell-Fit. Die Psy-BaDo-PTM-KiJu ermöglicht eine relativ breite Erfassung psycho-sozialer Kontextfakten. Die Evaluation sozialtherapeutischer Interventionen erscheint aus Perspektive Klinischer Sozialarbeit allerdings unzureichend. Diskussion: Aktuell wird die Psy-BaDo-PTM-KiJu in der Praxis erprobt. Auf Grundlage dieser Erfahrungen sollte die BaDo ĂŒberarbeitet werden. Perspektivisch wĂ€re die Konsentierung des Merkmalkatalogs durch Berufs- und FachverbĂ€nde wĂŒnschenswert. Mit dem SDQ und dem KINDL-R konnten zwei international weit verbreitete Instrumente in die BaDo integriert werden. Allerdings zeigte die Untersuchung auch, dass es an systematischen Überblicksarbeiten zu psychometrischen Eigenschaften generischer psychodiagnostischer Instrumente fĂŒr Kinder und Jugendliche fehlt. In der Untersuchung des Kid-KINDL ergaben sich deutliche Hinweise darauf, dass aufgrund von Item-Doppelladungen einige HRQoL-Dimensionen durch das Instrument nicht valide erfasst werden. Trotz dieser Limitationen erscheint der Kid-KINDL fĂŒr die Erfassung der HRQoL-Dimensionen Selbstwert, Familie und Freunde geeignet. Es konnte gezeigt werden, dass BaDo-Daten fĂŒr die Analyse psycho-sozialer Faktoren durchaus interessant sind. Eine adĂ€quate BerĂŒcksichtigung sozialtherapeutischer Leistungen in BaDo‘s wĂŒrde dabei helfen, das multi-disziplinĂ€re psychosomatische Behandlungsspektrum in der QS abzubilden

    WHAT WE NEED FOR ENCODING OF MEMORY AND EMOTIONAL RECONSOLIDATION

    Get PDF
    Background: It is known that an interactive design and good participants’ involvement strengthens the motivation to engage in learning processes. Previous research suggests attitude-behaviour consistency with relevance of subjective meaning and interest in learning. This observational study aims to measure the attitude of medical students. Methods: The connotative meaning and perception of e-learning were explored. A semantic differential scale was given to all students (N=328) of a case-based blended-learning (CBBL) course, 296 medical students were included in this study. Results: The online-survey completion rate was 100%. An exploratory principal components analysis with varimax rotation was performed. Five components could be extracted that explained 47.21% of the total variance. The five components are best described by the following adjectives taken from the item pool: “soft, emotional, playful”, “clear and organised”, “vigorous and serious”, “vivid and outgoing”, “economical and introverted”. An additional qualitative analysis revealed relevant positive connotations ascribed to e-learning by the students: freedom in time and space for learning, interdisciplinary approach and communication, playfulness and clear, structured procedure. Conclusion: Our study demonstrated that a specific set of aspects is essential for students to feel comfortable and affectcognitively engaged to learn and gain the best exam grades

    Numerical Investigation of the Effects of Post-Combustion due to Fuel Outflow in Bleed Engine Cycles of a Retro Propulsion-Assisted Launch Vehicle

    Get PDF
    Reusable launch vehicles (RLV) have the potential to be a resource- and cost-efficient alternative to conventional space transport systems. Several first stages of RLVs are in the maturing process and the European long term strategy aims towards the development and characterization of RLV relevant technologies for their next generation of launchers. We are basing our studies on the EU funded Retro Propulsion Assisted Landing Technologies (RETALT) project, which was formed with the goal of investigating Vertical Take-off Vertical Landing (VTVL) launch vehicles. In this paper, the first stage of the VTVL Two Stage to Orbit (TSTO) RETALT1 configuration will be used for the assessment of thermal loads during the flight trajectory. The mission plan for the first stage of the RETALT1-vehicle is to return either to the launch pad or a drone ship via a re-entry burn and a retro propulsion maneuver. During this retro propulsion phase high thermal loads are acting on the rocket structure and especially the landing legs, the base plate and the aerodynamic control surfaces. These thermal loads due to the main engine exhaust of the RLV have been characterized in previous studies by Laureti et al. Only little research has been devoted to the topic of post combustion due to the outflow of gas generators and air vents of cryogenic fuel tanks in VTVL-configurations in general. Owing to these secondary exhaust jets, unburned hydrogen is ejected near the high temperature outflow of the main engines, which could lead to a significant post combustion with the surrounding atmospheric oxygen and deviating thermal loads along the vital parts of the rocket structure. In order to provide an assessment of the additional influence of the post combustion and thermal loads, Computational Fluid Dynamics (CFD) simulations are carried out using the DLR-TAU code with the Reynolds Averaged Navier Stokes (RANS) method. As the post-combustion of the nozzle- and gasgenerator-outflow is to be observed, a reduced Jachimowsky mechanism for a species mixture of the liquid hydrogen, liquid oxygen combustion mixture and ambient air is applied as chemistry model. In this publication the validity of the computational mesh by means of a GCI-study and the influence of the turbulence modeling with different approaches is to be investigated. With these results first observations of the flow field characteristics and the thermal loads acting on the RLV will be done in order to identify the simulation parameters, including crucial points along the flight trajectory for investigations of the heat flux distribution across the surface structures of the vehicle

    Accelerating galaxy dynamical modeling using a neural network for joint lensing and kinematics analyses

    Full text link
    Strong gravitational lensing is a powerful tool to provide constraints on galaxy mass distributions and cosmological parameters, such as the Hubble constant, H0H_0. Nevertheless, inference of such parameters from images of lensing systems is not trivial as parameter degeneracies can limit the precision in the measured lens mass and cosmological results. External information on the mass of the lens, in the form of kinematic measurements, is needed to ensure a precise and unbiased inference. Traditionally, such kinematic information has been included in the inference after the image modeling, using spherical Jeans approximations to match the measured velocity dispersion integrated within an aperture. However, as spatially resolved kinematic measurements become available via IFU data, more sophisticated dynamical modeling is necessary. Such kinematic modeling is expensive, and constitutes a computational bottleneck which we aim to overcome with our Stellar Kinematics Neural Network (SKiNN). SKiNN emulates axisymmetric modeling using a neural network, quickly synthesizing from a given mass model a kinematic map which can be compared to the observations to evaluate a likelihood. With a joint lensing plus kinematic framework, this likelihood constrains the mass model at the same time as the imaging data. We show that SKiNN's emulation of a kinematic map is accurate to considerably better precision than can be measured (better than 1%1\% in almost all cases). Using SKiNN speeds up the likelihood evaluation by a factor of ∌200\sim 200. This speedup makes dynamical modeling economical, and enables lens modelers to make effective use of modern data quality in the JWST era.Comment: (13 pages, 9 figures, submitted to Astronomy & Astrophysics

    Differences in Fabry Cardiomyopathy Between Female and Male Patients Consequences for Diagnostic Assessment

    Get PDF
    ObjectivesWe hypothesized that Fabry cardiomyopathy in female patients might differ substantially from that in male patients and sought to prove this hypothesis in a large cohort consisting of 104 patients with Fabry disease.BackgroundFabry cardiomyopathy in male patients is characterized by left ventricular (LV) hypertrophy, impaired myocardial function, and subsequent progressive myocardial fibrosis. In contrast, the occurrence of these 3 cardiomyopathic hallmarks in female patients remains unknown.MethodsIn 104 patients (58 females, age 42 ± 16 years; 46 males, age 42 ± 13 years) with genetically proven Fabry disease, LV hypertrophy, regional myocardial deformation and myocardial fibrosis were assessed by standard echocardiography, strain rate imaging, and cardiac magnetic resonance (CMR) imaging–guided late enhancement (LE).ResultsIn men, end-diastolic left ventricular wall thickness (LVWT) ranged from 6 to 19.5 mm (LV mass CMR 55 to 200 g/m2), and LE was never seen with LVWT <12 mm (LV mass <99 g/m2). In contrast in female patients, LVWT ranged from 5 to 15.5 mm, LV mass ranged from 39 to 146 g/m2, and LE was already detectable with an LVWT of 9 mm (LV mass 56 g/m2). When LV mass was examined in CMR, LE was detected in 23% of the female patients without hypertrophy (n = 9), whereas LE was never seen in male patients with normal LV mass. LE was always associated with low systolic strain rate, but the severity of impairment was independent of LVWT in female patients (lateral strain rate in patients with LV hypertrophy with LE −0.7 ± 0.2 s−1; patients without LV hypertrophy with LE −0.8 ± 0.2 s−1; p = 0.45).ConclusionsIn contrast to male patients, the loss of myocardial function and the development of fibrosis do not necessarily require myocardial hypertrophy in female patients with Fabry disease. Thus, in contrast to actual recommendations, initial cardiac staging and monitoring should be based on LV hypertrophy and on replacement fibrosis in female patients with Fabry disease

    Laser-based 3D printing of hydrogel barrier models for microfludic applications

    Get PDF
    The placenta secures the survival and development of the fetus. As placental tissue connects the fetus with the mother and is responsible for endogenous and exogenous material transfer. The maternal and fetal blood are thereby separated, by the so-called placental barrier, which is made up by the trophoblastic syncytium and the fetal capillary wall. Research in the field of placenta biology represents a challenging topic, as current approaches are difficult to perform, time consuming and often carry the risk of harming the fetus. The establishment of a reproducible in-vitro model, simulating the placental transport is necessary to study fetal development and for identification of underlying causes of maldevelopment. In this study, a photosensitive hydrogel material, in combination with two-photon polymerisation, was used to produce high resolution structures with nanometre precision geometries. Gelatine modified with methacrylamide and amino-ethyl-methacrylate (GelMOD AEMA) was thereby crosslinked within a customised microfluidic-device under the addition of photoinitiator, separating the chip in two different compartments (Figure 1). The fetal compartment contains HUVEC cells which are cultivated in EGM2, while BeWo B30 cells are supplied with DMEM Ham-F12 to mimic the maternal compartment. This microfluidic approach in combination with native flow profiles can be used to precisely remodel the microenvironment of placental tissue. The establishment of a functional placenta-on-a-chip-model allows the modulation of different clinical and biological scenarios in the future. A potential application can be found in the simulation of altered sugar transport across the placental membrane and evaluation of the effects of altered nutrient balance in-utero Please click Additional Files below to see the full abstract
    • 

    corecore