42 research outputs found

    Concept for support and cleavage of brittle crystals

    Get PDF
    Tröger L, Schütte J, Ostendorf F, Kühnle A, Reichling M. Concept for support and cleavage of brittle crystals. Review of Scientific Instruments. 2009;80(6):063703.We report on sample holders for crystals to be cleaved for the preparation of surfaces with large atomically flat terraces. The concept for mounting sample crystals is based on a vicelike clamping mechanism to securely hold the crystal in position while reducing the risk of fragmentation. Sample holders based on this concept and made of suitable materials allow preparation and cleavage of crystals in the ultrahigh vacuum at high or low temperatures. To cleave the crystal, we employ a scalpel blade mounted on a wobble stick to generate a highly localized stress field initiating the cleavage process. The sample holders are used for experiments of highest resolution scanning force microscopy, however, the concept can be transferred to any other system where cleavage faces of crystals are of interest. Exemplarily, scanning force microscopy results demonstrate that (111) cleavage faces of CaF(2) crystals can be prepared with steps only a few F-Ca-F triple-layers high and atomically flat terraces extending over areas of several mu m(2)

    Clear Signature of the (2 x 1) Reconstruction of Calcite (1014)

    Get PDF
    Schütte J, Rahe P, Tröger L, et al. Clear Signature of the (2 x 1) Reconstruction of Calcite (1014). Langmuir. 2010;26(11):8295-8300.Calcite is a mineral of fundamental importance that plays a crucial role in many fields of research such as biomineralization, biomolecule adsorption, and reactivity as well as industrial and daily life applications. Consequently, the most stable cleavage plane of calcite has been studied extensively using both direct imaging techniques such as atomic force microscopy as well as spectroscopic and diffraction techniques. Several surface structures have been reported for the (10 (1) over bar4) cleavage plane of calcite differing from the simple bulk-truncated structure and an ongoing controversy exists in literature whether the cleavage plane exhibits a (2 x 1) reconstruction or not. We study the (10 (1) over bar4) cleavage plane using high-resolution noncontact atomic force microscopy (NC-AFM) under ultrahigh vacuum conditions and obtain a clear signature of the (2 x) reconstruction. This reconstruction is observed M very narrow tip-surface distance ranges only, explaining why in some experiments the reconstruction has been observed and in others not. Moreover, as all sample preparation is performed in ultrahigh vacuum, the possibility of the (2 x 1) reconstruction being adsorbate-induced appears rather unlikely. Additionally, tip-induced surface changes are ruled out as origin for the observed reconstruction either. In conclusion, our study suggests that the (2 x 1) reconstruction is a true surface property of the (10 (1) over bar4) cleavage plane of calcite

    Modification of a commercial atomic force microscopy for low-noise, high-resolution frequency-modulation imaging in liquid environment

    Get PDF
    Rode S, Stark R, Lübbe J, et al. Modification of a commercial atomic force microscopy for low-noise, high-resolution frequency-modulation imaging in liquid environment. Review of Scientific Instruments. 2011;82(7): 73703.A key issue for high-resolution frequency-modulation atomic force microscopy imaging in liquids is minimizing the frequency noise, which requires a detailed analysis of the corresponding noise contributions. In this paper, we present a detailed description for modifying a commercial atomic force microscope (Bruker MultiMode V with Nanoscope V controller), aiming at atomic-resolution frequency-modulation imaging in ambient and in liquid environment. Care was taken to maintain the AFMs original stability and ease of operation. The new system builds upon an optimized light source, a new photodiode and an entirely new amplifier. Moreover, we introduce a home-built liquid cell and sample holder as well as a temperature-stabilized isolation chamber dedicated to low-noise imaging in liquids. The success of these modifications is measured by the reduction in the deflection sensor noise density from initially 100 fm/root Hz to around 10 fm/root Hz after modification. The performance of our instrument is demonstrated by atomically resolved images of calcite taken under liquid conditions. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3606399

    Effectiveness and safety of opicapone in Parkinson’s disease patients with motor fluctuations: the OPTIPARK open-label study

    Get PDF
    Background The efficacy and safety of opicapone, a once-daily catechol-O-methyltransferase inhibitor, have been established in two large randomized, placebo-controlled, multinational pivotal trials. Still, clinical evidence from routine practice is needed to complement the data from the pivotal trials. Methods OPTIPARK (NCT02847442) was a prospective, open-label, single-arm trial conducted in Germany and the UK under clinical practice conditions. Patients with Parkinson’s disease and motor fluctuations were treated with opicapone 50 mg for 3 (Germany) or 6 (UK) months in addition to their current levodopa and other antiparkinsonian treatments. The primary endpoint was the Clinician’s Global Impression of Change (CGI-C) after 3 months. Secondary assessments included Patient Global Impressions of Change (PGI-C), the Unified Parkinson’s Disease Rating Scale (UPDRS), Parkinson’s Disease Questionnaire (PDQ-8), and the Non-Motor Symptoms Scale (NMSS). Safety assessments included evaluation of treatment-emergent adverse events (TEAEs) and serious adverse events (SAEs). Results Of the 506 patients enrolled, 495 (97.8%) took at least one dose of opicapone. Of these, 393 (79.4%) patients completed 3 months of treatment. Overall, 71.3 and 76.9% of patients experienced any improvement on CGI-C and PGI-C after 3 months, respectively (full analysis set). At 6 months, for UK subgroup only (n = 95), 85.3% of patients were judged by investigators as improved since commencing treatment. UPDRS scores at 3 months showed statistically significant improvements in activities of daily living during OFF (mean ± SD change from baseline: − 3.0 ± 4.6, p < 0.0001) and motor scores during ON (− 4.6 ± 8.1, p < 0.0001). The mean ± SD improvements of − 3.4 ± 12.8 points for PDQ-8 and -6.8 ± 19.7 points for NMSS were statistically significant versus baseline (both p < 0.0001). Most of TEAEs (94.8% of events) were of mild or moderate intensity. TEAEs considered to be at least possibly related to opicapone were reported for 45.1% of patients, with dyskinesia (11.5%) and dry mouth (6.5%) being the most frequently reported. Serious TEAEs considered at least possibly related to opicapone were reported for 1.4% of patients. Conclusions Opicapone 50 mg was effective and generally well-tolerated in PD patients with motor fluctuations treated in clinical practice. Trial registration Registered in July 2016 at clinicaltrials.gov (NCT02847442)

    The ALICE experiment at the CERN LHC

    Get PDF
    ALICE (A Large Ion Collider Experiment) is a general-purpose, heavy-ion detector at the CERN LHC which focuses on QCD, the strong-interaction sector of the Standard Model. It is designed to address the physics of strongly interacting matter and the quark-gluon plasma at extreme values of energy density and temperature in nucleus-nucleus collisions. Besides running with Pb ions, the physics programme includes collisions with lighter ions, lower energy running and dedicated proton-nucleus runs. ALICE will also take data with proton beams at the top LHC energy to collect reference data for the heavy-ion programme and to address several QCD topics for which ALICE is complementary to the other LHC detectors. The ALICE detector has been built by a collaboration including currently over 1000 physicists and engineers from 105 Institutes in 30 countries. Its overall dimensions are 161626 m3 with a total weight of approximately 10 000 t. The experiment consists of 18 different detector systems each with its own specific technology choice and design constraints, driven both by the physics requirements and the experimental conditions expected at LHC. The most stringent design constraint is to cope with the extreme particle multiplicity anticipated in central Pb-Pb collisions. The different subsystems were optimized to provide high-momentum resolution as well as excellent Particle Identification (PID) over a broad range in momentum, up to the highest multiplicities predicted for LHC. This will allow for comprehensive studies of hadrons, electrons, muons, and photons produced in the collision of heavy nuclei. Most detector systems are scheduled to be installed and ready for data taking by mid-2008 when the LHC is scheduled to start operation, with the exception of parts of the Photon Spectrometer (PHOS), Transition Radiation Detector (TRD) and Electro Magnetic Calorimeter (EMCal). These detectors will be completed for the high-luminosity ion run expected in 2010. This paper describes in detail the detector components as installed for the first data taking in the summer of 2008

    Specification-driven dynamic binary translation

    Get PDF
    Machine emulation allows for the simulation of a real or virtual machine, the source machine, on various host computers. A machine emulator interprets programs that are compiled for the emulated machine, but normally at a much reduced speed. Therefore, in order to increase the executions peed of such interpreted programs, a machine emulator may apply different dynamic optimization techniques. In our research we focus on emulators for real machines, i.e. existing computer architectures, and in particular on dynamic binary translation as the optimization technique. With dynamic binary translation, the machine instructions of the interpreted source program are translated in to machine instructions for the host machine during the interpretation of the program. Both, the machine emulator and its dynamic binary translator a resource and host machine specific, respectively, and are therefore traditionally hand-written. In this thesis we introduce the Walkabout/Yirr-Ma framework. Walkabout, initially developed by Sun Micro systems, allows among other things for the generation of instrumented machine emulators from a certain type of machine specification files. We extended Walkabout with our generic dynamic optimization framework ‘Yirr-Ma’ which defines an interface for the implementation of various dynamic optimizers: by instrumenting a Walkabout emulator’s instruction interpretation functions, Yirr-Ma observes and intercepts the interpretation of a source machine program, and applies dynamic optimizations to selected traces of interpreted instructions on demand. One instance of Yirr-Ma’s interface for dynamic optimizers implements our specification-driven dynamic binary translator, the major contribution of this thesis. At first we establish two things: a formal framework that describes the process of machine emulation by abstracting from real machines, and different classes of applicable dynamic optimizations. We define dynamic optimizations by a set of functions over the abstracted machine, and dynamic binary translation as one particular optimization function. Using this formalism, we then derive the upper bound for quality of dynamically translated machine instructions. Yirr-Ma’s dynamic binary translator implements the optimization functions of our formal framework by modules which are either generated from, or parameterized by, machine specification files. They thus allow for the adaptation of the dynamic binary translator to different source and host machines without hand-writing machine dependent code

    OUT

    No full text
    publicly paper and electronic copies of this thesis document in whole or in part. “But I’ve come to see that as nothing can be made that isn’t flawed, the challenge is twofold: first, not to berate oneself for what is, after all, inevitable; and second, to see in our failed perfection a different thing; a truer thing, perhaps, because it contains both our ambition and the spoiling of that ambition; the exhaustion of order, and the discovery – in the midst of despair – that the beast dogging the heels of beauty has a beauty all of its own.

    Do people make sufficiency-oriented mobile phone choices based on dynamic norms? The perception and effectiveness of sufficiency-promoting messages in online media

    No full text
    Excessive consumption poses a significant threat to the environment. Therefore, overall consumption reduction is necessary. The sufficiency strategy aims to achieve this reduction by enabling individual behavioral changes. In the context of mobile phone choices, one way to promote sufficiency is by encouraging consumers to refrain from purchasing new devices and instead prolong the lifespan of their existing ones. As companies play a major role in shaping products and market conditions that influence consumption, they have a responsibility to support consumers in adopting sufficiency-oriented mobile phone use. However, previous research on the effectiveness of sufficiency-promoting communication and the use of dynamic norms in behavior-change interventions has yielded mixed results. We conducted an online experiment with two specific objectives: First, we examined whether messages emphasizing a dynamic norm have positive impacts on individuals' intentions and behavioral choices regarding sufficiency-oriented mobile phone use. The results indicated no significant effects of a sufficiency-promoting message emphasizing a dynamic norm, compared to two other conditions. Nevertheless, regardless of the message people received, one third of the individuals in all groups chose a sufficiency-oriented voucher. Materialism was found to positively predict both the intention to buy a new mobile phone and sufficiency-oriented behavior, partially contradicting our hypothesis. This suggests that sufficiency-oriented consumption may also be motivated by aspirations for status, albeit in the context of sufficiency. In contrast, personal norms for sufficiency consistently predicted intentions or behaviors aligned with sufficiency, however, not always in the direction we hypothesized. Second, we investigated consumers' attributed motives for online media by comparing a search engine, as an unbiased source of information, with an online store driven by commercial interests. Our analysis revealed significant differences in consumers' perceptions, with altruistic motives attributed to the search engine and exploitative motives attributed to the online store. Our results contribute to the ongoing discussion about the requirements for effective communication strategies that promote sufficiency-oriented choices. We confirm the limited effectiveness of interventions based on dynamic norms in online settings. Nevertheless, our study offers valuables insights for designing future communication initiatives aimed at fostering sufficiency-oriented behaviors

    4-D Scalable Multi-View Video Coding Using Disparity Compensated View Filtering and Motion

    No full text
    Abstract — In this paper, a novel framework for scalable multiview video coding is described. A well known wavelet based scalable coding scheme for single-view video sequences has been adopted and extended to match the specific needs of scalable multi-view video coding. Motion compensated temporal filtering (MCTF) is applied to each video sequence of each camera. The use of a wavelet lifting structure guarantees perfect invertibility of this step, and as a consequence of its open-loop architecture, SNR and temporal scalability are attained. Correlations between the temporal subbands of adjacent cameras are reduced by a novel disparity compensated view filtering (DCVF), method which is also lifting based and open-loop to enable view scalability. Spatial scalability and entropy coding are achieved by the JPEG2000 spatial wavelet transform and EBCOT coding, respectively. Rate allocation along the temporal-view-filtered subbands is done by means of an RD-optimal algorithm. Experimental results show the high scaling capability in terms of SNR, temporal and view scalability. I
    corecore