2,188 research outputs found

    A Bayesian time-to-event pharmacokinetic model for sequential phase I dose-escalation trials with multiple schedules

    Full text link
    Phase I dose-escalation trials constitute the first step in investigating the safety of potentially promising drugs in humans. Conventional methods for phase I dose-escalation trials are based on a single treatment schedule only. More recently, however, multiple schedules are more frequently investigated in the same trial. Here, we consider sequential phase I trials, where the trial proceeds with a new schedule (e.g. daily or weekly dosing) once the dose escalation with another schedule has been completed. The aim is to utilize the information from both the completed and the ongoing dose-escalation trial to inform decisions on the dose level for the next dose cohort. For this purpose, we adapted the time-to-event pharmacokinetics (TITE-PK) model, which were originally developed for simultaneous investigation of multiple schedules. TITE-PK integrates information from multiple schedules using a pharmacokinetics (PK) model. In a simulation study, the developed appraoch is compared to the bridging continual reassessment method and the Bayesian logistic regression model using a meta-analytic-prior. TITE-PK results in better performance than comparators in terms of recommending acceptable dose and avoiding overly toxic doses for sequential phase I trials in most of the scenarios considered. Furthermore, better performance of TITE-PK is achieved while requiring similar number of patients in the simulated trials. For the scenarios involving one schedule, TITE-PK displays similar performance with alternatives in terms of acceptable dose recommendations. The \texttt{R} and \texttt{Stan} code for the implementation of an illustrative sequential phase I trial example is publicly available at https://github.com/gunhanb/TITEPK_sequential

    Rock, Rap, or Reggaeton?: Assessing Mexican Immigrants' Cultural Assimilation Using Facebook Data

    Full text link
    The degree to which Mexican immigrants in the U.S. are assimilating culturally has been widely debated. To examine this question, we focus on musical taste, a key symbolic resource that signals the social positions of individuals. We adapt an assimilation metric from earlier work to analyze self-reported musical interests among immigrants in Facebook. We use the relative levels of interest in musical genres, where a similarity to the host population in musical preferences is treated as evidence of cultural assimilation. Contrary to skeptics of Mexican assimilation, we find significant cultural convergence even among first-generation immigrants, which problematizes their use as assimilative "benchmarks" in the literature. Further, 2nd generation Mexican Americans show high cultural convergence vis-\`a-vis both Anglos and African-Americans, with the exception of those who speak Spanish. Rather than conforming to a single assimilation path, our findings reveal how Mexican immigrants defy simple unilinear theoretical expectations and illuminate their uniquely heterogeneous character.Comment: WebConf 201

    Design and Implementation of a Documentation Tool for Interactive Commandline Sessions

    Full text link
    In digital investigations it is important to document the examination of a computer system with as much detail as possible. Allthough never designed for digital investigations, many experts use the software script to record their whole terminal session while analyzing a target system. We analyze script's deficiencies and present the design and implementation of forscript (forensic script), a software providing additional capabilities and mechanisms for digital investigations

    Keep Them out of It!:How Information Externalities Affect the Willingness to Sell Personal Data Online

    Get PDF
    When individuals provide their personal data online, they often disregard that this allows third parties to learn about others, too. Our large-scale online experiment reveals that individuals are less willing to provide personal data when sharing can compromise others’ privacy, entailing personalized price discrimination. Compared to a benchmark without data compromise, individuals’ willingness to sell data decreases when others’ data is compromisedwith 50% or 100% probability. By applying two well-studied interventions – peer effects and a social norm focus – we explore ways to mitigate excessive data sharing, laying the ground for policy design. While peer effects, on average, increase individuals’ willingness to provide personal data, making people reflect on the appropriate behavior appears to be a promising social nudge to reduce excessive data sharing

    The Impact of a Millisecond: Measuring Latency Effects in Securities Trading

    Get PDF
    In the course of technological evolution security markets offer low-latency access to their customers. Although latency figures are used as marketing instruments, only little research sheds light on the means of those figures. This paper provides a performance measure on the effect of latency in the context of the competitive advantage of IT. Based on a historical dataset of Deutsche Börse’s electronic trading system Xetra an empirical analysis is applied. That way we quantify and qualify the impact of latency from a customer’s point of view
    • …
    corecore