42 research outputs found

    Impact of food, alcohol and pH on modified-release hydrocortisone developed to treat congenital adrenal hyperplasia.

    Get PDF
    BACKGROUND: We developed a modified-release hydrocortisone, Chronocort®, to replace the cortisol rhythm in patients with congenital adrenal hyperplasia. Food, alcohol and pH affect drug absorption and it is important to assess their impact when replicating a physiological rhythm. SUBJECTS AND METHODS: In vitro dissolution to study impact of alcohol and pH on Chronocort®. A Phase 1, three-period, cross over study in 18 volunteers to assess the impact of food on Chronocort® and to compare bioavailability to immediate-release hydrocortisone. RESULTS: In vitro dissolution of Chronocort® was not affected by gastrointestinal pH up to 6.0 nor by an alcohol content up to 20 % v/v. Food delayed and reduced the rate of absorption of Chronocort® as reflected by a longer Tmax (fed vs fasted: 6.75 hrs vs 4.5 hrs, p=0005) and lower Cmax (549.49 vs 708.46, nmol/L, ratio 77% with CI 71 - 85). Cortisol exposure was similar in fed and fasted state: Geo LSmean ratio (CI) AUC0 t for fed/fasted was 108.33% (102.30 - 114.72%). Cortisol exposure was higher for Chronocort® compared to immediate-release hydrocortisone: Geo LSmean ratios (CI) 118.83% (111.58 - 126.54%); however, derived free cortisol showed cortisol exposure CIs were within 80.0 125.0 %: Geo LSmean ratio (CI) for AUC0 t 112.73% (105.33 - 120.65%). CONCLUSIONS: Gastric pH ≤ 6.0 and alcohol do not effect hydrocortisone release from Chronocort®. Food delays Chronocort® absorption but cortisol exposure is similar in the fasted and fed state and exposure as assessed by free cortisol is similar between Chronocort® and immediate-release hydrocortisone

    Measure Transformer Semantics for Bayesian Machine Learning

    Get PDF
    The Bayesian approach to machine learning amounts to computing posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define measure-transformer combinators inspired by theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zero-probability events. We compile our core language to a small imperative language that is processed by an existing inference engine for factor graphs, which are data structures that enable many efficient inference algorithms. This allows efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models.Comment: An abridged version of this paper appears in the proceedings of the 20th European Symposium on Programming (ESOP'11), part of ETAPS 201

    The Physicochemical and Rheological Characterisation of Drug-Polymer Systems Prepared by Hot-Melt Extrusion

    No full text
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Measure Transformer Semantics for Bayesian Machine Learning

    No full text
    The Bayesian approach to machine learning amounts to computing posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define measure-transformer combinators inspired by theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zero-probability events. We compile our core language to a small imperative language that is processed by an existing inference engine for factor graphs, which are data structures that enable many efficient inference algorithms. This allows efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models
    corecore