305 research outputs found

    A risk-based framework to manage single-use systems over lifecycle: Design, cleaning, operation, ongoing process verification

    Get PDF
    Using a formal risk management approach, SU systems (SUSs) and classical stainless steel-based systems (CSSs) are examined in detail. The risk assessment steps include: identification, analysis, mitigation and reevaluation/verification of risks over lifecycle. Adequate risk tools are used for each step. Failure modes specific of each type of system, are considered. The scientific background described in BPOG’s Guidelines on extractables (2014) and leachables (Jan.2017) for SUSs is built in, together with other authoritative sources for both types of systems. The risk quantitation, profiling and the holistic risk visualization for both technology platforms, allows companies with stainless steel only fermentation capacity and those with a mix of both technologies, to consider how to best evolve their capacity, when potential impacts on product quality and patient safety are taken into account. In addition to allowing the relative benchmarking of both technology platforms, comparisons over lifecycle of a specific platform are made possible by our approach, enabling companies to retain a complete picture of all their risk-based decisions and very detailed knowledge about their assets performance. The knowledge management component proposed is unique and supports several business processes and also decisions a company needs to justify before authorities. Our talk provides an insightful discussion of SUSs, using state-of-the-art risk-based and knowledge management tools, illustrated by examples on Design, Cleaning, Operation and Ongoing Process Verification, focusing on risks to quality and safety

    Finite-Fault Rupture Detector (FinDer): Going Real-Time in Californian ShakeAlert Warning System

    Get PDF
    Rapid detection of local and regional earthquakes and issuance of fast alerts for impending shaking is considered beneficial to save lives, reduce losses, and shorten recovery times after destructive events (Allen et al., 2009). Over the last two decades, several countries have built operational earthquake early warning (EEW) systems, including Japan (Hoshiba et al., 2008), Mexico (Espinosa-Aranda et al., 1995), Romania (Mărmureanu et al., 2011), Turkey (Erdik et al., 2003), Taiwan (Hsiao et al., 2011), and China (Peng et al., 2011). Other countries, such as the United States (Böse, Allen, et al., 2013), Italy (Satriano et al., 2011), and Switzerland (Behr et al., 2015), are currently developing systems or evaluating algorithms in their seismic real-time networks

    Comparability and similarity protocols for biotechnology products

    Get PDF
    Over lifecycle, from development through all industrialization stages and then over manufacturing history across multiple sites, a biotechnology product must show very high levels of quality consistency, as that is a pre-requisite for safety and efficacy to patients (cf. ICH Q5E, 2004). Recently, FDA defined different levels of similarity (Draft Guidance, May 2014, “Clinical Pharmacology Data to Support a Demonstration of Biosimilarity to a Reference Product”) namely the concepts of (1) “highly similar with fingerprint-like similarity” and that of (2) “residual uncertainty” in regard to similarity. Here we present a new approach to combine whole analytical domains from different techniques used in comparability protocols (to support change management for same product and process) that also is applicable to assess similarity in biocomparability investigations. Our approach ensures much higher levels of confidence in Comparability Protocols by not overlooking differences that might be unnoticed or overlooked due to incomplete prior-knowledge in regards to methods used and attributes considered. Our approach can (a) detect very small differences, (b) establish therefore the exact level of comparability / similarity and (c) provides a sound statistical foundation to assess residual uncertainties. We will illustrate our approach on reference products over their lifecycle and to compare reference with putatively similar products. The outcome of assessments (a) through (c) can then be linked to pharmacological performance or both types of biotechnology products, and support regulatory or other decisions related to managing filings

    Integrating analysis with process control for continuous bioprocessing: Extending the lifecycle concept to process analytical technologies

    Get PDF
    The most notable trends in the use of PAT tools in bioprocessing nowadays are: (1) the combined use of multiple tools to a product & process and (2) that information fusion provides a better process estimation and product quality knowledge foundation than the use classical use of one analytical method The lifecycle management aspects of validation launched by FDA in its 2011 Validation Guidance, arrived - through ICH Q11 (2012) and the forth coming ICH Q12 (2017), to all aspects of development, qualification and commercial manufacturing of large molecule drug substances. For continuous bioprocessing that challenges are significantly different and more complex than for batch processing of small molecules. The homeostasis state that are targeted for most continuous bioprocesses, puts specific requirements in each individual PAT tool chosen and in their combined use over lifecycle. Here, using case-studies, a discussion is done of specific challenges at each of the three stages of lifecycle, about how and when the shift is made from understanding (acquiring data & information) to enhanced bioprocess control (knowledge-based decisions) to realize product quality by design (predictive product quality). Finally, we show how platform knowledge can be managed across multiple products for a company’s own portfolio benefit, through aggregation and high-level visualization of multiple PAT projects

    Pneumonia organizativa – ExperiĂȘncia da consulta de um hospital central

    Get PDF
    AbstractAim: to characterise outpatients of a Portuguese central hospital diagnosed with organising pneumonia (OP) and compare results with current literature. Methods: medical processes with diagnosis of OP were retrospectively studied as to demographics, aetiology, clinical and radiological features, average time until and date of diagnosis, laboratory and histological changes, treatment and relapse. Results – thirteen patients with a mean follow-up of 171.6weeks (max 334 and min 28 weeks) were evaluated. Nine of these patients (70%) had cryptogenic OP (COP) while 30% had secondary OP (SOP), two with rheumatoid arthritis, one with dermatomyositis and another undergoing radiotherapy for breast cancer. Mean age was 55.6 (+-15.3years), 92% female, 77% were non-smokers. Average time until diagnosis was 77.2weeks (min 3 and max 432 weeks). Symptoms at presentation were tiredness (92%), cough (85%), fever (65%), shortness of breath (54%), thoracic pain (23%) and weight loss (23%). At the time of diagnosis, the mean erythrocyte sedimentation rate was 70mm (max 170mm and min 16mm). C-reactive protein level was increased in eight patients. Significant leucocytosis was absent. Chest X-ray and chest CT scan showed bilateral distribution in 12 patients (92%). Consolidation with an air bronchogram was present in 12 patients and in four (31%), consolidation was migratory. Four patients (30%) underwent transbronchial pulmonary biopsy, all uncharacteristic and eight patients surgical pulmonary biopsy, four showed histological confirmation of SOP. Corticosteroids were started in 11 patients and average treatment was 61.6weeks (16-288 weeks). 15% (2/13) had spontaneous resolution. Four patients (31%) relapsed, one of them five times. Two patients are dependent on a low dose of corticosteroids, one due to underlying disease and another due to multiple relapses. Therapy of relapse was corticosteroids alone in minimum effective dosage or associated to azathioprine or ciclosporin. Discussion and conclusion: such a high incidence in females (92%) may be explained by the limited sample of patients. In 70% of the patients diagnosis were established by clinical and radiology criteria. Mean time to diagnosis was very variable which suggests that in some cases the disease was not diagnosed and treated as another interstitial lung disease or as recurrent pneumonia. Most patients (53.8%) had a favourable clinical course after treatment with corticosteroids with a very low number of relapses (30.8%), much lower than described by other authors (60%). Only in experienced centres should the diagnosis of OP established by clinical and radiological criteria.Rev Port Pneumol 2010; XVI (3): 369-38

    Keck Interferometer nuller update

    Get PDF
    The Keck Interferometer combines the two 10 m Keck telescopes as a long baseline interferometer, funded by NASA, as a joint development among the Jet Propulsion Laboratory, the W. M. Keck Observatory, and the Michelson Science Center. Since 2004, it has offered an H- and K-band fringe visibility mode through the Keck TAC process. Recently this mode has been upgraded with the addition of a grism for higher spectral resolution. The 10 um nulling mode, for which first nulling data were collected in 2005, completed the bulk of its engineering development in 2007. At the end of 2007, three teams were chosen in response to a nuller key science call to perform a survey of nearby stars for exozodiacal dust. This key science observation program began in Feb. 2008. Under NSF funding, Keck Observatory is leading development of ASTRA, a project to add dual-star capability for high sensitivity observations and dual-star astrometry. We review recent activity at the Keck Interferometer, with an emphasis on the nuller development

    FinDer v.2: Improved real-time ground-motion predictions for M2–M9 with seismic finite-source characterization

    Get PDF
    Recent studies suggest that small and large earthquakes nucleate similarly, and that they often have indistinguishable seismic waveform onsets. The characterization of earthquakes in real time, such as for earthquake early warning, therefore requires a flexible modeling approach that allows a small earthquake to become large as fault rupture evolves over time. Here, we present a modeling approach that generates a set of output parameters and uncertainty estimates that are consistent with both small/moderate (≀M6.5) and large earthquakes (>M6.5) as is required for a robust parameter interpretation and shaking forecast. Our approach treats earthquakes over the entire range of magnitudes (>M2) as finite line-source ruptures, with the dimensions of small earthquakes being very small (<100 m) and those of large earthquakes exceeding several tens to hundreds of kilometres in length. The extent of the assumed line source is estimated from the level and distribution of high-frequency peak acceleration amplitudes observed in a local seismic network. High-frequency motions are well suited for this approach, because they are mainly controlled by the distance to the rupturing fault. Observed ground-motion patterns are compared with theoretical templates modeled from empirical ground-motion prediction equations to determine the best line source and uncertainties. Our algorithm extends earlier work by Böse et al. for large finite-fault ruptures. This paper gives a detailed summary of the new algorithm and its offline performance for the 2016 M7.0 Kumamoto, Japan and 2014 M6.0 South Napa, California earthquakes, as well as its performance for about 100 real-time detected local earthquakes (2.2 ≀ M ≀ 5.1) in California. For most events, both the rupture length and the strike are well constrained within a few seconds (<10 s) of the event origin. In large earthquakes, this could allow for providing warnings of up to several tens of seconds. The algorithm could also be useful for resolving fault plane ambiguities of focal mechanisms and identification of rupturing faults for earthquakes as small as M2.5

    A CF3I-based SDD Prototype for Spin-independent Dark Matter Searches

    Full text link
    The application of Superheated Droplet Detectors (SDDs) to dark matter searches has so far been confined to the light nuclei refrigerants C2ClF5 and C4F10 (SIMPLE and PICASSO, respectively), with a principle sensitivity to spin-dependent interactions. Given the competitive results of these devices, as a result of their intrinsic insensitivity to backgrounds, we have developed a prototype trifluoroiodomethane (CF3I)-loaded SDD with increased sensitivity to spin-independent interactions as well. A low (0.102 kgd) exposure test operation of two high concentration, 1 liter devices is described, and the results compared with leading experiments in both spin-dependent and -independent sectors. Although competitive in both sectors when the difference in exposures is accounted for, a problem with fracturing of the detector gel must be addressed before significantly larger exposures can be envisioned.Comment: revised and updated; accepted Astrop. Phy

    Final Analysis and Results of the Phase II SIMPLE Dark Matter Search

    Full text link
    We report the final results of the Phase II SIMPLE measurements, comprising two run stages of 15 superheated droplet detectors each, the second stage including an improved neutron shielding. The analyses includes a refined signal analysis, and revised nucleation efficiency based on reanalysis of previously-reported monochromatic neutron irradiations. The combined results yield a contour minimum of \sigma_{p} = 4.2 x 10^-3 pb at 35 GeV/c^2 on the spin-dependent sector of WIMP-proton interactions, the most restrictive to date from a direct search experiment and overlapping for the first time results previously obtained only indirectly. In the spin-independent sector, a minimum of 3.6 x 10^-6 pb at 35 GeV/c^2 is achieved, with the exclusion contour challenging the recent CoGeNT region of current interest.Comment: revised, PRL-accepted version with slightly weakened limit contour
    • 

    corecore