45 research outputs found

    ITL Monitor: Compositional Runtime Analysis with Interval Temporal Logic

    Get PDF
    Runtime verification has gained significant interest in recent years. It is a process in which the execution trace of a program is analysed while it is running. A popular language for specifying temporal requirements for runtime verification is Linear Temporal Logic (LTL), which is excellent for expressing properties such as safety and liveness. Another formalism that is used is Interval Temporal Logic (ITL). This logic has constructs for specifying the behaviour of programs that can be decomposed into subintervals of activity. Traditionally, only a restricted subset of ITL has been used for runtime verification due to the limitations imposed by making the subset executable. In this thesis an alternative restriction of ITL was considered as the basis for constructing a library of runtime verification monitors (ITL-Monitor). The thesis introduces a new first-occurrence operator (|>) into ITL and explores its properties. This operator is the basis of the translation from runtime monitors to their corresponding ITL formulae. ITL-Monitor is then introduced formally, and the algebraic properties of its operators are analysed. An implementation of ITL-Monitor is given, based upon the construction of a Domain Specific Language using Scala. The architecture of the underlying system comprises a network of concurrent actors built on top of Akka - an industrial strength distributed actor framework. A number of example systems are constructed to evaluate ITL-Monitor's performance against alternative verification tools. ITL-Monitor is also subjected to a simulation that generates a very large quantity of state data. The monitors were observed to deliver consistent performance across execution traces of up to a million states, and to verify subintervals of up to 300 states against ITL formulae with evaluation complexity of O(n^3)

    Batch and continuous blending of particulate material studied by near-infrared spectroscopy

    Get PDF
    Background: Pharmaceutical manufacturing is moving towards real-time release of the products. This objective can only be achieved by clearly understanding the process and by implementing suitable technologies for manufacturing and for process control. Near-infrared (NIR) spectroscopy is one technology that has attracted lot of attention from the pharmaceutical industry since it can analyze bulk solids without any pretreatment, therefore reducing or eliminating wet chemistry analysis. NIR spectroscopy is a powerful tool for the monitoring unit operations were bulk material is involved i.e. blending of powders. Blending of powders is a complex and poorly understood unit operation. In the pharmaceutical industry blending has been performed batchwise and controlled by thief sampling. Thief sampling is an invasive process which is tedious and tends to introduce bias; therefore an alternative sampling method was highly needed. Here is where NIR found a perfect match with blend uniformity monitoring, thus NIR implementation offers several advantages: thief sampling is avoided, the process is continuously monitored, detection of blend-end point, and fast identification of process deviations. NIR spectral data need to be correlated with the parameter of interest (physical or chemical). These computations are done by multivariate data analysis (MVDA). MVDA and NIR are a powerful combination for in-process control and their use has been promoted by the health authorities through the Process Analytical technology (PAT) initiative by the FDA. Purpose: This thesis is focused on the study of powder blending, which is an essential unit operation for the manufacture of solid dosage forms. The aim was to develop two quantitative methods for the monitoring of the active ingredient concentration. One method was developed for blend uniformity monitoring of a batch mixing process, and a second method for a continuous mixing process. This study also tackles the relevance of the physical presentation of the powder on the final blend quality, by studying the influence of the particle size and the effect of the previous manufacturing steps on the NIR spectral data. Methods: Particle size was studied by NIR in diffuse reflectance mode, using Kubelka-Munk function and the transformation of reflectance of absorbance values, in order to focus the analysis on the physical properties. Furthermore, an off-line NIR model was developed for the quantification of the mean particle size. Segregation tendencies due to particle size incompatibilities were studied. Blend uniformity monitoring of a batch pharmaceutical mixing was achieved through a NIR off- line calibration method, which was used for the in-line drug quantification of a production scale mixing process. NIR in diffuse reflectance mode was used in the study of a continuous blending system. The effect of the process parameters, i.e. flow rate and stirring rate, was analyzed. Moreover, a NIR method for the in-line drug quantification was developed. NIR was implemented in a powder stream, in which the mass of powder measured by NIR was estimated. Results and discussion: Regarding particle size, incompatibilities due to different particle size ranges between the formulation ingredients lead to severe segregation. Particle size and cohesion determined the quality of the powder blend; slight cohesion and broader particle size distribution improved the robustness of the final blend. NIR showed high sensitivity to particle size variations, thus it was possible to develop a quantitative model for the mean particle size determination with a prediction error of 16 micrometers. Concerning batch mixing, an off-line calibration was generated for the quantification of two active ingredients contained in the formulation. The prediction errors varied from 0.4 to 2.3% m/m for each of the drugs respectively. Special emphasis was given on the proper wavelength selection for the quantitative analysis in order to focus the analysis on the active ingredients quantification. In relation to continuous blending of particulate material, a quantitative NIR model was developed for the in-line prediction of the active ingredient concentration. The NIR model was tested under different process conditions of feeding rate and stirring rate. High stirring rates produce higher scattering of the NIR predictions. This was directly associated with the acceleration of the particles at the outlet of the blender affecting the dwell time of the particles with the NIR probe. The NIR model showed to be robust to moderate feed rate increments; however the NIR model under-predicted the drug concentration under moderate feed rate reductions of 30 kg/h. Furthermore, the continuous blending phases were clearly identified by principal component analysis, moving block of standard deviation, and relative standard deviation, all of them giving consistent results. NIR measurements in a powder stream involved the scanning of powder flowing in a chute. The flow of bulk solids is a complex phenomenon in which powder moves at a certain velocity. The motion of particles produces changes in the density and distribution of the voids. In this study, the velocity of the powder sliding down an inclined chute was measured and used for the estimation of the NIR measured mass. The mass observed during one NIR measurement was estimated to be less than one tablet. Conclusions: This study proved the feasibility of applying NIR spectroscopy for the blend uniformity monitoring of batch and continuous powder mixing. Understanding the critical parameters of powder mixing lead to a robust process and reliable analytical methods. NIR proved to be a valuable and versatile analytical tool in the measurement of bulk solids

    Hybridizable compatible finite element discretizations for numerical weather prediction: implementation and analysis

    Get PDF
    There is a current explosion of interest in new numerical methods for atmospheric modeling. A driving force behind this is the need to be able to simulate, with high efficiency, large-scale geophysical flows on increasingly more parallel computer systems. Many current operational models, including that of the UK Met Office, depend on orthogonal meshes, such as the latitude-longitude grid. This facilitates the development of finite difference discretizations with favorable numerical properties. However, such methods suffer from the ``pole problem," which prohibits the model to make efficient use of a large number of computing processors due to excessive concentration of grid-points at the poles. Recently developed finite element discretizations, known as ``compatible" finite elements, avoid this issue while maintaining the key numerical properties essential for accurate geophysical simulations. Moreover, these properties can be obtained on arbitrary, non-orthogonal meshes. However, the efficient solution of the resulting discrete systems depend on transforming the mixed velocity-pressure (or velocity-pressure-buoyancy) system into an elliptic problem for the pressure. This is not so straightforward within the compatible finite element framework due to inter-element coupling. This thesis supports the proposition that systems arising from compatible finite element discretizations can be solved efficiently using a technique known as ``hybridization." Hybridization removes inter-element coupling while maintaining the desired numerical properties. This permits the construction of sparse, elliptic problems, for which fast solver algorithms are known, using localized algebra. We first introduce the technique for compatible finite element discretizations of simplified atmospheric models. We then develop a general software abstraction for the rapid implementation and composition of hybridization methods, with an emphasis on preconditioning. Finally, we extend the technique for a new compatible method for the full, compressible atmospheric equations used in operational models.Open Acces

    The impact of AI on radiographic image reporting – perspectives of the UK reporting radiographer population

    Get PDF
    Background: It is predicted that medical imaging services will be greatly impacted by AI in the future. Developments in computer vision have allowed AI to be used for assisted reporting. Studies have investigated radiologists' opinions of AI for image interpretation (Huisman et al., 2019 a/b) but there remains a paucity of information in reporting radiographers' opinions on this topic.Method: A survey was developed by AI expert radiographers and promoted via LinkedIn/Twitter and professional networks for radiographers from all specialities in the UK. A sub analysis was performed for reporting radiographers only.Results: 411 responses were gathered to the full survey (Rainey et al., 2021) with 86 responses from reporting radiographers included in the data analysis. 10.5% of respondents were using AI tools? as part of their reporting role. 59.3% and 57% would not be confident in explaining an AI decision to other healthcare practitioners and 'patients and carers' respectively. 57% felt that an affirmation from AI would increase confidence in their diagnosis. Only 3.5% would not seek second opinion following disagreement from AI. A moderate level of trust in AI was reported: mean score = 5.28 (0 = no trust; 10 = absolute trust). 'Overall performance/accuracy of the system', 'visual explanation (heatmap/ROI)', 'Indication of the confidence of the system in its diagnosis' were suggested as measures to increase trust.Conclusion: AI may impact reporting professionals' confidence in their diagnoses. Respondents are not confident in explaining an AI decision to key stakeholders. UK radiographers do not yet fully trust AI. Improvements are suggested

    An evaluation of a training tool and study day in chest image interpretation

    Get PDF
    Background: With the use of expert consensus a digital tool was developed by the research team which proved useful when teaching radiographers how to interpret chest images. The training tool included A) a search strategy training tool and B) an educational tool to communicate the search strategies using eye tracking technology. This training tool has the potential to improve interpretation skills for other healthcare professionals.Methods: To investigate this, 31 healthcare professionals i.e. nurses and physiotherapists, were recruited and participants were randomised to receive access to the training tool (intervention group) or not to have access to the training tool (control group) for a period of 4-6 weeks. Participants were asked to interpret different sets of 20 chest images before and after the intervention period. A study day was then provided to all participants following which participants were again asked to interpret a different set of 20 chest images (n=1860). Each participant was asked to complete a questionnaire on their perceptions of the training provided. Results: Data analysis is in progress. 50% of participants did not have experience in image interpretation prior to the study. The study day and training tool were useful in improving image interpretation skills. Participants perception of the usefulness of the tool to aid image interpretation skills varied among respondents.Conclusion: This training tool has the potential to improve patient diagnosis and reduce healthcare costs

    Medical Informatics

    Get PDF
    Information technology has been revolutionizing the everyday life of the common man, while medical science has been making rapid strides in understanding disease mechanisms, developing diagnostic techniques and effecting successful treatment regimen, even for those cases which would have been classified as a poor prognosis a decade earlier. The confluence of information technology and biomedicine has brought into its ambit additional dimensions of computerized databases for patient conditions, revolutionizing the way health care and patient information is recorded, processed, interpreted and utilized for improving the quality of life. This book consists of seven chapters dealing with the three primary issues of medical information acquisition from a patient's and health care professional's perspective, translational approaches from a researcher's point of view, and finally the application potential as required by the clinicians/physician. The book covers modern issues in Information Technology, Bioinformatics Methods and Clinical Applications. The chapters describe the basic process of acquisition of information in a health system, recent technological developments in biomedicine and the realistic evaluation of medical informatics

    Deep learning applications in the prostate cancer diagnostic pathway

    Get PDF
    Prostate cancer (PCa) is the second most frequently diagnosed cancer in men worldwide and the fifth leading cause of cancer death in men, with an estimated 1.4 million new cases in 2020 and 375,000 deaths. The risk factors most strongly associated to PCa are advancing age, family history, race, and mutations of the BRCA genes. Since the aforementioned risk factors are not preventable, early and accurate diagnoses are a key objective of the PCa diagnostic pathway. In the UK, clinical guidelines recommend multiparametric magnetic resonance imaging (mpMRI) of the prostate for use by radiologists to detect, score, and stage lesions that may correspond to clinically significant PCa (CSPCa), prior to confirmatory biopsy and histopathological grading. Computer-aided diagnosis (CAD) of PCa using artificial intelligence algorithms holds a currently unrealized potential to improve upon the diagnostic accuracy achievable by radiologist assessment of mpMRI, improve the reporting consistency between radiologists, and reduce reporting time. In this thesis, we build and evaluate deep learning-based CAD systems for the PCa diagnostic pathway, which address gaps identified in the literature. First, we introduce a novel patient-level classification framework, PCF, which uses a stacked ensemble of convolutional neural networks (CNNs) and support vector machines (SVMs) to assign a probability of having CSPCa to patients, using mpMRI and clinical features. Second, we introduce AutoProstate, a deep-learning powered framework for automated PCa assessment and reporting; AutoProstate utilizes biparametric MRI and clinical data to populate an automatic diagnostic report containing segmentations of the whole prostate, prostatic zones, and candidate CSPCa lesions, as well as several derived characteristics that are clinically valuable. Finally, as automatic segmentation algorithms have not yet reached the desired robustness for clinical use, we introduce interactive click-based segmentation applications for the whole prostate and prostatic lesions, with potential uses in diagnosis, active surveillance progression monitoring, and treatment planning

    TECHNOLOGY ASSESSMENT FOR SUSTAINABILITY IN WATER USE. OPERATIONALIZATION OF A RESPONSIBLE GOVERNANCE BASED IN RESPONSIBLE RESEARCH AND INNOVATION (ANTICIPATION AND INCLUSIVENESS)

    Get PDF
    The management of sustainability in water resources has underscored the critical importance of determining appropriate decision-making processes and establishing effective governance structures. Gaining comprehensive insights into the decision-making mechanisms and actors involved is pivotal for tackling present as well as prospective issues related to water efficiently. This research evaluates the interplay among water scarcity, responsible technologies for water use, and systems of governance for sustainability amid swift technological progress. Furthermore, it delves into the congruity of said endeavors with the Sustainable Development Goals (SDGs), other sustainability water frameworks and the social and political ecosystem. In this context, the active engagement and participation of societal actors, and not only stakeholders, assume a pivotal role as it significantly impacts the decision-making processes and molds the results of sustainability initiatives. An innovative approach to the concepts of responsibility and sustainability is predicated on the quality of the relationship between the network of societal actors as a key point. This work underscores the importance of establishing strong and comprehensive relationships to address the challenges concerning water management and promote the adoption of sustainable approaches, in co-creation, not only of knowledge but the epistemic subject in the process. This work sheds light on the interrelated domains of water management, sustainability, and regulation. A novel proposal is presented via a simulation exercise and use the socio-technical framework for the purpose of fostering responsible water use. The comprehension and use of responsible technology and innovation in the realm of water u management will be enhanced through the technique of operationalizing open anticipatory governance and executing a simulated experiment. By using a digital deliberation space and establishing a systematic approach towards technology assessment and sustainability, using the relational quality of the network of actors as the key element for co-production of knowledge, science and technology, the present study has produced and materialized an innovative framework.Na sustentabilidade da gestão da água reveste-se de especial importância determinar processos de tomada de decisão adequados e estabelecer estruturas de governação eficazes. Obter uma visão abrangente sobre os mecanismos de tomada de decisão e os atores envolvidos é fundamental para abordar questões presentes e futuras relacionadas ao uso eficiente da água. Este trabalho procura conhecer a interação entre gestão de água, tecnologias responsáveis pelo uso da água e sistemas de governança para a sustentabilidade. Adicionalmente, pretende conhecer a relação com os Sustainable Development Goals (SDGs), outros programas de sustentabilidade, bem como com o ecossistema social e político. Neste contexto, o envolvimento e a participação ativa dos atores sociais, e não apenas de stakeholders, assume um papel fundamental, uma vez que, não só, impactam significativamente os processos de tomada de decisão, mas, também, moldam os resultados das iniciativas de sustentabilidade. Nesta nova aproximação ao conceito de responsabilidade e sustentabilidade encontra-se a qualidade da relação entre a rede de atores sociais como ponto-chave. Sublinha-se a importância de estabelecer uma qualidade relacional enriquecida e abrangente para enfrentar de forma mais estruturada os desafios relativos à gestão da água de forma eficiente e promover a adoção de abordagens sustentáveis. Com este trabalho, procura-se aprofundar os domínios inter-relacionados da gestão da água, sustentabilidade e regulamentação. É elaborada uma proposta de simulação, utilizando uma perspetiva sociotécnica com o objetivo de capacitar a co-constituição como sujeitos e a compreensão e utilização de tecnologia responsável e inovação no âmbito da gestão do uso da água utilizando operacionalização da governação antecipatória aberta. O presente estudo materializa seu carácter de inovação ao utilizar um espaço de deliberação digital e ao estabelecer uma abordagem sistemática para a avaliação da tecnologia e sustentabilidade, usando a qualidade relacional da rede de atores como elemento-chave para a coprodução de conhecimento, ciência e tecnologia e co-constituição do próprio sujeito no processo de deliberação
    corecore