24 research outputs found

    Global overview of modern financing typologies to mitigate financial risks in development countries

    Get PDF
    In this study, we reviewed the laws and legal regulations that mandate banks and financial services organizations to implement anti-money laundering efforts which are responsible to detect and mitigate the risks of money laundering and modern financing. We examined the topics of money laundering and modern financing in greater depth to understand the risk factors related to each financial crime. Understanding the aspects of each financial crime is necessary to comprehend predicate offense typologies. We continued with a review and synthesis of the literature on money laundering and modern financing typologies. We concluded the review with an analysis of Gary Becker’s economic theory of criminal behavior and the neoclassical approach to criminal behavior. As suggested by the key concepts reviewed in this literature review, predicate offenses are evolving as prevailing conditions of society change. A major global challenge in recent times is the Covid-19 pandemic crisis which has increased financial risks worldwide (Klimczak et al., 2021). Understanding the different types of predicate offenses and typologies portrays a holistic process of how criminals launder money or finance modern acts. A review of the existing literature demonstrated intensive research on the topic of financial crime but there is a gap in the current legislative and financial risk management framework. The legislative and financial risk management framework detects economic uncertainties and risk factors requiring a reevaluation of financial risk measurement methodologies to mitigate the risk consequences of money laundering and modern financing activities. A best practice to provide a sound framework to manage financial risks is for U.S. banking and financial service company compliance managers to identify predicate offense typologies. American society could benefit from the results of the study (Klimczak et al., 2021). The banking and financial industries ought to be prepared for the future and continue to adapt to new emerging threats, varying consumer classification, and changing environment. It is essential for compliance leaders to implement public education initiatives and help their customers recognize their role in combating money laundering and modern financing activities. Overall, the study has contributed to positive social change by identifying predicate offense typologies that can help U.S. banking and financial services company compliance managers reduce the risks of money laundering and modern financing activities (Klimczak et al., 2021)

    Estimation of valvular resistance of segmented aortic valves using computational fluid dynamics

    Get PDF
    Aortic valve stenosis is associated with an elevated left ventricular pressure and transaortic pressure drop. Clinicians routinely use Doppler ultrasound to quantify aortic valve stenosis severity by estimating this pressure drop from blood velocity. However, this method approximates the peak pressure drop, and is unable to quantify the partial pressure recovery distal to the valve. As pressure drops are flow dependent, it remains difficult to assess the true significance of a stenosis for low-flow low-gradient patients. Recent advances in segmentation techniques enable patient-specific Computational Fluid Dynamics (CFD) simulations of flow through the aortic valve. In this work a simulation framework is presented and used to analyze data of 18 patients. The ventricle and valve are reconstructed from 4D Computed Tomography imaging data. Ventricular motion is extracted from the medical images and used to model ventricular contraction and corresponding blood flow through the valve. Simplifications of the framework are assessed by introducing two simplified CFD models: a truncated time-dependent and a steady-state model. Model simplifications are justified for cases where the simulated pressure drop is above 10 mmHg. Furthermore, we propose a valve resistance index to quantify stenosis severity from simulation results. This index is compared to established metrics for clinical decision making, i.e. blood velocity and valve area. It is found that velocity measurements alone do not adequately reflect stenosis severity. This work demonstrates that combining 4D imaging data and CFD has the potential to provide a physiologically relevant diagnostic metric to quantify aortic valve stenosis severity

    Support for Taverna workflows in the VPH-Share cloud platform

    Get PDF
    Background and objective: To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Methods: Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. Results: 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. Conclusion: The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described

    A multiscale orchestrated computational framework to reveal emergent phenomena in neuroblastoma

    Get PDF
    Neuroblastoma is a complex and aggressive type of cancer that affects children. Current treatments involve a combination of surgery, chemotherapy, radiotherapy, and stem cell transplantation. However, treatment outcomes vary due to the heterogeneous nature of the disease. Computational models have been used to analyse data, simulate biological processes, and predict disease progression and treatment outcomes. While continuum cancer models capture the overall behaviour of tumours, and agent-based models represent the complex behaviour of individual cells, multiscale models represent interactions at different organisational levels, providing a more comprehensive understanding of the system. In 2018, the PRIMAGE consortium was formed to build a cloud-based decision support system for neuroblastoma, including a multi-scale model for patient-specific simulations of disease progression. In this work we have developed this multi-scale model that includes data such as patient's tumour geometry, cellularity, vascularization, genetics and type of chemotherapy treatment, and integrated it into an online platform that runs the simulations on a high-performance computation cluster using Onedata and Kubernetes technologies. This infrastructure will allow clinicians to optimise treatment regimens and reduce the number of costly and time-consuming clinical trials. This manuscript outlines the challenging framework's model architecture, data workflow, hypothesis, and resources employed in its development

    PRIMAGE project : predictive in silico multiscale analytics to support childhood cancer personalised evaluation empowered by imaging biomarkers

    Get PDF
    PRIMAGE is one of the largest and more ambitious research projects dealing with medical imaging, artificial intelligence and cancer treatment in children. It is a 4-year European Commission-financed project that has 16 European partners in the consortium, including the European Society for Paediatric Oncology, two imaging biobanks, and three prominent European paediatric oncology units. The project is constructed as an observational in silico study involving high-quality anonymised datasets (imaging, clinical, molecular, and genetics) for the training and validation of machine learning and multiscale algorithms. The open cloud-based platform will offer precise clinical assistance for phenotyping (diagnosis), treatment allocation (prediction), and patient endpoints (prognosis), based on the use of imaging biomarkers, tumour growth simulation, advanced visualisation of confidence scores, and machine-learning approaches. The decision support prototype will be constructed and validated on two paediatric cancers: neuroblastoma and diffuse intrinsic pontine glioma. External validation will be performed on data recruited from independent collaborative centres. Final results will be available for the scientific community at the end of the project, and ready for translation to other malignant solid tumours

    A multiscale orchestrated computational framework to reveal emergent phenomena in neuroblastoma

    Get PDF
    Neuroblastoma is a complex and aggressive type of cancer that affects children. Current treatments involve a combination of surgery, chemotherapy, radiotherapy, and stem cell transplantation. However, treatment outcomes vary due to the heterogeneous nature of the disease. Computational models have been used to analyse data, simulate biological processes, and predict disease progression and treatment outcomes. While continuum cancer models capture the overall behaviour of tumours, and agent-based models represent the complex behaviour of individual cells, multiscale models represent interactions at different organisational levels, providing a more comprehensive understanding of the system. In 2018, the PRIMAGE consortium was formed to build a cloud-based decision support system for neuroblastoma, including a multi-scale model for patient-specific simulations of disease progression. In this work we have developed this multi-scale model that includes data such as patient's tumour geometry, cellularity, vascularization, genetics and type of chemotherapy treatment, and integrated it into an online platform that runs the simulations on a high-performance computation cluster using Onedata and Kubernetes technologies. This infrastructure will allow clinicians to optimise treatment regimens and reduce the number of costly and time-consuming clinical trials. This manuscript outlines the challenging framework's model architecture, data workflow, hypothesis, and resources employed in its development

    Grid resource registry : zunifikowany dostęp do zasobów obliczeniowych

    No full text
    The growing number of resources available to researchers in the e-Science domain has opened new possibilities for constructing complex scientific applications while at the same time introducing new requirements for tools which assist developers in creating such applications. This paper discusses the problems of rapid application development, the use of distributed resources and a uniform approach to resource registration, discovery and access. It presents the Grid Resource Registry, which delivers an abstract layer for computational resources. The Registry is a central place where developers may search for available services and from which the execution engine receives technical specifications of services. The Registry is used throughout the lifetime ofthe e-science application, starting with application design, through implementation to execution.Rosnąca liczba zasobów dostępnych dla naukowca z jednej strony otworzyła nowe możliwości w konstruowaniu złożonych aplikacji naukowych, a z drugiej przyniosła dodatkowe wymagania dla narzędzi wspierających proces tworzenia oraz uruchamiania takich aplikacji. W artykule przedstawiono wyzwania związane z szybkim wytwarzaniem aplikacji naukowych, które wykorzystują rozproszone zasoby oraz związane z nimi trudności wynikające z rejestrowania, wyszukiwania i wywoływania zasobów używanych przez aplikację. Rozważania przedstawiono na przykładzie Grid Resource Registry - centralnego rejestru, który dostarcza abstrakcyjnego opisu rozproszonych zasobów, dzięki czemu w znaczący sposób proces wytwarzania oraz uruchamiania aplikacji naukowych może zostać uproszczony

    Effect of particularisation size on the accuracy and efficiency of a multiscale tumours' growth model

    Get PDF
    In silico, medicine models are frequently used to represent a phenomenon across multiples space-time scales. Most of these multiscale models require impracticable execution times to be solved, even using high performance computing systems, because typically each representative volume element in the upper scale model is coupled to an instance of the lower scale model; this causes a combinatory explosion of the computational cost, which increases exponentially as the number of scales to be modelled increases. To attenuate this problem, it is a common practice to interpose between the two models a particularisation operator, which maps the upper-scale model results into a smaller number of lower-scale models, and an operator, which maps the fewer results of the lower-scale models on the whole space-time homogenisation domain of upper-scale model. The aim of this study is to explore what is the simplest particularisation / homogenisation scheme that can couple a model aimed to predict the growth of a whole solid tumour (neuroblastoma) to a tissue-scale model of the cell-tissue biology with an acceptable approximation error and a viable computational cost. Using an idealised initial dataset with spatial gradients representative of those of real neuroblastomas, but small enough to be solved without any particularisation, we determined the approximation error and the computational cost of a very simple particularisation strategy based on binning. We found that even such simple algorithm can significantly reduce the computational cost with negligible approximation errors
    corecore