2,648 research outputs found

    MITK-ModelFit: A generic open-source framework for model fits and their exploration in medical imaging -- design, implementation and application on the example of DCE-MRI

    Full text link
    Many medical imaging techniques utilize fitting approaches for quantitative parameter estimation and analysis. Common examples are pharmacokinetic modeling in DCE MRI/CT, ADC calculations and IVIM modeling in diffusion-weighted MRI and Z-spectra analysis in chemical exchange saturation transfer MRI. Most available software tools are limited to a special purpose and do not allow for own developments and extensions. Furthermore, they are mostly designed as stand-alone solutions using external frameworks and thus cannot be easily incorporated natively in the analysis workflow. We present a framework for medical image fitting tasks that is included in MITK, following a rigorous open-source, well-integrated and operating system independent policy. Software engineering-wise, the local models, the fitting infrastructure and the results representation are abstracted and thus can be easily adapted to any model fitting task on image data, independent of image modality or model. Several ready-to-use libraries for model fitting and use-cases, including fit evaluation and visualization, were implemented. Their embedding into MITK allows for easy data loading, pre- and post-processing and thus a natural inclusion of model fitting into an overarching workflow. As an example, we present a comprehensive set of plug-ins for the analysis of DCE MRI data, which we validated on existing and novel digital phantoms, yielding competitive deviations between fit and ground truth. Providing a very flexible environment, our software mainly addresses developers of medical imaging software that includes model fitting algorithms and tools. Additionally, the framework is of high interest to users in the domain of perfusion MRI, as it offers feature-rich, freely available, validated tools to perform pharmacokinetic analysis on DCE MRI data, with both interactive and automatized batch processing workflows.Comment: 31 pages, 11 figures URL: http://mitk.org/wiki/MITK-ModelFi

    Using analytical CRM system to reduce churn in the telecom sector: A macine learning approach

    Get PDF
    Applied project submitted to the Department of Computer Science and Information Systems, Ashesi University, in partial fulfillment of Bachelor of Science degree in Computer Science, April 2019Customers are considered to be the most valuable assets of any business, and thus their loyalty is key to profitability as they indulge in repeat purchases and attract their colleagues through word-of-mouth. In competitive markets such as telecommunications, customers have a lot of flexibility due to the variety of service providers available and the introduction of mobile number portability (MNP) thus they can easily switch services and service providers. Customer churn is, therefore, a major problem among telecommunication companies hence their quest to reduce customer churn rate and retain an existing customer. Customer relationship management systems have been used over the years to track patterns within the customer data, but this could be improved notably with the technological advances hitting the universe on a daily basis. We have moved past the age of innovations around steam engines, electricity, computers, mobile, internet to the current technology trends in artificial intelligence and big data. We are at the cusp of a new wave where enterprises have embraced the application of machine learning in streamlining different business processes. Telecom companies have the advantage of mining large customer datasets that can be leveraged on for predictive analysis using data science. This project explores the use of analytical CRM system in reducing customer churn in the telecom industry using machine learning algorithms to predict customer behavior in order to retain them. Its goal is to analyze all relevant customer data and develop focused customer retention programs. This is on the focus that if you could somehow predict in advance which customers are at risk of leaving, you could develop focused customer retention programs to reduce customer churn.Ashesi Universit

    EksPy: a new Python framework for developing graphical user interface based PyQt5

    Get PDF
    This study introduces EksPy Python framework, a novel framework designed for developing graphical user interface (GUI) applications in Python. EksPy framework is built on PyQt5, which is a collection of Python bindings for the Qt libraries, and it provides a user-friendly and intuitive interface. The comparative analysis of EksPy framework with existing frameworks such as Tkinter and PyQt highlights its notable features, including ease of use, rapid development, enhanced performance, effective database management, and the model-view-controller (MVC) concept. The experimental results illustrate that EksPy framework requires less code and enhances code readability, thereby facilitating better understanding and efficient development. Additionally, EksPy framework offers a modern and customizable appearance, surpassing Tkinter’s capabilities. Furthermore, it incorporates a built-in object-relational mapping (ORM) feature to simplify database interactions and adheres to the MVC architectural pattern. In conclusion, EksPy Python framework emerges as a powerful, user-friendly, and efficient framework for GUI application development in Python

    Internship Portfolio

    Get PDF
    My one-year internship program work was with Mayo Clinic, Rochester. I was involved in the software development as part of a work term, all of which will be outlined in this report. The report will cover some background information on the projects I was involved in, as well as details on how the projects were developed. The report also states how and what academic courses and projects helped me in overall internship experience so far. At the beginning of the internship, I formulated serval learning goals, which I wanted to achieve: To understand the functioning and working conditions of the organization; To explore working in a professional environment; To explore the work environment for the possibility of a future career; To utilize my gained skills and knowledge; To find skills and knowledge I still need to work in a professional environment; To learn about software development life cycle; To learn about the development methodologies; To obtain fieldwork experience/collect data in an environment unknown for me; To obtain experience working in multicultural and diverse environment; To enhance my interpersonal and technical skills; To network with professionals in the industry. There are five major projects that I had a significant role in. The first project was Space Tools, involved gaining a good understanding of a javascript framework called Angular. My task was to study its working, develop wireframes from the view point of developing an application using that technology. My task was to Understand working with Angular framework, Understand working with Git, Develop wireframes. As this was my first project with Mayo Clinic, particularly at Development Shared Services (DSS) as a team project, I also had a large scope of understanding Agile Methodology - Scrum Process in particular. The second project was BAMS which was a rewrite of existing application in Windows Presentation Framework(WPF) and .Net backend. In this project my tasks were Understand using WinForms and WPF, Develop pages using WPF- MVVM Framework. The third project was DSA, where I acquired knowledge of working on Angular4 and frontend Unit testing in Karma using Mocha and Chai frameworks. The fourth project is MML Notification and Delivery, which started with an analysis phase in which were asked to analyze the data flow and system integrations the current Mayo Access and Mayo Link (MML Internal Operations) are dependent upon. We are to provide a new functionality to Mayo Access users of Notification and Delivery of tests results. The current project that I’m working on now is “MML Database Analysis”. This project is in the analysis phase. We were given a task to analyze MML databases to write an API instead of frontend calls to the database. I acquired many new technical skills throughout my work. I acquired new knowledge in Front-end development using various versions of Angular framework and Unit testing using Mocha and Chai framework in Karma. I also brushed my HTML/HTML5, CSS/CSS3, Javascript, Java, C# skills while working on various projects. Then I was introduced to the area of research and analysis and how to approach it. Most importantly, the work included good fellowship, cooperative teamwork and accepting responsibilities. Although I spent much time as a learning curve, I found that I was well trained in certain areas that helped me substantially in my projects. Many programming skills and Software Development Life Cycle understanding that I used in my internship, such as programming style and design, were the skills that I had acquired during my studies in Computer Science. This report also includes advantages of using Angular framework over other Javascript frameworks. The report concludes with my overall impressions of my work experience as well as my opinion of the Industrial Internship Program in general

    THE RANGE AND ROLE OF THEORY IN INFORMATION SYSTEMS DESIGN RESEARCH: FROM CONCEPTS TO CONSTRUCTION

    Get PDF
    This paper reports results from a field study of cross-disciplinary design researchers in information systems, software engineering, human-computer interaction, and computer-supported cooperative work. The purpose of the study was to explore how these different disciplines conceptualize and conduct design-as-research. The focus in this paper is on how theories are used in a design research project to motivate and inform the particulars of designed artifacts and design methods. Our objective was to better understand how elements of a theory are translated into design action, and how theoretical propositions are translated and then realized in designed artifacts. The results reveal a broad diversity in the processes through which theories are translated into working artifacts. The paper contributes to our understanding of design research in information systems by providing empirical support for existing constructs and frameworks, identifying some new approaches to translating theoretical concepts into research designs, and suggesting ways in which action and artifact-oriented research can more effectively contribute to a cumulative and progressive science of design

    Development of a decision support system through modelling of critical infrastructure interdependencies : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Emergency Management at Massey University, Wellington, New Zealand

    Get PDF
    Critical Infrastructure (CI) networks provide functional services to support the wellbeing of a community. Although it is possible to obtain detailed information about individual CI and their components, the interdependencies between different CI networks are often implicit, hidden or not well understood by experts. In the event of a hazard, failures of one or more CI networks and their components can disrupt the functionality and consequently affect the supply of services. Understanding the extent of disruption and quantification of the resulting consequences is important to assist various stakeholders' decision-making processes to complete their tasks successfully. A comprehensive review of the literature shows that a Decision Support System (DSS) integrated with appropriate modelling and simulation techniques is a useful tool for CI network providers and relevant emergency management personnel to understand the network recovery process of a region following a hazard event. However, the majority of existing DSSs focus on risk assessment or stakeholders' involvement without addressing the overall CI interdependency modelling process. Furthermore, these DSSs are primarily developed for data visualization or CI representation but not specifically to help decision-makers by providing them with a variety of customizable decision options that are practically viable. To address these limitations, a Knowledge-centred Decision Support System (KCDSS) has been developed in this study with the following aims: 1) To develop a computer-based DSS using efficient CI network recovery modelling algorithms, 2) To create a knowledge-base of various recovery options relevant to specific CI damage scenarios so that the decision-makers can test and verify several ‘what-if’ scenarios using a variety of control variables, and 3) To bridge the gap between hazard and socio-economic modelling tools through a multidisciplinary and integrated natural hazard impact assessment. Driven by the design science research strategy, this study proposes an integrated impact assessment framework using an iterative design process as its first research outcome. This framework has been developed as a conceptual artefact using a topology network-based approach by adopting the shortest path tree method. The second research outcome, a computer-based KCDSS, provides a convenient and efficient platform for enhanced decision making through a knowledge-base consisting of real-life recovery strategies. These strategies have been identified from the respective decision-makers of the CI network providers through the Critical Decision Method (CDM), a Cognitive Task Analysis (CTA) method for requirement elicitation. The capabilities of the KCDSS are demonstrated through electricity, potable water, and road networks in the Wellington region of Aotearoa New Zealand. The network performance has been analysed independently and with interdependencies to generate outage of services spatially and temporally. The outcomes of this study provide a range of theoretical and practical contributions. Firstly, the topology network-based analysis of CI interdependencies will allow a group of users to build different models, make and test assumptions, and try out different damage scenarios for CI network components. Secondly, the step-by-step process of knowledge elicitation, knowledge representation and knowledge modelling of CI network recovery tasks will provide a guideline for improved interactions between researchers and decision-makers in this field. Thirdly, the KCDSS can be used to test the variations in outage and restoration time estimates of CI networks due to the potential uncertainty related to the damage modelling of CI network components. The outcomes of this study also have significant practical implications by utilizing the KCDSS as an interface to integrate and add additional capabilities to the hazard and socio-economic modelling tools. Finally, the variety of ‘what-if’ scenarios embedded in the KCDSS would allow the CI network providers to identify vulnerabilities in their networks and to examine various post-disaster recovery options for CI reinstatement projects

    Real-Time GPS-Alternative Navigation Using Commodity Hardware

    Get PDF
    Modern navigation systems can use the Global Positioning System (GPS) to accurately determine position with precision in some cases bordering on millimeters. Unfortunately, GPS technology is susceptible to jamming, interception, and unavailability indoors or underground. There are several navigation techniques that can be used to navigate during times of GPS unavailability, but there are very few that result in GPS-level precision. One method of achieving high precision navigation without GPS is to fuse data obtained from multiple sensors. This thesis explores the fusion of imaging and inertial sensors and implements them in a real-time system that mimics human navigation. In addition, programmable graphics processing unit technology is leveraged to perform stream-based image processing using a computer\u27s video card. The resulting system can perform complex mathematical computations in a fraction of the time those same operations would take on a CPU-based platform. The resulting system is an adaptable, portable, inexpensive and self-contained software and hardware platform, which paves the way for advances in autonomous navigation, mobile cartography, and artificial intelligence
    • 

    corecore