740 research outputs found

    Synergistic Visualization And Quantitative Analysis Of Volumetric Medical Images

    Get PDF
    The medical diagnosis process starts with an interview with the patient, and continues with the physical exam. In practice, the medical professional may require additional screenings to precisely diagnose. Medical imaging is one of the most frequently used non-invasive screening methods to acquire insight of human body. Medical imaging is not only essential for accurate diagnosis, but also it can enable early prevention. Medical data visualization refers to projecting the medical data into a human understandable format at mediums such as 2D or head-mounted displays without causing any interpretation which may lead to clinical intervention. In contrast to the medical visualization, quantification refers to extracting the information in the medical scan to enable the clinicians to make fast and accurate decisions. Despite the extraordinary process both in medical visualization and quantitative radiology, efforts to improve these two complementary fields are often performed independently and synergistic combination is under-studied. Existing image-based software platforms mostly fail to be used in routine clinics due to lack of a unified strategy that guides clinicians both visually and quan- titatively. Hence, there is an urgent need for a bridge connecting the medical visualization and automatic quantification algorithms in the same software platform. In this thesis, we aim to fill this research gap by visualizing medical images interactively from anywhere, and performing a fast, accurate and fully-automatic quantification of the medical imaging data. To end this, we propose several innovative and novel methods. Specifically, we solve the following sub-problems of the ul- timate goal: (1) direct web-based out-of-core volume rendering, (2) robust, accurate, and efficient learning based algorithms to segment highly pathological medical data, (3) automatic landmark- ing for aiding diagnosis and surgical planning and (4) novel artificial intelligence algorithms to determine the sufficient and necessary data to derive large-scale problems

    Web Framework Points: an Effort Estimation Methodology for Web Application Development

    Get PDF
    Software effort estimation is one of the most critical components of a successful software project: completing the project on time and within budget is the classic challenge for all project managers. However, predictions made by project managers about their project are often inexact: software projects need, on average, 30-40% more effort than estimated. Research on software development effort and cost estimation has been abundant and diversified since the end of the Seventies. The topic is still very much alive, as shown by the numerous works existing in the literature. During these three years of research activity, I had the opportunity to go into the knowledge and to experiment some of the main software effort estimation methodologies existing in literature. In particular, I focused my research on Web effort estimation. As stated by many authors, the existing models for classic software applications are not well suited to measure the effort of Web applications, that unfortunately are not exempt from cost and time overruns, as traditional software projects. Initially, I compared the effectiveness of Albrecht's classic Function Points (FP) and Reifer's Web Objects (WO) metrics in estimating development effort for Web applications, in the context of an Italian software company. I tested these metrics on a dataset made of 24 projects provided by the software company between 2003 and 2010. I compared the estimate data with the real effort of each project completely developed, using the MRE (Magnitude of Relative Error) method. The experimental results showed a high error in estimates when using WO metric, which proved to be more effective than the FP metric in only two occurrences. In the context of this first work, it appeared evident that effort estimation depends not only on functional size measures, but other factors had to be considered, such as model accuracy and other challenges specific to Web applications; though the former represent the input that influences most the final results. For this reason, I revised the WO methodology, creating the RWO methodology. I applied this methodology to the same dataset of projects, comparing the results to those gathered by applying the FP and WO methods. The experimental results showed that the RWO method reached effort prediction results that are comparable to – and in 4 cases even better than – the FP method. Motivated by the dominant use of Content Management Framework (CMF) in Web application development and the inadequacy of the RWO method when used with the latest Web application development tools, I finally chose to focus my research on the study of a new Web effort estimation methodology for Web applications developed with a CMF. I proposed a new methodology for effort estimation: the Web CMF Objects one. In this methodology, new key elements for analysis and planning were identified; they allow to define every important step in the development of a Web application using a CMF. Following the RWO method approach, the estimated effort of a Web project stems from the sum of all elements, each of them weighted with its own complexity. I tested the whole methodology on 9 projects provided by three different Italian software companies, comparing the value of the effort estimate to the actual, final effort of each project, in man-days. I then compared the effort estimate both with values obtained from the Web CMF Objects methodology and with those obtained from the respective effort estimation methodologies of the three companies, getting excellent results: a value of Pred(0.25) equal to 100% for the Web CMF Objects methodology. Recently, I completed the presentation and assessment of Web CMF Objects methodology, upgrading the cost model for the calculation of effort estimation. I named it again Web Framework Points methodology. I tested the updated methodology on 19 projects provided by three software companies, getting good results: a value of Pred(0.25) equal to 79%. The aim of my research is to contribute to reducing the estimation error in software development projects developed through Content Management Frameworks, with the purpose to make the Web Framework Points methodology a useful tool for software companies

    Evaluation of Coordinated Ramp Metering (CRM) Implemented By Caltrans

    Get PDF
    Coordinated ramp metering (CRM) is a critical component of smart freeway corridors that rely on real-time traffic data from ramps and freeway mainline to improve decision-making by the motorists and Traffic Management Center (TMC) personnel. CRM uses an algorithm that considers real-time traffic volumes on freeway mainline and ramps and then adjusts the metering rates on the ramps accordingly for optimal flow along the entire corridor. Improving capacity through smart corridors is less costly and easier to deploy than freeway widening due to high costs associated with right-of-way acquisition and construction. Nevertheless, conversion to smart corridors still represents a sizable investment for public agencies. However, in the U.S. there have been limited evaluations of smart corridors in general, and CRM in particular, based on real operational data. This project examined the recent Smart Corridor implementation on Interstate 80 (I-80) in the Bay Area and State Route 99 (SR-99, SR99) in Sacramento based on travel time reliability measures, efficiency measures, and before-and-after safety evaluation using the Empirical Bayes (EB) approach. As such, this evaluation represents the most complete before-and-after evaluation of such systems. The reliability measures include buffer index, planning time, and measures from the literature that account for both the skew and width of the travel time distribution. For efficiency, the study estimates the ratio of vehicle miles traveled vs. vehicle hour traveled. The research contextualizes before-and-after comparisons for efficiency and reliability measures through similar measures from another corridor (i.e., the control corridor of I-280 in District 4 and I-5 in District 3) from the same region, which did not have CRM implemented. The results show there has been an improvement in freeway operation based on efficiency data. Post-CRM implementation, travel time reliability measures do not show a similar improvement. The report also provides a counterfactual estimate of expected crashes in the post-implementation period, which can be compared with the actual number of crashes in the “after” period to evaluate effectiveness

    Fluid-Structure Interaction Simulation of a Coriolis Mass Flowmeter using a Lattice Boltzmann Method

    Get PDF
    In this paper we use a fluid-structure interaction (FSI) approach to simulate a Coriolis mass flowmeter (CMF). The fluid dynamics are calculated by the open source framework OpenLB, based on the lattice Boltzmann method (LBM). For the structural dynamics we employ the open source software Elmer, an implementation of the finite element method (FEM). A staggered coupling approach between the two software packages is presented. The finite element mesh is created by the mesh generator Gmsh to ensure a complete open source workflow. The Eigenmodes of the CMF, which are calculated by modal analysis are compared with measurement data. Using the estimated excitation frequency, a fully coupled, partitioned, FSI simulation is applied to simulate the phase shift of the investigated CMF design. The calculated phaseshift values are in good agreement to the measurement data and verify the suitability of the model to numerically describe the working principle of a CMF

    Service Migration in Dynamic and Resource-Constrained Networks

    Get PDF

    PERFORMANCE OF RANDOM SURVIVAL FORESTS WITH TIME-VARYING COVARIATES IN PREDICTION OF U.S. ARMY ENLISTED ATTRITION COMPARED TO TRADITIONAL MANPOWER ANALYSIS METHODS

    Get PDF
    The importance of identifying qualified candidates and properly forecasting future manpower strength will always be critical to military recruiting and organization. The ability to assess the cross-section of covariates of a cohort of enlistees and forecast manpower strength would allow for improved planning and allocation decisions. We leverage an innovative method of survival analysis—random survival forests (RSF) with time-varying covariates (T-VC)—to predict Army first-term post-Initial Entry Training attrition rates. Using random survival forests with time-varying covariates (TV-RSF) is an emerging method of survival analysis that has not been used in a military manpower setting. Using a Brier Score we compare TV-RSF with three other methods. We illustrate that using a single tree rather than the computationally intensive TV-RSF may suffice for predicting future year attrition. We also illustrate that TV-RSFs outperform traditional classification methods (logistic regression, random forests) that only account for yearly changes in T-VCs.Ensign, United States NavyApproved for public release. Distribution is unlimited

    Spectral modeling of a six-color inkjet printer

    Get PDF
    After customizing an Epson Stylus Photo 1200 by adding a continuous-feed ink system and a cyan, magenta, yellow, black, orange and green ink set, a series of research tasks were carried out to build a full spectral model of the printers output. First, various forward printer models were tested using the fifteen two color combinations of the printer. Yule- Nielsen-spectral-Neugebauer (YNSN) was selected as the forward model and its accuracy tested throughout the colorant space. It was found to be highly accurate, performing as well as a more complex local, cellular version. Next, the performance of nonlinear optimization-routine algorithms were evaluated for their ability to efficiently invert the YNSN model. A quasi-Newton based algorithm designed by Davidon, Fletcher and Powell (DFP) was found to give the best performance when combined with starting values produced from the non-negative least squares fit of single-constant Kubelka- Munk. The accuracy of the inverse model was tested and different optimization objective functions were evaluated. A multistage objective function based on minimizing spectral RMS error and then colorimetric error was found to give highly accurate matches with low metameric potential. Finally, the relationship between the number of printing inks and the ability to eliminate metamerism was explored
    • …
    corecore