299 research outputs found

    Reverse engineering and emotional attachments as mechanisms mediating the effects of quantification

    Get PDF
    Alain DesrosiĂšres understood statistics as simultaneous representations of the world and interventions in it. This article examines two mechanisms that mediate how numbers do both. The first, reverse engineering, describes how working backwards from a desired number shapes organizational routines. The second, emotional attachment, describes the processes by which numbers generate a variety of emotions that sometimes stimulate collective identities. Focusing on educational rankings but including examples of other types of numbers, it argues for the importance of disclosing the effects of specific causal mechanisms in the analysis of particular performance measures

    Method-MS, final report 2010

    Get PDF
    Radiometric determination methods, such as alpha spectrometry require long counting times when low activities are to be determined. Mass spectrometric techniques as Inductively Coupled Plasma Mass Spectrometry (ICP-MS), Thermal Ionisation Mass Spectrometry (TIMS) and Accelerator Mass Spectrometry (AMS) have shown several advantages compared to traditional methods when measuring long-lived radionuclides. Mass spectrometric methods for determination of very low concentrations of elemental isotopes, and thereby isotopic ratios, have been developed using a variety of ion sources. Although primarily applied to the determination of the lighter stable element isotopes and radioactive isotopes in geological studies, the techniques can equally well be applied to the measurement of activity concentrations of long-lived low-level radionuclides in various samples using “isotope dilution” methods such as those applied in inductively coupled plasma mass spectrometry (ICP-MS). Due to the low specific activity of long-lived radionuclides, many of these are more conveniently detected using mass spectrometric techniques. Mass spectrometry also enables the individual determination of Pu-239 and Pu-240, which cannot be obtained by alpha spectrometry. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) are rapidly growing techniques for the ultra-trace analytical determination of stable and long-lived isotopes and have a wide potential within environmental science, including ecosystem tracers and radio ecological studies. Such instrumentation, of course needs good radiochemical separation, to give best performance. The objectives of the project is to identify current needs and problems within low-level determination of long-lived radioisotopes by ICP-MS, to perform intercalibration and development and improvement of ICP-MS methods for the measurement of radionuclides and isotope ratios and to develop new methods based on modified separation chemistry applied to new auxiliary equipment

    Game Theory and Prescriptive Analytics for Naval Wargaming Battle Management Aids

    Get PDF
    NPS NRP Technical ReportThe Navy is taking advantage of advances in computational technologies and data analytic methods to automate and enhance tactical decisions and support warfighters in highly complex combat environments. Novel automated techniques offer opportunities to support the tactical warfighter through enhanced situational awareness, automated reasoning and problem-solving, and faster decision timelines. This study will investigate how game theory and prescriptive analytics methods can be used to develop real-time wargaming capabilities to support warfighters in their ability to explore and evaluate the possible consequences of different tactical COAs to improve tactical missions. This study will develop a conceptual design of a real-time tactical wargaming capability. This study will explore data analytic methods including game theory, prescriptive analytics, and artificial intelligence (AI) to evaluate their potential to support real-time wargaming.N2/N6 - Information WarfareThis research is supported by funding from the Naval Postgraduate School, Naval Research Program (PE 0605853N/2098). https://nps.edu/nrpChief of Naval Operations (CNO)Approved for public release. Distribution is unlimited.

    Naval Research Program 2021 Annual Report

    Get PDF
    NPS NRP Annual ReportThe Naval Postgraduate School (NPS) Naval Research Program (NRP) is funded by the Chief of Naval Operations and supports research projects for the Navy and Marine Corps. The NPS NRP serves as a launch-point for new initiatives which posture naval forces to meet current and future operational warfighter challenges. NRP research projects are led by individual research teams that conduct research and through which NPS expertise is developed and maintained. The primary mechanism for obtaining NPS NRP support is through participation at NPS Naval Research Working Group (NRWG) meetings that bring together fleet topic sponsors, NPS faculty members, and students to discuss potential research topics and initiatives.Chief of Naval Operations (CNO)Approved for public release. Distribution is unlimited.

    New Similarity Measures for Capturing Browsing Interests of Users into Web Usage Profiles

    Get PDF
    The essence of web personalization is the adaptability of a website to the needs and interests of individual users. The recognition of user preferences and interests can be based on the knowledge gained from previous interactions of users with the site. Typically, a set of usage profiles is mined from web log data (records of website usage), where each profile models common browsing interests of a group of like-minded users. These profiles are later utilized to provide personalized recommendations. Clearly, the quality of usage profiles is critical to the performance of a personalization system. When using clustering for web mining, successful clustering of users is a major factor in deriving effective usage profiles. Clustering depends on the discriminatory capabilities of the similarity measure used. In this thesis, we first present a new weighted session similarity measure to capture the browsing interests of users into web usage profiles. We base our similarity measure on the reasonable assumption that when users spend longer times on pages or revisit pages in the same session, then very likely, such pages are of greater interest to the user. The proposed similarity measure combines structural similarity with session-wise page significance. The latter, representing the degree of user interest, is computed using page-access frequency and page-access duration. Web usage profiles are generated by applying a fuzzy clustering algorithm using this measure. For evaluating the effectiveness of the proposed measure, we adapt two model-based collaborative filtering algorithms for recommending pages. Experimental results show considerable improvement in overall performance of recommender systems as compared to other known similarity measures. Lastly, we propose a modification by replacing structural similarity by concept (content) similarity, which we expect would further enhance recommendation system performance

    Evolving Clustering Algorithms And Their Application For Condition Monitoring, Diagnostics, & Prognostics

    Get PDF
    Applications of Condition-Based Maintenance (CBM) technology requires effective yet generic data driven methods capable of carrying out diagnostics and prognostics tasks without detailed domain knowledge and human intervention. Improved system availability, operational safety, and enhanced logistics and supply chain performance could be achieved, with the widespread deployment of CBM, at a lower cost level. This dissertation focuses on the development of a Mutual Information based Recursive Gustafson-Kessel-Like (MIRGKL) clustering algorithm which operates recursively to identify underlying model structure and parameters from stream type data. Inspired by the Evolving Gustafson-Kessel-like Clustering (eGKL) algorithm, we applied the notion of mutual information to the well-known Mahalanobis distance as the governing similarity measure throughout. This is also a special case of the Kullback-Leibler (KL) Divergence where between-cluster shape information (governed by the determinant and trace of the covariance matrix) is omitted and is only applicable in the case of normally distributed data. In the cluster assignment and consolidation process, we proposed the use of the Chi-square statistic with the provision of having different probability thresholds. Due to the symmetry and boundedness property brought in by the mutual information formulation, we have shown with real-world data that the algorithm’s performance becomes less sensitive to the same range of probability thresholds which makes system tuning a simpler task in practice. As a result, improvement demonstrated by the proposed algorithm has implications in improving generic data driven methods for diagnostics, prognostics, generic function approximations and knowledge extractions for stream type of data. The work in this dissertation demonstrates MIRGKL’s effectiveness in clustering and knowledge representation and shows promising results in diagnostics and prognostics applications

    Computational Methods for Segmentation of Multi-Modal Multi-Dimensional Cardiac Images

    Get PDF
    Segmentation of the heart structures helps compute the cardiac contractile function quantified via the systolic and diastolic volumes, ejection fraction, and myocardial mass, representing a reliable diagnostic value. Similarly, quantification of the myocardial mechanics throughout the cardiac cycle, analysis of the activation patterns in the heart via electrocardiography (ECG) signals, serve as good cardiac diagnosis indicators. Furthermore, high quality anatomical models of the heart can be used in planning and guidance of minimally invasive interventions under the assistance of image guidance. The most crucial step for the above mentioned applications is to segment the ventricles and myocardium from the acquired cardiac image data. Although the manual delineation of the heart structures is deemed as the gold-standard approach, it requires significant time and effort, and is highly susceptible to inter- and intra-observer variability. These limitations suggest a need for fast, robust, and accurate semi- or fully-automatic segmentation algorithms. However, the complex motion and anatomy of the heart, indistinct borders due to blood flow, the presence of trabeculations, intensity inhomogeneity, and various other imaging artifacts, makes the segmentation task challenging. In this work, we present and evaluate segmentation algorithms for multi-modal, multi-dimensional cardiac image datasets. Firstly, we segment the left ventricle (LV) blood-pool from a tri-plane 2D+time trans-esophageal (TEE) ultrasound acquisition using local phase based filtering and graph-cut technique, propagate the segmentation throughout the cardiac cycle using non-rigid registration-based motion extraction, and reconstruct the 3D LV geometry. Secondly, we segment the LV blood-pool and myocardium from an open-source 4D cardiac cine Magnetic Resonance Imaging (MRI) dataset by incorporating average atlas based shape constraint into the graph-cut framework and iterative segmentation refinement. The developed fast and robust framework is further extended to perform right ventricle (RV) blood-pool segmentation from a different open-source 4D cardiac cine MRI dataset. Next, we employ convolutional neural network based multi-task learning framework to segment the myocardium and regress its area, simultaneously, and show that segmentation based computation of the myocardial area is significantly better than that regressed directly from the network, while also being more interpretable. Finally, we impose a weak shape constraint via multi-task learning framework in a fully convolutional network and show improved segmentation performance for LV, RV and myocardium across healthy and pathological cases, as well as, in the challenging apical and basal slices in two open-source 4D cardiac cine MRI datasets. We demonstrate the accuracy and robustness of the proposed segmentation methods by comparing the obtained results against the provided gold-standard manual segmentations, as well as with other competing segmentation methods

    Naval Research Program 2019 Annual Report

    Get PDF
    NPS NRP Annual ReportThe Naval Postgraduate School (NPS) Naval Research Program (NRP) is funded by the Chief of Naval Operations and supports research projects for the Navy and Marine Corps. The NPS NRP serves as a launch-point for new initiatives which posture naval forces to meet current and future operational warfighter challenges. NRP research projects are led by individual research teams that conduct research and through which NPS expertise is developed and maintained. The primary mechanism for obtaining NPS NRP support is through participation at NPS Naval Research Working Group (NRWG) meetings that bring together fleet topic sponsors, NPS faculty members, and students to discuss potential research topics and initiatives.Chief of Naval Operations (CNO)This research is supported by funding from the Naval Postgraduate School, Naval Research Program (PE 0605853N/2098). https://nps.edu/nrpChief of Naval Operations (CNO)Approved for public release. Distribution is unlimited.

    EVALUATING ARTIFICIAL INTELLIGENCE METHODS FOR USE IN KILL CHAIN FUNCTIONS

    Get PDF
    Current naval operations require sailors to make time-critical and high-stakes decisions based on uncertain situational knowledge in dynamic operational environments. Recent tragic events have resulted in unnecessary casualties, and they represent the decision complexity involved in naval operations and specifically highlight challenges within the OODA loop (Observe, Orient, Decide, and Assess). Kill chain decisions involving the use of weapon systems are a particularly stressing category within the OODA loop—with unexpected threats that are difficult to identify with certainty, shortened decision reaction times, and lethal consequences. An effective kill chain requires the proper setup and employment of shipboard sensors; the identification and classification of unknown contacts; the analysis of contact intentions based on kinematics and intelligence; an awareness of the environment; and decision analysis and resource selection. This project explored the use of automation and artificial intelligence (AI) to improve naval kill chain decisions. The team studied naval kill chain functions and developed specific evaluation criteria for each function for determining the efficacy of specific AI methods. The team identified and studied AI methods and applied the evaluation criteria to map specific AI methods to specific kill chain functions.Civilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyCaptain, United States Marine CorpsCivilian, Department of the NavyCivilian, Department of the NavyApproved for public release. Distribution is unlimited

    Proceedings of the 2004 ONR Decision-Support Workshop Series: Interoperability

    Get PDF
    In August of 1998 the Collaborative Agent Design Research Center (CADRC) of the California Polytechnic State University in San Luis Obispo (Cal Poly), approached Dr. Phillip Abraham of the Office of Naval Research (ONR) with the proposal for an annual workshop focusing on emerging concepts in decision-support systems for military applications. The proposal was considered timely by the ONR Logistics Program Office for at least two reasons. First, rapid advances in information systems technology over the past decade had produced distributed collaborative computer-assistance capabilities with profound potential for providing meaningful support to military decision makers. Indeed, some systems based on these new capabilities such as the Integrated Marine Multi-Agent Command and Control System (IMMACCS) and the Integrated Computerized Deployment System (ICODES) had already reached the field-testing and final product stages, respectively. Second, over the past two decades the US Navy and Marine Corps had been increasingly challenged by missions demanding the rapid deployment of forces into hostile or devastate dterritories with minimum or non-existent indigenous support capabilities. Under these conditions Marine Corps forces had to rely mostly, if not entirely, on sea-based support and sustainment operations. Particularly today, operational strategies such as Operational Maneuver From The Sea (OMFTS) and Sea To Objective Maneuver (STOM) are very much in need of intelligent, near real-time and adaptive decision-support tools to assist military commanders and their staff under conditions of rapid change and overwhelming data loads. In the light of these developments the Logistics Program Office of ONR considered it timely to provide an annual forum for the interchange of ideas, needs and concepts that would address the decision-support requirements and opportunities in combined Navy and Marine Corps sea-based warfare and humanitarian relief operations. The first ONR Workshop was held April 20-22, 1999 at the Embassy Suites Hotel in San Luis Obispo, California. It focused on advances in technology with particular emphasis on an emerging family of powerful computer-based tools, and concluded that the most able members of this family of tools appear to be computer-based agents that are capable of communicating within a virtual environment of the real world. From 2001 onward the venue of the Workshop moved from the West Coast to Washington, and in 2003 the sponsorship was taken over by ONR’s Littoral Combat/Power Projection (FNC) Program Office (Program Manager: Mr. Barry Blumenthal). Themes and keynote speakers of past Workshops have included: 1999: ‘Collaborative Decision Making Tools’ Vadm Jerry Tuttle (USN Ret.); LtGen Paul Van Riper (USMC Ret.);Radm Leland Kollmorgen (USN Ret.); and, Dr. Gary Klein (KleinAssociates) 2000: ‘The Human-Computer Partnership in Decision-Support’ Dr. Ronald DeMarco (Associate Technical Director, ONR); Radm CharlesMunns; Col Robert Schmidle; and, Col Ray Cole (USMC Ret.) 2001: ‘Continuing the Revolution in Military Affairs’ Mr. Andrew Marshall (Director, Office of Net Assessment, OSD); and,Radm Jay M. Cohen (Chief of Naval Research, ONR) 2002: ‘Transformation ... ’ Vadm Jerry Tuttle (USN Ret.); and, Steve Cooper (CIO, Office ofHomeland Security) 2003: ‘Developing the New Infostructure’ Richard P. Lee (Assistant Deputy Under Secretary, OSD); and, MichaelO’Neil (Boeing) 2004: ‘Interoperability’ MajGen Bradley M. Lott (USMC), Deputy Commanding General, Marine Corps Combat Development Command; Donald Diggs, Director, C2 Policy, OASD (NII
    • 

    corecore