25,176 research outputs found

    Computer hardware and software for robotic control

    Get PDF
    The KSC has implemented an integrated system that coordinates state-of-the-art robotic subsystems. It is a sensor based real-time robotic control system performing operations beyond the capability of an off-the-shelf robot. The integrated system provides real-time closed loop adaptive path control of position and orientation of all six axes of a large robot; enables the implementation of a highly configurable, expandable testbed for sensor system development; and makes several smart distributed control subsystems (robot arm controller, process controller, graphics display, and vision tracking) appear as intelligent peripherals to a supervisory computer coordinating the overall systems

    Automation Script For Evaluation Of Source Codes

    Get PDF
    This thesis focuses on the development of an automation script integrated with a Web application to extract crucial information from .NET projects. The objective was to streamline the process of retrieving database type, database name, and .NET version, build status zip files, generate comprehensive reports, and present key metrics on a dashboard. The automation script was implemented in Python, utilizing packages such as os, subprocess, zipfile, re, json5, shutil, and xml.etree.ElementTree. The script automated the extraction of information from the zip files, eliminating the need for manual intervention. It executed the .Net build command to determine the success of the build and captured error details if any. The appsettings.json file was parsed to obtain the database type and name, while the csproj files provided the .NET version The developed automation script was integrated with a Web application, allowing users to upload zip files and apply the script effortlessly. The application displayed a dashboard presenting statistical insights, including the counts of database types used, the distribution of .NET versions, and the overall success rate of the build process. Reports were generated, providing detailed breakdowns of the build process and error details. The experimental setup involved using various test files, including sample files representing SQL Server and SQLite databases and files intentionally modified to include build errors. The results obtained from running the automation script on the test files demonstrated its effectiveness and efficiency in extracting information and generating accurate reports. The script showcased advantages over existing methods and tools, offering simplicity, cost-effectiveness, and flexibility. The thesis concludes with a discussion of the strengths and limitations of the automation script, potential improvements, and recommendations for future automation efforts. Overall, the developed automation script proved valuable for extracting information from zip-filed .NET projects and demonstrated its potential for enhancing productivity and decision-making in software development processes

    User-centered visual analysis using a hybrid reasoning architecture for intensive care units

    Get PDF
    One problem pertaining to Intensive Care Unit information systems is that, in some cases, a very dense display of data can result. To ensure the overview and readability of the increasing volumes of data, some special features are required (e.g., data prioritization, clustering, and selection mechanisms) with the application of analytical methods (e.g., temporal data abstraction, principal component analysis, and detection of events). This paper addresses the problem of improving the integration of the visual and analytical methods applied to medical monitoring systems. We present a knowledge- and machine learning-based approach to support the knowledge discovery process with appropriate analytical and visual methods. Its potential benefit to the development of user interfaces for intelligent monitors that can assist with the detection and explanation of new, potentially threatening medical events. The proposed hybrid reasoning architecture provides an interactive graphical user interface to adjust the parameters of the analytical methods based on the users' task at hand. The action sequences performed on the graphical user interface by the user are consolidated in a dynamic knowledge base with specific hybrid reasoning that integrates symbolic and connectionist approaches. These sequences of expert knowledge acquisition can be very efficient for making easier knowledge emergence during a similar experience and positively impact the monitoring of critical situations. The provided graphical user interface incorporating a user-centered visual analysis is exploited to facilitate the natural and effective representation of clinical information for patient care

    Digital Image Access & Retrieval

    Get PDF
    The 33th Annual Clinic on Library Applications of Data Processing, held at the University of Illinois at Urbana-Champaign in March of 1996, addressed the theme of "Digital Image Access & Retrieval." The papers from this conference cover a wide range of topics concerning digital imaging technology for visual resource collections. Papers covered three general areas: (1) systems, planning, and implementation; (2) automatic and semi-automatic indexing; and (3) preservation with the bulk of the conference focusing on indexing and retrieval.published or submitted for publicatio

    Why (and How) Networks Should Run Themselves

    Full text link
    The proliferation of networked devices, systems, and applications that we depend on every day makes managing networks more important than ever. The increasing security, availability, and performance demands of these applications suggest that these increasingly difficult network management problems be solved in real time, across a complex web of interacting protocols and systems. Alas, just as the importance of network management has increased, the network has grown so complex that it is seemingly unmanageable. In this new era, network management requires a fundamentally new approach. Instead of optimizations based on closed-form analysis of individual protocols, network operators need data-driven, machine-learning-based models of end-to-end and application performance based on high-level policy goals and a holistic view of the underlying components. Instead of anomaly detection algorithms that operate on offline analysis of network traces, operators need classification and detection algorithms that can make real-time, closed-loop decisions. Networks should learn to drive themselves. This paper explores this concept, discussing how we might attain this ambitious goal by more closely coupling measurement with real-time control and by relying on learning for inference and prediction about a networked application or system, as opposed to closed-form analysis of individual protocols
    corecore