17,574 research outputs found

    The Factory of the Future

    Get PDF
    A brief history of aircraft production techniques is given. A flexible machining cell is then described. It is a computer controlled system capable of performing 4-axis machining part cleaning, dimensional inspection and materials handling functions in an unmanned environment. The cell was designed to: allow processing of similar and dissimilar parts in random order without disrupting production; allow serial (one-shipset-at-a-time) manufacturing; reduce work-in-process inventory; maximize machine utilization through remote set-up; maximize throughput and minimize labor

    Special Libraries, Spring 1995

    Get PDF
    Volume 86, Issue 2https://scholarworks.sjsu.edu/sla_sl_1995/1001/thumbnail.jp

    Lifecycle Study for Information Technology Systems in the Cold Rolling Plant 1

    Get PDF
    The purpose of this Master’s thesis was to identify the status of the lifecycle for the IT components used in the production in Cold Rolling Plant, Outokumpu Tornio Site. Other purposes were to clarify the availability of support and spare parts, and to analyze if the components can be upgraded or replaced with newer ones. One purpose was also to get a view of the used equipment in production lines and to study methods for lifecycle management. This study covers the basics provided by ITIL and the global manufacturers, ABB and Siemens. The study included field surveys and spreadsheets were created based on the surveys. During the field surveys, all IT components were photographed and finally over 700 photos were analyzed with a system documentation. This study and the result from the field surveys can be used as a base for the IT system development and for developing daily routines. It gives views for the lifecycles and used equipment in the IT systems in the production and acts as a successful approach for collecting data from the field. The results are company confidential information and separated from the published version

    Special Libraries, December 1964

    Get PDF
    Volume 55, Issue 10https://scholarworks.sjsu.edu/sla_sl_1964/1009/thumbnail.jp

    Special Libraries, December 1964

    Get PDF
    Volume 55, Issue 10https://scholarworks.sjsu.edu/sla_sl_1964/1009/thumbnail.jp

    Developing a traffic control device maintenance management system interfacing with Gis

    Full text link
    Roadway systems contain a wide variety of spatially distributed physical features which require installation, maintenance and replacement. These features include traffic control devices such as signs, signals, pavement markings and streetlights. Several technologies exist that can be utilized by the transportation sector to improve program management of a number of these features. Geographic Information Systems (GIS) technology provides a powerful environment for the capture, storage, retrieval, analysis, and display of spatial (locationally defined) data. A need exists to provide an inventory of the transportation physical plant to interface with a work management system. Information pertaining to the number and condition of such features is required for planning, operating, maintaining, managing and budgeting needs. This thesis summarizes the development of a user-friendly, computerized process to establish a graphical interface between a roadway inventory database and GIS; Evaluation of existing technologies and a survey of current literature will provide a basis for the design of a Traffic Control Device Maintenance Management System. This system will provide a consistent form of technology transfer on a common platform. This system will manage resources by integrating work-orders and the database. The system will utilize GIS technology to integrate a work-order system and a database reporting system for resource management. The work order interface capabilities will include maintenance work-order management, project cost and progress tracking, and program planning and policy analysis; The key is to develop a user-friendly system useful to both the field-level installation crews and planning-level management. A case study in Clark County, Nevada, will be used to evaluate alternative methods of collecting and data on traffic control devices and to illustrate the development of a GIS-based management system. This system is intended to improve the efficiency and effectiveness of operational practices as well as serve as a vital decision support tool for planning and management

    Information Outlook, May 1997

    Get PDF
    Volume 1, Issue 5https://scholarworks.sjsu.edu/sla_io_1997/1004/thumbnail.jp

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    Context Aware Computing for The Internet of Things: A Survey

    Get PDF
    As we are moving towards the Internet of Things (IoT), the number of sensors deployed around the world is growing at a rapid pace. Market research has shown a significant growth of sensor deployments over the past decade and has predicted a significant increment of the growth rate in the future. These sensors continuously generate enormous amounts of data. However, in order to add value to raw sensor data we need to understand it. Collection, modelling, reasoning, and distribution of context in relation to sensor data plays critical role in this challenge. Context-aware computing has proven to be successful in understanding sensor data. In this paper, we survey context awareness from an IoT perspective. We present the necessary background by introducing the IoT paradigm and context-aware fundamentals at the beginning. Then we provide an in-depth analysis of context life cycle. We evaluate a subset of projects (50) which represent the majority of research and commercial solutions proposed in the field of context-aware computing conducted over the last decade (2001-2011) based on our own taxonomy. Finally, based on our evaluation, we highlight the lessons to be learnt from the past and some possible directions for future research. The survey addresses a broad range of techniques, methods, models, functionalities, systems, applications, and middleware solutions related to context awareness and IoT. Our goal is not only to analyse, compare and consolidate past research work but also to appreciate their findings and discuss their applicability towards the IoT.Comment: IEEE Communications Surveys & Tutorials Journal, 201
    • …
    corecore