1,642 research outputs found

    Peripatetic electronic teachers in higher education

    Get PDF
    This paper explores the idea of information and communications technology providing a medium enabling higher education teachers to act as freelance agents. The notion of a ‘Peripatetic Electronic Teacher’ (PET) is introduced to encapsulate this idea. PETs would exist as multiple telepresences (pedagogical, professional, managerial and commercial) in PET‐worlds; global networked environments which support advanced multimedia features. The central defining rationale of a pedagogical presence is described in detail and some implications for the adoption of the PET‐world paradigm are discussed. The ideas described in this paper were developed by the author during a recently completed Short‐Term British Telecom Research Fellowship, based at the BT Adastral Park

    Software Agents for Electronic Marketplaces: Current and Future Research Directions

    Get PDF
    The premise of software agents to define the structural and operational models of the virtual marketplace of the future can account for the increased interest regarding their application in areas where they can add substantial value in terms of automation and functionality. At the heart of such a marketplace rests an ontology modeling the domain upon which a nucleus of agent-based services can be constructed. Negotiation services hold the dominant position in terms of the attention they have received in research. Complementary to them, but no less important, are the advising services representing support functionality that is required throughout the cycle of a deal; from the expressed intention of the two parties to eventual maturity and closure. In this paper we focus on research trends and on their possible future development for ontologies and the above service categories emphasizing on the role of software agents in this context. A review and analysis of past and present works helps to formulate sets of questions that future research will seek to address

    Towards implementing integrated building product libraries

    Get PDF
    Electronic product catalogues and brochures are gaining popularity but there is little agreement on content, format and searching methods. This limits their usability and integration with existing construction software tools. This paper examines a productmodelling approach to delivering building product information and describes a proposed multi-tier client-server environment. ISO/STEP and IAI/IFC building product models are considered to facilitate representation, exchange and sharing of product information. The proposed architecture incorporates scalability with middleware components that would provide single or few points of entry to integrated product information. This paper is part of a research project, which builds on the results of related projects including ConstructIT Strategy, PROCAT-GEN, Active Catalog, COMBINE and ARROW, towards implementing the required software components

    Transaction Streams: Definition and Implications for Trust in Internet-Based Electronic Commerce.

    Get PDF
    In this paper we analyze how transactions related to the exchange of goods and services are being performed on the Internet. The adoption of electronic markets in an industry has a disintermediation potential because it can create a direct link between the producer and the consumer (without the need for the intermediation role of distributors). Electronic markets lower the search cost, allowing customers to choose among more providers (which ultimately reduces both the costs for the customer and the profits for the producer). In this paper we contend that electronic markets on the Internet have the opposite effect, resulting in our increase in the number of intermediators. We introduce transaction streams, which model how transactions are being conducted and help explain the types of new intermediators that are appearing on the Internet. We also describe mechanisms by which companies are exploring ways of extending transaction streams. To illustrate the model and validate our findings, we analyze transaction streams in the insurance industry and review associated concepts such as trust and brands.transactions; electronic markets;

    A Generic Network and System Management Framework

    Get PDF
    Networks and distributed systems have formed the basis of an ongoing communications revolution that has led to the genesis of a wide variety of services. The constantly increasing size and complexity of these systems does not come without problems. In some organisations, the deployment of Information Technology has reached a state where the benefits from downsizing and rightsizing by adding new services are undermined by the effort required to keep the system running. Management of networks and distributed systems in general has a straightforward goal: to provide a productive environment in which work can be performed effectively. The work required for management should be a small fraction of the total effort. Most IT systems are still managed in an ad hoc style without any carefully elaborated plan. In such an environment the success of management decisions depends totally on the qualification and knowledge of the administrator. The thesis provides an analysis of the state of the art in the area of Network and System Management and identifies the key requirements that must be addressed for the provisioning of Integrated Management Services. These include the integration of the different management related aspects (i.e. integration of heterogeneous Network, System and Service Management). The thesis then proposes a new framework, INSMware, for the provision of Management Services. It provides a fundamental basis for the realisation of a new approach to Network and System Management. It is argued that Management Systems can be derived from a set of pre-fabricated and reusable Building Blocks that break up the required functionality into a number of separate entities rather than being developed from scratch. It proposes a high-level logical model in order to accommodate the range of requirements and environments applicable to Integrated Network and System Management that can be used as a reference model. A development methodology is introduced that reflects principles of the proposed approach, and provides guidelines to structure the analysis, design and implementation phases of a management system. The INSMware approach can further be combined with the componentware paradigm for the implementation of the management system. Based on these principles, a prototype for the management of SNMP systems has been implemented using industry standard middleware technologies. It is argued that development of a management system based on Componentware principles can offer a number of benefits. INSMware Components may be re-used and system solutions will become more modular and thereby easier to construct and maintain

    Architecture for implementing IFC-based online construction product libraries

    Get PDF
    Construction product information providers have responded to the demand for electronic delivery by providing online access, CD-ROMs and DVDs but these solutions have limited usability and are generally incapable of supporting prevalent and emerging industry practices. The product library implementations attempt to replicate the functionalities of the paper versions, which serve for independent specification and procurement but gives little thought to teams and tools integration through support for automated information exchange and sharing. The IFC standard provides common terminologies, technologies, syntax and semantics necessary to address present and future compatibility and integration issues, hence IFC-based implementation of product libraries have good prospect for meeting the industry requirements. This paper reviews current product information delivery methods and examines the applicability of the IFC and other standards. The requirements for IFC-based construction product libraries are identified and an architecture for realising the requirements was presented

    Development of an integrated product information management system

    Get PDF
    This thesis reports on a research project undertaken over a four year period investigating and developing a software framework and application for integrating and managing building product information for construction engineering. The research involved extensive literature research, observation of the industry practices and interviews with construction industry practitioners and systems implementers to determine how best to represent and present product information to support the construction process. Applicable product models for information representation were reviewed and evaluated to determine present suitability. The IFC product model was found to be the most applicable. Investigations of technologies supporting the product model led to the development of a software tool, the IFC Assembly Viewer, which aided further investigations into the suitability of the product model (in its current state) for the exchange and sharing of product information. A software framework, or reusable software design and application, called PROduct Information Management System (PROMIS), was developed based on a non-standard product model but with flexibility to work with the IFC product model when sufficiently mature. The software comprises three subsystems namely: ProductWeb, ModelManager.NET and Product/Project Service (or P2Service). The key features of this system were shared project databases, parametric product specification, integration of product information sources, and application interaction and integration through interface components. PROMIS was applied to and tested with a modular construction business for the management of product information and for integration of product and project information through the design and construction (production) process

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web
    • 

    corecore