15 research outputs found

    Verification and synthesis of asynchronous control circuits using petri net unfoldings

    Get PDF
    PhD ThesisDesign of asynchronous control circuits has traditionally been associated with application of formal methods. Event-based models, such as Petri nets, provide a compact and easy to understand way of specifying asynchronous behaviour. However, analysis of their behavioural properties is often hindered by the problem of exponential growth of reachable state space. This work proposes a new method for analysis of asynchronous circuit models based on Petri nets. The new approach is called PN-unfolding segment. It extends and improves existing Petri nets unfolding approaches. In addition, this thesis proposes a new analysis technique for Signal Transition Graphs along with an efficient verification technique which is also based on the Petri net unfolding. The former is called Full State Graph, the latter - STG-unfolding segment. The boolean logic synthesis is an integral part of the asynchronous circuit design process. In many cases, even if the verification of an asynchronous circuit specification has been performed successfully, it is impossible to obtain its implementation using existing methods because they are based on the reachability analysis. A new approach is proposed here for automated synthesis of speed-independent circuits based on the STG-unfolding segment constructed during the verification of the circuit's specification. Finally, this work presents experimental results showing the need for the new Petri net unfolding techniques and confirming the advantages of application of partial order approach to analysis, verification and synthesis of asynchronous circuits.The Research Committee, Newcastle University: Overseas Research Studentship Award

    The theory and practice of refinement-after-hiding

    Get PDF
    In software or hardware development, we take an abstract view of a process or system - i.e. a specification - and proceed to render it in a more implement able form. The relationship between an implementation and its specification is characterised in the context of formal verification using a notion called refinement: this notion provides a correctness condition which must be met before we can say that a particular implementation is correct with respect to a particular specification. For a notion of refinement to be useful, it should reflect the ways in which we might want to make concrete our abstract specification. In process algebras, such as those used in [28,50,63]' the notion that a process Q implements or refines a process P is based on the idea that Q is more deterministic than P: this means that every behaviour of the implementation must be possible for the specification. Consider the case that we build a (specification) network from a set of (specification) component processes, where communications or interactions between these processes are hidden. The abstract behaviour which con- stitutes these communications or interactions may be implemented using a particular protocol, replication of communication channels to mask possible faults or perhaps even parallel access to data structures to increase perfor- mance. These concrete behaviours will be hidden in the construction of the final implementation network and so the correctness of the final network may be considered using standard notions of refinement. However, we can- not directly verify the correctness of component processes in the general case, precisely because we may have done more than simply increase determinism in the move from specification to implementation component. Standard (pro- cess algebraic) refinement does not, therefore, fully reflect the ways in which we may wish to move from the abstract to the concrete at the level of such components. This has implications both in terms of the state explosion prob- lem and also in terms of verifying in isolation the correctness of a component which may be used in a number of different contexts. We therefore introduce a more powerful notion of refinement, which we shall call refinement-after-hiding: this gives us the power to approach ver- ification compositionally even though the behaviours of an implementation component may not be contained in those of the corresponding specification, provided that the (parts of the) behaviours which are different will be hidden in the construction of the final network. We explore both the theory and practice of this new notion and also present a means for its automatic verifi- cation. Finally, we use the notion of refinement-after-hiding, along with the means of verification, to verify the correctness of an important algorithm for asynchronous communication. The nature of the verification and the results achieved are completely new and quite significant.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Living ontologies: collaborative knowledge structuring on the Internet

    Get PDF
    This thesis discusses the issues involving the support of Living Ontologies: collaborating in the construction and maintenance of ontologies using the Internet. Ontologies define the concepts used in describing a domain: they are used by knowledge engineers as reusable components of knowledge-based systems. Knowledge engineers create ontologies by eliciting information from domain experts. However, experts often have different conceptualisations of a domain and knowledge engineers often have different ways of formalising their conceptualisations. Taking a constructivist perspective, constructing ontologies from multiple conflicting conceptualisations can be seen as a design activity, in which knowledge engineers make choices according to the context in which the representation will be used. Based on this theory, a methodology for collaboratively constructing ontologies might involve comparing differing conceptualisations and using these comparisons to initiate discussion, changes to the conceptualisations and the development of criteria against which they can be evaluated. APECKS (Adaptive Presentation Environment for Collaborative Knowledge Structuring) is designed to support this methodology. APECKS aims not only to support the collaborative construction of ontologies but also to use ontologies to present information to its users adaptively within a virtual environment. It demonstrates a number of innovations over conventional ontology servers, such as prompted knowledge elicitation from domain experts, automated comparisons between ontologies, the creation of design rationales and change tracking. A small evaluation of APECKS has shown that it is usable by domain experts and that automated comparisons between ontologies can be used to initiate alterations, investigations of others' conceptualisations and as a basis for discussion. Possible future development of APECKS includes tighter integration with a virtual environment and with other networked knowledge-based tools. Further research is also needed to develop the methodology on which APECKS is based, by investigating ways of comparing, combining and discussing ontologies

    Living ontologies: collaborative knowledge structuring on the Internet

    Get PDF
    This thesis discusses the issues involving the support of Living Ontologies: collaborating in the construction and maintenance of ontologies using the Internet. Ontologies define the concepts used in describing a domain: they are used by knowledge engineers as reusable components of knowledge-based systems. Knowledge engineers create ontologies by eliciting information from domain experts. However, experts often have different conceptualisations of a domain and knowledge engineers often have different ways of formalising their conceptualisations. Taking a constructivist perspective, constructing ontologies from multiple conflicting conceptualisations can be seen as a design activity, in which knowledge engineers make choices according to the context in which the representation will be used. Based on this theory, a methodology for collaboratively constructing ontologies might involve comparing differing conceptualisations and using these comparisons to initiate discussion, changes to the conceptualisations and the development of criteria against which they can be evaluated. APECKS (Adaptive Presentation Environment for Collaborative Knowledge Structuring) is designed to support this methodology. APECKS aims not only to support the collaborative construction of ontologies but also to use ontologies to present information to its users adaptively within a virtual environment. It demonstrates a number of innovations over conventional ontology servers, such as prompted knowledge elicitation from domain experts, automated comparisons between ontologies, the creation of design rationales and change tracking. A small evaluation of APECKS has shown that it is usable by domain experts and that automated comparisons between ontologies can be used to initiate alterations, investigations of others' conceptualisations and as a basis for discussion. Possible future development of APECKS includes tighter integration with a virtual environment and with other networked knowledge-based tools. Further research is also needed to develop the methodology on which APECKS is based, by investigating ways of comparing, combining and discussing ontologies

    Optimisation of Bluetooth wireless personal area networks

    Get PDF
    In recent years there has been a marked growth in the use of wireless cellular telephones, PCs and the Internet. This proliferation of information technology has hastened the advent of wireless networks which aim to increase the accessibility and reach of communications devices. Ambient Intelligence (Ami) is a vision of the future of computing in which all kinds of everyday objects will contain intelligence. To be effective, Ami requires Ubiquitous Computing and Communication, the latter being enabled by wireless networking. The IEEE's 802.11 task group has developed a series of radio based replacements for the familiar wired ethernet LAN. At the same time another IEEE standards task group, 802.15, together with a number of industry consortia, has introduced a new level of wireless networking based upon short range, ad-hoc connections. Currently, the most significant of these new Wireless Personal Area Network (WPAN) standards is Bluetooth, one of the first of the enabling technologies of Ami to be commercially available. Bluetooth operates in the internationally unlicensed Industrial, Scientific and Medical (ISM) band at 2.4 GHz. unfortunately, this spectrum is particularly crowded. It is also used by: WiFi (IEEE 802.11); a new WPAN standard called Zig- Bee; many types of simple devices such as garage door openers; and is polluted by unintentional radiators. The success of a radio specification for ubiquitous wireless communications is, therefore, dependant upon a robust tolerance to high levels of electromagnetic noise. This thesis addresses the optimisation of low power WPANs in this context, with particular reference to the physical layer radio specification of the Bluetooth system

    Proceedings of the Air Transportation Management Workshop

    Get PDF
    The Air Transportation Management (ATM) Workshop was held 31 Jan. - 1 Feb. 1995 at NASA Ames Research Center. The purpose of the workshop was to develop an initial understanding of user concerns and requirements for future ATM capabilities and to initiate discussions of alternative means and technologies for achieving more effective ATM capabilities. The topics for the sessions were as follows: viewpoints of future ATM capabilities, user requirements, lessons learned, and technologies for ATM. In addition, two panel sessions discussed priorities for ATM, and potential contributions of NASA to ATM. The proceedings contain transcriptions of all sessions

    Abstracts on Radio Direction Finding (1899 - 1995)

    Get PDF
    The files on this record represent the various databases that originally composed the CD-ROM issue of "Abstracts on Radio Direction Finding" database, which is now part of the Dudley Knox Library's Abstracts and Selected Full Text Documents on Radio Direction Finding (1899 - 1995) Collection. (See Calhoun record https://calhoun.nps.edu/handle/10945/57364 for further information on this collection and the bibliography). Due to issues of technological obsolescence preventing current and future audiences from accessing the bibliography, DKL exported and converted into the three files on this record the various databases contained in the CD-ROM. The contents of these files are: 1) RDFA_CompleteBibliography_xls.zip [RDFA_CompleteBibliography.xls: Metadata for the complete bibliography, in Excel 97-2003 Workbook format; RDFA_Glossary.xls: Glossary of terms, in Excel 97-2003 Workbookformat; RDFA_Biographies.xls: Biographies of leading figures, in Excel 97-2003 Workbook format]; 2) RDFA_CompleteBibliography_csv.zip [RDFA_CompleteBibliography.TXT: Metadata for the complete bibliography, in CSV format; RDFA_Glossary.TXT: Glossary of terms, in CSV format; RDFA_Biographies.TXT: Biographies of leading figures, in CSV format]; 3) RDFA_CompleteBibliography.pdf: A human readable display of the bibliographic data, as a means of double-checking any possible deviations due to conversion

    AN INVESTIGATION INTO AN EXPERT SYSTEM FOR TELECOMMUNICATION NETWORK DESIGN

    Get PDF
    Many telephone companies, especially in Eastern-Europe and the 'third world', are developing new telephone networks. In such situations the network design engineer needs computer based tools that not only supplement his own knowledge but also help him to cope with situations where not all the information necessary for the design is available. Often traditional network design tools are somewhat removed from the practical world for which they were developed. They often ignore the significant uncertain and statistical nature of the input data. They use data taken from a fixed point in time to solve a time variable problem, and the cost formulae tend to be on an average per line or port rather than the specific case. Indeed, data is often not available or just plainly unreliable. The engineer has to rely on rules of thumb honed over many years of experience in designing networks and be able to cope with missing data. The complexity of telecommunication networks and the rarity of specialists in this area often makes the network design process very difficult for a company. It is therefore an important area for the application of expert systems. Designs resulting from the use of expert systems will have a measure of uncertainty in their solution and adequate account must be made of the risk involved in implementing its design recommendations. The thesis reviews the status of expert systems as used for telecommunication network design. It further shows that such an expert system needs to reduce a large network problem into its component parts, use different modules to solve them and then combine these results to create a total solution. It shows how the various sub-division problems are integrated to solve the general network design problem. This thesis further presents details of such an expert system and the databases necessary for network design: three new algorithms are invented for traffic analysis, node locations and network design and these produce results that have close correlation with designs taken from BT Consultancy archives. It was initially supposed that an efficient combination of existing techniques for dealing with uncertainty within expert systems would suffice for the basis of the new system. It soon became apparent, however, that to allow for the differing attributes of facts, rules and data and the varying degrees of importance or rank within each area, a new and radically different method would be needed. Having investigated the existing uncertainty problem it is believed that a new more rational method has been found. The work has involved the invention of the 'Uncertainty Window' technique and its testing on various aspects of network design, including demand forecast, network dimensioning, node and link system sizing, etc. using a selection of networks that have been designed by BT Consultancy staff. From the results of the analysis, modifications to the technique have been incorporated with the aim of optimising the heuristics and procedures, so that the structure gives an accurate solution as early as possible. The essence of the process is one of associating the uncertainty windows with their relevant rules, data and facts, which results in providing the network designer with an insight into the uncertainties that have helped produce the overall system design: it indicates which sources of uncertainty and which assumptions are were critical for further investigation to improve upon the confidence of the overall design. The windowing technique works by virtue of its ability to retain the composition of the uncertainty and its associated values, assumption, etc. and allows for better solutions to be attained.BRITISH TELECOMMUNICATIONS PL

    Políticas de Copyright de Publicações Científicas em Repositórios Institucionais: O Caso do INESC TEC

    Get PDF
    A progressiva transformação das práticas científicas, impulsionada pelo desenvolvimento das novas Tecnologias de Informação e Comunicação (TIC), têm possibilitado aumentar o acesso à informação, caminhando gradualmente para uma abertura do ciclo de pesquisa. Isto permitirá resolver a longo prazo uma adversidade que se tem colocado aos investigadores, que passa pela existência de barreiras que limitam as condições de acesso, sejam estas geográficas ou financeiras. Apesar da produção científica ser dominada, maioritariamente, por grandes editoras comerciais, estando sujeita às regras por estas impostas, o Movimento do Acesso Aberto cuja primeira declaração pública, a Declaração de Budapeste (BOAI), é de 2002, vem propor alterações significativas que beneficiam os autores e os leitores. Este Movimento vem a ganhar importância em Portugal desde 2003, com a constituição do primeiro repositório institucional a nível nacional. Os repositórios institucionais surgiram como uma ferramenta de divulgação da produção científica de uma instituição, com o intuito de permitir abrir aos resultados da investigação, quer antes da publicação e do próprio processo de arbitragem (preprint), quer depois (postprint), e, consequentemente, aumentar a visibilidade do trabalho desenvolvido por um investigador e a respetiva instituição. O estudo apresentado, que passou por uma análise das políticas de copyright das publicações científicas mais relevantes do INESC TEC, permitiu não só perceber que as editoras adotam cada vez mais políticas que possibilitam o auto-arquivo das publicações em repositórios institucionais, como também que existe todo um trabalho de sensibilização a percorrer, não só para os investigadores, como para a instituição e toda a sociedade. A produção de um conjunto de recomendações, que passam pela implementação de uma política institucional que incentive o auto-arquivo das publicações desenvolvidas no âmbito institucional no repositório, serve como mote para uma maior valorização da produção científica do INESC TEC.The progressive transformation of scientific practices, driven by the development of new Information and Communication Technologies (ICT), which made it possible to increase access to information, gradually moving towards an opening of the research cycle. This opening makes it possible to resolve, in the long term, the adversity that has been placed on researchers, which involves the existence of barriers that limit access conditions, whether geographical or financial. Although large commercial publishers predominantly dominate scientific production and subject it to the rules imposed by them, the Open Access movement whose first public declaration, the Budapest Declaration (BOAI), was in 2002, proposes significant changes that benefit the authors and the readers. This Movement has gained importance in Portugal since 2003, with the constitution of the first institutional repository at the national level. Institutional repositories have emerged as a tool for disseminating the scientific production of an institution to open the results of the research, both before publication and the preprint process and postprint, increase the visibility of work done by an investigator and his or her institution. The present study, which underwent an analysis of the copyright policies of INESC TEC most relevant scientific publications, allowed not only to realize that publishers are increasingly adopting policies that make it possible to self-archive publications in institutional repositories, all the work of raising awareness, not only for researchers but also for the institution and the whole society. The production of a set of recommendations, which go through the implementation of an institutional policy that encourages the self-archiving of the publications developed in the institutional scope in the repository, serves as a motto for a greater appreciation of the scientific production of INESC TEC
    corecore