42 research outputs found

    Intensional Cyberforensics

    Get PDF
    This work focuses on the application of intensional logic to cyberforensic analysis and its benefits and difficulties are compared with the finite-state-automata approach. This work extends the use of the intensional programming paradigm to the modeling and implementation of a cyberforensics investigation process with backtracing of event reconstruction, in which evidence is modeled by multidimensional hierarchical contexts, and proofs or disproofs of claims are undertaken in an eductive manner of evaluation. This approach is a practical, context-aware improvement over the finite state automata (FSA) approach we have seen in previous work. As a base implementation language model, we use in this approach a new dialect of the Lucid programming language, called Forensic Lucid, and we focus on defining hierarchical contexts based on intensional logic for the distributed evaluation of cyberforensic expressions. We also augment the work with credibility factors surrounding digital evidence and witness accounts, which have not been previously modeled. The Forensic Lucid programming language, used for this intensional cyberforensic analysis, formally presented through its syntax and operational semantics. In large part, the language is based on its predecessor and codecessor Lucid dialects, such as GIPL, Indexical Lucid, Lucx, Objective Lucid, and JOOIP bound by the underlying intensional programming paradigm.Comment: 412 pages, 94 figures, 18 tables, 19 algorithms and listings; PhD thesis; v2 corrects some typos and refs; also available on Spectrum at http://spectrum.library.concordia.ca/977460

    24th International Conference on Information Modelling and Knowledge Bases

    Get PDF
    In the last three decades information modelling and knowledge bases have become essentially important subjects not only in academic communities related to information systems and computer science but also in the business area where information technology is applied. The series of European ā€“ Japanese Conference on Information Modelling and Knowledge Bases (EJC) originally started as a co-operation initiative between Japan and Finland in 1982. The practical operations were then organised by professor Ohsuga in Japan and professors Hannu Kangassalo and Hannu Jaakkola in Finland (Nordic countries). Geographical scope has expanded to cover Europe and also other countries. Workshop characteristic - discussion, enough time for presentations and limited number of participants (50) / papers (30) - is typical for the conference. Suggested topics include, but are not limited to: 1. Conceptual modelling: Modelling and specification languages; Domain-specific conceptual modelling; Concepts, concept theories and ontologies; Conceptual modelling of large and heterogeneous systems; Conceptual modelling of spatial, temporal and biological data; Methods for developing, validating and communicating conceptual models. 2. Knowledge and information modelling and discovery: Knowledge discovery, knowledge representation and knowledge management; Advanced data mining and analysis methods; Conceptions of knowledge and information; Modelling information requirements; Intelligent information systems; Information recognition and information modelling. 3. Linguistic modelling: Models of HCI; Information delivery to users; Intelligent informal querying; Linguistic foundation of information and knowledge; Fuzzy linguistic models; Philosophical and linguistic foundations of conceptual models. 4. Cross-cultural communication and social computing: Cross-cultural support systems; Integration, evolution and migration of systems; Collaborative societies; Multicultural web-based software systems; Intercultural collaboration and support systems; Social computing, behavioral modeling and prediction. 5. Environmental modelling and engineering: Environmental information systems (architecture); Spatial, temporal and observational information systems; Large-scale environmental systems; Collaborative knowledge base systems; Agent concepts and conceptualisation; Hazard prediction, prevention and steering systems. 6. Multimedia data modelling and systems: Modelling multimedia information and knowledge; Contentbased multimedia data management; Content-based multimedia retrieval; Privacy and context enhancing technologies; Semantics and pragmatics of multimedia data; Metadata for multimedia information systems. Overall we received 56 submissions. After careful evaluation, 16 papers have been selected as long paper, 17 papers as short papers, 5 papers as position papers, and 3 papers for presentation of perspective challenges. We thank all colleagues for their support of this issue of the EJC conference, especially the program committee, the organising committee, and the programme coordination team. The long and the short papers presented in the conference are revised after the conference and published in the Series of ā€œFrontiers in Artificial Intelligenceā€ by IOS Press (Amsterdam). The books ā€œInformation Modelling and Knowledge Basesā€ are edited by the Editing Committee of the conference. We believe that the conference will be productive and fruitful in the advance of research and application of information modelling and knowledge bases. Bernhard Thalheim Hannu Jaakkola Yasushi Kiyok

    Intensional Cyberforensics

    Get PDF
    This work focuses on the application of intensional logic to cyberforensic analysis and its benefits and difficulties are compared with the finite-state-automata approach. This work extends the use of the intensional programming paradigm to the modeling and implementation of a cyberforensics investigation process with backtracing of event reconstruction, in which evidence is modeled by multidimensional hierarchical contexts, and proofs or disproofs of claims are undertaken in an eductive manner of evaluation. This approach is a practical, context-aware improvement over the finite state automata (FSA) approach we have seen in previous work. As a base implementation language model, we use in this approach a new dialect of the Lucid programming language, called Forensic Lucid, and we focus on defining hierarchical contexts based on intensional logic for the distributed evaluation of cyberforensic expressions. We also augment the work with credibility factors surrounding digital evidence and witness accounts, which have not been previously modeled. The Forensic Lucid programming language, used for this intensional cyberforensic analysis, formally presented through its syntax and operational semantics. In large part, the language is based on its predecessor and codecessor Lucid dialects, such as GIPL, Indexical Lucid, Lucx, Objective Lucid, MARFL, and JOOIP bound by the underlying intensional programming paradigm

    Moving Boundaries in Translation Studies

    Get PDF
    Translation is in motion. Both translation practice and translation studies (TS) have seen considerable innovation in recent decades, and we are currently witnessing a wealth of new approaches and concepts, some of which refect new translation phenomena, whereas others mirror new scholarly foci. Volunteer translation, crowdsourcing, virtual translator networks, transediting, and translanguaging are only some examples of practices and notions that are emerging on the scene alongside a renewed focus on well-established concepts that have traditionally been considered peripheral to the practice and study of translation: intralingual and intersemiotic translation are cases in point. At the same time, technological innovation and global developments such as the spread of English as a lingua franca are affecting wide areas of translation and, with it, translation studies. These trends are currently pushing or even crossing our traditional understandings of translation (studies) and its boundaries. The question is: how to deal with these developments? Some areas of the translation profession seem to respond by widening its borders, adding new practices such as technical writing, localisation, transcreation, or post-editing to their job portfolios, whereas others seem to be closing ranks. The same trend can be observed in the academic discipline: some branches of translation studies are eager to embrace all new developments under the TS umbrella, whereas others tend to dismiss (some of) them as irrelevant or as merely refecting new names for age-old practices. Translation is in motion. Technological developments, digitalisation and globalisation are among the many factors affecting and changing translation and, with it, translation studies. Moving Boundaries in Translation Studies offers a birdā€™s-eye view of recent developments and discusses their implications for the boundaries of the discipline. With 15 chapters written by leading translation scholars from around the world, the book analyses new translation phenomena, new practices and tools, new forms of organisation, new concepts and names as well as new scholarly approaches and methods. This is key reading for scholars, researchers and advanced students of translation and interpreting studies. The Open Access version of this book, available at http://www.taylorfrancis.com, has been made available under a Creative Commons Attribution-Non Commercial-No Derivatives 4.0 licens

    The Sixth Annual Workshop on Space Operations Applications and Research (SOAR 1992)

    Get PDF
    This document contains papers presented at the Space Operations, Applications, and Research Symposium (SOAR) hosted by the U.S. Air Force (USAF) on 4-6 Aug. 1992 and held at the JSC Gilruth Recreation Center. The symposium was cosponsored by the Air Force Material Command and by NASA/JSC. Key technical areas covered during the symposium were robotic and telepresence, automation and intelligent systems, human factors, life sciences, and space maintenance and servicing. The SOAR differed from most other conferences in that it was concerned with Government-sponsored research and development relevant to aerospace operations. The symposium's proceedings include papers covering various disciplines presented by experts from NASA, the USAF, universities, and industry

    Moving Boundaries in Translation Studies

    Get PDF
    Translation is in motion. Both translation practice and translation studies (TS) have seen considerable innovation in recent decades, and we are currently witnessing a wealth of new approaches and concepts, some of which refect new translation phenomena, whereas others mirror new scholarly foci. Volunteer translation, crowdsourcing, virtual translator networks, transediting, and translanguaging are only some examples of practices and notions that are emerging on the scene alongside a renewed focus on well-established concepts that have traditionally been considered peripheral to the practice and study of translation: intralingual and intersemiotic translation are cases in point. At the same time, technological innovation and global developments such as the spread of English as a lingua franca are affecting wide areas of translation and, with it, translation studies. These trends are currently pushing or even crossing our traditional understandings of translation (studies) and its boundaries. The question is: how to deal with these developments? Some areas of the translation profession seem to respond by widening its borders, adding new practices such as technical writing, localisation, transcreation, or post-editing to their job portfolios, whereas others seem to be closing ranks. The same trend can be observed in the academic discipline: some branches of translation studies are eager to embrace all new developments under the TS umbrella, whereas others tend to dismiss (some of) them as irrelevant or as merely refecting new names for age-old practices. Translation is in motion. Technological developments, digitalisation and globalisation are among the many factors affecting and changing translation and, with it, translation studies. Moving Boundaries in Translation Studies offers a birdā€™s-eye view of recent developments and discusses their implications for the boundaries of the discipline. With 15 chapters written by leading translation scholars from around the world, the book analyses new translation phenomena, new practices and tools, new forms of organisation, new concepts and names as well as new scholarly approaches and methods. This is key reading for scholars, researchers and advanced students of translation and interpreting studies. The Open Access version of this book, available at http://www.taylorfrancis.com, has been made available under a Creative Commons Attribution-Non Commercial-No Derivatives 4.0 licens

    Thinking According to Finance: A Critique of Financial Temporality

    Get PDF
    There are assumptions that undergird the discipline and practice of political economy. We address this by focusing on one contracted point of tension within political economy that is of great concern post-GFC. Namely, how are the presuppositions concerning the nature of temporality within the field of political economy determinate of deficient accounts of speculation, value, and even financialization more broadly? Thus, we argue that the temporal assumptions of political economy are abstractions that demand a demystifying critique. In particular, we contest linear notions of time that presume 1) a sequence of moments and 2) the idea that the future ā€“ as such ā€“ exists. We appeal to the speculative theories of Henri Bergson and Gilles Deleuze to provide both a critique of temporality and also a prescriptive theoretical apparatus for constructive engagement with financial temporality. This takes place, first, through an elaboration of Bergsonā€™s conception of duration and in Deleuzeā€™s Three Syntheses of Time. Once elaborated, we, second, use our theoretical apparatus as a heuristic to critically engage three prominent political economic persuasions: the Marxian, Keynesian, and Critical Finance traditions. Each of these open an aperture on finance that is valuable but limited. These limitations reveal how and in what ways each are formalist projects trapped within their own schematic limitations because of the ways they think about time and finance in extensional terms; which in turn impacts how they understand the logic(s) of finance. Therefore, in order to avoid reproducing these schematic limitations, we, third, close our project by speculatively proposing a novel conception of financial temporality: what we call the ā€˜techno-temporal logic of financeā€™. This concept allows us to sidestep the limitations revealed in the Marxian, Keynesian, and Critical Finance approaches, while also constructively indicating novel ways we might be able to think according to finance

    Applications Development for the Computational Grid

    Get PDF

    Advances in Robotics, Automation and Control

    Get PDF
    The book presents an excellent overview of the recent developments in the different areas of Robotics, Automation and Control. Through its 24 chapters, this book presents topics related to control and robot design; it also introduces new mathematical tools and techniques devoted to improve the system modeling and control. An important point is the use of rational agents and heuristic techniques to cope with the computational complexity required for controlling complex systems. Through this book, we also find navigation and vision algorithms, automatic handwritten comprehension and speech recognition systems that will be included in the next generation of productive systems developed by man
    corecore