10,709 research outputs found

    Overview of the tool-flow for the Montium Processing Tile

    Get PDF
    This paper presents an overview of a tool chain to support a transformational design methodology. The tool can be used to compile code written in a high level source language, like C, to a coarse grain reconfigurable architecture. The source code is first translated into a Control Data Flow Graph (CDFG). A Control Dataflow Graph contains not only the dataflow operations (e.g. arithmetic or logical operations on data) but also control flow operations (e.g. operators for loop and if then else constructs). The CDFG is minimized using a set of behavior preserving transformations such as dependency analysis, common sub-expression elimination, etc. After applying graph clustering, scheduling and allocation transformations on this minimized graph, it can be mapped onto the target architecture

    Aircraft Conceptual Structural Design Using the AMMIT Structural Analysis Tool

    Get PDF
    Aircraft conceptual structural design is the process of developing and refining an idea for an aircraft into a feasible structural design. The process typically involves multiple evaluations of a single configuration and can require designers to examine thousands of concepts. Standard approaches to conducting structural analyses in this phase are either based on the use of historical or empirical data or often require significant expertise in structural analysis to perform these rapid assessments. The AMMIT structural analysis tool includes structural line models and handbook methods wrapped in a simple to use interface that can enable rapid, physics-based structural designs without requiring extensive structural expertise. The objectives of the present paper are to introduce AMMIT, describe the methods used in AMMIT, and present the results of the validation effort. Validation of the AMMIT methodology was performed on nine aircraft to determine the accuracy of the methods, highlight features of AMMIT, and guide future development of the methodology. Results of the validation effort indicated that AMMIT provides a prediction of primary structural weight for each aircraft with an acceptable level of error during the preliminary design phase with a minimal expenditure of computational resources

    A computational framework for aesthetical navigation in musical search space

    Get PDF
    Paper presented at 3rd AISB symposium on computational creativity, AISB 2016, 4-6th April, Sheffield. Abstract. This article addresses aspects of an ongoing project in the generation of artificial Persian (-like) music. Liquid Persian Music software (LPM) is a cellular automata based audio generator. In this paper LPM is discussed from the view point of future potentials of algorithmic composition and creativity. Liquid Persian Music is a creative tool, enabling exploration of emergent audio through new dimensions of music composition. Various configurations of the system produce different voices which resemble musical motives in many respects. Aesthetical measurements are determined by Zipf’s law in an evolutionary environment. Arranging these voices together for producing a musical corpus can be considered as a search problem in the LPM outputs space of musical possibilities. On this account, the issues toward defining the search space for LPM is studied throughout this paper

    Requirements engineering: a review and research agenda

    Get PDF
    This paper reviews the area of requirements engineering. It outlines the key concerns to which attention should be devoted by both practitioners, who wish to "reengineer" their development processes, and academics, seeking intellectual challenges. It presents an assessment of the state-of-the-art and draws conclusions in the form of a research agenda

    From Autonomous to Performative Control of Timbral Spatialisation

    Get PDF
    Timbral spatialisation is one such process that requires the independent control of potentially thousands of parameters (Torchia, et al., 2003). Current research on controlling timbral spatialisation has focussed either on automated generative systems, or suggested that to design trajectories in software is to write every movement line by line (Normandeau, 2009). This research proposes that Wave Terrain Synthesis may be used as an effective bridging control structure for timbral spatialisation, enabling the performative control of large numbers of parameter sets associated with software. This methodology also allows for compact interactive mapping possibilities for a physical controller, and may also be effectively mapped gesturall

    Futures of shipbuilding in the 22nd century : Explorative industry foresight research of the long-range futures for commercial ship-building, using elements of OpenAI.

    Get PDF
    The shipbuilding industry has historically shaped global trade, logistics, research, and cultural globalization. It was instrumental in exploring and colonizing new continents, thereby significantly shaping our society. Today, it's essential to consider the industry's current transformations and speculate on what shipbuilding might look like in the 22nd century. This study is dedicated to exploring the possible futures of shipbuilding over a long-range time horizon of 70 -100 years. This thesis applied futures research methods to data collected using OpenAI tools and explored possible transformative pathways within the industry. The research offers potential future scenarios and delineates change pathways from external pressures and internal shifts within the shipbuilding system. Additionally, the study highlights the possible applications and implications of utilizing OpenAI technology in a research context. The analysis of shipbuilding incorporates the Multi-Level Perspective (MLP) concept, viewing the industry as a system involving ten groups of key actors. This structure guided the data collection process for the input of the research. The primary research process adheres to traditional futures research methods, which include horizon scanning, systems thinking, scenario building, and causal layered analysis (CLA). Furthermore, the methodology was expanded to incorporate AI-assisted techniques. This includes using AI technology for automated data collection and a separate pathway using ChatGPT-4 for computer-generated scenarios and CLA narratives development. The outcomes from both methodologies are compared, and additional literature research about the applicability and implications of using AI in futures studies. The research has identified critical external drivers of change, originating from fields such as technology, energy, and social development, as well as internal drivers, including biotechnology and diversifying floating structures. The external drivers could influence the future direction of shipbuilding, while the internal factors represent potential changes originating from within the industry. The constructed scenarios are designed to stimulate discussion and provide context for future developmental trajectories of shipbuilding

    CONTREX: Design of embedded mixed-criticality CONTRol systems under consideration of EXtra-functional properties

    Get PDF
    The increasing processing power of today’s HW/SW platforms leads to the integration of more and more functions in a single device. Additional design challenges arise when these functions share computing resources and belong to different criticality levels. The paper presents the CONTREX European project and its preliminary results. CONTREX complements current activities in the area of predictable computing platforms and segregation mechanisms with techniques to consider the extra-functional properties, i.e., timing constraints, power, and temperature. CONTREX enables energy efficient and cost aware design through analysis and optimization of these properties with regard to application demands at different criticality levels

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Points of Exchange:Spatial Strategies for the Transition Towards Sustainable Urban Mobilities

    Get PDF
    • …
    corecore