290 research outputs found

    Proceedings of JAC 2010. Journées Automates Cellulaires

    Get PDF
    The second Symposium on Cellular Automata “Journ´ees Automates Cellulaires” (JAC 2010) took place in Turku, Finland, on December 15-17, 2010. The first two conference days were held in the Educarium building of the University of Turku, while the talks of the third day were given onboard passenger ferry boats in the beautiful Turku archipelago, along the route Turku–Mariehamn–Turku. The conference was organized by FUNDIM, the Fundamentals of Computing and Discrete Mathematics research center at the mathematics department of the University of Turku. The program of the conference included 17 submitted papers that were selected by the international program committee, based on three peer reviews of each paper. These papers form the core of these proceedings. I want to thank the members of the program committee and the external referees for the excellent work that have done in choosing the papers to be presented in the conference. In addition to the submitted papers, the program of JAC 2010 included four distinguished invited speakers: Michel Coornaert (Universit´e de Strasbourg, France), Bruno Durand (Universit´e de Provence, Marseille, France), Dora Giammarresi (Universit` a di Roma Tor Vergata, Italy) and Martin Kutrib (Universit¨at Gie_en, Germany). I sincerely thank the invited speakers for accepting our invitation to come and give a plenary talk in the conference. The invited talk by Bruno Durand was eventually given by his co-author Alexander Shen, and I thank him for accepting to make the presentation with a short notice. Abstracts or extended abstracts of the invited presentations appear in the first part of this volume. The program also included several informal presentations describing very recent developments and ongoing research projects. I wish to thank all the speakers for their contribution to the success of the symposium. I also would like to thank the sponsors and our collaborators: the Finnish Academy of Science and Letters, the French National Research Agency project EMC (ANR-09-BLAN-0164), Turku Centre for Computer Science, the University of Turku, and Centro Hotel. Finally, I sincerely thank the members of the local organizing committee for making the conference possible. These proceedings are published both in an electronic format and in print. The electronic proceedings are available on the electronic repository HAL, managed by several French research agencies. The printed version is published in the general publications series of TUCS, Turku Centre for Computer Science. We thank both HAL and TUCS for accepting to publish the proceedings.Siirretty Doriast

    木を用いた構造化並列プログラミング

    Get PDF
    High-level abstractions for parallel programming are still immature. Computations on complicated data structures such as pointer structures are considered as irregular algorithms. General graph structures, which irregular algorithms generally deal with, are difficult to divide and conquer. Because the divide-and-conquer paradigm is essential for load balancing in parallel algorithms and a key to parallel programming, general graphs are reasonably difficult. However, trees lead to divide-and-conquer computations by definition and are sufficiently general and powerful as a tool of programming. We therefore deal with abstractions of tree-based computations. Our study has started from Matsuzaki’s work on tree skeletons. We have improved the usability of tree skeletons by enriching their implementation aspect. Specifically, we have dealt with two issues. We first have implemented the loose coupling between skeletons and data structures and developed a flexible tree skeleton library. We secondly have implemented a parallelizer that transforms sequential recursive functions in C into parallel programs that use tree skeletons implicitly. This parallelizer hides the complicated API of tree skeletons and makes programmers to use tree skeletons with no burden. Unfortunately, the practicality of tree skeletons, however, has not been improved. On the basis of the observations from the practice of tree skeletons, we deal with two application domains: program analysis and neighborhood computation. In the domain of program analysis, compilers treat input programs as control-flow graphs (CFGs) and perform analysis on CFGs. Program analysis is therefore difficult to divide and conquer. To resolve this problem, we have developed divide-and-conquer methods for program analysis in a syntax-directed manner on the basis of Rosen’s high-level approach. Specifically, we have dealt with data-flow analysis based on Tarjan’s formalization and value-graph construction based on a functional formalization. In the domain of neighborhood computations, a primary issue is locality. A naive parallel neighborhood computation without locality enhancement causes a lot of cache misses. The divide-and-conquer paradigm is known to be useful also for locality enhancement. We therefore have applied algebraic formalizations and a tree-segmenting technique derived from tree skeletons to the locality enhancement of neighborhood computations.電気通信大学201

    Observations on Cortical Mechanisms for Object Recognition andsLearning

    Get PDF
    This paper sketches a hypothetical cortical architecture for visual 3D object recognition based on a recent computational model. The view-centered scheme relies on modules for learning from examples, such as Hyperbf-like networks. Such models capture a class of explanations we call Memory-Based Models (MBM) that contains sparse population coding, memory-based recognition, and codebooks of prototypes. Unlike the sigmoidal units of some artificial neural networks, the units of MBMs are consistent with the description of cortical neurons. We describe how an example of MBM may be realized in terms of cortical circuitry and biophysical mechanisms, consistent with psychophysical and physiological data

    Digital Color Imaging

    Full text link
    This paper surveys current technology and research in the area of digital color imaging. In order to establish the background and lay down terminology, fundamental concepts of color perception and measurement are first presented us-ing vector-space notation and terminology. Present-day color recording and reproduction systems are reviewed along with the common mathematical models used for representing these devices. Algorithms for processing color images for display and communication are surveyed, and a forecast of research trends is attempted. An extensive bibliography is provided

    Improving the Cybersecurity of Cyber-Physical Systems Through Behavioral Game Theory and Model Checking in Practice and in Education

    Get PDF
    This dissertation presents automated methods based on behavioral game theory and model checking to improve the cybersecurity of cyber-physical systems (CPSs) and advocates teaching certain foundational principles of these methods to cybersecurity students. First, it encodes behavioral game theory\u27s concept of level-k reasoning into an integer linear program that models a newly defined security Colonel Blotto game. This approach is designed to achieve an efficient allocation of scarce protection resources by anticipating attack allocations. A human subjects experiment based on a CPS infrastructure demonstrates its effectiveness. Next, it rigorously defines the term adversarial thinking, one of cybersecurity educations most important and elusive learning objectives, but for which no proper definition exists. It spells out what it means to think like a hacker by examining the characteristic thought processes of hackers through the lens of Sternberg\u27s triarchic theory of intelligence. Next, a classroom experiment demonstrates that teaching basic game theory concepts to cybersecurity students significantly improves their strategic reasoning abilities. Finally, this dissertation applies the SPIN model checker to an electric power protection system and demonstrates a straightforward and effective technique for rigorously characterizing the degree of fault tolerance of complex CPSs, a key step in improving their defensive posture

    Algorithmic Techniques in Gene Expression Processing. From Imputation to Visualization

    Get PDF
    The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.Siirretty Doriast

    A Rapid Prototyping Tool for Embedded, Real-Time Hierarchical Control Systems

    Get PDF
    <p>Abstract</p> <p>Laboratory Virtual Instrumentation and Engineering Workbench (LabVIEW) is a graphical programming tool based on the dataflow language <b>G</b>. Recently, runtime support for a hard real-time environment has become available for LabVIEW, which makes it an option for embedded systems prototyping. Due to its characteristics, the environment presents itself as an ideal tool for both the design and implementation of embedded software. In this paper, we study the design and implementation of embedded software by using <b>G</b> as the specification language and the LabVIEW RT real-time platform. One of the main advantages of this approach is that the environment leads itself to a very smooth transition from design to implementation, allowing for powerful cosimulation strategies (e.g., hardware in the loop, runtime modeling). We characterize the semantics and formal model of computation of <b>G</b>. We compare it to other models of computation and develop design rules and algorithms to propose sound embedded design in the language. We investigate the specification and mapping of hierarchical control systems in LabVIEW and <b>G</b>. Finally, we describe the development of a state-of-the-art embedded motion control system using LabVIEW as the specification, simulation and implementation tool, using the proposed design principles. The solution is state-of-the-art in terms of flexibility and control performance.</p

    Audio-based localization for ubiquitous sensor networks

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2005.Includes bibliographical references (p. 97-101).This research presents novel techniques for acoustic-source location for both actively triggered, and passively detected signals using pervasive, distributed networks of devices, and investigates the combination of existing resources available in personal electronics to build a digital sensing 'commons'. By connecting personal resources with those of the people nearby, tasks can be achieved, through distributed placement and statistical improvement, that a single device could not do alone. The utility and benefits of spatio-temporal acoustic sensing are presented, in the context of ubiquitous computing and machine listening history. An active audio self-localisation algorithm is described which is effective in distributed sensor networks even if only coarse temporal synchronisation can be established. Pseudo-noise 'chirps' are emitted and recorded at each of the nodes. Pair-wise distances are calculated by comparing the difference in the audio delays between the peaks measured in each recording. By removing dependence on fine grained temporal synchronisation it is hoped that this technique can be used concurrently across a wide range of devices to better leverage the existing audio sensing resources that surround us.(cont.) A passive acoustic source location estimation method is then derived which is suited to the microphone resources of network-connected heterogeneous devices containing asynchronous processors and uncalibrated sensors. Under these constraints position coordinates must be simultaneously determined for pairs of sounds and recorded at each microphone to form a chain of acoustic events. It is shown that an iterative, numerical least-squares estimator can be used. Initial position estimates of the source pair can be first found from the previous estimate in the chain and a closed-form least squares approach, improving the convergence rate of the second step. Implementations of these methods using the Smart Architectural Surfaces development platform are described and assessed. The viability of the active ranging technique is further demonstrated in a mixed-device ad-hoc sensor network case using existing off-the-shelf technology. Finally, drawing on human-centric onset detection as a means of discovering suitable sound features, to be passed between nodes for comparison, the extension of the source location algorithm beyond the use of pseudo-noise test sounds to enable the location of extraneous noises and acoustic streams is discussed for further study.Benjamin Christopher Dalton.S.M

    The synchronous languages 12 years later

    Full text link
    corecore