246 research outputs found

    Design Thinking as Heterogeneous Engineering: Emerging Design Methods in Meme Warfare

    Get PDF
    The shift of production of material artefacts to digital and online making has been greatly disruptive to material culture. Design has typically concerned itself with studying material cultures in order to develop a better understanding of the ways people go about shaping the world around them. This thesis contributes to this space by looking at an emerging form of artefact generation in digital and online making, namely, visual communication design in online information warfare. Developing understanding of participation in this space reveals possible trajectory of working with material culture as it increasingly becomes digital and online. Marshall McLuhan wrote in 1970 that “World War 3 is a guerrilla information war with no division between military and civilian participation” (p. 66), anticipating ubiquitous symmetrical capacity of users as both producers and consumers of information through communication technology. This space has emerged as our digital and online environment, and prominent in this environment are images with characteristics of visual communication design. It appears that the trajectory of visual communication design from the late 19t h century is moving toward ubiquitous making and exchanging of visual communication, as anyone with a smartphone can make an internet meme with worldwide reach and influence

    A Comprehensive Review of Bio-Inspired Optimization Algorithms Including Applications in Microelectronics and Nanophotonics

    Get PDF
    The application of artificial intelligence in everyday life is becoming all-pervasive and unavoidable. Within that vast field, a special place belongs to biomimetic/bio-inspired algorithms for multiparameter optimization, which find their use in a large number of areas. Novel methods and advances are being published at an accelerated pace. Because of that, in spite of the fact that there are a lot of surveys and reviews in the field, they quickly become dated. Thus, it is of importance to keep pace with the current developments. In this review, we first consider a possible classification of bio-inspired multiparameter optimization methods because papers dedicated to that area are relatively scarce and often contradictory. We proceed by describing in some detail some more prominent approaches, as well as those most recently published. Finally, we consider the use of biomimetic algorithms in two related wide fields, namely microelectronics (including circuit design optimization) and nanophotonics (including inverse design of structures such as photonic crystals, nanoplasmonic configurations and metamaterials). We attempted to keep this broad survey self-contained so it can be of use not only to scholars in the related fields, but also to all those interested in the latest developments in this attractive area

    VLSI Design

    Get PDF
    This book provides some recent advances in design nanometer VLSI chips. The selected topics try to present some open problems and challenges with important topics ranging from design tools, new post-silicon devices, GPU-based parallel computing, emerging 3D integration, and antenna design. The book consists of two parts, with chapters such as: VLSI design for multi-sensor smart systems on a chip, Three-dimensional integrated circuits design for thousand-core processors, Parallel symbolic analysis of large analog circuits on GPU platforms, Algorithms for CAD tools VLSI design, A multilevel memetic algorithm for large SAT-encoded problems, etc

    Algoritmos meméticos para la resolución de problemas combinatorios de satisfacción con restricciones y con simetrías

    Get PDF
    Dicho análisis incluye un estudio del empleo de diferentes arquitecturas cooperativas que utilizan un variado número de algoritmos metaheurísticos e híbridos, apoyándonos en métodos estadísticos propuestos para la evaluación de este tipo de algoritmos.Este trabajo se enfoca en la resolución de problemas complejos de optimización, principalmente con el objetivo de prestar atención al modelado y ajuste de diversas técnicas metaheurísticas con el fin de resolver problemas de optimización con simetrías. La principal motivación para el desarrollo de esta investigación ha sido presentar una metodología que reúna las líneas principales que se deben seguir al momento de abordar este tipo de problemas. Es por ello que hemos utilizado un enfoque incremental de corte integrativo que involucre aspectos relacionados con la construcción o aplicación de modelos adecuados para la representación de los problemas objeto de estudio, considerando diferentes formas de representación enmarcados en la teoría de la dualidad, e intentando emplear algún mecanismo que permita reducir el paisaje de búsqueda (esto es, ruptura de simetrías). Se ha empleado un esquema de colaboración utilizando diferentes modelos de arquitectura, así como algoritmos híbridos evolutivos con diferentes métodos de búsqueda local. Además, consideraremos la utilización de un enfoque colaborativo entre las metaheurísticas propuestas a través de la definición de topologías de comunicación entre los diferentes componentes que participan en dicho esquema. Este enfoque propuesto se engloba dentro del paradigma de los algoritmos meméticos y ha sido validado empíricamente por medio dos problemas de optimización combinatoria que presentan un alto grado de complejidad, cuyos espacios de búsqueda son ricos en lo que se refiere a presencia de estados simétricos, y que han sido tradicionalmente formulados y resueltos por medio de técnicas de programación lineal entera (ILP) y programación con restricciones (CP). A tal fin, se presenta un extenso análisis de los resultados obtenidos con el fin de validar la adecuación y la eficacia de las técnicas metaheurísticas propuestas

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms

    La métaheuristique CAT pour le design de réseaux logistiques déterministes et stochastiques

    Get PDF
    De nos jours, les entreprises d’ici et d’ailleurs sont confrontées à une concurrence mondiale sans cesse plus féroce. Afin de survivre et de développer des avantages concurrentiels, elles doivent s’approvisionner et vendre leurs produits sur les marchés mondiaux. Elles doivent aussi offrir simultanément à leurs clients des produits d’excellente qualité à prix concurrentiels et assortis d’un service impeccable. Ainsi, les activités d’approvisionnement, de production et de marketing ne peuvent plus être planifiées et gérées indépendamment. Dans ce contexte, les grandes entreprises manufacturières se doivent de réorganiser et reconfigurer sans cesse leur réseau logistique pour faire face aux pressions financières et environnementales ainsi qu’aux exigences de leurs clients. Tout doit être révisé et planifié de façon intégrée : sélection des fournisseurs, choix d’investissements, planification du transport et préparation d’une proposition de valeur incluant souvent produits et services au fournisseur. Au niveau stratégique, ce problème est fréquemment désigné par le vocable « design de réseau logistique ». Une approche intéressante pour résoudre ces problématiques décisionnelles complexes consiste à formuler et résoudre un modèle mathématique en nombres entiers représentant la problématique. Plusieurs modèles ont ainsi été récemment proposés pour traiter différentes catégories de décision en matière de design de réseau logistique. Cependant, ces modèles sont très complexes et difficiles à résoudre, et même les solveurs les plus performants échouent parfois à fournir une solution de qualité. Les travaux développés dans cette thèse proposent plusieurs contributions. Tout d’abord, un modèle de design de réseau logistique incorporant plusieurs innovations proposées récemment dans la littérature a été développé; celui-ci intègre les dimensions du choix des fournisseurs, la localisation, la configuration et l’assignation de mission aux installations (usines, entrepôts, etc.) de l’entreprise, la planification stratégique du transport et la sélection de politiques de marketing et d’offre de valeur au consommateur. Des innovations sont proposées au niveau de la modélisation des inventaires ainsi que de la sélection des options de transport. En deuxième lieu, une méthode de résolution distribuée inspirée du paradigme des systèmes multi-agents a été développée afin de résoudre des problèmes d’optimisation de grande taille incorporant plusieurs catégories de décisions. Cette approche, appelée CAT (pour collaborative agent teams), consiste à diviser le problème en un ensemble de sous-problèmes, et assigner chacun de ces sous-problèmes à un agent qui devra le résoudre. Par la suite, les solutions à chacun de ces sous-problèmes sont combinées par d’autres agents afin d’obtenir une solution de qualité au problème initial. Des mécanismes efficaces sont conçus pour la division du problème, pour la résolution des sous-problèmes et pour l’intégration des solutions. L’approche CAT ainsi développée est utilisée pour résoudre le problème de design de réseaux logistiques en univers certain (déterministe). Finalement, des adaptations sont proposées à CAT permettant de résoudre des problèmes de design de réseaux logistiques en univers incertain (stochastique)

    Information recovery in the biological sciences : protein structure determination by constraint satisfaction, simulation and automated image processing

    Get PDF
    Regardless of the field of study or particular problem, any experimental science always poses the same question: ÒWhat object or phenomena generated the data that we see, given what is known?Ó In the field of 2D electron crystallography, data is collected from a series of two-dimensional images, formed either as a result of diffraction mode imaging or TEM mode real imaging. The resulting dataset is acquired strictly in the Fourier domain as either coupled Amplitudes and Phases (as in TEM mode) or Amplitudes alone (in diffraction mode). In either case, data is received from the microscope in a series of CCD or scanned negatives of images which generally require a significant amount of pre-processing in order to be useful. Traditionally, processing of the large volume of data collected from the microscope was the time limiting factor in protein structure determination by electron microscopy. Data must be initially collected from the microscope either on film-negatives, which in turn must be developed and scanned, or from CCDs of sizes typically no larger than 2096x2096 (though larger models are in operation). In either case, data are finally ready for processing as 8-bit, 16-bit or (in principle) 32-bit grey-scale images. Regardless of data source, the foundation of all crystallographic methods is the presence of a regular Fourier lattice. Two dimensional cryo-electron microscopy of proteins introduces special challenges as multiple crystals may be present in the same image, producing in some cases several independent lattices. Additionally, scanned negatives typically have a rectangular region marking the film number and other details of image acquisition that must be removed prior to processing. If the edges of the images are not down-tapered, vertical and horizontal ÒstreaksÓ will be present in the Fourier transform of the image --arising from the high-resolution discontinuities between the opposite edges of the image. These streaks can overlap with lattice points which fall close to the vertical and horizontal axes and disrupt both the information they contain and the ability to detect them. Lastly, SpotScanning (Downing, 1991) is a commonly used process where-by circular discs are individually scanned in an image. The large-scale regularity of the scanning patter produces a low frequency lattice which can interfere and overlap with any protein crystal lattices. We introduce a series of methods packaged into 2dx (Gipson, et al., 2007) which simultaneously addresses these problems, automatically detecting accurate crystal lattice parameters for a majority of images. Further a template is described for the automation of all subsequent image processing steps on the road to a fully processed dataset. The broader picture of image processing is one of reproducibility. The lattice parameters, for instance, are only one of hundreds of parameters which must be determined or provided and subsequently stored and accessed in a regular way during image processing. Numerous steps, from correct CTF and tilt-geometry determination to the final stages of symmetrization and optimal image recovery must be performed sequentially and repeatedly for hundreds of images. The goal in such a project is then to automatically process as significant a portion of the data as possible and to reduce unnecessary, repetitive data entry by the user. Here also, 2dx (Gipson, et al., 2007), the image processing package designed to automatically process individual 2D TEM images is introduced. This package focuses on reliability, ease of use and automation to produce finished results necessary for full three-dimensional reconstruction of the protein in question. Once individual 2D images have been processed, they contribute to a larger project-wide 3-dimensional dataset. Several challenges exist in processing this dataset, besides simply the organization of results and project-wide parameters. In particular, though tilt-geometry, relative amplitude scaling and absolute orientation are in principle known (or obtainable from an individual image) errors, uncertainties and heterogeneous data-types provide for a 3D-dataset with many parameters to be optimized. 2dx_merge (Gipson, et al., 2007) is the follow-up to the first release of 2dx which had originally processed only individual images. Based on the guiding principles of the earlier release, 2dx_merge focuses on ease of use and automation. The result is a fully qualified 3D structure determination package capable of turning hundreds of electron micrograph images, nearly completely automatically, into a full 3D structure. Most of the processing performed in the 2dx package is based on the excellent suite of programs termed collectively as the MRC package (Crowther, et al., 1996). Extensions to this suite and alternative algorithms continue to play an essential role in image processing as computers become faster and as advancements are made in the mathematics of signal processing. In this capacity, an alternative procedure to generate a 3D structure from processed 2D images is presented. This algorithm, entitled ÒProjective Constraint OptimizationÓ (PCO), leverages prior known information, such as symmetry and the fact that the protein is bound in a membrane, to extend the normal boundaries of resolution. In particular, traditional methods (Agard, 1983) make no attempt to account for the Òmissing coneÓ a vast, un-sampled, region in 3D Fourier space arising from specimen tilt limitations in the microscope. Provided sufficient data, PCO simultaneously refines the dataset, accounting for error, as well as attempting to fill this missing cone. Though PCO provides a near-optimal 3D reconstruction based on data, depending on initial data quality and amount of prior knowledge, there may be a host of solutions, and more importantly pseudo-solutions, which are more-or-less consistent with the provided dataset. Trying to find a global best-fit for known information and data can be a daunting challenge mathematically, to this end the use of meta-heuristics is addressed. Specifically, in the case of many pseudo-solutions, so long as a suitably defined error metric can be found, quasi-evolutionary swarm algorithms can be used that search solution space, sharing data as they go. Given sufficient computational power, such algorithms can dramatically reduce the search time for global optimums for a given dataset. Once the structure of a protein has been determined, many questions often remain about its function. Questions about the dynamics of a protein, for instance, are not often readily interpretable from structure alone. To this end an investigation into computationally optimized structural dynamics is described. Here, in order to find the most likely path a protein might take through Òconformation spaceÓ between two conformations, a graphics processing unit (GPU) optimized program and set of libraries is written to speed of the calculation of this process 30x. The tools and methods developed here serve as a conceptual template as to how GPU coding was applied to other aspects of the work presented here as well as GPU programming generally. The final portion of the thesis takes an apparent step in reverse, presenting a dramatic, yet highly predictive, simplification of a complex biological process. Kinetic Monte Carlo simulations idealize thousands of proteins as interacting agents by a set of simple rules (i.e. react/dissociate), offering highly-accurate insights into the large-scale cooperative behavior of proteins. This work demonstrates that, for many applications, structure, dynamics or even general knowledge of a protein may not be necessary for a meaningful biological story to emerge. Additionally, even in cases where structure and function is known, such simulations can help to answer the biological question in its entirety from structure, to dynamics, to ultimate function

    Association of Architecture Schools in Australasia

    Get PDF
    "Techniques and Technologies: Transfer and Transformation", proceedings of the 2007 AASA Conference held September 27-29, 2007, at the School of Architecture, UTS

    Advances in Artificial Intelligence: Models, Optimization, and Machine Learning

    Get PDF
    The present book contains all the articles accepted and published in the Special Issue “Advances in Artificial Intelligence: Models, Optimization, and Machine Learning” of the MDPI Mathematics journal, which covers a wide range of topics connected to the theory and applications of artificial intelligence and its subfields. These topics include, among others, deep learning and classic machine learning algorithms, neural modelling, architectures and learning algorithms, biologically inspired optimization algorithms, algorithms for autonomous driving, probabilistic models and Bayesian reasoning, intelligent agents and multiagent systems. We hope that the scientific results presented in this book will serve as valuable sources of documentation and inspiration for anyone willing to pursue research in artificial intelligence, machine learning and their widespread applications

    Sustainability in design: now! Challenges and opportunities for design research, education and practice in the XXI century

    Get PDF
    Copyright @ 2010 Greenleaf PublicationsLeNS project funded by the Asia Link Programme, EuropeAid, European Commission
    corecore