40 research outputs found

    Developing sustainability pathways for social simulation tools and services

    Get PDF
    The use of cloud technologies to teach agent-based modelling and simulation (ABMS) is an interesting application of a nascent technological paradigm that has received very little attention in the literature. This report fills that gap and aims to help instructors, teachers and demonstrators to understand why and how cloud services are appropriate solutions to common problems they face delivering their study programmes, as well as outlining the many cloud options available. The report first introduces social simulation and considers how social simulation is taught. Following this factors affecting the implementation of agent-based models are explored, with attention focused primarily on the modelling and execution platforms currently available, the challenges associated with implementing agent-based models, and the technical architectures that can be used to support the modelling, simulation and teaching process. This sets the context for an extended discussion on cloud computing including service and deployment models, accessing cloud resources, the financial implications of adopting the cloud, and an introduction to the evaluation of cloud services within the context of developing, executing and teaching agent-based models

    Scalable Architecture for Integrated Batch and Streaming Analysis of Big Data

    Get PDF
    Thesis (Ph.D.) - Indiana University, Computer Sciences, 2015As Big Data processing problems evolve, many modern applications demonstrate special characteristics. Data exists in the form of both large historical datasets and high-speed real-time streams, and many analysis pipelines require integrated parallel batch processing and stream processing. Despite the large size of the whole dataset, most analyses focus on specific subsets according to certain criteria. Correspondingly, integrated support for efficient queries and post- query analysis is required. To address the system-level requirements brought by such characteristics, this dissertation proposes a scalable architecture for integrated queries, batch analysis, and streaming analysis of Big Data in the cloud. We verify its effectiveness using a representative application domain - social media data analysis - and tackle related research challenges emerging from each module of the architecture by integrating and extending multiple state-of-the-art Big Data storage and processing systems. In the storage layer, we reveal that existing text indexing techniques do not work well for the unique queries of social data, which put constraints on both textual content and social context. To address this issue, we propose a flexible indexing framework over NoSQL databases to support fully customizable index structures, which can embed necessary social context information for efficient queries. The batch analysis module demonstrates that analysis workflows consist of multiple algorithms with different computation and communication patterns, which are suitable for different processing frameworks. To achieve efficient workflows, we build an integrated analysis stack based on YARN, and make novel use of customized indices in developing sophisticated analysis algorithms. In the streaming analysis module, the high-dimensional data representation of social media streams poses special challenges to the problem of parallel stream clustering. Due to the sparsity of the high-dimensional data, traditional synchronization method becomes expensive and severely impacts the scalability of the algorithm. Therefore, we design a novel strategy that broadcasts the incremental changes rather than the whole centroids of the clusters to achieve scalable parallel stream clustering algorithms. Performance tests using real applications show that our solutions for parallel data loading/indexing, queries, analysis tasks, and stream clustering all significantly outperform implementations using current state-of-the-art technologies

    FinBook: literary content as digital commodity

    Get PDF
    This short essay explains the significance of the FinBook intervention, and invites the reader to participate. We have associated each chapter within this book with a financial robot (FinBot), and created a market whereby book content will be traded with financial securities. As human labour increasingly consists of unstable and uncertain work practices and as algorithms replace people on the virtual trading floors of the worlds markets, we see members of society taking advantage of FinBots to invest and make extra funds. Bots of all kinds are making financial decisions for us, searching online on our behalf to help us invest, to consume products and services. Our contribution to this compilation is to turn the collection of chapters in this book into a dynamic investment portfolio, and thereby play out what might happen to the process of buying and consuming literature in the not-so-distant future. By attaching identities (through QR codes) to each chapter, we create a market in which the chapter can ‘perform’. Our FinBots will trade based on features extracted from the authors’ words in this book: the political, ethical and cultural values embedded in the work, and the extent to which the FinBots share authors’ concerns; and the performance of chapters amongst those human and non-human actors that make up the market, and readership. In short, the FinBook model turns our work and the work of our co-authors into an investment portfolio, mediated by the market and the attention of readers. By creating a digital economy specifically around the content of online texts, our chapter and the FinBook platform aims to challenge the reader to consider how their personal values align them with individual articles, and how these become contested as they perform different value judgements about the financial performance of each chapter and the book as a whole. At the same time, by introducing ‘autonomous’ trading bots, we also explore the different ‘network’ affordances that differ between paper based books that’s scarcity is developed through analogue form, and digital forms of books whose uniqueness is reached through encryption. We thereby speak to wider questions about the conditions of an aggressive market in which algorithms subject cultural and intellectual items – books – to economic parameters, and the increasing ubiquity of data bots as actors in our social, political, economic and cultural lives. We understand that our marketization of literature may be an uncomfortable juxtaposition against the conventionally-imagined way a book is created, enjoyed and shared: it is intended to be

    Association of Architecture Schools in Australasia

    Get PDF
    "Techniques and Technologies: Transfer and Transformation", proceedings of the 2007 AASA Conference held September 27-29, 2007, at the School of Architecture, UTS

    Optimización de algoritmos bioinspirados en sistemas heterogéneos CPU-GPU.

    Get PDF
    Los retos científicos del siglo XXI precisan del tratamiento y análisis de una ingente cantidad de información en la conocida como la era del Big Data. Los futuros avances en distintos sectores de la sociedad como la medicina, la ingeniería o la producción eficiente de energía, por mencionar sólo unos ejemplos, están supeditados al crecimiento continuo en la potencia computacional de los computadores modernos. Sin embargo, la estela de este crecimiento computacional, guiado tradicionalmente por la conocida “Ley de Moore”, se ha visto comprometido en las últimas décadas debido, principalmente, a las limitaciones físicas del silicio. Los arquitectos de computadores han desarrollado numerosas contribuciones multicore, manycore, heterogeneidad, dark silicon, etc, para tratar de paliar esta ralentización computacional, dejando en segundo plano otros factores fundamentales en la resolución de problemas como la programabilidad, la fiabilidad, la precisión, etc. El desarrollo de software, sin embargo, ha seguido un camino totalmente opuesto, donde la facilidad de programación a través de modelos de abstracción, la depuración automática de código para evitar efectos no deseados y la puesta en producción son claves para una viabilidad económica y eficiencia del sector empresarial digital. Esta vía compromete, en muchas ocasiones, el rendimiento de las propias aplicaciones; consecuencia totalmente inadmisible en el contexto científico. En esta tesis doctoral tiene como hipótesis de partida reducir las distancias entre los campos hardware y software para contribuir a solucionar los retos científicos del siglo XXI. El desarrollo de hardware está marcado por la consolidación de los procesadores orientados al paralelismo masivo de datos, principalmente GPUs Graphic Processing Unit y procesadores vectoriales, que se combinan entre sí para construir procesadores o computadores heterogéneos HSA. En concreto, nos centramos en la utilización de GPUs para acelerar aplicaciones científicas. Las GPUs se han situado como una de las plataformas con mayor proyección para la implementación de algoritmos que simulan problemas científicos complejos. Desde su nacimiento, la trayectoria y la historia de las tarjetas gráficas ha estado marcada por el mundo de los videojuegos, alcanzando altísimas cotas de popularidad según se conseguía más realismo en este área. Un hito importante ocurrió en 2006, cuando NVIDIA (empresa líder en la fabricación de tarjetas gráficas) lograba hacerse con un hueco en el mundo de la computación de altas prestaciones y en el mundo de la investigación con el desarrollo de CUDA “Compute Unified Device Arquitecture. Esta arquitectura posibilita el uso de la GPU para el desarrollo de aplicaciones científicas de manera versátil. A pesar de la importancia de la GPU, es interesante la mejora que se puede producir mediante su utilización conjunta con la CPU, lo que nos lleva a introducir los sistemas heterogéneos tal y como detalla el título de este trabajo. Es en entornos heterogéneos CPU-GPU donde estos rendimientos alcanzan sus cotas máximas, ya que no sólo las GPUs soportan el cómputo científico de los investigadores, sino que es en un sistema heterogéneo combinando diferentes tipos de procesadores donde podemos alcanzar mayor rendimiento. En este entorno no se pretende competir entre procesadores, sino al contrario, cada arquitectura se especializa en aquella parte donde puede explotar mejor sus capacidades. Donde mayor rendimiento se alcanza es en estos clústeres heterogéneos, donde múltiples nodos son interconectados entre sí, pudiendo dichos nodos diferenciarse no sólo entre arquitecturas CPU-GPU, sino también en las capacidades computacionales dentro de estas arquitecturas. Con este tipo de escenarios en mente, se presentan nuevos retos en los que lograr que el software que hemos elegido como candidato se ejecuten de la manera más eficiente y obteniendo los mejores resultados posibles. Estas nuevas plataformas hacen necesario un rediseño del software para aprovechar al máximo los recursos computacionales disponibles. Se debe por tanto rediseñar y optimizar los algoritmos existentes para conseguir que las aportaciones en este campo sean relevantes, y encontrar algoritmos que, por su propia naturaleza sean candidatos para que su ejecución en dichas plataformas de alto rendimiento sea óptima. Encontramos en este punto una familia de algoritmos denominados bioinspirados, que utilizan la inteligencia colectiva como núcleo para la resolución de problemas. Precisamente esta inteligencia colectiva es la que les hace candidatos perfectos para su implementación en estas plataformas bajo el nuevo paradigma de computación paralela, puesto que las soluciones pueden ser construidas en base a individuos que mediante alguna forma de comunicación son capaces de construir conjuntamente una solución común. Esta tesis se centrará especialmente en uno de estos algoritmos bioinspirados que se engloba dentro del término metaheurísticas bajo el paradigma del Soft Computing, el Ant Colony Optimization “ACO”. Se realizará una contextualización, estudio y análisis del algoritmo. Se detectarán las partes más críticas y serán rediseñadas buscando su optimización y paralelización, manteniendo o mejorando la calidad de sus soluciones. Posteriormente se pasará a implementar y testear las posibles alternativas sobre diversas plataformas de alto rendimiento. Se utilizará el conocimiento adquirido en el estudio teórico-práctico anterior para su aplicación a casos reales, más en concreto se mostrará su aplicación sobre el plegado de proteínas. Todo este análisis es trasladado a su aplicación a un caso concreto. En este trabajo, aunamos las nuevas plataformas hardware de alto rendimiento junto al rediseño e implementación software de un algoritmo bioinspirado aplicado a un problema científico de gran complejidad como es el caso del plegado de proteínas. Es necesario cuando se implementa una solución a un problema real, realizar un estudio previo que permita la comprensión del problema en profundidad, ya que se encontrará nueva terminología y problemática para cualquier neófito en la materia, en este caso, se hablará de aminoácidos, moléculas o modelos de simulación que son desconocidos para los individuos que no sean de un perfil biomédico.Ingeniería, Industria y Construcció

    Annual Report of the University, 2001-2002, Volumes 1-4

    Get PDF
    VITAL ACADEMIC CLIMATE* by Brian Foster, Provost/Vice President of Academic Affairs A great university engages students and faculty fully in important ideas and issues ... not just to learn about them, but to take them apart and put them back together, to debate, deconstruct, resist, reconstruct and build upon them. Engagement of this sort takes concentration and commitment, and it produces the kind of discipline and passion that leads to student and faculty success and satisfaction in their studies, research, performance, artistic activity and service. It is also the kind of activity that creates a solid, nurturing spirit of community. This is what we mean when we talk about a vital academic climate. We are striving for an environment that will enrich the social, cultural and intellectual lives of all who come in contact with the University. Many things interconnect to make this happen: curriculum, co-curricular activities, conferences, symposia, cultural events, community service, research and social activity. Our goal is to create the highest possible level of academic commitment and excitement at UNM. This is what characterizes a truly great university. *Strategic Direction 2 New Mexico native Andres C. Salazar, a Ph.D. in electrical engineering from Michigan State University, has been named the PNM Chair in Microsystems, Commercialization and Technology. Carrying the title of professor, the PNM Chair is a joint appointment between the School of Engineering and the Anderson Schools of Management. Spring 2002 graduate John Probasco was selected a 2002 Rhodes Scholar, the second UNM student to be so honored in the past four years. The biochemistry major from Alamogordo previously had been awarded the Goldwater Scholarship and the Truman Scholarship. Andres c. Salazar Biology student Sophie Peterson of Albuquerque was one of 30 students nationwide to receive a 2002-2003 Award of Excellence from Phi Kappa Phi, the oldest and largest national honor society. Regents\\u27 Professor of Communication and Journalism Everett M. Rogers was selected the University\\u27s 4 71h Annual Research Lecturer, the highest honor UNM bestows upon members of its faculty. John Probasco honored by Student Activities Director Debbie Morris. New Mexico resident, author and poet Simon}. Ortiz received an Honorary Doctorate of Letters at Spring Commencement ceremonies. Child advocate Angela Angie Vachio, founder and executive director of Peanut Butter and Jelly Family Services, Inc., was awarded an Honorary Doctorate of Humane Letters. American Studies Assistant Professor Amanda}. Cobb won the 22 d annual American Book Award for listening to Our Grandmothers\\u27 Stories: The Bloomfield Academy for Chickasaw Females, 1852-1949

    The Proceedings of the Fourth International Conference of the Association of Architecture Schools of Australasia

    Full text link
    The Proceedings of the Fourth International Conference of the Association of Architecture Schools of Australasia. Each paper in the Proceedings has been double refereed by members of an independent panel of academic peers appointed by the Conference Committee. Papers were matched, where possible, to referees in the same field and with similar interests to the authors

    Communicating the Unspeakable: Linguistic Phenomena in the Psychedelic Sphere

    Get PDF
    Psychedelics can enable a broad and paradoxical spectrum of linguistic phenomena from the unspeakability of mystical experience to the eloquence of the songs of the shaman or curandera. Interior dialogues with the Other, whether framed as the voice of the Logos, an alien download, or communion with ancestors and spirits, are relatively common. Sentient visual languages are encountered, their forms unrelated to the representation of speech in natural language writing systems. This thesis constructs a theoretical model of linguistic phenomena encountered in the psychedelic sphere for the field of altered states of consciousness research (ASCR). The model is developed from a neurophenomenological perspective, especially the work of Francisco Varela, and Michael Winkelman’s work in shamanistic ASC, which in turn builds on the biogenetic structuralism of Charles Laughlin, John McManus, and Eugene d’Aquili. Neurophenomenology relates the physical and functional organization of the brain to the subjective reports of lived experience in altered states as mutually informative, without reducing consciousness to one or the other. Consciousness is seen as a dynamic multistate process of the recursive interaction of biology and culture, thereby navigating the traditional dichotomies of objective/subjective, body/mind, and inner/outer realities that problematically characterize much of the discourse in consciousness studies. The theoretical work of Renaissance scholar Stephen Farmer on the evolution of syncretic and correlative systems and their relation to neurobiological structures provides a further framework for the exegesis of the descriptions of linguistic phenomena in first-person texts of long-term psychedelic selfexploration. Since the classification of most psychedelics as Schedule I drugs, legal research came to a halt; self-experimentation as research did not. Scientists such as Timothy Leary and John Lilly became outlaw scientists, a social aspect of the “unspeakability” of these experiences. Academic ASCR has largely side-stepped examination of the extensive literature of psychedelic selfexploration. This thesis examines aspects of both form and content from these works, focusing on those that treat linguistic phenomena, and asking what these linguistic experiences can tell us about how the psychedelic landscape is constructed, how it can be navigated, interpreted, and communicated within its own experiential field, and communicated about to make the data accessible to inter-subjective comparison and validation. The methodological core of this practice-based research is a technoetic practice as defined by artist and theoretician Roy Ascott: the exploration of consciousness through interactive, artistic, and psychoactive technologies. The iterative process of psychedelic self-exploration and creation of interactive software defines my own technoetic practice and is the means by which I examine my states of consciousness employing the multidimensional visual language Glide

    The development of computer science a sociocultural perspective

    Get PDF
    corecore