12 research outputs found

    Entropy of Bit-Stuffing-Induced Measures for Two-Dimensional Checkerboard Constraints

    Get PDF

    Coding for Two Dimensional Constrained Fields

    Get PDF

    Block Pickard Models for Two-Dimensional Constraints

    Get PDF

    Generalized Belief Propagation for the Noiseless Capacity and Information Rates of Run-Length Limited Constraints

    Full text link
    The performance of the generalized belief propagation algorithm for computing the noiseless capacity and mutual information rates of finite-size two-dimensional and three-dimensional run-length limited constraints is investigated. For each constraint, a method is proposed to choose the basic regions and to construct the region graph. Simulation results for the capacity of different constraints as a function of the size of the channel and mutual information rates of different constraints as a function of signal-to-noise ratio are reported. Convergence to the Shannon capacity is also discussed.Comment: 8 pages, 11 figure

    Slice-Level Trading of Quality and Performance in Decoding H.264 Video: Slice-basiertes Abwägen zwischen Qualität und Leistung beim Dekodieren von H.264-Video

    Get PDF
    When a demanding video decoding task requires more CPU resources then available, playback degrades ungracefully today: The decoder skips frames selected arbitrarily or by simple heuristics, which is noticed by the viewer as jerky motion in the good case or as images completely breaking up in the bad case. The latter can happen due to missing reference frames. This thesis provides a way to schedule individual decoding tasks based on a cost for performance trade. Therefore, I will present a way to preprocess a video, generating estimates for the cost in terms of execution time and the performance in terms of perceived visual quality. The granularity of the scheduling decision is a single slice, which leads to a much more fine-grained approach than dealing with entire frames. Together with an actual scheduler implementation that uses the generated estimates, this work allows for higher perceived quality video playback in case of CPU overload.Wenn eine anspruchsvolle Video-Dekodierung mehr Prozessor-Ressourcen benötigt, als verfügbar sind, dann verschlechtert sich die Abspielqualität mit aktuellen Methoden drastisch: Willkürlich oder mit einfachen Heuristiken ausgewählten Bilder werden nicht dekodiert. Diese Auslassung nimmt der Betrachter im günstigsten Fall nur als ruckelnde Bewegung wahr, im ungünstigen Fall jedoch als komplettes Zusammenbrechen nachfolgender Bilder durch Folgefehler im Dekodierprozess. Meine Arbeit ermöglicht es, einzelne Teilaufgaben des Dekodierprozesses anhand einer Kosten-Nutzen-Analyse einzuplanen. Dafür ermittle ich die Kosten im Sinne von Rechenzeitbedarf und den Nutzen im Sinne von visueller Qualität für einzelne Slices eines H.264 Videos. Zusammen mit einer Implementierung eines Schedulers, der diese Werte nutzt, erlaubt meine Arbeit höhere vom Betrachter wahrgenommene Videoqualität bei knapper Prozessorzeit

    Law and Policy for the Quantum Age

    Get PDF
    Law and Policy for the Quantum Age is for readers interested in the political and business strategies underlying quantum sensing, computing, and communication. This work explains how these quantum technologies work, future national defense and legal landscapes for nations interested in strategic advantage, and paths to profit for companies

    Obiter Dicta

    Get PDF
    "Stitched together over five years of journaling, Obiter Dicta is a commonplace book of freewheeling explorations representing the transcription of a dozen notebooks, since painstakingly reimagined for publication. Organized after Theodor Adorno’s Minima Moralia, this unschooled exercise in aesthetic thought—gleefully dilettantish, oftentimes dangerously close to the epigrammatic—interrogates an array of subject matter (although inescapably circling back to the curiously resemblant histories of Western visual art and instrumental music) through the lens of drive-by speculation. Erick Verran’s approach to philosophical inquiry follows the brute-force literary technique of Jacques Derrida to exhaustively favor the material grammar of a signifier over hand-me-down meaning, juxtaposing outer semblances with their buried systems and our etched-in-stone intuitions about color and illusion, shape and value, with lessons stolen from seemingly unrelatable disciplines. Interlarded with extracts of Ludwig Wittgenstein but also Wallace Stevens, Cormac McCarthy as well as Roland Barthes, this cache of incidental remarks eschews what’s granular for the biggest picture available, leaving below the hyper-specialized fields of academia for a bird’s-eye view of their crop circles. Obiter Dicta is an unapologetic experiment in intellectual dot-connecting that challenges much long-standing wisdom about everything from illuminated manuscripts to Minecraft and the evolution of European music with lyrical brevity; that is, before jumping to the next topic.

    Personality Identification from Social Media Using Deep Learning: A Review

    Get PDF
    Social media helps in sharing of ideas and information among people scattered around the world and thus helps in creating communities, groups, and virtual networks. Identification of personality is significant in many types of applications such as in detecting the mental state or character of a person, predicting job satisfaction, professional and personal relationship success, in recommendation systems. Personality is also an important factor to determine individual variation in thoughts, feelings, and conduct systems. According to the survey of Global social media research in 2018, approximately 3.196 billion social media users are in worldwide. The numbers are estimated to grow rapidly further with the use of mobile smart devices and advancement in technology. Support vector machine (SVM), Naive Bayes (NB), Multilayer perceptron neural network, and convolutional neural network (CNN) are some of the machine learning techniques used for personality identification in the literature review. This paper presents various studies conducted in identifying the personality of social media users with the help of machine learning approaches and the recent studies that targeted to predict the personality of online social media (OSM) users are reviewed

    Bibliography of Lewis Research Center technical publications announced in 1984

    Get PDF
    This compilation of abstracts describes and indexes the technical reporting that resulted from the scientific and engineering work performed and managed by the Lewis Research Center in 1984. All the publications were announced in the 1984 issues of STAR (Scientific and Technical Aerospace Reports) and/or IAA (International Aerospace Abstracts). Included are research reports, journal articles, conference presentations, patents and patent applications, and theses

    Systematic Approaches for Telemedicine and Data Coordination for COVID-19 in Baja California, Mexico

    Get PDF
    Conference proceedings info: ICICT 2023: 2023 The 6th International Conference on Information and Computer Technologies Raleigh, HI, United States, March 24-26, 2023 Pages 529-542We provide a model for systematic implementation of telemedicine within a large evaluation center for COVID-19 in the area of Baja California, Mexico. Our model is based on human-centric design factors and cross disciplinary collaborations for scalable data-driven enablement of smartphone, cellular, and video Teleconsul-tation technologies to link hospitals, clinics, and emergency medical services for point-of-care assessments of COVID testing, and for subsequent treatment and quar-antine decisions. A multidisciplinary team was rapidly created, in cooperation with different institutions, including: the Autonomous University of Baja California, the Ministry of Health, the Command, Communication and Computer Control Center of the Ministry of the State of Baja California (C4), Colleges of Medicine, and the College of Psychologists. Our objective is to provide information to the public and to evaluate COVID-19 in real time and to track, regional, municipal, and state-wide data in real time that informs supply chains and resource allocation with the anticipation of a surge in COVID-19 cases. RESUMEN Proporcionamos un modelo para la implementación sistemática de la telemedicina dentro de un gran centro de evaluación de COVID-19 en el área de Baja California, México. Nuestro modelo se basa en factores de diseño centrados en el ser humano y colaboraciones interdisciplinarias para la habilitación escalable basada en datos de tecnologías de teleconsulta de teléfonos inteligentes, celulares y video para vincular hospitales, clínicas y servicios médicos de emergencia para evaluaciones de COVID en el punto de atención. pruebas, y para el tratamiento posterior y decisiones de cuarentena. Rápidamente se creó un equipo multidisciplinario, en cooperación con diferentes instituciones, entre ellas: la Universidad Autónoma de Baja California, la Secretaría de Salud, el Centro de Comando, Comunicaciones y Control Informático. de la Secretaría del Estado de Baja California (C4), Facultades de Medicina y Colegio de Psicólogos. Nuestro objetivo es proporcionar información al público y evaluar COVID-19 en tiempo real y rastrear datos regionales, municipales y estatales en tiempo real que informan las cadenas de suministro y la asignación de recursos con la anticipación de un aumento de COVID-19. 19 casos.ICICT 2023: 2023 The 6th International Conference on Information and Computer Technologieshttps://doi.org/10.1007/978-981-99-3236-
    corecore