33,608 research outputs found

    On The Foundations of Digital Games

    Get PDF
    Computers have lead to a revolution in the games we play, and, following this, an interest for computer-based games has been sparked in research communities. However, this easily leads to the perception of a one-way direction of influence between that the field of game research and computer science. This historical investigation points towards a deep and intertwined relationship between research on games and the development of computers, giving a richer picture of both fields. While doing so, an overview of early game research is presented and an argument made that the distinction between digital games and non-digital games may be counter-productive to game research as a whole

    Finiteness and Dual Variables for Lorentzian Spin Foam Models

    Full text link
    We describe here some new results concerning the Lorentzian Barrett-Crane model, a well-known spin foam formulation of quantum gravity. Generalizing an existing finiteness result, we provide a concise proof of finiteness of the partition function associated to all non-degenerate triangulations of 4-manifolds and for a class of degenerate triangulations not previously shown. This is accomplished by a suitable re-factoring and re-ordering of integration, through which a large set of variables can be eliminated. The resulting formulation can be interpreted as a ``dual variables'' model that uses hyperboloid variables associated to spin foam edges in place of representation variables associated to faces. We outline how this method may also be useful for numerical computations, which have so far proven to be very challenging for Lorentzian spin foam models.Comment: 15 pages, 1 figur

    Ultra-pure digital sideband separation at sub-millimeter wavelengths

    Get PDF
    Deep spectral-line surveys in the mm and sub-mm range can detect thousands of lines per band uncovering the rich chemistry of molecular clouds, star forming regions and circumstellar envelopes, among others objects. The ability to study the faintest features of spectroscopic observation is, nevertheless, limited by a number of factors. The most important are the source complexity (line density), limited spectral resolution and insufficient sideband (image) rejection (SRR). Dual Sideband (2SB) millimeter receivers separate upper and lower sideband rejecting the unwanted image by about 15 dB, but they are difficult to build and, until now, only feasible up to about 500 GHz (equivalent to ALMA Band 8). For example ALMA Bands 9 (602-720 GHz) and 10 (787-950 GHz) are currently DSB receivers. Aims: This article reports the implementation of an ALMA Band 9 2SB prototype receiver that makes use of a new technique called calibrated digital sideband separation. The new method promises to ease the manufacturing of 2SB receivers, dramatically increase sideband rejection and allow 2SB instruments at the high frequencies currently covered only by Double Sideband (DSB) or bolometric detectors. Methods: We made use of a Field Programmable Gate Array (FPGA) and fast Analog to Digital Converters (ADCs) to measure and calibrate the receiver's front end phase and amplitude imbalances to achieve sideband separation beyond the possibilities of purely analog receivers. The technique could in principle allow the operation of 2SB receivers even when only imbalanced front ends can be built, particularly at very high frequencies. Results: This digital 2SB receiver shows an average sideband rejection of 45.9 dB while small portions of the band drop below 40 dB. The performance is 27 dB (a factor of 500) better than the average performance of the proof-of-concept Band 9 purely-analog 2SB prototype receiver.Comment: 5 page

    Metastability-Containing Circuits

    Get PDF
    In digital circuits, metastability can cause deteriorated signals that neither are logical 0 or logical 1, breaking the abstraction of Boolean logic. Unfortunately, any way of reading a signal from an unsynchronized clock domain or performing an analog-to-digital conversion incurs the risk of a metastable upset; no digital circuit can deterministically avoid, resolve, or detect metastability (Marino, 1981). Synchronizers, the only traditional countermeasure, exponentially decrease the odds of maintained metastability over time. Trading synchronization delay for an increased probability to resolve metastability to logical 0 or 1, they do not guarantee success. We propose a fundamentally different approach: It is possible to contain metastability by fine-grained logical masking so that it cannot infect the entire circuit. This technique guarantees a limited degree of metastability in---and uncertainty about---the output. At the heart of our approach lies a time- and value-discrete model for metastability in synchronous clocked digital circuits. Metastability is propagated in a worst-case fashion, allowing to derive deterministic guarantees, without and unlike synchronizers. The proposed model permits positive results and passes the test of reproducing Marino's impossibility results. We fully classify which functions can be computed by circuits with standard registers. Regarding masking registers, we show that they become computationally strictly more powerful with each clock cycle, resulting in a non-trivial hierarchy of computable functions

    Fusing Censored Dependent Data for Distributed Detection

    Full text link
    In this paper, we consider a distributed detection problem for a censoring sensor network where each sensor's communication rate is significantly reduced by transmitting only "informative" observations to the Fusion Center (FC), and censoring those deemed "uninformative". While the independence of data from censoring sensors is often assumed in previous research, we explore spatial dependence among observations. Our focus is on designing the fusion rule under the Neyman-Pearson (NP) framework that takes into account the spatial dependence among observations. Two transmission scenarios are considered, one where uncensored observations are transmitted directly to the FC and second where they are first quantized and then transmitted to further improve transmission efficiency. Copula-based Generalized Likelihood Ratio Test (GLRT) for censored data is proposed with both continuous and discrete messages received at the FC corresponding to different transmission strategies. We address the computational issues of the copula-based GLRTs involving multidimensional integrals by presenting more efficient fusion rules, based on the key idea of injecting controlled noise at the FC before fusion. Although, the signal-to-noise ratio (SNR) is reduced by introducing controlled noise at the receiver, simulation results demonstrate that the resulting noise-aided fusion approach based on adding artificial noise performs very closely to the exact copula-based GLRTs. Copula-based GLRTs and their noise-aided counterparts by exploiting the spatial dependence greatly improve detection performance compared with the fusion rule under independence assumption

    Spartan Daily, April 2, 1996

    Get PDF
    Volume 106, Issue 41https://scholarworks.sjsu.edu/spartandaily/8826/thumbnail.jp

    Formal Analysis of Linear Control Systems using Theorem Proving

    Full text link
    Control systems are an integral part of almost every engineering and physical system and thus their accurate analysis is of utmost importance. Traditionally, control systems are analyzed using paper-and-pencil proof and computer simulation methods, however, both of these methods cannot provide accurate analysis due to their inherent limitations. Model checking has been widely used to analyze control systems but the continuous nature of their environment and physical components cannot be truly captured by a state-transition system in this technique. To overcome these limitations, we propose to use higher-order-logic theorem proving for analyzing linear control systems based on a formalized theory of the Laplace transform method. For this purpose, we have formalized the foundations of linear control system analysis in higher-order logic so that a linear control system can be readily modeled and analyzed. The paper presents a new formalization of the Laplace transform and the formal verification of its properties that are frequently used in the transfer function based analysis to judge the frequency response, gain margin and phase margin, and stability of a linear control system. We also formalize the active realizations of various controllers, like Proportional-Integral-Derivative (PID), Proportional-Integral (PI), Proportional-Derivative (PD), and various active and passive compensators, like lead, lag and lag-lead. For illustration, we present a formal analysis of an unmanned free-swimming submersible vehicle using the HOL Light theorem prover.Comment: International Conference on Formal Engineering Method

    Fracture of disordered solids in compression as a critical phenomenon: I. Statistical mechanics formalism

    Get PDF
    This is the first of a series of three articles that treats fracture localization as a critical phenomenon. This first article establishes a statistical mechanics based on ensemble averages when fluctuations through time play no role in defining the ensemble. Ensembles are obtained by dividing a huge rock sample into many mesoscopic volumes. Because rocks are a disordered collection of grains in cohesive contact, we expect that once shear strain is applied and cracks begin to arrive in the system, the mesoscopic volumes will have a wide distribution of different crack states. These mesoscopic volumes are the members of our ensembles. We determine the probability of observing a mesoscopic volume to be in a given crack state by maximizing Shannon's measure of the emergent crack disorder subject to constraints coming from the energy-balance of brittle fracture. The laws of thermodynamics, the partition function, and the quantification of temperature are obtained for such cracking systems.Comment: 11 pages, 2 figure

    Efficient time-domain modeling and simulation of passive bandpass systems

    Get PDF
    In communication systems, the signals of interest are often amplitude and/or phase modulated ones. In this framework, the baseband equivalent signals and systems representation is usually adopted to simulate the digital parts of communication systems in an efficient manner. This contribution extends the applicability of such representation to RF/analog devices, leading to a common and efficient modeling and simulation framework. In particular, the proposed method can build half-size models compared to existing approaches, and allows one to choose the simulation time step according to the bandwidth of the modulating signals rather than the carrier frequency, thereby significantly speeding up the simulation procedure. The novel proposed method is validated via a suitable application example
    • …
    corecore