2,253 research outputs found

    Defining block character

    Get PDF
    In this paper I propose a clear, efficient, and accurate method for determining if a block of contiguous buildings has an overall character. The work is needed because most contemporary design reviews presuppose the existence of visual character, but existing design principles are often too vague to make the required determination. Clarity is achieved by shifting from vague notions to a definite concept for block character: a design feature will be perceived as part of the overall character of that block if the frequency of the feature is greater than a critical threshold. An experiment suggested that the critical frequency was quite high: over 80%. A case history illustrates how the new concept of visual character could greatly increase the efficiency and accuracy of actual planning decisions.

    COOPER-framework: A Unified Standard Process for Non-parametric Projects

    Get PDF
    Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the ‘COOPER-framework’ a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly.DEA, non-parametric efficiency, unified standard process, COOPER-framework.

    Tropical Fourier-Motzkin elimination, with an application to real-time verification

    Get PDF
    We introduce a generalization of tropical polyhedra able to express both strict and non-strict inequalities. Such inequalities are handled by means of a semiring of germs (encoding infinitesimal perturbations). We develop a tropical analogue of Fourier-Motzkin elimination from which we derive geometrical properties of these polyhedra. In particular, we show that they coincide with the tropically convex union of (non-necessarily closed) cells that are convex both classically and tropically. We also prove that the redundant inequalities produced when performing successive elimination steps can be dynamically deleted by reduction to mean payoff game problems. As a complement, we provide a coarser (polynomial time) deletion procedure which is enough to arrive at a simply exponential bound for the total execution time. These algorithms are illustrated by an application to real-time systems (reachability analysis of timed automata).Comment: 29 pages, 8 figure

    Performance evaluation using bootstrapping DEA techniques: Evidence from industry ratio analysis

    Get PDF
    In Data Envelopment Analysis (DEA) context financial data/ ratios have been used in order to produce a unified measure of performance metric. However, several scholars have indicated that the inclusion of financial ratios create biased efficiency estimates with implications on firms’ and industries’ performance evaluation. There have been several DEA formulations and techniques dealing with this problem including sensitivity analysis, Prior-Ratio-Analysis and DEA/ output–input ratio analysis for the assessment of the efficiency and ranking of the examined units. In addition to these computational approaches this paper in order to overcome these problems applies bootstrap techniques. Moreover it provides an application evaluating the performance of 23 Greek manufacturing sectors with the use of financial data. The results reveal that in the first stage of our sensitivity analysis the efficiencies obtained are biased. However, after applying the bootstrap techniques the sensitivity analysis reveals that the efficiency scores have been significantly improved.Performance measurement; Data Envelopment Analysis; Financial ratios; Bootstrap; Bias correction

    Communication complexity and combinatorial lattice theory

    Get PDF
    AbstractIn a recent paper, Hajnal, Maass, and Turán analyzed the communication complexity of graph connectivity. Building on this work, we develop a general framework for the study of a broad class of communication problems which has several interesting special cases including the graph connectivity problem. The approach is based on the combinatorial theory of alignments and lattices

    Two Timescale Convergent Q-learning for Sleep--Scheduling in Wireless Sensor Networks

    Full text link
    In this paper, we consider an intrusion detection application for Wireless Sensor Networks (WSNs). We study the problem of scheduling the sleep times of the individual sensors to maximize the network lifetime while keeping the tracking error to a minimum. We formulate this problem as a partially-observable Markov decision process (POMDP) with continuous state-action spaces, in a manner similar to (Fuemmeler and Veeravalli [2008]). However, unlike their formulation, we consider infinite horizon discounted and average cost objectives as performance criteria. For each criterion, we propose a convergent on-policy Q-learning algorithm that operates on two timescales, while employing function approximation to handle the curse of dimensionality associated with the underlying POMDP. Our proposed algorithm incorporates a policy gradient update using a one-simulation simultaneous perturbation stochastic approximation (SPSA) estimate on the faster timescale, while the Q-value parameter (arising from a linear function approximation for the Q-values) is updated in an on-policy temporal difference (TD) algorithm-like fashion on the slower timescale. The feature selection scheme employed in each of our algorithms manages the energy and tracking components in a manner that assists the search for the optimal sleep-scheduling policy. For the sake of comparison, in both discounted and average settings, we also develop a function approximation analogue of the Q-learning algorithm. This algorithm, unlike the two-timescale variant, does not possess theoretical convergence guarantees. Finally, we also adapt our algorithms to include a stochastic iterative estimation scheme for the intruder's mobility model. Our simulation results on a 2-dimensional network setting suggest that our algorithms result in better tracking accuracy at the cost of only a few additional sensors, in comparison to a recent prior work
    corecore