9,332 research outputs found
Digital Color Imaging
This paper surveys current technology and research in the area of digital
color imaging. In order to establish the background and lay down terminology,
fundamental concepts of color perception and measurement are first presented
us-ing vector-space notation and terminology. Present-day color recording and
reproduction systems are reviewed along with the common mathematical models
used for representing these devices. Algorithms for processing color images for
display and communication are surveyed, and a forecast of research trends is
attempted. An extensive bibliography is provided
Privacy-Preserving Public Information for Sequential Games
In settings with incomplete information, players can find it difficult to
coordinate to find states with good social welfare. For example, in financial
settings, if a collection of financial firms have limited information about
each other's strategies, some large number of them may choose the same
high-risk investment in hopes of high returns. While this might be acceptable
in some cases, the economy can be hurt badly if many firms make investments in
the same risky market segment and it fails. One reason why many firms might end
up choosing the same segment is that they do not have information about other
firms' investments (imperfect information may lead to `bad' game states).
Directly reporting all players' investments, however, raises confidentiality
concerns for both individuals and institutions.
In this paper, we explore whether information about the game-state can be
publicly announced in a manner that maintains the privacy of the actions of the
players, and still suffices to deter players from reaching bad game-states. We
show that in many games of interest, it is possible for players to avoid these
bad states with the help of privacy-preserving, publicly-announced information.
We model behavior of players in this imperfect information setting in two ways
-- greedy and undominated strategic behaviours, and we prove guarantees on
social welfare that certain kinds of privacy-preserving information can help
attain. Furthermore, we design a counter with improved privacy guarantees under
continual observation
Contagion effects in the world network of economic activities
Using the new data from the OECD-WTO world network of economic activities we
construct the Google matrix of this directed network and perform its
detailed analysis. The network contains 58 countries and 37 activity sectors
for years 1995, 2000, 2005, 2008, 2009. The construction of , based on
Markov chain transitions, treats all countries on equal democratic grounds
while the contribution of activity sectors is proportional to their exchange
monetary volume. The Google matrix analysis allows to obtain reliable ranking
of countries and activity sectors and to determine the sensitivity of
CheiRank-PageRank commercial balance of countries in respect to price
variations and labor cost in various countries. We demonstrate that the
developed approach takes into account multiplicity of network links with
economy interactions between countries and activity sectors thus being more
efficient compared to the usual export-import analysis. Our results highlight
the striking increase of the influence of German economic activity on other
countries during the period 1995 to 2009 while the influence of Eurozone
decreases during the same period. We compare our results with the similar
analysis of the world trade network from the UN COMTRADE database. We argue
that the knowledge of network structure allows to analyze the effects of
economic influence and contagion propagation over the world economy.Comment: this work is linked with arXiv:1504.06773 [q-fin.ST
An approach for real world data modelling with the 3D terrestrial laser scanner for built environment
Capturing and modelling 3D information of the built environment is a big challenge. A number of techniques and technologies are now in use. These include EDM, GPS, and photogrammetric application, remote sensing and traditional building surveying applications. However, use of these technologies cannot be practical and efficient in regard to time, cost and accuracy. Furthermore, a multi disciplinary knowledge base, created from the studies and research about the regeneration aspects is fundamental: historical, architectural, archeologically, environmental, social, economic, etc. In order to have an adequate diagnosis of regeneration, it is necessary to describe buildings and surroundings by means of documentation and plans. However, at this point in time the foregoing is considerably far removed from the real situation, since more often than not it is extremely difficult to obtain full documentation and cartography, of an acceptable quality, since the material, constructive pathologies and systems are often insufficient or deficient (flat that simply reflects levels, isolated photographs,..). Sometimes the information in reality exists, but this fact is not known, or it is not easily accessible, leading to the unnecessary duplication of efforts and resources.
In this paper, we discussed 3D laser scanning technology, which can acquire high density point data in an accurate, fast way. Besides, the scanner can digitize all the 3D information concerned with a real world object such as buildings, trees and terrain down to millimetre detail Therefore, it can provide benefits for refurbishment process in regeneration in the Built Environment and it can be the potential solution to overcome the challenges above. The paper introduce an approach for scanning buildings, processing the point cloud raw data, and a modelling approach for CAD extraction and building objects classification by a pattern matching approach in IFC (Industry Foundation Classes) format. The approach presented in this paper from an undertaken research can lead to parametric design and Building Information Modelling (BIM) for existing structures. Two case studies are introduced to demonstrate the use of laser scanner technology in the Built Environment. These case studies are the Jactin House Building in East Manchester and the Peel building in the campus of University Salford. Through these case studies, while use of laser scanners are explained, the integration of it with various technologies and systems are also explored for professionals in Built Environmen
Proposal for a study of computer mapping of terrain using multispectral data from ERTS-A for the Yellowstone National Park test site
The author has identified the following significant results. A terrain map of Yellowstone National Park showed plant community types and other classes of ground cover in what is basically a wild land. The map comprised 12 classes, six of which were mapped with accuracies of 70 to 95%. The remaining six classes had spectral reflectances that overlapped appreciably, and hence, those were mapped less accurately. Techniques were devised for quantitatively comparing the recognition map of the park with control data acquired from ground inspection and from analysis of sidelooking radar images, a thermal IR mosaic, and IR aerial photos of several scales. Quantitative analyses were made in ten 40 sq km test areas. Comparison mechanics were performed by computer with the final results displayed on line printer output. Forested areas were mapped by computer using ERTS data for less than 1/4 the cost of the conventional forest mapping technique for topographic base maps
A Local Stochastic Algorithm for Separation in Heterogeneous Self-Organizing Particle Systems
We present and rigorously analyze the behavior of a distributed, stochastic algorithm for separation and integration in self-organizing particle systems, an abstraction of programmable matter. Such systems are composed of individual computational particles with limited memory, strictly local communication abilities, and modest computational power. We consider heterogeneous particle systems of two different colors and prove that these systems can collectively separate into different color classes or integrate, indifferent to color. We accomplish both behaviors with the same fully distributed, local, stochastic algorithm. Achieving separation or integration depends only on a single global parameter determining whether particles prefer to be next to other particles of the same color or not; this parameter is meant to represent external, environmental influences on the particle system. The algorithm is a generalization of a previous distributed, stochastic algorithm for compression (PODC \u2716) that can be viewed as a special case of separation where all particles have the same color. It is significantly more challenging to prove that the desired behavior is achieved in the heterogeneous setting, however, even in the bichromatic case we focus on. This requires combining several new techniques, including the cluster expansion from statistical physics, a new variant of the bridging argument of Miracle, Pascoe and Randall (RANDOM \u2711), the high-temperature expansion of the Ising model, and careful probabilistic arguments
- …