269,848 research outputs found

    Study on Rough Sets and Fuzzy Sets in Constructing Intelligent Information System

    Get PDF
    Since human being is not an omniscient and omnipotent being, we are actually living in an uncertain world. Uncertainty was involved and connected to every aspect of human life as a quotation from Albert Einstein said: �As far as the laws of mathematics refer to reality, they are not certain. And as far as they are certain, they do not refer to reality.� The most fundamental aspect of this connection is obviously shown in human communication. Naturally, human communication is built on the perception1-based information instead of measurement-based information in which perceptions play a central role in human cognition [Zadeh, 2000]. For example, it is naturally said in our communication that �My house is far from here.� rather than let say �My house is 12,355 m from here�. Perception-based information is a generalization of measurement-based information, where perception-based information such as �John is excellent.� is hard to represent by measurement-based version. Perceptions express human subjective view. Consequently, they tend to lead up to misunderstanding. Measurements then are needed such as defining units of length, time, etc., to provide objectivity as a means to overcome misunderstanding. Many measurers were invented along with their methods and theories of measurement. Hence, human cannot communicate with measurers including computer as a product of measurement era unless he uses measurement-based information. Perceptions are intrinsic aspect in uncertainty-based information. In this case, information may be incomplete, imprecise, fragmentary, not fully reliable, vague, contradictory, or deficient in some other way. 1In psychology, perception is understood as a process of translating sensory stimulation into an organized experience Generally, these various information deficiencies may express different types of uncertainty. It is necessary to construct a computer-based information system called intelligent information system that can process uncertainty-based information. In the future, computers are expected to be able to make communication with human in the level of perception. Many theories were proposed to express and process the types of uncertainty such as probability, possibility, fuzzy sets, rough sets, chaos theory and so on. This book extends and generalizes existing theory of rough set, fuzzy sets and granular computing for the purpose of constructing intelligent information system. The structure of this book is the following: In Chapter 2, types of uncertainty in the relation to fuzziness, probability and evidence theory (belief and plausibility measures) are briefly discussed. Rough set regarded as another generalization of crisp set is considered to represent rough event in the connection to the probability theory. Special attention will be given to formulation of fuzzy conditional probability relation generated by property of conditional probability of fuzzy event. Fuzzy conditional probability relation then is used to represent similarity degree of two fuzzy labels. Generalization of rough set induced by fuzzy conditional probability relation in terms of covering of the universe is given in Chapter 3. In the relation to fuzzy conditional probability relation, it is necessary to consider an interesting mathematical relation called weak fuzzy similarity relation as a generalization of fuzzy similarity relation proposed by Zadeh [1995]. Fuzzy rough set and generalized fuzzy rough set are proposed along with the generalization of rough membership function. Their properties are examined. Some applications of these methods in information system such as α-redundancy of object and dependency of domain attributes are discussed. In addition, multi rough sets based on multi-context of attributes in the presence of multi-contexts information system is defined and proposed in Chapter 4. In the real application, depending on the context, a given object may have different values of attributes. In other words, set of attributes might be represented based on different context, where they may provide different values for a given object. Context can be viewed as background or situation in which somehow it is necessary to group some attributes as a subset of attributes and consider the subset as a context. Finally, Chapter 5 summarizes all discussed in this book and puts forward some future topics of research

    Simulation of Water Distribution Systems

    Get PDF
    In this paper a software package offering a means of simulating complex water distribution systems is described. It has been developed in the course of our investigations into the applicability of neural networks and fuzzy systems for the implementation of decision support systems in operational control of industrial processes with case-studies taken from the water industry. Examples of how the simulation package have been used in a design and testing of the algorithms for state estimation, confidence limit analysis and fault detection are presented. Arguments for using a suitable graphical visualization techniques in solving problems like meter placement or leakage diagnosis are also given and supported by a set of examples

    A dust-parallax distance of 19 megaparsecs to the supermassive black hole in NGC 4151

    Full text link
    The active galaxy NGC 4151 has a crucial role as one of only two active galactic nuclei for which black hole mass measurements based on emission line reverberation mapping can be calibrated against other dynamical methods. Unfortunately, effective calibration requires an accurate distance to NGC 4151, which is currently not available. Recently reported distances range from 4 to 29 megaparsecs (Mpc). Strong peculiar motions make a redshift-based distance very uncertain, and the geometry of the galaxy and its nucleus prohibit accurate measurements using other techniques. Here we report a dust-parallax distance to NGC 4151 of DA=19.0−2.6+2.4D_A = 19.0^{+2.4}_{-2.6} Mpc. The measurement is based on an adaptation of a geometric method proposed previously using the emission line regions of active galaxies. Since this region is too small for current imaging capabilities, we use instead the ratio of the physical-to-angular sizes of the more extended hot dust emission as determined from time-delays and infrared interferometry. This new distance leads to an approximately 1.4-fold increase in the dynamical black hole mass, implying a corresponding correction to emission line reverberation masses of black holes if they are calibrated against the two objects with additional dynamical masses.Comment: Authors' version of a letter published in Nature (27 November 2014); 8 pages, 5 figures, 1 tabl

    Using 21-cm absorption surveys to measure the average HI spin temperature in distant galaxies

    Full text link
    We present a statistical method for measuring the average HI spin temperature in distant galaxies using the expected detection yields from future wide-field 21cm absorption surveys. As a demonstrative case study we consider a simulated all-southern-sky survey of 2-h per pointing with the Australian Square Kilometre Array Pathfinder for intervening HI absorbers at intermediate cosmological redshifts between z=0.4z = 0.4 and 11. For example, if such a survey yielded 10001000 absorbers we would infer a harmonic-mean spin temperature of T‾spin∼100\overline{T}_\mathrm{spin} \sim 100K for the population of damped Lyman α\alpha (DLAs) absorbers at these redshifts, indicating that more than 5050 per cent of the neutral gas in these systems is in a cold neutral medium (CNM). Conversely, a lower yield of only 100 detections would imply T‾spin∼1000\overline{T}_\mathrm{spin} \sim 1000K and a CNM fraction less than 1010 per cent. We propose that this method can be used to provide independent verification of the spin temperature evolution reported in recent 21cm surveys of known DLAs at high redshift and for measuring the spin temperature at intermediate redshifts below z≈1.7z \approx 1.7, where the Lyman-α\alpha line is inaccessible using ground-based observatories. Increasingly more sensitive and larger surveys with the Square Kilometre Array should provide stronger statistical constraints on the average spin temperature. However, these will ultimately be limited by the accuracy to which we can determine the HI column density frequency distribution, the covering factor and the redshift distribution of the background radio source population.Comment: 11 pages, 9 figures, 1 table. Proof corrected versio

    Jet energy calibration at the LHC

    Full text link
    Jets are one of the most prominent physics signatures of high energy proton proton (p-p) collisions at the Large Hadron Collider (LHC). They are key physics objects for precision measurements and searches for new phenomena. This review provides an overview of the reconstruction and calibration of jets at the LHC during its first Run. ATLAS and CMS developed different approaches for the reconstruction of jets, but use similar methods for the energy calibration. ATLAS reconstructs jets utilizing input signals from their calorimeters and use charged particle tracks to refine their energy measurement and suppress the effects of multiple p-p interactions (pileup). CMS, instead, combines calorimeter and tracking information to build jets from particle flow objects. Jets are calibrated using Monte Carlo (MC) simulations and a residual in situ calibration derived from collision data is applied to correct for the differences in jet response between data and Monte Carlo. Large samples of dijet, Z+jets, and photon+jet events at the LHC allowed the calibration of jets with high precision, leading to very small systematic uncertainties. Both ATLAS and CMS achieved a jet energy calibration uncertainty of about 1% in the central detector region and for jets with transverse momentum pT>100 GeV. At low jet pT, the jet energy calibration uncertainty is less than 4%, with dominant contributions from pileup, differences in energy scale between quark and gluon jets, and jet flavor composition.Comment: Article submitted to the International Journal of Modern Physics A (IJMPA) as part of the special issue on the "Jet Measurements at the LHC", editor G. Dissertor

    Tests of a proximity focusing RICH with aerogel as radiator

    Full text link
    Using aerogel as radiator and multianode PMTs for photon detection, a proximity focusing Cherenkov ring imaging detector has been constructed and tested in the KEK π\pi2 beam. The aim is to experimentally study the basic parameters such as resolution of the single photon Cherenkov angle and number of detected photons per ring. The resolution obtained is well approximated by estimates of contributions from pixel size and emission point uncertainty. The number of detected photons per Cherenkov ring is in good agreement with estimates based on aerogel and detector characteristics. The values obtained turn out to be rather low, mainly due to Rayleigh scattering and to the relatively large dead space between the photocathodes. A light collection system or a higher fraction of the photomultiplier active area, together with better quality aerogels are expected to improve the situation. The reduction of Cherenkov yield, for charged particle impact in the vicinity of the aerogel tile side wall, has also been measured.Comment: 4 pages, 8 figure

    Investigation of Spatial and Temporal Aspects of Airborne Gamma Spectrometry: Final Report

    Get PDF
    A study has been conducted which demonstrates the reproducibility of Airborne Gamma-ray Spectrometry (AGS) and the effects of changes in survey parameters, particularly line spacing. This has involved analysis of new data collected from estuarine salt marsh and upland areas in West Cumbria and SW Scotland during three phases of field work, in which over 150000 spectra were recorded with a 16 litre NaI(Tl) detector. The shapes and inventories of radiometric features have been examined. It has been shown that features with dimensions that are large relative to the survey line spacing are very well reproduced with all line spacings, whereas smaller features show more variability. The AGS technique has been applied to measuring changes in the radiation environment over a range of time scales from a few days to several years using data collected during this and previous surveys of the area. Changes due to sedimentation and erosion of salt marshes, and hydrological transportation of upland activity have been observed
    • …
    corecore