115 research outputs found

    Aggregability is NP-hard

    Full text link

    Measuring the effect of node aggregation on community detection

    Full text link
    Many times the nodes of a complex network, whether deliberately or not, are aggregated for technical, ethical, legal limitations or privacy reasons. A common example is the geographic position: one may uncover communities in a network of places, or of individuals identified with their typical geographical position, and then aggregate these places into larger entities, such as municipalities, thus obtaining another network. The communities found in the networks obtained at various levels of aggregation may exhibit various degrees of similarity, from full alignment to perfect independence. This is akin to the problem of ecological and atomic fallacies in statistics, or to the Modified Areal Unit Problem in geography. We identify the class of community detection algorithms most suitable to cope with node aggregation, and develop an index for aggregability, capturing to which extent the aggregation preserves the community structure. We illustrate its relevance on real-world examples (mobile phone and Twitter reply-to networks). Our main message is that any node-partitioning analysis performed on aggregated networks should be interpreted with caution, as the outcome may be strongly influenced by the level of the aggregation.Comment: 12 pages, 5 figure

    The economic consequences of euro area modelling shortcuts

    Get PDF
    The available empirical evidence suggests that non-negligible differences in economic structures persist among euro area countries. Because of these asymmetries, an area-wide modelling approach is arguably less reliable, from a strictly statistical viewpoint, than a multi-country one. This paper revolves around the following issue: are those (statistically detectable) asymmetries of any practical relevance when it comes to supporting monetary policy decision-making? To answer this question, we compute optimal parameter values of a Taylor-type rule, using two simple area-wide and multi-country models for the three largest economies in the euro area, and compare the corresponding optimized loss functions. The results suggest that the welfare under performance of an area-wide modelling approach is likely to be far from trifling.euro area, aggregation, monetary policy rules

    Nanosafety : towards safer nanoparticles by design

    Get PDF
    Background: Nanosafety aims for a solution through the safer design (and re-design) of nanostructured materials, optimizing both performance and safety, by resolving which structural features lead to the desired properties and modifying them to avoid their detrimental effects without losing their desired nanoscale properties in the process. Starting with known toxic NPs, the final aim should be the re-design of such detrimental specific NP characteristics and to redefine the way they should be manipulated from the beginning to the end of their life cycle. Methods: The researchers reviewed literature in the area of novel nanosafety strategies addressing the "safe-by-design" paradigm. Results: The potential hazards of engineered NPs are not only determined by the physicochemical properties of the engineered NPs per se but also on the interactions of these NPs with immediate surrounding environments. The aim of promoting the timely and safe development of NPs cannot be achieved via traditional studies as they address one material at one time. The development of a safer design strategy of engineered NPs requires an understanding of both intrinsic (synthetic) properties together with their extrinsic responses to external stimuli. Conclusions: We have summarized recent developments of novel nanosafety strategies addressing the "safe-by-design" paradigm for optimizing both performance and safety, allowing the comparison of results of different studies and ultimately providing guidelines for the re-design of safer NPs. The resulting discussion is intended to provide guidelines for synthetic nanochemists on how to design NPs to be safe during their full life cycle while maintaining their parental desired properties

    Hemodynamics

    Get PDF
    Hemodynamics is study of the mechanical and physiologic properties controlling blood pressure and flow through the body. The factors influencing hemodynamics are complex and extensive. In addition to systemic hemodynamic alterations, microvascular alterations are frequently observed in critically ill patients. The book "Hemodynamics: New Diagnostic and Therapeuric Approaches" is formed to present the up-to-date research under the scope of hemodynamics by scientists from different backgrounds

    PCD

    Get PDF
    Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010.Page 96 blank. Cataloged from PDF version of thesis.Includes bibliographical references (p. 87-95).The security of systems can often be expressed as ensuring that some property is maintained at every step of a distributed computation conducted by untrusted parties. Special cases include integrity of programs running on untrusted platforms, various forms of confidentiality and side-channel resilience, and domain-specific invariants. We propose a new approach, proof-carrying data (PCD), which sidesteps the threat of faults and leakage by reasoning about properties of a computation's output data, regardless of the process that produced it. In PCD, the system designer prescribes the desired properties of a computation's outputs. Corresponding proofs are attached to every message flowing through the system, and are mutually verified by the system's components. Each such proof attests that the message's data and all of its history comply with the prescribed properties. We construct a general protocol compiler that generates, propagates, and verifies such proofs of compliance, while preserving the dynamics and efficiency of the original computation. Our main technical tool is the cryptographic construction of short non-interactive arguments (computationally-sound proofs) for statements whose truth depends on "hearsay evidence": previous arguments about other statements. To this end, we attain a particularly strong proof-of-knowledge property. We realize the above, under standard cryptographic assumptions, in a model where the prover has blackbox access to some simple functionality - essentially, a signature card.by Alessandro Chiesa.M.Eng

    Modelling consumption and constructing long-term baselines in final demand

    Get PDF
    Modelling and projecting consumption, investment and government demand by detailed commodities in CGE models poses many data and methodological challenges. We review the state of knowledge of modelling consumption of commodities (price and income elasticities and demographics), as well as the historical trends that we should be able to explain. We then discuss the current approaches taken in CGE models to project the trends in demand at various levels of commodity disaggregation. We examine the pros and cons of the various approaches to adjust parameters over time or using functions of time and suggest a research agenda to improve modelling and projection. We compare projections out to 2050 using LES, CES and AIDADS functions in the same CGE model to illustrate the size of the differences. In addition, we briefly discuss the allocation of total investment and government demand to individual commodities
    corecore