7,191 research outputs found

    Decentralizing Trust with Resilient Group Signatures in Blockchains

    Get PDF
    Blockchains have the goal of promoting the decentralization of transactions in a P2Pbased internetworking model that does not depend on centralized trust parties. Along with research on better scalability, performance, consistency control, and security guarantees in their service planes, other challenges aimed at better trust decentralization and fairness models on the research community’s agenda today. Asymmetric cryptography and digital signatures are key components of blockchain systems. As a common flaw in different blockchains, public keys and verification of single-signed transactions are handled under the principle of trust centralization. In this dissertation, we propose a better fairness and trust decentralization model by proposing a service plane for blockchains that provides support for collective digital signatures and allowing transactions to be collaboratively authenticated and verified with groupbased witnessed guarantees. The proposed solution is achieved by using resilient group signatures from randomly and dynamically assigned groups. In our approach we use Threshold-Byzantine Fault Tolerant Digital Signatures to improve the resilience and robustness of blockchain systems while preserving their decentralization nature. We have designed and implemented a modular and portable cryptographic provider that supports operations expressed by smart contracts. Our system is designed to be a service plane agnostic and adaptable to the base service planes of different blockchains. Therefore, we envision our solution as a portable, adaptable and reusable plugin service plane for blockchains, as a way to provide authenticated group-signed transactions with decentralized auditing, fairness, and long-term security guarantees and to leverage a better decentralized trust model. We conducted our experimental evaluations in a cloudbased testbench with at least sixteen blockchain nodes distributed across four different data centers, using two different blockchains and observing the proposed benefits.As blockchains tem principal objetivo de promover a descentralização das transações numa rede P2P, baseada num modelo não dependente de uma autoridade centralizada. Em conjunto com maior escalabilidade, performance, controlos de consistência e garantias de segurança nos planos de serviço, outros desafios como a melhoria do modelo de descentralização e na equidade estão na agenda da comunidade científica. Criptografia assimétrica e as assinaturas digitais são a componente chave dos sistemas de blockchains. Porém, as blockchains, chaves públicas e verificações de transações assinadas estão sobre o princípio de confiança centralizada. Nesta dissertação, vamos propor uma solução que inclui melhores condições de equidade e descentralização de confiança, modelado por um plano de serviços para a blockchain que fornece suporte para assinaturas coletivas e permite que as transações sejam autenticadas colaborativamente e verificadas com garantias das testemunhadas. Isto será conseguido usando assinaturas resilientes para grupos formados de forma aleatória e dinamicamente. A nossa solução para melhorar a resiliência das blockchains e preservar a sua natureza descentralizada, irá ser baseada em assinaturas threshold à prova de falhas Bizantinas. Com esta finalidade, iremos desenhar e implementar um provedor criptográfico modelar e portável para suportar operações criptográficas que podem ser expressas por smart-contracts. O nosso sistema será desenhado de uma forma agnóstica e adaptável a diferentes planos de serviços. Assim, imaginamos a nossa solução como um plugin portável e adaptável para as blockchains, que oferece suporte para auditoria descentralizada, justiça, e garantias de longo termo para criar modelo melhor da descentralização da base de confiança. Iremos efetuar as avaliações experimentais na cloud, correndo o nosso plano de serviço com duas implementações de blockchain e pelo menos dezasseis nós distribuídos em quatro data centres, observando os benefícios da solução proposta

    Handling imperfect information in criterion evaluation, aggregation and indexing

    Get PDF

    A fuzzy hierarchical decision model and its application in networking datacenters and in infrastructure acquisitions and design

    Get PDF
    According to several studies, an inordinate number of major business decisions to acquire, design, plan, and implement networking infrastructures fail. A networking infrastructure is a collaborative group of telecommunications systems providing services needed for a firm\u27s operations and business growth. The analytical hierarchy process (AHP) is a well established decision-making process used to analyze decisions related to networking infrastructures. AHP is concerned with decomposing complex decisions into a set of factors and solutions. However, AHP has difficulties in handling uncertainty in decision information. This study addressed the research question of solutions to AHP deficiencies. The solutions were accomplished through the development of a model capable of handling decisions with incomplete information and uncertain decision operating environment. This model is based on AHP framework and fuzzy sets theory. Fuzzy sets are sets whose memberships are gradual. A member of a fuzzy set may have a strong, weak, or a moderate membership. The methodology for this study was based primarily on the analytical research design method, which is neither quantitative nor qualitative, but based on mathematical concepts, proofs, and logic. The model\u27s constructs were verified by a simulated practical case study based on current literature and the input of networking experts. To further verify the research objectives, the investigator developed, tested, and validated a software platform. The results showed tangible improvements in analyzing complex networking infrastructure decisions. The ability of this model to analyze decisions with incomplete information and uncertain economic outlook can be employed in the socially important areas such as renewable energy, forest management, and environmental studies to achieve large savings

    Macroeconomic Modeling when Agents are Imperfectly Informed

    Get PDF
    DSGE-models have become important tools of analysis not only in academia but increasingly in the board rooms of central banks. The success of these models has much to do with the coherence of the intellectual framework it provides. The limitations of these models come from the fact that they make very strong assumptions about the cognitive abilities of agents in understanding the underlying model. In this paper we relax this strong assumption. We develop a stylized DSGE-model in which individuals use simple rules of thumb (heuristics) to forecast the future inflation and output gap. We compare this model with the rational expectations version of the same underlying model. We find that the dynamics predicted by the heuristic model differs from the rational expectations version in some important respects, in particular in their capacity to produce endogenous economic cycles.DSGE-model, imperfect information, heuristics, animal spirits

    Addressing the Pension Dilemma in Canada

    Get PDF
    The purpose of this paper is to advance understanding of defined benefit pension plans in Canada by focusing on important questions related to the funding, accounting and policy aspects of management of defined benefit pension plans. As well, the paper aims to impart a reasonable estimate of the standing of defined benefit pension plans at December 31, 2003 and to explore potential remedies for consideration by stakeholders inclusive of legislators, regulators, standard setters, employers and members. The analysis shows that at December 31, 2003, 59% of Canadian defined benefit pension plans continued to be in deficit. That number rises to 95% if provided for indexation of benefits. It is expected that an additional 160billionisrequiredtofullyfundthosedeficits(assumingindexationofaccruedbenefits).Giventhatthefutureprospectsofequitiesmarketperformancearerelativelyconservative,itisunlikelythatstockmarketreturnsalonewillcorrectthesituation;atleastintheshortterm.Onasolvencybasis,160 billion is required to fully fund those deficits (assuming indexation of accrued benefits). Given that the future prospects of equities market performance are relatively conservative, it is unlikely that stock market returns alone will correct the situation; at least in the short term. On a solvency basis, 15 billion per year will need to be injected into defined benefit pension plans over the next five years to make up for the investment losses. Pension regulators must take a more proactive approach and monitor more closely pension plans that are in a deficit position.defined benefit pension plan, funding position of pension plans, household savings, retirement savings, retirement income, household finance, pension accounting

    Bringing Order into Things Decentralized and Scalable Ledgering for the Internet-of-Things

    Get PDF
    The Internet-of-Things (IoT) is simultaneously the largest and the fastest growing distributed system known to date. With the expectation of 50 billion of devices coming online by 2020, far surpassing the size of the human population, problems related to scale, trustability and security are anticipated. Current IoT architectures are inherently flawed as they are centralized on the cloud and explore fragile trust-based relationships over a plethora of loosely integrated devices, leading to IoT platforms being non-robust for every party involved and unable to scale properly in the near future. The need for a new architecture that addresses these concerns is urgent as the IoT is progressively more ubiquitous, pervasive and demanding regarding the integration of devices and processing of data increasingly susceptible to reliability and security issues. In this thesis, we propose a decentralized ledgering solution for the IoT, leveraging a recent concept: blockchains. Rather than replacing the cloud, our solution presents a scalable and fault-tolerant middleware for recording transactions between peers, under verifiable and decentralized trustability assumptions and authentication guarantees for IoT devices, cloud services and users. Following on the emergent trend in modern IoT architectures, we leverage smart hubs as blockchain gateways, aggregating, pre-processing and forwarding small amounts of data and transactions in proximity conditions, that will be verified and processed as transactions in the blockchain. The proposed middleware acts as a secure ledger and establishes private channels between peers, requiring transactions in the blockchain to be signed using threshold signature schemes and grouporiented verification properties. The approach improves the decentralization and robustness characteristics under Byzantine fault-tolerance settings, while preserving the blockchain distributed nature

    Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected works), Vol. 2

    Get PDF
    This second volume dedicated to Dezert-Smarandache Theory (DSmT) in Information Fusion brings in new fusion quantitative rules (such as the PCR1-6, where PCR5 for two sources does the most mathematically exact redistribution of conflicting masses to the non-empty sets in the fusion literature), qualitative fusion rules, and the Belief Conditioning Rule (BCR) which is different from the classical conditioning rule used by the fusion community working with the Mathematical Theory of Evidence. Other fusion rules are constructed based on T-norm and T-conorm (hence using fuzzy logic and fuzzy set in information fusion), or more general fusion rules based on N-norm and N-conorm (hence using neutrosophic logic and neutrosophic set in information fusion), and an attempt to unify the fusion rules and fusion theories. The known fusion rules are extended from the power set to the hyper-power set and comparison between rules are made on many examples. One defines the degree of intersection of two sets, degree of union of two sets, and degree of inclusion of two sets which all help in improving the all existing fusion rules as well as the credibility, plausibility, and communality functions. The book chapters are written by Frederic Dambreville, Milan Daniel, Jean Dezert, Pascal Djiknavorian, Dominic Grenier, Xinhan Huang, Pavlina Dimitrova Konstantinova, Xinde Li, Arnaud Martin, Christophe Osswald, Andrew Schumann, Tzvetan Atanasov Semerdjiev, Florentin Smarandache, Albena Tchamova, and Min Wang

    Application of decision trees and multivariate regression trees in design and optimization

    Get PDF
    Induction of decision trees and regression trees is a powerful technique not only for performing ordinary classification and regression analysis but also for discovering the often complex knowledge which describes the input-output behavior of a learning system in qualitative forms;In the area of classification (discrimination analysis), a new technique called IDea is presented for performing incremental learning with decision trees. It is demonstrated that IDea\u27s incremental learning can greatly reduce the spatial complexity of a given set of training examples. Furthermore, it is shown that this reduction in complexity can also be used as an effective tool for improving the learning efficiency of other types of inductive learners such as standard backpropagation neural networks;In the area of regression analysis, a new methodology for performing multiobjective optimization has been developed. Specifically, we demonstrate that muitiple-objective optimization through induction of multivariate regression trees is a powerful alternative to the conventional vector optimization techniques. Furthermore, in an attempt to investigate the effect of various types of splitting rules on the overall performance of the optimizing system, we present a tree partitioning algorithm which utilizes a number of techniques derived from diverse fields of statistics and fuzzy logic. These include: two multivariate statistical approaches based on dispersion matrices, an information-theoretic measure of covariance complexity which is typically used for obtaining multivariate linear models, two newly-formulated fuzzy splitting rules based on Pearson\u27s parametric and Kendall\u27s nonparametric measures of association, Bellman and Zadeh\u27s fuzzy decision-maximizing approach within an inductive framework, and finally, the multidimensional extension of a widely-used fuzzy entropy measure. The advantages of this new approach to optimization are highlighted by presenting three examples which respectively deal with design of a three-bar truss, a beam, and an electric discharge machining (EDM) process

    Chance-constrained programming with fuzzy stochastic coefficients

    Get PDF
    International audienceWe consider fuzzy stochastic programming problems with a crisp objective function and linear constraints whose coefficients are fuzzy random variables, in particular of type L-R. To solve this type of problems, we formulate deterministic counterparts of chance-constrained programming with fuzzy stochastic coefficients, by combining constraints on probability of satisfying constraints, as well as their possibility and necessity. We discuss the possible indices for comparing fuzzy quantities by putting together interval orders and statistical preference. We study the convexity of the set of feasible solutions under various assumptions. We also consider the case where fuzzy intervals are viewed as consonant random intervals. The particular cases of type L-R fuzzy Gaussian and discrete random variables are detailed
    corecore