145 research outputs found

    Disentangling Decentralized Finance (DeFi) Compositions

    Full text link
    We present the first study on compositions of Decentralized Finance (DeFi) protocols, which aim to disrupt traditional finance and offer financial services on top of the distributed ledgers, such as the Ethereum. Starting from a ground-truth of 23 DeFi protocols and 10,663,881 associated accounts, we study the interactions of DeFi protocols and associated smart contracts from a macroscopic perspective. We find that DEX and lending protocols have a high degree centrality, that interactions among protocols primarily occur in a strongly connected component, and that known community detection cannot disentangle DeFi protocols. Therefore, we propose an algorithm for extracting the building blocks and uncovering the compositions of DeFi protocols. We apply the algorithm and conduct an empirical analysis finding that swaps are the most frequent building blocks and that DeFi aggregation protocols utilize functions of many other DeFi protocols. Overall, our results and methods contribute to a better understanding of a new family of financial products and could play an essential role in assessing systemic risks if DeFi continues to proliferate

    The Governance of Decentralized Autonomous Organizations: A Study of Contributors' Influence, Networks, and Shifts in Voting Power

    Full text link
    We present a study analyzing the voting behavior of contributors, or vested users, in Decentralized Autonomous Organizations (DAOs). We evaluate their involvement in decision-making processes, discovering that in at least 7.54% of all DAOs, contributors, on average, held the necessary majority to control governance decisions. Furthermore, contributors have singularly decided at least one proposal in 20.41% of DAOs. Notably, contributors tend to be centrally positioned within the DAO governance ecosystem, suggesting the presence of inner power circles. Additionally, we observed a tendency for shifts in governance token ownership shortly before governance polls take place in 1202 (14.81%) of 8116 evaluated proposals. Our findings highlight the central role of contributors across a spectrum of DAOs, including Decentralized Finance protocols. Our research also offers important empirical insights pertinent to ongoing regulatory activities aimed at increasing transparency to DAO governance frameworks

    Assessing the Solvency of Virtual Asset Service Providers: Are Current Standards Sufficient?

    Full text link
    Entities like centralized cryptocurrency exchanges fall under the business category of virtual asset service providers (VASPs). As any other enterprise, they can become insolvent. VASPs enable the exchange, custody, and transfer of cryptoassets organized in wallets across distributed ledger technologies (DLTs). Despite the public availability of DLT transactions, the cryptoasset holdings of VASPs are not yet subject to systematic auditing procedures. In this paper, we propose an approach to assess the solvency of a VASP by cross-referencing data from three distinct sources: cryptoasset wallets, balance sheets from the commercial register, and data from supervisory entities. We investigate 24 VASPs registered with the Financial Market Authority in Austria and provide regulatory data insights such as who are the customers and where do they come from. Their yearly incoming and outgoing transaction volume amount to 2 billion EUR for around 1.8 million users. We describe what financial services they provide and find that they are most similar to traditional intermediaries such as brokers, money exchanges, and funds, rather than banks. Next, we empirically measure DLT transaction flows of four VASPs and compare their cryptoasset holdings to balance sheet entries. Data are consistent for two VASPs only. This enables us to identify gaps in the data collection and propose strategies to address them. We remark that any entity in charge of auditing requires proof that a VASP actually controls the funds associated with its on-chain wallets. It is also important to report fiat and cryptoasset and liability positions broken down by asset types at a reasonable frequency

    Network-based indicators of Bitcoin bubbles

    Get PDF
    The functioning of the cryptocurrency Bitcoin relies on the open availability of the entire history of its transactions. This makes it a particularly interesting socio-economic system to analyse from the point of view of network science. Here we analyse the evolution of the network of Bitcoin transactions between users. We achieve this by using the complete transaction history from December 5th 2011 to December 23rd 2013. This period includes three bubbles experienced by the Bitcoin price. In particular, we focus on the global and local structural properties of the user network and their variation in relation to the different period of price surge and decline. By analysing the temporal variation of the heterogeneity of the connectivity patterns we gain insights on the different mechanisms that take place during bubbles, and find that hubs (i.e., the most connected nodes) had a fundamental role in triggering the burst of the second bubble. Finally, we examine the local topological structures of interactions between users, we discover that the relative frequency of triadic interactions experiences a strong change before, during and after a bubble, and suggest that the importance of the hubs grows during the bubble. These results provide further evidence that the behaviour of the hubs during bubbles significantly increases the systemic risk of the Bitcoin network, and discuss the implications on public policy interventions

    STAble: A novel approach to de novo assembly of RNA-seq data and its application in a metabolic model network based metatranscriptomic workflow

    Get PDF
    Background: De novo assembly of RNA-seq data allows the study of transcriptome in absence of a reference genome either if data is obtained from a single organism or from a mixed sample as in metatranscriptomics studies. Given the high number of sequences obtained from NGS approaches, a critical step in any analysis workflow is the assembly of reads to reconstruct transcripts thus reducing the complexity of the analysis. Despite many available tools show a good sensitivity, there is a high percentage of false positives due to the high number of assemblies considered and it is likely that the high frequency of false positive is underestimated by currently used benchmarks. The reconstruction of not existing transcripts may false the biological interpretation of results as - for example - may overestimate the identification of "novel" transcripts. Moreover, benchmarks performed are usually based on RNA-seq data from annotated genomes and assembled transcripts are compared to annotations and genomes to identify putative good and wrong reconstructions, but these tests alone may lead to accept a particular type of false positive as true, as better described below. Results: Here we present a novel methodology of de novo assembly, implemented in a software named STAble (Short-reads Transcriptome Assembler). The novel concept of this assembler is that the whole reads are used to determine possible alignments instead of using smaller k-mers, with the aim of reducing the number of chimeras produced. Furthermore, we applied a new set of benchmarks based on simulated data to better define the performance of assembly method and carefully identifying true reconstructions. STAble was also used to build a prototype workflow to analyse metatranscriptomics data in connection to a steady state metabolic modelling algorithm. This algorithm was used to produce high quality metabolic interpretations of small gene expression sets obtained from already published RNA-seq data that we assembled with STAble. Conclusions: The presented results, albeit preliminary, clearly suggest that with this approach is possible to identify informative reactions not directly revealed by raw transcriptomic data

    STAble: a novel approach to de novo assembly of RNA-seq data and its application in a metabolic model network based metatranscriptomic workflow.

    Get PDF
    BACKGROUND: De novo assembly of RNA-seq data allows the study of transcriptome in absence of a reference genome either if data is obtained from a single organism or from a mixed sample as in metatranscriptomics studies. Given the high number of sequences obtained from NGS approaches, a critical step in any analysis workflow is the assembly of reads to reconstruct transcripts thus reducing the complexity of the analysis. Despite many available tools show a good sensitivity, there is a high percentage of false positives due to the high number of assemblies considered and it is likely that the high frequency of false positive is underestimated by currently used benchmarks. The reconstruction of not existing transcripts may false the biological interpretation of results as - for example - may overestimate the identification of "novel" transcripts. Moreover, benchmarks performed are usually based on RNA-seq data from annotated genomes and assembled transcripts are compared to annotations and genomes to identify putative good and wrong reconstructions, but these tests alone may lead to accept a particular type of false positive as true, as better described below. RESULTS: Here we present a novel methodology of de novo assembly, implemented in a software named STAble (Short-reads Transcriptome Assembler). The novel concept of this assembler is that the whole reads are used to determine possible alignments instead of using smaller k-mers, with the aim of reducing the number of chimeras produced. Furthermore, we applied a new set of benchmarks based on simulated data to better define the performance of assembly method and carefully identifying true reconstructions. STAble was also used to build a prototype workflow to analyse metatranscriptomics data in connection to a steady state metabolic modelling algorithm. This algorithm was used to produce high quality metabolic interpretations of small gene expression sets obtained from already published RNA-seq data that we assembled with STAble. CONCLUSIONS: The presented results, albeit preliminary, clearly suggest that with this approach is possible to identify informative reactions not directly revealed by raw transcriptomic data

    The Borexino detector at the Laboratori Nazionali del Gran Sasso

    Full text link
    Borexino, a large volume detector for low energy neutrino spectroscopy, is currently running underground at the Laboratori Nazionali del Gran Sasso, Italy. The main goal of the experiment is the real-time measurement of sub MeV solar neutrinos, and particularly of the mono energetic (862 keV) Be7 electron capture neutrinos, via neutrino-electron scattering in an ultra-pure liquid scintillator. This paper is mostly devoted to the description of the detector structure, the photomultipliers, the electronics, and the trigger and calibration systems. The real performance of the detector, which always meets, and sometimes exceeds, design expectations, is also shown. Some important aspects of the Borexino project, i.e. the fluid handling plants, the purification techniques and the filling procedures, are not covered in this paper and are, or will be, published elsewhere (see Introduction and Bibliography).Comment: 37 pages, 43 figures, to be submitted to NI

    Results from the first use of low radioactivity argon in a dark matter search

    Get PDF
    Liquid argon is a bright scintillator with potent particle identification properties, making it an attractive target for direct-detection dark matter searches. The DarkSide-50 dark matter search here reports the first WIMP search results obtained using a target of low-radioactivity argon. DarkSide-50 is a dark matter detector, using two-phase liquid argon time projection chamber, located at the Laboratori Nazionali del Gran Sasso. The underground argon is shown to contain Ar-39 at a level reduced by a factor (1.4 +- 0.2) x 10^3 relative to atmospheric argon. We report a background-free null result from (2616 +- 43) kg d of data, accumulated over 70.9 live-days. When combined with our previous search using an atmospheric argon, the 90 % C.L. upper limit on the WIMP-nucleon spin-independent cross section based on zero events found in the WIMP search regions, is 2.0 x 10^-44 cm^2 (8.6 x 10^-44 cm^2, 8.0 x 10^-43 cm^2) for a WIMP mass of 100 GeV/c^2 (1 TeV/c^2 , 10 TeV/c^2).Comment: Accepted by Phys. Rev.

    Diagnosis, treatment and prevention of pediatric obesity: consensus position statement of the Italian Society for Pediatric Endocrinology and Diabetology and the Italian Society of Pediatrics

    Get PDF
    The Italian Consensus Position Statement on Diagnosis, Treatment and Prevention of Obesity in Children and Adolescents integrates and updates the previous guidelines to deliver an evidence based approach to the disease. The following areas were reviewed: (1) obesity definition and causes of secondary obesity; (2) physical and psychosocial comorbidities; (3) treatment and care settings; (4) prevention.The main novelties deriving from the Italian experience lie in the definition, screening of the cardiometabolic and hepatic risk factors and the endorsement of a staged approach to treatment. The evidence based efficacy of behavioral intervention versus pharmacological or surgical treatments is reported. Lastly, the prevention by promoting healthful diet, physical activity, sleep pattern, and environment is strongly recommended since the intrauterine phase
    corecore