4,906 research outputs found

    A large study on the effect of code obfuscation on the quality of java code

    Get PDF
    Context: Obfuscation is a common technique used to protect software against malicious reverse engineering. Obfuscators manipulate the source code to make it harder to analyze and more difficult to understand for the attacker. Although different obfuscation algorithms and implementations are available, they have never been directly compared in a large scale study. Aim: This paper aims at evaluating and quantifying the effect of several different obfuscation implementations (both open source and commercial), to help developers and project managers to decide which algorithms to use. Method: In this study we applied 44 obfuscations to 18 subject applications covering a total of 4 millions lines of code. The effectiveness of these source code obfuscations has been measured using 10 code metrics, considering modularity, size and complexity of code. Results: Results show that some of the considered obfuscations are effective in making code metrics change substantially from original to obfuscated code, although this change (called potency of the obfuscation) is different on different metrics. In the paper we recommend which obfuscations to select, given the security requirements of the software to be protected

    An empirical analysis of source code metrics and smart contract resource consumption

    Get PDF
    A smart contract (SC) is a programme stored in the Ethereum blockchain by a contract‐creation transaction. SC developers deploy an instance of the SC and attempt to execute it in exchange for a fee, paid in Ethereum coins (Ether). If the computation needed for their execution turns out to be larger than the effort proposed by the developer (i.e., the gasLimit ), their client instantiation will not be completed successfully. In this paper, we examine SCs from 11 Ethereum blockchain‐oriented software projects hosted on GitHub.com, and we evaluate the resources needed for their deployment (i.e., the gasUsed ). For each of these contracts, we also extract a suite of object‐oriented metrics, to evaluate their structural characteristics. Our results show a statistically significant correlation between some of the object‐oriented (OO) metrics and the resources consumed on the Ethereum blockchain network when deploying SCs. This result has a direct impact on how Ethereum developers engage with a SC: evaluating its structural characteristics, they will be able to produce a better estimate of the resources needed to deploy it. Other results show specific source code metrics to be prioritised based on application domains when the projects are clustered based on common themes

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures

    Distributed Computing Grid Experiences in CMS

    Get PDF
    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system

    Some Findings Concerning Requirements in Agile Methodologies

    Get PDF
    gile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies

    Leading particle effect, inelasticity and the connection between average multiplicities in {\bf e+e−e^+e^-} and {\bf pppp} processes

    Full text link
    The Regge-Mueller formalism is used to describe the inclusive spectrum of the proton in ppp p collisions. From such a description the energy dependences of both average inelasticity and leading proton multiplicity are calculated. These quantities are then used to establish the connection between the average charged particle multiplicities measured in {\bf e+e−e^+e^-} and {\bf pp/pˉppp/{\bar p}p} processes. The description obtained for the leading proton cross section implies that Feynman scaling is strongly violated only at the extreme values of xFx_F, that is at the central region (xF≈0x_F \approx 0) and at the diffraction region (xF≈1x_F \approx 1), while it is approximately observed in the intermediate region of the spectrum.Comment: 20 pages, 10 figures, to be published in Physical Review

    Measurement of triple gauge boson couplings from WâșW⁻ production at LEP energies up to 189 GeV

    Get PDF
    A measurement of triple gauge boson couplings is presented, based on W-pair data recorded by the OPAL detector at LEP during 1998 at a centre-of-mass energy of 189 GeV with an integrated luminosity of 183 pb⁻Âč. After combining with our previous measurements at centre-of-mass energies of 161–183 GeV we obtain Îș = 0.97_{-0.16}^{+0.20}, g_{1}^{z} = 0.991_{-0.057}^{+0.060} and λ = -0.110_{-0.055}^{+0.058}, where the errors include both statistical and systematic uncertainties and each coupling is determined by setting the other two couplings to their Standard Model values. These results are consistent with the Standard Model expectations

    Determination of alpha_s using Jet Rates at LEP with the OPAL detector

    Full text link
    Hadronic events produced in e+e- collisions by the LEP collider and recorded by the OPAL detector were used to form distributions based on the number of reconstructed jets. The data were collected between 1995 and 2000 and correspond to energies of 91 GeV, 130-136 GeV and 161-209 GeV. The jet rates were determined using four different jet-finding algorithms (Cone, JADE, Durham and Cambridge). The differential two-jet rate and the average jet rate with the Durham and Cambridge algorithms were used to measure alpha(s) in the LEP energy range by fitting an expression in which order alpah_2s calculations were matched to a NLLA prediction and fitted to the data. Combining the measurements at different centre-of-mass energies, the value of alpha_s (Mz) was determined to be alpha(s)(Mz)=0.1177+-0.0006(stat.)+-0.0012$(expt.)+-0.0010(had.)+-0.0032(theo.) \.Comment: 40 pages, 17 figures, Submitted to Euro. Phys. J.
    • 

    corecore