10 research outputs found

    Analysis of an inflection s-shaped software reliability model considering log-logistic testing-effort and imperfect debugging

    Get PDF
    Gokhale and Trivedi (1998) have proposed the Log-logistic software reliability growth model that can capture the increasing/decreasing nature of the failure occurrence rate per fault. In this paper, we will first show that a Log-logistic testing-effort function (TEF) can be expressed as a software development/testing-effort expenditure curve. We investigate how to incorporate the Log-logistic TEF into inflection S-shaped software reliability growth models based on non-homogeneous Poisson process (NHPP). The models parameters are estimated by least square estimation (LSE) and maximum likelihood estimation (MLE) methods. The methods of data analysis and comparison criteria are presented. The experimental results from actual data applications show good fit. A comparative analysis to evaluate the effectiveness for the proposed model and other existing models are also performed. Results show that the proposed models can give fairly better predictions. Therefore, the Log-logistic TEF is suitable for incorporating into inflection S-shaped NHPP growth models. In addition, the proposed models are discussed under imperfect debugging environment

    Модифікований спосіб гібридного шифрування

    Get PDF
    Пояснювальна записка до дипломної роботи на тему «Модифікований спосіб гібридного шифрування» складається з 100 сторінок та містить 34 рисунків, 10 таблиць, 49 літературних джерел. Об'єкт розробки – програмний засіб стеганографiчного підпису електронного зображення. Мета роботи – підвищення ефективності існуючого процесу ведення договірної діяльності, зниження витрат часу та впорядкування бази контрагентів підприємства шляхом впровадження розроблюваного програмного засобу. Метод дослідження – розробка програми для приховування текстової інформації в зображенні за допомогою модифікованих алгоритмів впровадження. В процесі роботи було виконано глибокий аналіз предметної області. В результаті чого виявилося, що використання криптографії залишається одним з найважливіших аспектів для подальших розробок і вдосконалень програми. Ідентичним за важливістю використання криптографії аспектом є розробка більш стійких до стегоаналізу алгоритмів впровадження інформації для приховування та використання в якості контейнерів і стегоповідомлень більшого числа форматів не тільки растрових зображень і текстової інформації, але і файлів абсолютно будь-яких форматів. Результатами дипломної роботи є методика та програмний засіб, що можуть бути використані для збільшення ефективності приховування текстової інформації в зображенні за допомогою модифікованих алгоритмів впровадження.Explanatory note to the thesis on the topic “Hybrid encryption modified method”consists of 100 pages and contains 34 pictures, 10 tables, 49 literary sources. The object of the development is a program of the stenographic signature of the student's electronic report. The purpose of the work is to increase the efficiency of the existing process of contractual activity, reduce the time costs and streamline the contractor's base of the company by introducing the developed software. The research method is the development of a program for concealing textual information in the image using modified implementation algorithms. In the process of work, a deep analysis of the subject area was performed. As a result, it turned out that the use of cryptography remains one of the most important aspects for further development and improvement of the program. Identical for the importance of the use of cryptography aspect is the development of more stable to steganalization algorithms for the introduction of information to conceal and use as containers and stencils of a large number of formats not only raster images and text information, but also files of absolutely any format. The results of the thesis are a method and a software tool that can be used to increase the effectiveness of hiding text information in the image with the help of modified implementation algorithms

    Framework for modeling software reliability, using various testing-efforts and fault-detection rates

    No full text
    [[abstract]]This paper proposes a new scheme for constructing software reliability growth models (SRGM) based on a nonhomogeneous Poisson process (NHPP). The main focus is to provide an efficient parametric decomposition method for software reliability modeling, which considers both testing efforts and fault detection rates (FDR). In general, the software fault detection/removal mechanisms depend on previously detected/removed faults and on how testing efforts are used. From practical field studies, it is likely that we can estimate the testing efforts consumption pattern and predict the trends of FDR. A set of time-variable, testing-effort-based FDR models were developed that have the inherent flexibility of capturing a wide range of possible fault detection trends: increasing, decreasing, and constant. This scheme has a flexible structure and can model a wide spectrum of software development environments, considering various testing efforts. The paper describes the FDR, which can be obtained from historical records of previous releases or other similar software projects, and incorporates the related testing activities into this new modeling approach. The applicability of our model and the related parametric decomposition methods are demonstrated through several real data sets from various software projects. The evaluation results show that the proposed framework to incorporate testing efforts and FDR for SRGM has a fairly accurate prediction capability and it depicts the real-life situation more faithfully. This technique can be applied to a wide range of software systems.[[fileno]]2030240010006[[department]]資訊工程學

    Envelhecimento e rejuvenescimento de software: 20 anos (19952014) - panorama e desafios

    Get PDF
    Although software aging and rejuvenation is a young research held, in its first 20 years a lot of knowledge has been produced. Nowadays, important scientific journals and conferences include SAR-related topics in their scope of interest. This fast growing and wide range of dissemination venues pose a challenge to researchers to keep tracking of the new findings and trends in this area. In this work, we collected and analyzed SAR research data to detect trends, patterns, and thematic gaps, in order to provide a comprehensive view of this research held over its hrst 20 years. Adopted the systematic mapping approach to answer research questions such as: How the main topics investigated in SAR have evolved over time? Which are the most investigated aging effects? Which rejuvenation techniques and strategies are more frequently used?CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível SuperiorDissertação (Mestrado)Embora o envelhecimento e rejuvenescimento de software seja um campo de pesquisa novo, em seus primeiros 20 anos muito conhecimento foi produzido. Hoje em dia, revistas e conferências científicas importantes incluem temas relacionados a SAR no seu âmbito de interesse. Este crescimento rápido e a grande variedade de locais de disseminação representam um desafio para os pesquisadores para manter o acompanhamento das novas descobertas e tendências nesta área. Neste trabalho, foram coletados e analisados dados de pesquisa em SAR para detectar tendências, padrões e lacunas temáticas, a hm de proporcionar uma visão abrangente deste campo de pesquisa em seus primeiros 20 anos. Adotou-se a abordagem de mapeamento sistemático para responder a perguntas de pesquisa, tais como: Como os principais temas investigados em SAR têm evoluído ao longo do tempo? Quais são os efeitos do envelhecimento mais investigados? Quais técnicas e estratégias de rejuvenescimento são mais frequentemente usadas

    Software Reliability Growth Models from the Perspective of Learning Effects and Change-Point.

    Get PDF
    Increased attention towards reliability of software systems has led to the thorough analysis of the process of reliability growth for prediction and assessment of software reliability in the testing or debugging phase. With many frameworks available in terms of the underlying probability distributions like Poisson process, Non-Homogeneous Poisson Process (NHPP), Weibull, etc, many researchers have developed models using the Non-Homogeneous Poisson Process (NHPP) analytical framework. The behavior of interest, usually, is S-shaped or exponential shaped. S-shaped behavior could relate more closely to the human learning. The need to develop different models stems from the fact that nature of the underlying environment, learning effect acquisition during testing, resource allocations, application and the failure data itself vary. There is no universal model that fits everywhere to be called an Oracle. Learning effects that stem from the experiences of the testing or debugging staff have been considered for the growth of reliability. Learning varies over time and this asserts need for conduct of more research for study of learning effects.Digital copy of ThesisUniversity of Kashmi

    On the Viability of Quantitative Assessment Methods in Software Engineering and Software Services

    Get PDF
    IT help desk operations are expensive. Costs associated with IT operations present challenges to profit goals. Help desk managers need a way to plan staffing levels so that labor costs are minimized while problems are resolved efficiently. An incident prediction method is needed for planning staffing levels. The potential value of a solution to this problem is important to an IT service provider since software failures are inevitable and their timing is difficult to predict. In this research, a cost model for help desk operations is developed. The cost model relates predicted incidents to labor costs using real help desk data. Incidents are predicted using software reliability growth models. Cluster analysis is used to group products with similar help desk incident characteristics. Principal Components Analysis is used to determine one product per cluster for the prediction of incidents for all members of the cluster. Incident prediction accuracy is demonstrated using cluster representatives, and is done so successfully for all clusters with accuracy comparable to making predictions for each product in the portfolio. Linear regression is used with cost data for the resolution of incidents to relate incident predictions to help desk labor costs. Following a series of four pilot studies, the cost model is validated by successfully demonstrating cost prediction accuracy for one month prediction intervals over a 22 month period

    Software reliability modeling and analysis

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Reliability models and analyses of the computing systems

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH
    corecore