256,898 research outputs found

    Modification of EVA in Value Based Management

    Get PDF
    AbstractThis article deals with the value based management and the metrics used in it. Nowadays, one of the basic requirements on the company is creation of value for the shareholders and that is why managers have adopted value-oriented methods to be able to measure any change of value. In the past, managers of many companies were focused on the main economic objective which they considered to be the maximization of profits. But this attitude was not sufficient to satisfy the requirements of shareholders, because their preferences lie in the value of the company. There will be described the most common value-oriented indicator, economic value added (EVA), in the article. The developer of the EVA concept is Stern Stewart & Company. They criticized the traditional indicators as ROA, ROI, PAT, EPS etc. for their characteristic and the weak explanatory power in terms of value creation. Company valuation through EVA is an appropriate mean to determine creditworthiness of a company. The EVA calculation also provides valuable information for various areas of management. This metric is quantifying the value that was added as a result of the implementation of operational activities during the reference period. Moreover, there are many modifications of the EVA used in practice, such as MVA, CVA, SVA, RONA that will be described and compared in the article

    Performance Based Logistics (PBL) for the FA-18/S-3/P-3/C-2 auxiliary power unit (APU) at Honeywell: an applied analysis

    Get PDF
    MBA Professional ReportThe purpose of this MBA project is to evaluate and assess the metrics, incentives and other terms and conditions of the Performance Based Logistics (PBL) contract between Naval Aviation Inventory Control Point (NAVICP) and Honeywell in support of FA-18/S-3/P-3/C-2 Auxiliary Power Unit (APU) to determine if the contractual terms and conditions established are effective in facilitating and encouraging the full potential of PBL savings and improved performance. PBL is an acquisition reform initiative intended to improve weapon system logistics with the goals of: 1) compressing the supply chain, 2) eliminating non-value added steps, 3) reducing total ownership costs, 4) improving weapon system readiness and reliability, and 5) reducing logistics footprint. PBL entails buying measurable outcomes with metrics based on war fighter stated performance requirements. The war-fighter requirements should be linked to metrics and metrics should be linked to contract incentives. An additional element of PBL is gain sharing, which ensures the contractorb2ss profit and the governmentb2ss increased performance at a reduced cost. Based on the elements of PBL, the objectives of this project include; 1) comparing the actions/activities/accomplishments of the contract to the goals of PBL, 2) measure and appraise the attainment of those goals 3) provide information about the major factors causing the observed effects on the above, and 4) identify and analyze the metrics and incentives for their effectiveness in achieving the desired outcomes. In meeting the objectives of this report, our findings indicate that non-value added steps were eliminated, and there were potential reductions in the logistics footprint. On the other hand, the supply chain was not compressed; aircraft maintenance costs did not decrease; and more importantly, the APU reliability for the FA-18, S-3, and C-2 did not improve. For the P-3, reliability improved by 7% to 19%, but not 300% per the contract guarantee. Our research also determined that the reliability metric was inappropriate for measuring and tracking APU reliability improvements. Additionally, disincentives were provided for not meeting contract requirements with the only contract incentive being the award term contract arrangement. Our report provides recommendations specific to the APU TLS contract and other recommendations for other PBL applications. These recommendations include an alternative contract pricing and gain sharing methods and appropriate metrics and incentives that reflect the true definition of PBL.http://archive.org/details/performancebased109459987US Marine Corps (USMC) author.Approved for public release; distribution is unlimited

    Social networks as a service in modern enterprises

    Get PDF
    The power of social networks stems from their ability to capture real-world phenomena such as collaboration, competition, and partnerships. Social networks provide means for enterprises to capture and expose many informal connections between their stakeholders. In this paper, we discuss how social networks could sustain growth and unfold business opportunities in modern enterprises. Furthermore, we study various types of social networks and investigate metrics that measure their value-added to enterprises. In response to business-oriented social network requirements, we propose a multi-tenant architecture to develop Social-Networks-as-a-Service (SNaaS) and allow efficient use of server resources while reducing maintenance costs and providing a high degree of customization to support each enterprise\u27s requirements. ©2009 IEEE

    Developing a Web Analytics Strategy for the National Science Digital Library

    Get PDF
    In August 2004, a two-day workshop was held on Developing a Web Analytics Strategy for the National Science Digital Library (NSDL) . The workshop was sponsored by the NSDL Educational Impact and Evaluation Standing Committee (EIESC) and was jointly organized with the NSDL Technology Standing Committee (TSC). It brought together 26 representatives from government and industry, as well as some of the projects funded by the National Science Foundation (NSF) NSDL program, to discuss how web metrics could be implemented in a pilot study to identify current NSDL use and develop strategies to support the collection of usage data across NSDL in the future. This new pilot follows a study that the EIESC conducted in 2002 to identify and collect basic web metrics data for NSDL.A bibliography on web metrics was prepared and distributed to the participants of the 2004 workshop. During the workshop, participants first reviewed the processes and technology used to gather web metrics data by two different organizations: the Association of Research Libraries E-Metrics Project and Sun Microsystems. Through a series of breakout and plenary sessions, participants identified high-level goals for the new pilot study, formulated and prioritized a list of desired effects and requirements for collecting web metrics across NSDL, and developed recommendations for implementing web metrics data collection on a project and program level. The workshop concluded with the EIESC and TSC establishing a joint taskforce to lead the pilot study in NSDL over the next year. Web analytics will be used to address two high-level goals. That high quality learning resources be accessible to a large spectrum of the US population That there be value added to users and projects by participating in NSDL.This workshop report provides a brief history of previous evaluation activities across NSDL and discusses the importance of web analytics to NSDL. After a review of the literature on web metrics, the report identifies cross-cutting issues that affect implementing web metrics in the upcoming pilot study (e.g., build vs. buy, data ownership and storage, organizational structure that supports ongoing data collection, user privacy); describes the goals and requirements for the pilot study; and lists near term action items for the joint task force. Documents from the workshop, including a preliminary report entitled Workshop on Web Metrics in NSDL , slides from ARL and Sun Microsystems presentations, participant statements and the web metrics bibliography can be found on the workshop website

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures

    Design of multimedia processor based on metric computation

    Get PDF
    Media-processing applications, such as signal processing, 2D and 3D graphics rendering, and image compression, are the dominant workloads in many embedded systems today. The real-time constraints of those media applications have taxing demands on today's processor performances with low cost, low power and reduced design delay. To satisfy those challenges, a fast and efficient strategy consists in upgrading a low cost general purpose processor core. This approach is based on the personalization of a general RISC processor core according the target multimedia application requirements. Thus, if the extra cost is justified, the general purpose processor GPP core can be enforced with instruction level coprocessors, coarse grain dedicated hardware, ad hoc memories or new GPP cores. In this way the final design solution is tailored to the application requirements. The proposed approach is based on three main steps: the first one is the analysis of the targeted application using efficient metrics. The second step is the selection of the appropriate architecture template according to the first step results and recommendations. The third step is the architecture generation. This approach is experimented using various image and video algorithms showing its feasibility

    A feature-similarity model for product line engineering

    Get PDF

    Using quality models in software package selection

    Get PDF
    The growing importance of commercial off-the-shelf software packages requires adapting some software engineering practices, such as requirements elicitation and testing, to this emergent framework. Also, some specific new activities arise, among which selection of software packages plays a prominent role. All the methodologies that have been proposed recently for choosing software packages compare user requirements with the packages' capabilities. There are different types of requirements, such as managerial, political, and, of course, quality requirements. Quality requirements are often difficult to check. This is partly due to their nature, but there is another reason that can be mitigated, namely the lack of structured and widespread descriptions of package domains (that is, categories of software packages such as ERP systems, graphical or data structure libraries, and so on). This absence hampers the accurate description of software packages and the precise statement of quality requirements, and consequently overall package selection and confidence in the result of the process. Our methodology for building structured quality models helps solve this drawback.Peer ReviewedPostprint (published version
    • …
    corecore