21,584 research outputs found

    Resource Management for Enhancing Predictability in Systems with Limited Processing Capabilities.

    Get PDF
    There is an increasing demand for computing systems composed by heterogeneous computers, connected by different types of networks, and that allow for accessing a wide range of services in a seamless way. Some of those computers are mobile or embedded and have limited resources, and can be overloaded when trying to handle their users demands. Then it is not possible to ensure a proper behaviour of the running applications. This can be an important problem when dealing with critical events in healthcare, home surveillance, or forest monitoring. Resource reservation is a valid basis for handling this issue. It allows for guaranteeing a certain resource share for applications that are important for the proper behavior of a given system. This paper describes an implementation of a resource management component and its integration in the Linux kernel. This piece of software has allowed to assign CPU budgets to standard Java threads, which is an important facility, given the widespread of this programming language. This implementation has been validated on service oriented middleware, where relevant services are executed by thread with guaranteed budget, to improve its predictability

    Survey of dynamic scheduling in manufacturing systems

    Get PDF

    Conceptualisation of intellectual capital in analysts’ narratives: a performative view

    Get PDF
    Purpose: This study tests the performativity of Intellectual Capital (IC) from the perspective of sell-side analysts, a type of actor who consumes and creates IC information and in whose practice IC information plays a significant role. Design/methodology/approach: The empirical component of the study comprises a narrative analysis of the text of a large corpus of sell-side analysts’ initiation coverage reports. We adopt Mouritsen’s (2006) performative and ostensive conceptualisations of IC as our theoretical framework. Findings: We find that the identities and properties of IC elements are variable, dynamic and transformative. The relevance of IC elements in the eyes of analysts is conditional on the context, temporally contingent and bestowed indirectly. IC elements are attributed to firm value both directly, in a linear manner, and indirectly, via various non-linear interrelationships established with other IC elements, tangible capital and financial capital. Research limitations/implications: This study challenges the conventional IC research paradigm and contributes towards a performativity-inspired conceptualisation of IC and a resultant situated model of IC in place of a predictive model. Originality/value: This is the first study to apply a performative lens to study IC identities, roles and relationships from the perspective of a field of practice that is external to the organisation where IC is hosted. Examining IC from analysts’ perspective is important because not only can it provide an alternative perspective of IC, it also enables an understanding of analysts’ field of practice

    Quality assessment technique for ubiquitous software and middleware

    Get PDF
    The new paradigm of computing or information systems is ubiquitous computing systems. The technology-oriented issues of ubiquitous computing systems have made researchers pay much attention to the feasibility study of the technologies rather than building quality assurance indices or guidelines. In this context, measuring quality is the key to developing high-quality ubiquitous computing products. For this reason, various quality models have been defined, adopted and enhanced over the years, for example, the need for one recognised standard quality model (ISO/IEC 9126) is the result of a consensus for a software quality model on three levels: characteristics, sub-characteristics, and metrics. However, it is very much unlikely that this scheme will be directly applicable to ubiquitous computing environments which are considerably different to conventional software, trailing a big concern which is being given to reformulate existing methods, and especially to elaborate new assessment techniques for ubiquitous computing environments. This paper selects appropriate quality characteristics for the ubiquitous computing environment, which can be used as the quality target for both ubiquitous computing product evaluation processes ad development processes. Further, each of the quality characteristics has been expanded with evaluation questions and metrics, in some cases with measures. In addition, this quality model has been applied to the industrial setting of the ubiquitous computing environment. These have revealed that while the approach was sound, there are some parts to be more developed in the future

    Major Indian ICT firms and their approaches towards achieving quality

    Get PDF
    Of the three basic theories of innovation: the entrepreneur theory, the technology-economics theory and the strategic theory, the third one seems to be highly appropriate for the analysis of recent growth of the information and communication technology (ICT) industry in many developing countries including India. The central measure for achieving quality by the various major Indian ICT firms is widely agreed to have been the adoption of Six Sigma Methodology and various other approaches like Total Quality Management (TQM), Supply Chain Management (SCM), Customer Relationship Management (CRM), etc. It is apparent that the main objective of the firms chosen has been to increase the pace of innovation activities, irrespective of their different areas of product specialisation. Its success also depends largely on the overall improvement in infrastructure, besides active market interaction. To enable both the above, a brief highlight on the establishment of interaction and learning sites (ILSs) in every regional State in India comes to the foreground. The chapter concludes with a mention of the elements observed to be missing among the firms under consideration, and, thereby, delineating the scope for their further improvement.

    Execution: the Critical “What’s Next?” in Strategic Human Resource Management

    Get PDF
    The Human Resource Planning Society’s 1999 State of the Art/Practice (SOTA/P) study was conducted by a virtual team of researchers who interviewed and surveyed 232 human resource and line executives, consultants, and academics worldwide. Looking three to five years ahead, the study probed four basic topics: (1) major emerging trends in external environments, (2) essential organizational capabilities, (3) critical people issues, and (4) the evolving role of the human resource function. This article briefly reports some of the study’s major findings, along with an implied action agenda – the “gotta do’s for the leading edge. Cutting through the complexity, the general tone is one of urgency emanating from the intersection of several underlying themes: the increasing fierceness of competition, the rapid and unrelenting pace of change, the imperatives of marketplace and thus organizational agility, and the corresponding need to buck prevailing trends by attracting and, especially, retaining and capturing the commitment of world-class talent. While it all adds up to a golden opportunity for human resource functions, there is a clear need to get to get on with it – to get better, faster, and smarter – or run the risk of being left in the proverbial dust. Execute or be executed

    Malware in the Future? Forecasting of Analyst Detection of Cyber Events

    Full text link
    There have been extensive efforts in government, academia, and industry to anticipate, forecast, and mitigate cyber attacks. A common approach is time-series forecasting of cyber attacks based on data from network telescopes, honeypots, and automated intrusion detection/prevention systems. This research has uncovered key insights such as systematicity in cyber attacks. Here, we propose an alternate perspective of this problem by performing forecasting of attacks that are analyst-detected and -verified occurrences of malware. We call these instances of malware cyber event data. Specifically, our dataset was analyst-detected incidents from a large operational Computer Security Service Provider (CSSP) for the U.S. Department of Defense, which rarely relies only on automated systems. Our data set consists of weekly counts of cyber events over approximately seven years. Since all cyber events were validated by analysts, our dataset is unlikely to have false positives which are often endemic in other sources of data. Further, the higher-quality data could be used for a number for resource allocation, estimation of security resources, and the development of effective risk-management strategies. We used a Bayesian State Space Model for forecasting and found that events one week ahead could be predicted. To quantify bursts, we used a Markov model. Our findings of systematicity in analyst-detected cyber attacks are consistent with previous work using other sources. The advanced information provided by a forecast may help with threat awareness by providing a probable value and range for future cyber events one week ahead. Other potential applications for cyber event forecasting include proactive allocation of resources and capabilities for cyber defense (e.g., analyst staffing and sensor configuration) in CSSPs. Enhanced threat awareness may improve cybersecurity.Comment: Revised version resubmitted to journa

    Federated Embedded Systems – a review of the literature in related fields

    Get PDF
    This report is concerned with the vision of smart interconnected objects, a vision that has attracted much attention lately. In this paper, embedded, interconnected, open, and heterogeneous control systems are in focus, formally referred to as Federated Embedded Systems. To place FES into a context, a review of some related research directions is presented. This review includes such concepts as systems of systems, cyber-physical systems, ubiquitous computing, internet of things, and multi-agent systems. Interestingly, the reviewed fields seem to overlap with each other in an increasing number of ways

    What is Strategic Competence and Does it Matter? Exposition of the Concept and a Research Agenda

    Get PDF
    Drawing on a range of theoretical and empirical insights from strategic management and the cognitive and organizational sciences, we argue that strategic competence constitutes the ability of organizations and the individuals who operate within them to work within their cognitive limitations in such a way that they are able to maintain an appropriate level of responsiveness to the contingencies confronting them. Using the language of the resource based view of the firm, we argue that this meta-level competence represents a confluence of individual and organizational characteristics, suitably configured to enable the detection of those weak signals indicative of the need for change and to act accordingly, thereby minimising the dangers of cognitive bias and cognitive inertia. In an era of unprecedented informational burdens and instability, we argue that this competence is central to the longer-term survival and well being of the organization. We conclude with a consideration of the major scientific challenges that lie ahead, if the ideas contained within this paper are to be validated

    Privacy, security, and trust issues in smart environments

    Get PDF
    Recent advances in networking, handheld computing and sensor technologies have driven forward research towards the realisation of Mark Weiser's dream of calm and ubiquitous computing (variously called pervasive computing, ambient computing, active spaces, the disappearing computer or context-aware computing). In turn, this has led to the emergence of smart environments as one significant facet of research in this domain. A smart environment, or space, is a region of the real world that is extensively equipped with sensors, actuators and computing components [1]. In effect the smart space becomes a part of a larger information system: with all actions within the space potentially affecting the underlying computer applications, which may themselves affect the space through the actuators. Such smart environments have tremendous potential within many application areas to improve the utility of a space. Consider the potential offered by a smart environment that prolongs the time an elderly or infirm person can live an independent life or the potential offered by a smart environment that supports vicarious learning
    corecore