337 research outputs found

    Efficient Auction-Based Grid Reservation using Dynamic Programming

    Get PDF
    Abstract — Auction mechanisms have been proposed as a means to efficiently and fairly schedule jobs in high-performance computing environments. The Generalized Vickrey Auction has long been known to produce efficient allocations while exposing users to truth-revealing incentives, but the algorithms used to compute its payments can be computationally intractable. In this paper we present a novel implementation of the Generalized Vickrey Auction that uses dynamic programming to schedule jobs and compute payments in pseudo-polynomial time. Additionally, we have built a version of the PBS scheduler that uses this algorithm to schedule jobs, and in this paper we present the results of our tests using this scheduler. I

    Discovering Piecewise Linear Models of Grid Workload

    Get PDF
    International audienceDespite extensive research focused on enabling QoS for grid users through economic and intelligent resource provisioning, no consensus has emerged on the most promising strategies. On top of intrinsically challenging problems, the complexity and size of data has so far drastically limited the number of comparative experiments. An alternative to experimenting on real, large, and complex data, is to look for well-founded and parsimonious representations. This study is based on exhaustive information about the gLite-monitored jobs from the EGEE grid, representative of a significant fraction of e-science computing activity in Europe. Our main contributions are twofold. First we found that workload models for this grid can consistently be discovered from the real data, and that limiting the range of models to piecewise linear time series models is sufficiently powerful. Second, we present a bootstrapping strategy for building more robust models from the limited samples at hand

    Attacking and securing Network Time Protocol

    Get PDF
    Network Time Protocol (NTP) is used to synchronize time between computer systems communicating over unreliable, variable-latency, and untrusted network paths. Time is critical for many applications; in particular it is heavily utilized by cryptographic protocols. Despite its importance, the community still lacks visibility into the robustness of the NTP ecosystem itself, the integrity of the timing information transmitted by NTP, and the impact that any error in NTP might have upon the security of other protocols that rely on timing information. In this thesis, we seek to accomplish the following broad goals: 1. Demonstrate that the current design presents a security risk, by showing that network attackers can exploit NTP and then use it to attack other core Internet protocols that rely on time. 2. Improve NTP to make it more robust, and rigorously analyze the security of the improved protocol. 3. Establish formal and precise security requirements that should be satisfied by a network time-synchronization protocol, and prove that these are sufficient for the security of other protocols that rely on time. We take the following approach to achieve our goals incrementally. 1. We begin by (a) scrutinizing NTP's core protocol (RFC 5905) and (b) statically analyzing code of its reference implementation to identify vulnerabilities in protocol design, ambiguities in specifications, and flaws in reference implementations. We then leverage these observations to show several off- and on-path denial-of-service and time-shifting attacks on NTP clients. We then show cache-flushing and cache-sticking attacks on DNS(SEC) that leverage NTP. We quantify the attack surface using Internet measurements, and suggest simple countermeasures that can improve the security of NTP and DNS(SEC). 2. Next we move beyond identifying attacks and leverage ideas from Universal Composability (UC) security framework to develop a cryptographic model for attacks on NTP's datagram protocol. We use this model to prove the security of a new backwards-compatible protocol that correctly synchronizes time in the face of both off- and on-path network attackers. 3. Next, we propose general security notions for network time-synchronization protocols within the UC framework and formulate ideal functionalities that capture a number of prevalent forms of time measurement within existing systems. We show how they can be realized by real-world protocols (including but not limited to NTP), and how they can be used to assert security of time-reliant applications-specifically, cryptographic certificates with revocation and expiration times. Our security framework allows for a clear and modular treatment of the use of time in security-sensitive systems. Our work makes the core NTP protocol and its implementations more robust and secure, thus improving the security of applications and protocols that rely on time

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    Early Detection of Online Auction Opportunistic Sellers Through the Use of Negative-Positive Feedback

    Get PDF
    Apparently fraud is a growth industry. The monetary losses from Internet fraud have increased every year since first officially reported by the Internet Crime Complaint Center (IC3) in 2000. Prior research studies and third-party reports of fraud show rates substantially higher than eBay’s reported negative feedback rate of less than 1%. The conclusion is most buyers are withholding reports of negative feedback. Researchers Nikitov and Stone in a forensic case study of a single opportunistic eBay seller found buyers sometimes embedded negative comments in positive feedback as a means of avoiding retaliation from sellers and damage to their reputation. This category of positive feedback was described as “negative-positive” feedback. An example of negative-positive type feedback is “Good product, but slow shipping.” This research study investigated the concept of using negative-positive type feedback as a signature to identify potential opportunistic sellers in an online auction population. As experienced by prior researchers using data extracted from the eBay web site, the magnitude of data to be analyzed in the proposed study was massive. The nature of the analysis required - judgment of seller behavior and contextual analysis of buyer feedback comments – could not be automated. The traditional method of using multiple dedicated human raters would have taken months of labor with a correspondingly high labor cost. Instead, crowdsourcing in the form of Amazon Mechanical Turk was used to reduce the analysis time to a few days and at a fraction of the traditional labor cost. The research’s results found that the presence of subtle buyer behavior in the form of negative-positive type feedback comments are an inter-buyer signal indicating that a seller was behaving fraudulently. Sellers with negative-positive type feedback were 1.82 times more likely to be fraudulent. A correlation exists between an increasing number of negative-positive type feedback comments and an increasing probability that a seller was acting fraudulently. For every one unit increase in the number of negative-positive type feedback comments a seller was 4% more likely to be fraudulent

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    Positive health: The passport approach to improving continuity of care for low income South African chronic disease sufferers

    Get PDF
    Research Problem: The South African health system faces numerous challenges associated with its status as a middle-income developing nation. Wasteful expenditure and poor clinical outcomes arise from inefficient inter-organizational communication of patient information and the lack of a centralized health database. Research question: How does the experience of chronic disease patients with their health information inform the development of future health records in low income population groups? Proposition: Exploration of patient and health care workers experiences of medical records can inform their future development to enhance continuity of care. Objectives, methodology, procedures and outcome: Identification of an appropriate format, technological basis and functional design of a prototype medical record system by means of a phenomenological study conducted through in-depth interviews of patients and doctors in order to improve clinical care. Left and right hermeneutics were used to analyse the data and develop themes. Findings: Health records play a critical role in the clinics workflow processes, document the patients' management and clinical progress. They are an important intermediary in the relationship between the patient and the facility. Inefficiencies in the paper-based system lead to ineffective consultations, loss of continuity of care and discord between practitioners and patients. Improvement of the records format is required to provide ubiquitous access to health and improve patient health literacy

    CHALLENGES AND OPPORTUNITIES OF ADOPTING MANAGEMENT INFORMATION SYSTEMS (MIS) FOR PASSPORT PROCESSING: COMPARATIVE STUDY BETWEEN LESOTHO AND SOUTH AFRICA

    Get PDF
    Thesis ( M. Tech. (Business Administration )) - Central University of Technology, Free State, 2014Fast and secure public service delivery is not only a necessity, but a compulsory endeavour. However, it is close to impossible to achieve such objectives without the use of Information Technology (IT). It is correspondingly important to find proper sustainability frameworks of technology. Organisations do not only need technology for efficient public service; the constant upgrading of systems and cautious migration to the newest IT developments is also equally indispensable in today’s dynamic technological world. Conversely, countries in Africa are always lagging behind in technological progresses. Such deficiencies have been identified in the passport processing of Lesotho and South Africa, where to unequal extents, problems related to systems of passport production have contributed to delays and have become fertile grounds for corrupt practices. The study seeks to identify the main impediments in the adoption of Management Information Systems (MIS) for passport processing. Furthermore, the study explores the impact MIS might have in attempting to combat long queues and to avoid long waiting periods – from application to issuance of passports to citizens. The reasonable time frame between passport application and issuance, and specific passport management systems, have been extensively discussed along with various strategies that have been adopted by some of the world’s first movers in modern passport management technologies. In all cases and stages of this research, Lesotho and South Africa are compared. The research approach of the study was descriptive and explorative in nature. As a quantitative design, a structured questionnaire was used to solicit responses in Lesotho and South Africa. It was established that both Lesotho and South Africa have somewhat similar problems – although, to a greater extent, Lesotho needs much more urgent attention. Although the processes of South Africa need to be improved, the Republic releases a passport much faster and more efficiently than Lesotho. Economic issues are also revealed by the study as unavoidable factors that always affect technological developments in Africa. The study reveals that the latest MIS for passport processing has facilitated modern, automated border-control systems and resultant e-passports that incorporate more biometric information of citizens to passports – thanks to modern RFID technologies. One can anticipate that this study will provide simple, affordable and secure IT solutions for passport processing. Key words: Information Technology (IT); Management Information Systems (MIS); E-Government; E-Passport; Biometrics; and RFID
    • …
    corecore