236,806 research outputs found

    50 years of isolation

    Get PDF
    The traditional means for isolating applications from each other is via the use of operating system provided ā€œprocessā€ abstraction facilities. However, as applications now consist of multiple fine-grained components, the traditional process abstraction model is proving to be insufficient in ensuring this isolation. Statistics indicate that a high percentage of software failure occurs due to propagation of component failures. These observations are further bolstered by the attempts by modern Internet browser application developers, for example, to adopt multi-process architectures in order to increase robustness. Therefore, a fresh look at the available options for isolating program components is necessary and this paper provides an overview of previous and current research on the area

    Evaluating the Impacts of Information and Communication Technology (ICT) on Trade in Fruit and Vegetables within the APEC Countries

    Get PDF
    The global food marketing network is being constantly reshaped, providing opportunities and challenges to the use of information and communication technology (ICT) to develop international trade in food products. ICT is likely to be especially important for food products such as fresh fruit and vegetables that are differentiated and sensitive to timeliness in supply, possess varied quality dimensions, and involve considerable supply accumulation and assortment. Digital ICT (Internet and mobile phones), in particular, is expected to facilitate international trade and encourage efficiency in the fruit and vegetables marketing system in two main ways. First, it reduces communication and search costs through cheaper and more effective media. Second, it improves market information and corrects information externalities along the supply chain, by promoting greater price transparency and enabling consumer preferences and tastes to be more precisely met. We employed a gravity model of international trade to test the hypothesis that ICT positively affects bilateral international trade in fruit and vegetables between member Asia-Pacific Economic Cooperation (APEC) economies in the period from 1997 to 2006. Explanatory variables include the usage of the Internet, mobile telephones and fixed telephone lines, and a broad range of factors that might determine the value of bilateral trade such as income per capita, population, distance between trading partners and common language. A Poisson pseudo-maximum likelihood model was estimated in order to handle zero trade observations and reduce biases caused by heteroskedasticity. Empirical results were not quite as expected, with relatively minor impact of digital ICT. They suggest that using digital ICT has significant positive effects on trade in fruit and vegetables between APEC countries only for the Internet in exporting countries. A stronger positive impact was discerned for the traditional form of ICT, fixed telephone lines in exporting importing countries. Nevertheless, fostering the development of digital ICT infrastructure and its diffusion should make exporters in APEC countries more competitive in the fruit and vegetables supply chain through the Internet effect, and boost their trade values in these products.International Relations/Trade,

    Search Neutrality as an Antitrust Principle

    Get PDF
    Given the Internet\u27s designation as the great equalizer, \u27 it is unsurprising that nondiscrimination has emerged as a central aspiration of web governance.2 But, of course, bias, discrimination, and neutrality are among the slipperiest of regulatory principles. One person\u27s bias is another person\u27s prioritization. Fresh on the heels of its initial success in advocating a net neutrality principle,\u27 Google is in the uncomfortable position of trying to stave off a corollary principle of search neutrality.\u27 Search neutrality has not yet coalesced into a generally understood principle, but at its heart is some idea that Internet search engines ought not to prefer their own content on adjacent websites in search results but should instead employ neutral search algorithms that determine search result rankings based on some objective metric of relevance.\u27 Count me a skeptic. Whatever the merits of the net neutrality argument, a general principle of search neutrality would pose a serious threat to the organic growth of Internet search. Although there may be a limited case for antitrust liability on a fact-specific basis for acts of naked exclusion against rival websites, the case for a more general neutrality principle is weak. Particularly as Internet search transitions from the ten blue links model of just a few years ago to a model where search engines increasingly provide end information and interface with website information, a neutrality principle becomes incoherent

    InfoFilter: Supporting Quality of Service for Fresh Information Delivery

    Get PDF
    With the explosive growth of the Internet and World Wide Web comes a dramatic increase in the number of users that compete for the shared resources of distributed system environments. Most implementations of application servers and distributed search software do not distinguish among requests to different web pages. This has the implication that the behavior of application servers is quite unpredictable. Applications that require timely delivery of fresh information consequently suffer the most in such competitive environments. This paper presents a model of quality of service (QoS) and the design of a QoS-enabled information delivery system that implements such a QoS modeL The goal of this development is two-fold. On one hand, we want to enable users or applications to specify the desired quality of service requ.irements for their requests so that application-aware QoS adaptation is supported throughout the Web query and search processing. On the other hand, we want to enable an application server to customize how it shou.ld respond to external requests by setting priorities among query requests and allocating server resources using adaptive QoS control mechanisms. We introduce the Infopipe approach as the systems support architecture and underlying technology for building a QoS-enabled distributed system for fresh information delivery

    Few-shot Multi-domain Knowledge Rearming for Context-aware Defence against Advanced Persistent Threats

    Full text link
    Advanced persistent threats (APTs) have novel features such as multi-stage penetration, highly-tailored intention, and evasive tactics. APTs defense requires fusing multi-dimensional Cyber threat intelligence data to identify attack intentions and conducts efficient knowledge discovery strategies by data-driven machine learning to recognize entity relationships. However, data-driven machine learning lacks generalization ability on fresh or unknown samples, reducing the accuracy and practicality of the defense model. Besides, the private deployment of these APT defense models on heterogeneous environments and various network devices requires significant investment in context awareness (such as known attack entities, continuous network states, and current security strategies). In this paper, we propose a few-shot multi-domain knowledge rearming (FMKR) scheme for context-aware defense against APTs. By completing multiple small tasks that are generated from different network domains with meta-learning, the FMKR firstly trains a model with good discrimination and generalization ability for fresh and unknown APT attacks. In each FMKR task, both threat intelligence and local entities are fused into the support/query sets in meta-learning to identify possible attack stages. Secondly, to rearm current security strategies, an finetuning-based deployment mechanism is proposed to transfer learned knowledge into the student model, while minimizing the defense cost. Compared to multiple model replacement strategies, the FMKR provides a faster response to attack behaviors while consuming less scheduling cost. Based on the feedback from multiple real users of the Industrial Internet of Things (IIoT) over 2 months, we demonstrate that the proposed scheme can improve the defense satisfaction rate.Comment: It has been accepted by IEEE SmartNet

    Hybrid Ventilation System and Soft-Sensors for Maintaining Indoor Air Quality and Thermal Comfort in Buildings

    Get PDF
    Maintaining both indoor air quality (IAQ) and thermal comfort in buildings along with optimized energy consumption is a challenging problem. This investigation presents a novel design for hybrid ventilation system enabled by predictive control and soft-sensors to achieve both IAQ and thermal comfort by combining predictive control with demand controlled ventilation (DCV). First, we show that the problem of maintaining IAQ, thermal comfort and optimal energy is a multi-objective optimization problem with competing objectives, and a predictive control approach is required to smartly control the system. This leads to many implementation challenges which are addressed by designing a hybrid ventilation scheme supported by predictive control and soft-sensors. The main idea of the hybrid ventilation system is to achieve thermal comfort by varying the ON/OFF times of the air conditioners to maintain the temperature within user-defined bands using a predictive control and IAQ is maintained using Healthbox 3.0, a DCV device. Furthermore, this study also designs soft-sensors by combining the Internet of Things (IoT)-based sensors with deep-learning tools. The hardware realization of the control and IoT prototype is also discussed. The proposed novel hybrid ventilation system and the soft-sensors are demonstrated in a real research laboratory, i.e., Center for Research in Automatic Control Engineering (C-RACE) located at Kalasalingam University, India. Our results show the perceived benefits of hybrid ventilation, predictive control, and soft-sensors

    Enablers and barriers in German online food retailing

    Get PDF
    This article discusses enablers and barriers in online food retailing in Germany. The German food retail sector is one of the largest in Europe; however, its online or Internet provision for customers lags way behind the United Kingdom and France. Prior research has considered the demand-consumer side of this dyad; however, little has been done on the online food supply-retail side. This article addresses that gap through exploratory empirical research with three retailers, three logistics service providers, and a marketing agency. There is good potential in this market but costs of fulfilment and service quality currently represent major barriers

    Basis Token Consistency: A Practical Mechanism for Strong Web Cache Consistency

    Full text link
    With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.National Science Foundation (ANI-9986397, ANI-0095988
    • ā€¦
    corecore