1,493 research outputs found
A Framework to assess the value of web services
Large organizations often begin to adopt new software technologies prior to establishing appropriate value frameworks. This approach may produce sub-optimal investment decisions and technology adoption rates, and introduce excessive risk. In this thesis, a value-based framework is developed for assessing the impact of Web Services technology investments on business systems development. The value factors included in the framework are data management, application development and deployment, system integration, and response time to market opportunities
Web Service infrastructure for supply chain
Managing a supply chain is one of the most complicated tasks today when erratic changes in demand must be met as soon as possible for staying competitive, while dealing with multitude of business partners that are involved in the chain. It is imperative that any changes in a corporation\u27s product/service demand be immediately communicated with its suppliers and logistic service providers. This task of communication has long been overtaken by computerized systems from the telephones and fax machines. The computer technologies being used so far to connect two businesses are proving to be too rigid in today\u27s world of mergers, acquisitions, new business deals etc which bring in the task of tying the disparate computer systems of these different organizations. To solve this problem, the enterprise software industry has developed new standards and a new design for constructing inter-organization applications, collectively known as the Web Service technology.
This paper demonstrates how this technology works and how it can be applied to the problem of supply chain management. This paper describes the principles of Web Services and its features like UDDI. A demonstrative supply chain infrastructure is created using the Web Service technology which shows the ease of creating new communication links with new supply chain partners without having to invest in costly computer technology resources. The paper will show that the adoption of Web Services and adoption of standard business language OAGIS will make the task of supply chain communication as easy as plug and play
Content not available: Why The United Kingdom's Proposal For A âPackage Of Platform Safety Measuresâ Will Harm Free Speech
This article critiques key proposals of the United Kingdomâs âOnline Harmsâ White Paper; in particular, the proposal for new digital regulator and the imposition of a âduty of careâ on platforms. While acknowledging that a duty of care, backed up by sanctions works well in some environments, we argue is not appropriate for policing the White Paperâs identified harms as it could result in the blocking of legal, subjectively harmful content. Furthermore, the proposed regulator lacks the necessary independence and could be subjected to political interference. We conclude that the imposition of a duty of care will result in an unacceptable chilling effect on free expression, resulting in a draconian regulatory environment for platforms, with usersâ digital rights adversely affected
Evaluating the use and impact of Web 2.0 technologies in local government
Second generation web-based technologies (Web 2.0) such as social media and networking sites are increasingly being used by governments for activities ranging from open policy making to communication campaigns and customer service. However, this in turn has brought about additional challenges. By its very nature, Web 2.0 tech- nologies are more interactive than the traditional models of information provision or creation of digital services. Such technologies open up a new set of benefits, costs and risks to those government authorities who make use of these social and digital media to enhance their work. This study draws on the extant literature together with an in-depth qualitative case enquiry to propose an emergent framework for evaluating the intra-organisational use of Web 2.0 technologies and its impact on local government. The study findings identified additional four factors (i.e. benefits: intra-marketing, informal engagement, costs: workload constraints and risk: integration with other systems) as part of the evaluation criteria which have not previously been discussed in the existing literature surrounding the context of Web 2.0 use in local government. The study concludes that a combined analysis of the evaluation and impact assessment factors, rather than one particular approach would better assist decision makers when implementing Web 2.0 technologies for use by public administration employees
The memory space: Exploring future uses of Web 2.0 and mobile internet through design interventions.
Cooperative Development of Web-based Mass Information Systems
This paper describes a research framework which led to the development of industry-specific reference models for mass information systems (mass IS). As one of the primary means of standardized communication, these models provide an invaluable opportunity to strengthen the ties between academic research and industry practice. By avoiding structural inconsistencies and mistakes, companies are able to improve the quality of their systems, usually at low costs compared to acquiring the required know-how from external commercial organizations. Researchers, on the other hand, are able to test hypotheses about the success factors of mass IS in different industries. For this reason the extended World Wide Web Design Technique (eW3DT) was developed and - in cooperation with renowned Austrian and German companiesâapplied to a number of Web- based prototypes
Comparison of Quality of Internet Pages on Human Papillomavirus Immunization in Italian and in English
Purpose: Information available on the Internet about immunizations may influence parents' perception about human papillomavirus (HPV) immunization and their attitude toward vaccinating their daughters. We hypothesized that the quality of information on HPV available on the Internet may vary with language and with the level of knowledge of parents. To this end we compared the quality of a sample of Web pages in Italian with a sample of Web pages in English. Methods: Five reviewers assessed the quality of Web pages retrieved with popular search engines using criteria adapted from the Good Information Practice Essential Criteria for Vaccine Safety Web Sites recommended by the World Health Organization. Quality of Web pages was assessed in the domains of accessibility, credibility, content, and design. Scores in these domains were compared through nonparametric statistical tests. Results: We retrieved and reviewed 74 Web sites in Italian and 117 in English. Most retrieved Web pages (33.5%) were from private agencies. Median scores were higher in Web pages in English compared with those in Italian in the domain of accessibility (p < .01), credibility (p < .01), and content (p < .01). The highest credibility and content scores were those of Web pages from governmental agencies or universities. Accessibility scores were positively associated with content scores (p < .01) and with credibility scores (p < .01). A total of 16.2% of Web pages in Italian opposed HPV immunization compared with 6.0% of those in English (p < .05). Conclusions: Quality of information and number of Web pages opposing HPV immunization may vary with the Web site language. High-quality Web pages on HPV, especially from public health agencies and universities, should be easily accessible and retrievable with common Web search engines. (C) 2010 Society for Adolescent Medicine. All rights reserved
Qualités des sites internet anglophones traitant du trouble obsessionnel complusif
Enjeux et contexte :
Dans les questions de santé mentale, internet constitue de plus en plus une source d'information pour les personnes souffrant de troubles psychiatriques ainsi que de leurs proches.
D'autre part, le trouble obsessionnel compulsif revĂȘt une importance grandissante et bĂ©nĂ©ficie d'un intĂ©rĂȘt croissant en raison de sa frĂ©quence et de la charge qu'il reprĂ©sente pour le patient et pour la sociĂ©tĂ©.
Les patients souffrant de trouble obsessionnel compulsif, qu'ils soient diagnostiquĂ©s ou non, ainsi que leurs proches peuvent ĂȘtre amenĂ©s Ă rechercher une information de bonne qualitĂ© sur net sur le sujet.
Cette Ă©tude vise Ă Ă©valuer la qualitĂ© de l'information issue de l'internet concernant les sites anglophones traitant du trouble obsessionnel compulsif et de comparer les rĂ©sultats des requĂȘtes en utilisant un moteur de recherche gĂ©nĂ©ral (Google) Ă celles obtenues avec un moteur de recherche spĂ©cialisĂ© (Omni Medical Search).
Des mots-clés relatifs au trouble obsessionnel compulsif ont été introduits dans Google et Omni Medical Search. Les sites retenus ont été évalués selon leur responsabilité, interactivité, lisibilité et la qualité de leur contenu.
Le Label HON et la version brÚve de l'échelle DISCERN ont été utilisés comme indicateurs possibles de la qualité du contenu.
Sur les 235 adresses retrouvées, 53 sites retenus ont été analysés.
RĂ©sultats :
La qualité du contenu des sites examinés est relativement bonne.
L'utilisation d'un moteur de recherche spécialisé ne constitue pas un avantage par comparaison au moteur général utilisé par la grande majorité des internautes.
Un score > 16 de la version brÚve du DISCERN est associé à une meilleure qualité du contenu.
En conclusion : cette étude montre que le contenu des sites web concernant le trouble obsessionnel compulsif est acceptable. L'utilisation d'un moteur de recherche spécialisé n'offre pas d'avantage par rapport à Google.
Comme implications pratiques : internet renferme des sites de haute qualité sur le trouble obsessionnel compulsif. L'accÚs à ces sites ne nécessite pas l'utilisation d'un moteur de recherche spécialisé. En revanche, une discussion entre le patient et le soignant à propos de l'information disponible sur internet demeure indispensable
Perspectives : en dĂ©pit des limitations de notre Ă©tude, on peut dire que l'information contenue dans le web concernant le trouble obsessionnel compulsif est acceptable. Le contenu et la prĂ©sentation de cette information pourraient ĂȘtre amĂ©liorĂ©s. Quant Ă l'internaute qui cherche une information de qualitĂ©, il pourrait ĂȘtre guidĂ© par deux Ă©lĂ©ments : le HON et la version brĂšve du DISCERN
Recommended from our members
Discovering Network Control Vulnerabilities and Policies in Evolving Networks
The range and number of new applications and services are growing at an unprecedented rate. Computer networks need to be able to provide connectivity for these services and meet their constantly changing demands. This requires not only support of new network protocols and security requirements, but often architectural redesigns for long-term improvements to efficiency, speed, throughput, cost, and security. Networks are now facing a drastic increase in size and are required to carry a constantly growing amount of heterogeneous traffic. Unfortunately such dynamism greatly complicates security of not only the end nodes in the network, but also of the nodes of the network itself. To make matters worse, just as applications are being developed at faster and faster rates, attacks are becoming more pervasive and complex. Networks need to be able to understand the impact of these attacks and protect against them.
Network control devices, such as routers, firewalls, censorship devices, and base stations, are elements of the network that make decisions on how traffic is handled. Although network control devices are expected to act according to specifications, there can be various reasons why they do not in practice. Protocols could be flawed, ambiguous or incomplete, developers could introduce unintended bugs, or attackers may find vulnerabilities in the devices and exploit them. Malfunction could intentionally or unintentionally threaten the confidentiality, integrity, and availability of end nodes and the data that passes through the network. It can also impact the availability and performance of the control devices themselves and the security policies of the network. The fast-paced evolution and scalability of current and future networks create a dynamic environment for which it is difficult to develop automated tools for testing new protocols and components. At the same time, they make the function of such tools vital for discovering implementation flaws and protocol vulnerabilities as networks become larger and more complex, and as new and potentially unrefined architectures become adopted. This thesis will present the design, implementation, and evaluation of a set of tools designed for understanding implementation of network control nodes and how they react to changes in traffic characteristics as networks evolve. We will first introduce Firecycle, a test bed for analyzing the impact of large-scale attacks and Machine-to-Machine (M2M) traffic on the Long Term Evolution (LTE) network. We will then discuss Autosonda, a tool for automatically discovering rule implementation and finding triggering traffic features in censorship devices.
This thesis provides the following contributions:
1. The design, implementation, and evaluation of two tools to discover models of network control nodes in two scenarios of evolving networks, mobile network and censored internet
2. First existing test bed for analysis of large-scale attacks and impact of traffic scalability on LTE mobile networks
3. First existing test bed for LTE networks that can be scaled to arbitrary size and that deploys traffic models based on real traffic traces taken from a tier-1 operator
4. An analysis of traffic models of various categories of Internet of Things (IoT) devices
5. First study demonstrating the impact of M2M scalability and signaling overload on the packet core of LTE mobile networks
6. A specification for modeling of censorship device decision models
7. A means for automating the discovery of features utilized in censorship device decision models, comparison of these models, and their rule discover
- âŠ