69,276 research outputs found

    Web Vulnerability Study of Online Pharmacy Sites

    Get PDF
    Consumers are increasingly using online pharmacies, but these sites may not provide an adequate level of security with the consumers’ personal data. There is a gap in this research addressing the problems of security vulnerabilities in this industry. The objective is to identify the level of web application security vulnerabilities in online pharmacies and the common types of flaws, thus expanding on prior studies. Technical, managerial and legal recommendations on how to mitigate security issues are presented. The proposed four-step method first consists of choosing an online testing tool. The next steps involve choosing a list of 60 online pharmacy sites to test, and then running the software analysis to compile a list of flaws. Finally, an in-depth analysis is performed on the types of web application vulnerabilities. The majority of sites had serious vulnerabilities, with the majority of flaws being cross-site scripting or old versions of software that have not been updated. A method is proposed for the securing of web pharmacy sites, using a multi-phased approach of technical and managerial techniques together with a thorough understanding of national legal requirements for securing systems

    Smart Change in Strategy: IBM‘s Response to Challenging Times

    Get PDF
    International Business Machines, or IBM, has been an indisputable force in the communications industry, leading the world into the modern, global technological age. Currently, IBM has revamped its business paradigm, allocating the majority of its resources to consulting work and data analyses rather than manufacturing as it had done so in the past — for the primary objectives of optimizing existing assets, resolving problems of burgeoning cities and beleaguered governments, and enhancing peoples‘ lives in a sustainable way

    50 years of isolation

    Get PDF
    The traditional means for isolating applications from each other is via the use of operating system provided “process” abstraction facilities. However, as applications now consist of multiple fine-grained components, the traditional process abstraction model is proving to be insufficient in ensuring this isolation. Statistics indicate that a high percentage of software failure occurs due to propagation of component failures. These observations are further bolstered by the attempts by modern Internet browser application developers, for example, to adopt multi-process architectures in order to increase robustness. Therefore, a fresh look at the available options for isolating program components is necessary and this paper provides an overview of previous and current research on the area

    Reinforcing the Surveillance of EU Borders: The Future Development of FRONTEX and EUROSUR. CEPS Challenge Paper No. 11, 19 August 2008.

    Get PDF
    This paper assesses the implications of the European Commission Communications on the evaluation and future development of FRONTEX (European Agency for the Management of Operational Cooperation at the External Borders of the Member States of the European Union) and the establishment of EUROSUR (European border surveillance system). It emphasises that the evaluation of the activities conducted by the EU’s external borders agency over the period 2006–07 fails to address the impact of such undertakings on fundamental rights and freedoms, solely focusing on technical issues and overall efficiency. It argues, furthermore, that the prospects for the development of FRONTEX, including through the proposal for EUROSUR, do not sufficiently address this matter either, while envisaging a significant reinforcement of the modalities of surveillance aimed at the EU’s external borders. The paper discusses the proposals presented in the two Communications, showing how they raise issues from a legal, technical, budgetary and political (i.e. the political desirability of additional measures for surveillance at the EU borders) standpoint. It concludes with a set of recommendations regarding how the prospects included in the two Communications should be approached

    Grid Infrastructure for Domain Decomposition Methods in Computational ElectroMagnetics

    Get PDF
    The accurate and efficient solution of Maxwell's equation is the problem addressed by the scientific discipline called Computational ElectroMagnetics (CEM). Many macroscopic phenomena in a great number of fields are governed by this set of differential equations: electronic, geophysics, medical and biomedical technologies, virtual EM prototyping, besides the traditional antenna and propagation applications. Therefore, many efforts are focussed on the development of new and more efficient approach to solve Maxwell's equation. The interest in CEM applications is growing on. Several problems, hard to figure out few years ago, can now be easily addressed thanks to the reliability and flexibility of new technologies, together with the increased computational power. This technology evolution opens the possibility to address large and complex tasks. Many of these applications aim to simulate the electromagnetic behavior, for example in terms of input impedance and radiation pattern in antenna problems, or Radar Cross Section for scattering applications. Instead, problems, which solution requires high accuracy, need to implement full wave analysis techniques, e.g., virtual prototyping context, where the objective is to obtain reliable simulations in order to minimize measurement number, and as consequence their cost. Besides, other tasks require the analysis of complete structures (that include an high number of details) by directly simulating a CAD Model. This approach allows to relieve researcher of the burden of removing useless details, while maintaining the original complexity and taking into account all details. Unfortunately, this reduction implies: (a) high computational effort, due to the increased number of degrees of freedom, and (b) worsening of spectral properties of the linear system during complex analysis. The above considerations underline the needs to identify appropriate information technologies that ease solution achievement and fasten required elaborations. The authors analysis and expertise infer that Grid Computing techniques can be very useful to these purposes. Grids appear mainly in high performance computing environments. In this context, hundreds of off-the-shelf nodes are linked together and work in parallel to solve problems, that, previously, could be addressed sequentially or by using supercomputers. Grid Computing is a technique developed to elaborate enormous amounts of data and enables large-scale resource sharing to solve problem by exploiting distributed scenarios. The main advantage of Grid is due to parallel computing, indeed if a problem can be split in smaller tasks, that can be executed independently, its solution calculation fasten up considerably. To exploit this advantage, it is necessary to identify a technique able to split original electromagnetic task into a set of smaller subproblems. The Domain Decomposition (DD) technique, based on the block generation algorithm introduced in Matekovits et al. (2007) and Francavilla et al. (2011), perfectly addresses our requirements (see Section 3.4 for details). In this chapter, a Grid Computing infrastructure is presented. This architecture allows parallel block execution by distributing tasks to nodes that belong to the Grid. The set of nodes is composed by physical machines and virtualized ones. This feature enables great flexibility and increase available computational power. Furthermore, the presence of virtual nodes allows a full and efficient Grid usage, indeed the presented architecture can be used by different users that run different applications
    corecore