19 research outputs found

    An Integrated and Distributed Framework for a Malaysian Telemedicine System (MyTel)

    Get PDF
    The overall aim of the research was to produce a validated framework for a Malaysian integrated and distributed telemedicine system. The framework was constructed so that it was capable of being useful in retrieving and storing a patient's lifetime health record continuously and seamlessly during the downtime of the computer system and the unavailability of a landline telecommunication network. The research methodology suitable for this research was identified including the verification and validation strategies. A case study approach was selected for facilitating the processes and development of this research. The empirical data regarding the Malaysian health system and telemedicine context were gathered through a case study carried out at the Ministry of Health Malaysia (MOHM). The telemedicine approach in other countries was also analysed through a literature review and was compared and contrasted with that in the Malaysian context. A critical appraisal of the collated data resulted in the development of the proposed framework (MyTel) a flexible telemedicine framework for the continuous upkeep o f patients' lifetime health records. Further data were collected through another case study (by way of a structured interview in the outpatient clinics/departments of MOHM) for developing and proposing a lifetime health record (LHR) dataset for supporting the implementation of the MyTel framework. The LHR dataset was developed after having conducted a critical analysis of the findings of the clinical consultation workflow and the usage o f patients' demographic and clinical records in the outpatient clinics. At the end of the analysis, the LHR components, LHR structures and LHR messages were created and proposed. A common LHR dataset may assist in making the proposed framework more flexible and interoperable. The first draft of the framework was validated in the three divisions of MOHM that were involved directly in the development of the National Health JCT project. The division includes the Telehealth Division, Public and Family Health Division and Planning and Development Division. The three divisions are directly involved in managing and developing the telehealth application, the teleprimary care application and the total hospital information system respectively. The feedback and responses from the validation process were analysed. The observations and suggestions made and experiences gained advocated that some modifications were essential for making the MyTel framework more functional, resulting in a revised/ final framework. The proposed framework may assist in achieving continual access to a patient's lifetime health record and for the provision of seamless and continuous care. The lifetime health record, which correlates each episode of care of an individual into a continuous health record, is the central key to delivery of the Malaysian integrated telehealth application. The important consideration, however, is that the lifetime health record should contain not only longitudinal health summary information but also the possibility of on-line retrieval of all of the patient's health history whenever required, even during the computer system's downtime and the unavailability of the landline telecommunication network

    Proceedings of the International Workshop on EuroPLOT Persuasive Technology for Learning, Education and Teaching (IWEPLET 2013)

    Get PDF
    "This book contains the proceedings of the International Workshop on EuroPLOT Persuasive Technology for Learning, Education and Teaching (IWEPLET) 2013 which was held on 16.-17.September 2013 in Paphos (Cyprus) in conjunction with the EC-TEL conference. The workshop and hence the proceedings are divided in two parts: on Day 1 the EuroPLOT project and its results are introduced, with papers about the specific case studies and their evaluation. On Day 2, peer-reviewed papers are presented which address specific topics and issues going beyond the EuroPLOT scope. This workshop is one of the deliverables (D 2.6) of the EuroPLOT project, which has been funded from November 2010 – October 2013 by the Education, Audiovisual and Culture Executive Agency (EACEA) of the European Commission through the Lifelong Learning Programme (LLL) by grant #511633. The purpose of this project was to develop and evaluate Persuasive Learning Objects and Technologies (PLOTS), based on ideas of BJ Fogg. The purpose of this workshop is to summarize the findings obtained during this project and disseminate them to an interested audience. Furthermore, it shall foster discussions about the future of persuasive technology and design in the context of learning, education and teaching. The international community working in this area of research is relatively small. Nevertheless, we have received a number of high-quality submissions which went through a peer-review process before being selected for presentation and publication. We hope that the information found in this book is useful to the reader and that more interest in this novel approach of persuasive design for teaching/education/learning is stimulated. We are very grateful to the organisers of EC-TEL 2013 for allowing to host IWEPLET 2013 within their organisational facilities which helped us a lot in preparing this event. I am also very grateful to everyone in the EuroPLOT team for collaborating so effectively in these three years towards creating excellent outputs, and for being such a nice group with a very positive spirit also beyond work. And finally I would like to thank the EACEA for providing the financial resources for the EuroPLOT project and for being very helpful when needed. This funding made it possible to organise the IWEPLET workshop without charging a fee from the participants.

    Control mechanisms for mobile terminals in heterogeneous access technology environments

    Get PDF
    Internet is evolving to become mobile and ubiquitous. As a consequence, there is a trend towards a diversification of the access technologies, as it can be seen with the recent appearance of wireless technologies such as WiFi or UMTS and the future deployment of WiMAX. Following these new opportunities, multi-technology terminals able to connect to the Internet through different technologies are appearing in the market. In this scenario, users start to demand new solutions able to use these new technologies in a transparent way from the user point of view. Foreseeing this demand, the IEEE started developing the specification IEEE 802.21, which enables multi-technology terminals to handover from one technology to another in a transparent way for the user. This specification has not yet being finished, and its deployment requires from the research community to analyze how to integrate it in current networks, how to achieve maximum benefit from its possibilities, and how to configure its parameters. In this thesis we propose control mechanisms for IP terminals to i) support efficient handovers in multi-technology environments applying the 802.21 framework and ii) allow the use of several interfaces and/or multiple providers by the terminals to improve the failure robustness of their communications. These mechanisms are focused in the terminal, although we also provide details on how to integrate IEEE 802.21 into nowadays operator's networks. The contributions of this thesis are threefold. In the first place the integration of 802.21 into terminals has been studied, focusing on the configuration of the parameters required to decide when to perform a handover in the case when the handover is initiated by the terminal. This analysis has also been done taking into account variables such as the terminal speed and the delay of the links. In the second place, we have studied how to introduce the Network Controlled Handover concept, using 802.21, into the network, including the possibility of the handover being initiated by the network. We have analyzed which are the main benefits of this approach and proposed and validated an implementation of this concept in 802.21. In third place we have analyzed a protocol, REAP, under development in the IETF, which allows terminals to detect and recover from failures in the links used in their communications. We have focused in the analytical characterization of the time required to detect a failure, since this parameter is crucial for the application's behavior. The applications should be able to cope with a failure without being disrupted by it. Through the analytical study performed, the REAP protocol can be properly configured to achieve a target recovery time. All the proposed mechanisms have been validated through simulation, using several tools such as OPNET, OMNET++ and MatlabLas tecnologías de acceso están evolucionando hacia un perfil móvil y ubicuo. Así mismo se está produciendo una diversificación en las tecnologías de acceso disponibles, con la proliferación de tecnologías inalámbricas como WiFi o UMTS y el despliegue próximo de WiMAX. Con la diversificación en el acceso aparecen también los primeros terminales multi-tecnología, capaces de utilizar diferentes redes simultáneamente. En este escenario, los usuarios empiezan a demandar soluciones y servicios capaces de utilizar estas tecnologías de forma transparente al usuario. Anticipándose a esta demanda, el IEEE comenzó la estandarización de la especificación 802.21 que permitirá a terminales multi-tecnología la posibilidad de realizar traspasos transparentes entre diferentes redes de acceso. Dicha especificación todavía no ha sido completada y su despliegue requiere la investigación de cómo integrarla en las redes actuales, cómo obtener el máximo beneficio de las posibilidades que presenta, así como de la configuración de sus parámetros. En la presente Tesis Doctoral proponemos una arquitectura que dota a terminales IP de mecanismos de control para i) soportar movilidad eficiente en entornos multi-tecnología en el marco de 802.21 y ii) permitir el uso de múltiples interfaces y/o proveedores con el objetivo de mejorar la robustez ante fallos en las comunicaciones. Dicha arquitectura se centra en el terminal aunque también se aportan detalles de cómo introducir las modificaciones requeridas por IEEE 802.21 en las redes de los operadores. Las contribuciones realizadas son varias. En primer lugar se ha estudiado la integración de IEEE 802.21 en un terminal, centrándonos en la configuración de los parámetros utilizados para determinar el momento del traspaso cuando éste es iniciado por el terminal. En segundo lugar se estudió cómo introducir, usando IEEE 802.21, el concepto de traspaso controlado por la red incluyendo la posibilidad de que la propia red sea la iniciadora del traspaso, analizando sus beneficios y aportando una propuesta de implementación dentro de IEEE 802.21. En tercer lugar analizamos un protocolo, REAP, que se está desarrollando dentro del IETF para permitir, desde los terminales, la detección y recuperación frente a fallos en los enlaces usados en sus comunicaciones. Dentro de este bloque nos centramos en la caracterización analítica del tiempo requerido para detectar un fallo ya que este parámetro es de vital importancia para el funcionamiento de las aplicaciones, que deben poder sobrevivir a un fallo sin verse completamente interrumpidas por él. Con el estudio analítico realizado es posible configurar REAP para alcanzar un tiempo determinado de recuperación ante fallo. Todos los mecanismos propuestos han sido validados mediante simulación empleando diversas herramientas como OPNET, OMNET++ y Matla

    An investigation into destination management systems website evaluation theory and practice

    Get PDF
    The main aim of this thesis is an investigation into Destination Management Systems (DMS) website effectiveness and evaluation in the tourism domain from both academic and industry (destination management) perspectives. This thesis begins with a comprehensive review of the literature about theories, concepts and methods used for DMS website effectiveness evaluation. The future direction of DMS website evaluation in tourism and a conceptual framework that defines the contemporary theory versus practice of the DMS websites evaluation is elaborated. The research employed first three rounds of Delphi study to generate an up-to-date definition and aims of DMS. The Delphi study also generated an up-to-date comprehensive set of dimensions and criteria for evaluating the effectiveness of DMS websites. The research then employed structured interviews as well as online survey sent to forty-six official destination websites to review how industry is evaluating their DMS websites. What approaches they use in addition to the criteria and dimensions when evaluating the effectiveness of their DMS websites is explored. This thesis also reviews additional aspects related to the in destination evaluation. The findings of the Delphi study indicated that there is a rising emergence of social media as a new important component related to DMS. The findings also suggested additional aims to previously identified aims of the DMS. The new additional aims of DMS found in this research are: support sustainable destination management; empower and support tourism firms; enable collaboration at the destination; increase consumer satisfaction level and capture consumer data. Further findings also indicated compared with these established by previous researchers there are new additions to the evaluation dimensions of DMS websites proposed which are: sustainability, marketing, collaboration issues, and goals of the website. The findings of this thesis indicated that there is a congruence and consensus between academic experts and industry in terms of the most dimensions that are crucial for DMS websites evaluation. The findings, however, indicated that there is limited parallel between criteria identified with the Delphi study and those found and used by destination management practitioners. This thesis calls for additional research to develop a support system to ensure a focused involvement between academia and industry in the area of DMS website evaluation. This thesis contributes to knowledge by generating an up-to-date and comprehensive set of dimensions and criteria for evaluating the effectiveness of a DMS website. This thesis also contributes to knowledge through the identification of the current dimensions, criteria, and evaluation approaches used by industry practitioners. This research adopted a strategy in presenting the literature review that enhanced the understanding of the DMS websites and their comprehensive evaluation in tourism. This research is one of the first studies in the tourism field that reviews and sheds light on and compares and contracts contemporary thinking on both academia and industry evaluation of DMS websites.sub_behsubmitted2197_ethesessubmitte

    Redefining personal information in the context of the Internet

    Full text link
    Réalisée en cotutelle avec l'Université de Panthéon-Assas (Paris II)Vers la fin des années soixante, face à l’importance grandissante de l’utilisation des ordinateurs par les organisations, une définition englobante de la notion de donnée personnelle a été incorporée dans les lois en matière de protection de données personnelles (« LPDPs »). Avec Internet et la circulation accrue de nouvelles données (adresse IP, données de géolocalisation, etc.), il y a lieu de s’interroger quant à l’adéquation entre cette définition et cette réalité. Aussi, si la notion de donnée personnelle, définie comme étant « une donnée concernant un individu identifiable » est toujours applicable à un tel contexte révolutionnaire, il n’en demeure pas moins qu’il importe de trouver des principes interprétatifs qui puissent intégrer ces changements factuels. La présente thèse vise à proposer une interprétation tenant compte de l’objectif recherché par les LPDPs, à savoir protéger les individus contre les risques de dommage découlant de la collecte, de l’utilisation ou de la divulgation de leurs données. Alors que la collecte et la divulgation des données entraîneront surtout un risque de dommage de nature subjective (la collecte, un sentiment d’être sous observation et la divulgation, un sentiment d’embarras et d’humiliation), l’utilisation de ces données causera davantage un dommage objectif (dommage de nature financière, physique ou discriminatoire). La thèse propose plusieurs critères qui devraient être pris en compte pour évaluer ce risque de dommage ; elle servira de guide afin de déterminer quelles données doivent être qualifiées de personnelles, et fera en sorte que les LPDPs soient le plus efficaces possibles dans un contexte de développements technologiques grandissants.In the late sixties, with the growing use of computers by organizations, a very broad definition of personal information as “information about an identifiable individual” was elaborated and has been incorporated in data protection laws (“DPLs”). In more recent days, with the Internet and the circulation of new types of information (IP addresses, location information, etc), the efficiency of this definition may be challenged. This thesis aims at proposing a new way of interpreting personal information. Instead of using a literal interpretation, an interpretation which takes into account the purpose behind DPLs will be proposed, in order to ensure that DPLs do what they are supposed to do: address or avoid the risk of harm to individuals triggered by organizations handling their personal information. While the collection or disclosure of information may trigger a more subjective kind of harm (the collection, a feeling of being observed and the disclosure, embarrassment and humiliation), the use of information will trigger a more objective kind of harm (financial, physical, discrimination, etc.). Various criteria useful in order to evaluate this risk of harm will be proposed. The thesis aims at providing a guide that may be used in order to determine whether certain information should qualify as personal information. It will provide for a useful framework under which DPLs remain efficient in light of modern technologies and the Internet

    Exploring the Transnational Neighbourhood

    Get PDF
    Practices of community-building in a globalised context Urban neighbourhoods have come to occupy the public imagination as a litmus test of migration, with some areas hailed as multicultural success stories while others are framed as ghettos. In an attempt to break down this dichotomy, Exploring the Transnational Neighbourhood filters these debates through the lenses of geography, anthropology, and literary and cultural studies. By establishing the interdisciplinary concept of the 'transnational neighbourhood', it presents these localities – whether Clichy-sous-Bois, Belfast, El Segundo Barrio or Williamsburg – as densely packed contact zones where disparate cultures meet in often highly asymmetrical relations, producing a constantly shifting local and cultural knowledge about identity, belonging, and familiarity. Exploring the Transnational Neighbourhood offers a pivotal response to one of the key questions of our time: How do people create a sense of community within an exceedingly globalised context? By focusing on the neighbourhood as a central space of transcultural everyday experience within three different levels of discourse (i.e., the virtual, the physical local, and the transnational-global), the multidisciplinary contributions explore bottom-up practices of community-building alongside cultural, social, economic, and historical barriers. Contributors: Christina Horvath (University of Bath), Maria Roca Lizarazu (NUI Galway), Emilio Maceda Rodriguez (Universidad Autónoma de Tlaxcala), Naomi Wells (IMLR, University of London), Anne Fuchs (University College Dublin), Gad Schaffer (Tel-Hai Academic College), Daniela Bohórquez Sheinin (University of Michigan), Anna Marta Marini (Universidad de Alcalá), Godela Weiss-Sussex (IMLR, University of London), Britta C. Jung (Maynooth University), Emma Crowley (University of Bristol), Mary Mazzilli (University of Essex) Ebook available in Open Access. This publication is GPRC-labeled (Guaranteed Peer-Reviewed Content)

    Exploring the Transnational Neighbourhood

    Get PDF
    Practices of community-building in a globalised context Urban neighbourhoods have come to occupy the public imagination as a litmus test of migration, with some areas hailed as multicultural success stories while others are framed as ghettos. In an attempt to break down this dichotomy, Exploring the Transnational Neighbourhood filters these debates through the lenses of geography, anthropology, and literary and cultural studies. By establishing the interdisciplinary concept of the 'transnational neighbourhood', it presents these localities – whether Clichy-sous-Bois, Belfast, El Segundo Barrio or Williamsburg – as densely packed contact zones where disparate cultures meet in often highly asymmetrical relations, producing a constantly shifting local and cultural knowledge about identity, belonging, and familiarity. Exploring the Transnational Neighbourhood offers a pivotal response to one of the key questions of our time: How do people create a sense of community within an exceedingly globalised context? By focusing on the neighbourhood as a central space of transcultural everyday experience within three different levels of discourse (i.e., the virtual, the physical local, and the transnational-global), the multidisciplinary contributions explore bottom-up practices of community-building alongside cultural, social, economic, and historical barriers. Contributors: Christina Horvath (University of Bath), Maria Roca Lizarazu (NUI Galway), Emilio Maceda Rodriguez (Universidad Autónoma de Tlaxcala), Naomi Wells (IMLR, University of London), Anne Fuchs (University College Dublin), Gad Schaffer (Tel-Hai Academic College), Daniela Bohórquez Sheinin (University of Michigan), Anna Marta Marini (Universidad de Alcalá), Godela Weiss-Sussex (IMLR, University of London), Britta C. Jung (Maynooth University), Emma Crowley (University of Bristol), Mary Mazzilli (University of Essex) Ebook available in Open Access. This publication is GPRC-labeled (Guaranteed Peer-Reviewed Content)

    Automating SLA enforcement in the cloud computing

    Get PDF
    Cloud computing is playing an increasingly important role, not only by facilitating digital trading platforms but also by transforming conventional services from client-server models to cloud computing. This domain has given the global economic and technological benefits, it offers to both the service providers and service subscribers. Digital marketplaces are no longer limited only to trade tangible commodities but also facilitates enormous service virtualization across various industries. Software as a Service (SaaS) being the largest service segment, dominates the global cloud migration. Infrastructure as a Service (IaaS) and cloud-based application development also known as Platform as a Service (PaaS) are also next-generation computing platforms for their ultimate futuristic demand by both, public and private sector. These service segments are now hosted on cloud platforms to compute, store, and network, an enormous amount of service requests, which process data incredibly fast and economically. Organizations also perform data analytics and other similar computing amenities to manage their business without maintaining on-premise computing infrastructures which are hard to maintain. This computing capability has extensively improved the popularity and increased the demand for cloud services to an extent, that businesses worldwide are heavily migrating their computing resources to these platforms. Diverse cloud service providers take the responsibility of provisioning such cloud-based services for subscribers. In return, a certain subscription fee is charged to them periodically and depending upon the service package, availability and security. On the flip side, such intensive technology shift and outsourcing reliance have also introduced scenarios that any failure on their part leads to serious consequences to the business community at large. In recent years technology industry has observed critical and increased service outages at various cloud service providers(CSP) such as Amazon AWS, Microsoft, Google, which ultimately interrupts the entire supply chain and causes several well-known web services to be taken offline either due to a human error, failed change control implementation or in more recently due to targeted cyber-attacks like DDoS. These web-based solutions such as compute, storage, network or other similar services are provisioned to cloud service subscribers (CSS) platforms. Regardless of a cloud service deployment, a legal binding such as a Service Level Agreement (SLA) is signed between the CSP and CSS. The SLA holds a service scope and guarantees in case of failure. There are probabilities where these SLA may be violated, revoked, or dishonoured by either party, mostly the CSP. An SLA violation along with an unsettled dispute leads to some financial losses for the service subscribers or perhaps cost them their business reputation. Eventually, the subscriber may request some form of compensation from the provider such as a service credit or a refund. In either case, the burden of proof lies with the subscribers, who have to capture and preserve those data or forensically sound system or service logs, supporting their claims. Most of the time, this is manually processed, which is both expensive and time-consuming. To address this problem, this research first analyses the gaps in existing arrangements. It then suggests automation of SLA enforcement within cloud environments and identifies the main properties of a solution to the problem covering various other avenues associated with the other operating environments. This research then subsequently proposes architectures, based on the concept of fair exchange, and shows that how intelligently the approach enforces cloud SLA using various techniques. Furthermore, by extending the research scope covering two key scenarios (a) when participants are loss averse and (b) when interacting participants can act maliciously. Our proposed architectures present robust schemes by enforcing the suggested solutions which are effective, efficient, and most importantly resilient to modern-day security and privacy challenges. The uniqueness of our research is that it does not only ensure the fairness aspect of digital trading but it also extends and logically implements a dual security layer throughout the service exchange. Using this approach protects business participants by securely automating the dispute resolutions in a more resilient fashion. It also shields their data privacy and security from diverse cyber challenges and other operational failures. These architectures are capable of imposing state-of-the-art defences through integrated secure modules along with full encryption schemes, mitigating security gaps previously not dealt with, based upon fair exchange protocols. The Protocol also accomplishes achieving service exchange scenarios either with or without dispute resolution. Finally, our proposed architectures are automated and interact with hardcoded procedures and verifications mechanism using a variant of trusted third parties and trusted authorities, which makes it difficult to cause potential disagreements and misbehaviours during a cloud-based service exchange by enforcing SLA
    corecore