29 research outputs found

    Development and face validation of strategies for improving consultation skills

    Get PDF
    While formative workplace based assessment can improve learners' skills, it often does not because the procedures used do not facilitate feedback which is sufficiently specific to scaffold improvement. Provision of pre-formulated strategies to address predicted learning needs has potential to improve the quality and automate the provision of written feedback. To systematically develop, validate and maximise the utility of a comprehensive list of strategies for improvement of consultation skills through a process involving both medical students and their clinical primary and secondary care tutors. Modified Delphi study with tutors, modified nominal group study with students with moderation of outputs by consensus round table discussion by the authors. 35 hospital and 21 GP tutors participated in the Delphi study and contributed 153 new or modified strategies. After review of these and the 205 original strategies, 265 strategies entered the nominal group study to which 46 year four and five students contributed, resulting in the final list of 249 validated strategies. We have developed a valid and comprehensive set of strategies which are considered useful by medical students. This list can be immediately applied by any school which uses the Calgary Cambridge Framework to inform the content of formative feedback on consultation skills. We consider that the list could also be mapped to alternative skills frameworks and so be utilised by schools which do not use the Calgary Cambridge Framework

    Cache-aided combination networks with interference

    Get PDF
    Centralized coded caching and delivery isstudied for a radio access combination network (RACN),whereby a set ofHedge nodes (ENs), connected to acloud server via orthogonal fronthaul links with limitedcapacity, serve a total ofKuser equipments (UEs) overwireless links. The cloud server is assumed to hold alibrary ofNfiles, each of sizeFbits; and each user,equipped with a cache of sizeμRNFbits, is connectedto a distinct set ofrENs each of which equipped witha cache of sizeμTNFbits, whereμT,μR∈[0,1]arethe fractional cache capacities of the UEs and the ENs,respectively. The objective is to minimize the normalizeddelivery time (NDT), which refers to the worst case deliverylatency when each user requests a single distinct file fromthe library. Three coded caching and transmission schemesare considered, namely theMDS-IA,soft-transferandzero-forcing (ZF)schemes. MDS-IA utilizes maximum distanceseparable (MDS) codes in the placement phase and realinterference alignment (IA) in the delivery phase. Theachievable NDT for this scheme is presented forr= 2and arbitrary fractional cache sizesμTandμR, and alsofor arbitrary value ofrand fractional cache sizeμTwhen the cache capacity of the UE is above a certainthreshold. The soft-transfer scheme utilizes soft-transferof coded symbols to ENs that implement ZF over the edgelinks. The achievable NDT for this scheme is presentedfor arbitraryrand arbitrary fractional cache sizesμTandμR. The last scheme utilizes ZF between the ENs andthe UEs without the participation of the cloud server inthe delivery phase. The achievable NDT for this scheme is presented for an arbitrary value ofrwhen the totalcache size at a pair of UE and EN is sufficient to store thewhole library, i.e.,μT+μR≥1. The results indicate thatthe fronthaul capacity determines which scheme achievesa better performance in terms of the NDT, and thesoft-transfer scheme becomes favorable as the fronthaulcapacity increases

    Abordando fatores humanos no projeto de soluções criptográficas : dois estudos de caso em validação de itens e autenticação

    Get PDF
    Orientador: Ricardo DahabTese (doutorado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: O projeto de soluções criptográficas seguras a partir de uma perspectiva puramente teórica não é suficiente para garantir seu sucesso em cenários realistas. Diversas vezes, as premissas sob as quais estas soluções são propostas não poderiam estar mais longe das necessidades do mundo real. Um aspecto frequentemente esquecido, que pode influenciar em como a solução se sai ao ser integrada, é a forma como o usuário final interage com ela (i.e., fatores humanos). Neste trabalho, estudamos este problema através da análise de dois cenários de aplicação bem conhecidos da pesquisa em Segurança da Informação: O comércio eletrônico de itens digitais e Internet banking. Protocolos de trocas justas tem sido amplamente estudados, mas continuam não sendo implementados na maioria das transações de comércio eletrônico disponíveis. Para diversos tipos de itens digitais (e-goods), o modelo de negócios atual para comércio eletrônico falha em garantir justiça aos clientes. A validação de itens é um passo crítico em trocas justas, e recebeu pouca atenção dos pesquisadores. Nós acreditamos que estes problemas devam ser abordados de forma integrada, para que os protocolos de trocas justas possam ser efetivamente implementados no mercado. De forma geral, acreditamos também que isso seja um reflexo de paradigmas de projeto orientado a sistemas para soluções de segurança, que são centrados em dados em vez de usuários, o que resulta em métodos e técnicas que frequentemente desconsideram os requisitos de usuários. Contextualizamos como, ao subestimar as sutilezas do problema da validação de itens, o modelo atual para compra e venda de itens digitais falha em garantir sucesso, na perspectiva dos compradores, para as transações ¿ sendo, portanto, injusto por definição. Também introduzimos o conceito de Degradação Reversível, um método que inerentemente inclui o passo de validação de itens em transações de compra e venda com a finalidade de mitigar os problemas apresentados. Como prova-de-conceito, produzimos uma implementação de Degradação Reversível baseada em códigos corretores de erros sistemáticos (SECCs), destinada a conteúdo multimídia. Este método é também o subproduto de uma tentativa de incluir os requisitos do usuário no processo de construção de métodos criptográficos, uma abordagem que, em seguida, evoluímos para o denominado projeto de protocolos orientado a itens. De uma perspectiva semelhante, também propomos um método inovador para a autenticação de usuários e de transações para cenários de Internet Banking. O método proposto, baseado em Criptografia Visual, leva em conta tanto requisitos técnicos quanto de usuário, e cabe como um componente seguro ¿ e intuitivo ¿ para cenários práticos de autenticação de transaçõesAbstract: Designing secure cryptographic solutions from a purely theoretical perspective is not enough to guarantee their success in a realistic scenario. Many times, the assumptions under which these solutions are designed could not be further from real-world necessities. One particular, often-overlooked aspect that may impact how the solution performs after deployment is how the final user interacts with it (i.e., human factors). In this work, we take a deeper look into this issue by analyzing two well known application scenarios from Information Security research: The electronic commerce of digital items and Internet banking. Fair exchange protocols have been widely studied, but are still not implemented on most e-commerce transactions available. For several types of digital items (e-goods), the current e-commerce business model fails to provide fairness to customers. A critical step in fair exchange is item validation, which still lacks proper attention from researchers. We believe this issue should be addressed in a comprehensive and integrated fashion before fair exchange protocols can be effectively deployed in the marketplace. More generally, we also believe this to be the consequence of ongoing system-oriented security solution design paradigms that are data-centered, as opposed to user-centered, thus leading to methods and techniques that often disregard users¿ requirements. We contextualize how, by overlooking the subtleties of the item validation problem, the current model for buying and selling digital items fails to provide guarantees of a successful transaction outcome to customers, thus being unfair by design. We also introduce the concept of Reversible Degradation, a method for enhancing buy-sell transactions concerning digital items that inherently includes the item validation step in the purchase protocol in order to tackle the discussed problems. As a proof-of-concept, we produce a deliverable instantiation of Reversible Degradation based on systematic error correction codes (SECCs), suitable for multimedia content. This method is also the byproduct of an attempt to include users¿ requirements into the cryptographic method construction process, an approach that we further develop into a so-called item-aware protocol design. From a similar perspective, we also propose a novel method for user and transaction authentication for Internet Banking scenarios. The proposed method, which uses Visual Cryptography, takes both technical and user requirements into account, and is suitable as a secure ¿ yet intuitive ¿ component for practical transaction authentication scenariosDoutoradoCiência da ComputaçãoDoutor em Ciência da Computaçã

    To Cloud or not to Cloud: A Qualitative Study on Self-Hosters’ Motivation, Operation, and Security Mindset

    Get PDF
    Despite readily available cloud services, some people decide to self-host internal or external services for themselves or their organization. In doing so, a broad spectrum of commercial, institutional, and private self-hosters take responsibility for their data, security, and reliability of their operations. Currently, little is known about what motivates these self- hosters, how they operate and secure their services, and which challenges they face. To improve the understanding of self-hosters’ security mindsets and practices, we conducted a largescale survey (NS=994) with users of a popular self-hosting suite and in-depth follow-up interviews with selected commercial, non-profit, and private users (NI =41). We found exemplary behavior in all user groups; however, we also found a significant part of self-hosters who approach security in an unstructured way, regardless of social or organizational embeddedness. Vague catch-all concepts such as firewalls and backups dominate the landscape, without proper reflection on the threats they help mitigate. At times, self-hosters engage in creative tactics to compensate for a potential lack of expertise or experience

    The Influences of Human-Made Disasters on the State of Government to Citizens ICT Services: Users’ Perspectives

    Get PDF
    The recent human-made disasters in middle-east harmed the governments’ functionality which caused difficulties on various aspects of life for citizens. These affected governments' functions include government to citizens (G2C) ICT services. There is an absence of empirical study to clarify the real situation of the government to citizens ICT services among citizens during a human-made disaster. Where government to citizens ICT services could fit and serve the affected people due to the difficulties and risks that hinder their access to the government sites. This paper attempts to fill this gap in the literature by empirically investigating these services in Iraq as a war-torn country. In this paper, the literature review has been conducted; the investigated issue is recognised by conducting a self-administered survey. Results point out that there is a lack of flexibility when using ICT services, and there is a notable ignorance about the availability of G2C ICT services among IDPs

    OPTIMAX 2014 - Radiation dose and image quality optimisation in medical imaging

    Get PDF
    Medical imaging is a powerful diagnostic tool. Consequently, the number of medical images taken has increased vastly over the past few decades. The most common medical imaging techniques use X-radiation as the primary investigative tool. The main limitation of using X-radiation is associated with the risk of developing cancers. Alongside this, technology has advanced and more centres now use CT scanners; these can incur significant radiation burdens compared with traditional X-ray imaging systems. The net effect is that the population radiation burden is rising steadily. Risk arising from X-radiation for diagnostic medical purposes needs minimising and one way to achieve this is through reducing radiation dose whilst optimising image quality. All ages are affected by risk from X-radiation however the increasing population age highlights the elderly as a new group that may require consideration. Of greatest concern are paediatric patients: firstly they are more sensitive to radiation; secondly their younger age means that the potential detriment to this group is greater. Containment of radiation exposure falls to a number of professionals within medical fields, from those who request imaging to those who produce the image. These staff are supported in their radiation protection role by engineers, physicists and technicians. It is important to realise that radiation protection is currently a major European focus of interest and minimum competence levels in radiation protection for radiographers have been defined through the integrated activities of the EU consortium called MEDRAPET. The outcomes of this project have been used by the European Federation of Radiographer Societies to describe the European Qualifications Framework levels for radiographers in radiation protection. Though variations exist between European countries radiographers and nuclear medicine technologists are normally the professional groups who are responsible for exposing screening populations and patients to X-radiation. As part of their training they learn fundamental principles of radiation protection and theoretical and practical approaches to dose minimisation. However dose minimisation is complex – it is not simply about reducing X-radiation without taking into account major contextual factors. These factors relate to the real world of clinical imaging and include the need to measure clinical image quality and lesion visibility when applying X-radiation dose reduction strategies. This requires the use of validated psychological and physics techniques to measure clinical image quality and lesion perceptibility

    Decryption Failure Attacks on Post-Quantum Cryptography

    Get PDF
    This dissertation discusses mainly new cryptanalytical results related to issues of securely implementing the next generation of asymmetric cryptography, or Public-Key Cryptography (PKC).PKC, as it has been deployed until today, depends heavily on the integer factorization and the discrete logarithm problems.Unfortunately, it has been well-known since the mid-90s, that these mathematical problems can be solved due to Peter Shor's algorithm for quantum computers, which achieves the answers in polynomial time.The recently accelerated pace of R&D towards quantum computers, eventually of sufficient size and power to threaten cryptography, has led the crypto research community towards a major shift of focus.A project towards standardization of Post-quantum Cryptography (PQC) was launched by the US-based standardization organization, NIST. PQC is the name given to algorithms designed for running on classical hardware/software whilst being resistant to attacks from quantum computers.PQC is well suited for replacing the current asymmetric schemes.A primary motivation for the project is to guide publicly available research toward the singular goal of finding weaknesses in the proposed next generation of PKC.For public key encryption (PKE) or digital signature (DS) schemes to be considered secure they must be shown to rely heavily on well-known mathematical problems with theoretical proofs of security under established models, such as indistinguishability under chosen ciphertext attack (IND-CCA).Also, they must withstand serious attack attempts by well-renowned cryptographers both concerning theoretical security and the actual software/hardware instantiations.It is well-known that security models, such as IND-CCA, are not designed to capture the intricacies of inner-state leakages.Such leakages are named side-channels, which is currently a major topic of interest in the NIST PQC project.This dissertation focuses on two things, in general:1) how does the low but non-zero probability of decryption failures affect the cryptanalysis of these new PQC candidates?And 2) how might side-channel vulnerabilities inadvertently be introduced when going from theory to the practice of software/hardware implementations?Of main concern are PQC algorithms based on lattice theory and coding theory.The primary contributions are the discovery of novel decryption failure side-channel attacks, improvements on existing attacks, an alternative implementation to a part of a PQC scheme, and some more theoretical cryptanalytical results

    Development of MEMS Piezoelectric Vibration Energy Harvesters with Wafer-Level Integrated Tungsten Proof-Mass for Ultra Low Power Autonomous Wireless Sensors

    Get PDF
    La génération d’énergie localisée et à petite échelle, par transformation de l’énergie vibratoire disponible dans l’environnement, est une solution attrayante pour améliorer l’autonomie de certains noeuds de capteurs sans-fil pour l’Internet des objets (IoT). Grâce à des microdispositifs inertiels résonants piézoélectriques, il est possible de transformer l’énergie mécanique en électricité. Cette thèse présente une étude exhaustive de cette technologie et propose un procédé pour fabriquer des microgénérateurs MEMS offrant des performances surpassant l’état de l’art. On présente d’abord une revue complète des limites physiques et technologiques pour identifier le meilleur chemin d’amélioration. En évaluant les approches proposées dans la littérature (géométrie, architecture, matériaux, circuits, etc.), nous suggérons des métriques pour comparer l’état de l’art. Ces analyses démontrent que la limite fondamentale est l’énergie absorbée par le dispositif, car plusieurs des solutions existantes répondent déjà aux autres limites. Pour un générateur linéaire résonant, l’absorption d’énergie dépend donc des vibrations disponibles, mais aussi de la masse du dispositif et de son facteur de qualité. Pour orienter la conception de prototypes, nous avons réalisé une étude sur le potentiel des capteurs autonomes dans une automobile. Nous avons évalué une liste des capteurs présents sur un véhicule pour leur compatibilité avec cette technologie. Nos mesures de vibrations sur un véhicule en marche aux emplacements retenus révèlent que l’énergie disponible pour un dispositif linéaire résonant MEMS se situe entre 30 à 150 Hz. Celui-ci pourrait produire autour de 1 à 10 μW par gramme. Pour limiter la taille d’un générateur MEMS pouvant produire 10 μW, il faut une densité supérieure à celle du silicium, ce qui motive l’intégration du tungstène. L’effet du tungstène sur la sensibilité du dispositif est évident, mais nous démontrons également que l’usage de ce matériau permet de réduire l’impact de l’amortissement fluidique sur le facteur de qualité mécanique Qm. En fait, lorsque l’amortissement fluidique domine, ce changement peut améliorer Qm d’un ordre de grandeur, passant de 103 à 104 dans l’air ambiant. Par conséquent, le rendement du dispositif est amélioré sans utiliser un boîtier sous vide. Nous proposons ensuite un procédé de fabrication qui intègre au niveau de la tranche des masses de tungstène de 500 μm d’épais. Ce procédé utilise des approches de collage de tranches et de gravure humide du métal en deux étapes. Nous présentons chaque bloc de fabrication réalisé pour démontrer la faisabilité du procédé, lequel a permis de fabriquer plusieurs prototypes. Ces dispositifs ont été testés en laboratoire, certains démontrant des performances records en terme de densité de puissance normalisée. Notre meilleur design se démarque par une métrique de 2.5 mW-s-1/(mm3(m/s2)2), soit le meilleur résultat répertorié dans l’état de l’art. Avec un volume de 3.5 mm3, il opère à 552.7 Hz et produit 2.7 μW à 1.6 V RMS à partir d’une accélération de 1 m/s2. Ces résultats démontrent que l’intégration du tungstène dans les microgénérateurs MEMS est très avantageuse et permet de s’approcher davantage des requis des applications réelles.Small scale and localized power generation, using vibration energy harvesting, is considered as an attractive solution to enhance the autonomy of some wireless sensor nodes used in the Internet of Things (IoT). Conversion of the ambient mechanical energy into electricity is most often done through inertial resonant piezoelectric microdevices. This thesis presents an extensive study of this technology and proposes a process to fabricate MEMS microgenerators with record performances compared to the state of the art. We first present a complete review of the physical and technological limits of this technology to asses the best path of improvement. Reported approaches (geometries, architectures, materials, circuits) are evaluated and figures of merit are proposed to compare the state of the art. These analyses show that the fundamental limit is the absorbed energy, as most proposals to date partially address the other limits. The absorbed energy depends on the level of vibrations available, but also on the mass of the device and its quality factor for a linear resonant generator. To guide design of prototypes, we conducted a study on the potential of autonomous sensors in vehicles. A survey of sensors present on a car was realized to estimate their compatibility with energy harvesting technologies. Vibration measurements done on a running vehicle at relevant locations showed that the energy available for MEMS devices is mostly located in a frequency range of 30 to 150 Hz and could generate power in the range of 1-10 μW per gram from a linear resonator. To limit the size of a MEMS generator capable of producing 10 μW, a higher mass density compared to silicon is needed, which motivates the development of a process that incorporates tungsten. Although the effect of tungsten on the device sensitivity is well known, we also demonstrate that it reduces the impact of the fluidic damping on the mechanical quality factor Qm. If fluidic damping is dominant, switching to tungsten can improve Qm by an order of magnitude, going from 103 to 104 in ambient air. As a result, the device efficiency is improved despite the lack of a vacuum package. We then propose a fabrication process flow to integrate 500 μm thick tungsten masses at the wafer level. This process combines wafer bonding with a 2-step wet metal etching approach. We present each of the fabrication nodes realized to demonstrate the feasibility of the process, which led to the fabrication of several prototypes. These devices are tested in the lab, with some designs demonstrating record breaking performances in term of normalized power density. Our best design is noteworthy for its figure of merit that is around 2.5 mW-s-1/(mm3(m/s2)2), which is the best reported in the state of the art. With a volume of 3.5 mm3, it operates at 552.7 Hz and produces 2.7 μW at 1.6 V RMS from an acceleration of 1 m/s2. These results therefore show that tungsten integration in MEMS microgenerators is very advantageous, allowing to reduce the gap with needs of current applications
    corecore