1,248 research outputs found

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Cerebrovascular dysfunction in cerebral small vessel disease

    Get PDF
    INTRODUCTION: Cerebral small vessel disease (SVD) is the cause of a quarter of all ischaemic strokes and is postulated to have a role in up to half of all dementias. SVD pathophysiology remains unclear but cerebrovascular dysfunction may be important. If confirmed many licensed medications have mechanisms of action targeting vascular function, potentially enabling new treatments via drug repurposing. Knowledge is limited however, as most studies assessing cerebrovascular dysfunction are small, single centre, single imaging modality studies due to the complexities in measuring cerebrovascular dysfunctions in humans. This thesis describes the development and application of imaging techniques measuring several cerebrovascular dysfunctions to investigate SVD pathophysiology and trial medications that may improve small blood vessel function in SVD. METHODS: Participants with minor ischaemic strokes were recruited to a series of studies utilising advanced MRI techniques to measure cerebrovascular dysfunction. Specifically MRI scans measured the ability of different tissues in the brain to change blood flow in response to breathing carbon dioxide (cerebrovascular reactivity; CVR) and the flow and pulsatility through the cerebral arteries, venous sinuses and CSF spaces. A single centre observational study optimised and established feasibility of the techniques and tested associations of cerebrovascular dysfunctions with clinical and imaging phenotypes. Then a randomised pilot clinical trial tested two medications’ (cilostazol and isosorbide mononitrate) ability to improve CVR and pulsatility over a period of eight weeks. The techniques were then expanded to include imaging of blood brain barrier permeability and utilised in multi-centre studies investigating cerebrovascular dysfunction in both sporadic and monogenetic SVDs. RESULTS: Imaging protocols were feasible, consistently being completed with usable data in over 85% of participants. After correcting for the effects of age, sex and systolic blood pressure, lower CVR was associated with higher white matter hyperintensity volume, Fazekas score and perivascular space counts. Lower CVR was associated with higher pulsatility of blood flow in the superior sagittal sinus and lower CSF flow stroke volume at the foramen magnum. Cilostazol and isosorbide mononitrate increased CVR in white matter. The CVR, intra-cranial flow and pulsatility techniques, alongside blood brain barrier permeability and microstructural integrity imaging were successfully employed in a multi-centre observational study. A clinical trial assessing the effects of drugs targeting blood pressure variability is nearing completion. DISCUSSION: Cerebrovascular dysfunction in SVD has been confirmed and may play a more direct role in disease pathogenesis than previously established risk factors. Advanced imaging measures assessing cerebrovascular dysfunction are feasible in multi-centre studies and trials. Identifying drugs that improve cerebrovascular dysfunction using these techniques may be useful in selecting candidates for definitive clinical trials which require large sample sizes and long follow up periods to show improvement against outcomes of stroke and dementia incidence and cognitive function

    Automatic Generation of Personalized Recommendations in eCoaching

    Get PDF
    Denne avhandlingen omhandler eCoaching for personlig livsstilsstøtte i sanntid ved bruk av informasjons- og kommunikasjonsteknologi. Utfordringen er å designe, utvikle og teknisk evaluere en prototyp av en intelligent eCoach som automatisk genererer personlige og evidensbaserte anbefalinger til en bedre livsstil. Den utviklede løsningen er fokusert på forbedring av fysisk aktivitet. Prototypen bruker bærbare medisinske aktivitetssensorer. De innsamlede data blir semantisk representert og kunstig intelligente algoritmer genererer automatisk meningsfulle, personlige og kontekstbaserte anbefalinger for mindre stillesittende tid. Oppgaven bruker den veletablerte designvitenskapelige forskningsmetodikken for å utvikle teoretiske grunnlag og praktiske implementeringer. Samlet sett fokuserer denne forskningen på teknologisk verifisering snarere enn klinisk evaluering.publishedVersio

    (b2023 to 2014) The UNBELIEVABLE similarities between the ideas of some people (2006-2016) and my ideas (2002-2008) in physics (quantum mechanics, cosmology), cognitive neuroscience, philosophy of mind, and philosophy (this manuscript would require a REVOLUTION in international academy environment!)

    Get PDF
    (b2023 to 2014) The UNBELIEVABLE similarities between the ideas of some people (2006-2016) and my ideas (2002-2008) in physics (quantum mechanics, cosmology), cognitive neuroscience, philosophy of mind, and philosophy (this manuscript would require a REVOLUTION in international academy environment!

    Topology, Correlation, and Information in Designer Quantum Systems

    Get PDF
    This thesis discusses the use of subgap and boundary modes for quantum engineering of novel phases, devices and response characteristics. It is comprised of four separate topics: quantum magnetism in Yu-Shiba-Rusinov chains, single-atom Josephson diodes, readout ofMajorana qubits, and surface photogalvanic response inWeyl semimetals. Chains of magnetic adatoms on superconductors have been discussed as promising systems for realizingMajorana end states. Here,we showthat dilute Yu-Shiba-Rusinov (YSR) chains are also a versatile platform for quantum magnetism and correlated electron dynamics, with widely adjustable spin values and couplings. Focusing on subgap excitations, we derive an extended t − J model for dilute quantum YSR chains and use it to study the phase diagram as well as tunneling spectra. We explore the implications of quantum magnetism for the formation of a topological superconducting phase, contrasting it to existing models assuming classical spin textures. Current-biased Josephson junctions exhibit hysteretic transitions between dissipative and superconducting states as characterized by switching and retrapping currents. Here, we develop a theory for diode-like effects in the switching and retrapping currents ofweakly-damped Josephson junctions. We find that while the diode-like behavior of switching currents is rooted in asymmetric current-phase relations, nonreciprocal retrapping currents originate in asymmetric quasiparticle currents. These different origins also imply distinctly different symmetry requirements. We illustrate our results by a microscopic model for junctions involving YSR subgap states. Our theory provides significant guidance in identifying the microscopic origin of nonreciprocities in Josephson junctions. Schemes for topological quantum computation withMajorana bound states rely heavily on the ability to measure products ofMajorana operators projectively. Here,weemployMarkovian quantum measurement theory, including the readout device, to analyze such measurements. Specifically, we focus on the readout of Majorana qubits via continuous charge sensing of a tunnel-coupled quantum dot by a quantum point contact. We show that projective measurements of Majorana products can be implemented by continuous charge sensing under quite general circumstances. Essential requirements are that a combined local parity ˆπ, involving the quantum dot charge along with the Majorana product of interest, be conserved, and that the two eigenspaces of the combined parity ˆπ generate distinguishable measurement signals. The photogalvanic effect requires the intrinsic symmetry of the medium to be sufficiently low, which strongly limits candidate materials for this effect.We explore how inWeyl semimetals the photogalvanic effect can be enabled and controlled by design of Fermi arc states at the material surface. Specifically, we provide a theory of ballistic photogalvanic current in a Weyl semimetal slab. We show that the confinement-induced response is tightly linked to the configuration of Fermi-arc surface states, thus inheriting the same directionality and sensitivity to boundary conditions. In principle this enables the control of the photogalvanic response through manipulation at the surface only

    Water and Brain Function: Effects of Hydration Status on Neurostimulation and Neurorecording

    Get PDF
    Introduction: TMS and EEG are used to study normal neurophysiology, diagnose, and treat clinical neuropsychiatric conditions, but can produce variable results or fail. Both techniques depend on electrical volume conduction, and thus brain volumes. Hydration status can affect brain volumes and functions (including cognition), but effects on these techniques are unknown. We aimed to characterize the effects of hydration on TMS, EEG, and cognitive tasks. Methods: EEG and EMG were recorded during single-pulse TMS, paired-pulse TMS, and cognitive tasks from 32 human participants on dehydrated (12-hour fast/thirst) and rehydrated (1 Liter oral water ingestion in 1 hour) testing days. Hydration status was confirmed with urinalysis. MEP, ERP, and network analyses were performed to examine responses at the muscle, brain, and higher-order functioning. Results: Rehydration decreased motor threshold (increased excitability) and shifted the motor hotspot. Significant effects on TMS measures occurred despite being re-localized and re-dosed to these new parameters. Rehydration increased SICF of the MEP, magnitudes of specific TEP peaks in inhibitory protocols, specific ERP peak magnitudes and reaction time during the cognitive task. Rehydration amplified nodal inhibition around the stimulation site in inhibitory paired-pulse networks and strengthened nodes outside the stimulation site in excitatory and CSP networks. Cognitive performance was not improved by rehydration, although similar performance was achieved with generally weaker network activity. Discussion: Results highlight differences between mild dehydration and rehydration. The rehydrated brain was easier to stimulate with TMS and produced larger responses to external and internal stimuli. This is explainable by the known physiology of body water dynamics, which encompass macroscopic and microscopic volume changes. Rehydration can shift 3D cortical positioning, decrease scalp cortex distance (bringing cortex closer to stimulator/recording electrodes), and cause astrocyte swelling-induced glutamate release. Conclusions: Previously unaccounted variables like osmolarity, astrocyte and brain volumes likely affect neurostimulation/neurorecording. Controlling for and carefully manipulating hydration may reduce variability and improve therapeutic outcomes of neurostimulation. Dehydration is common and produces less excitable circuits. Rehydration should offer a mechanism to macroscopically bring target cortical areas closer to an externally applied neurostimulation device to recruit greater volumes of tissue and microscopically favor excitability in the stimulated circuits

    Timely Classification of Encrypted or ProtocolObfuscated Internet Traffic Using Statistical Methods

    Get PDF
    Internet traffic classification aims to identify the type of application or protocol that generated a particular packet or stream of packets on the network. Through traffic classification, Internet Service Providers (ISPs), governments, and network administrators can access basic functions and several solutions, including network management, advanced network monitoring, network auditing, and anomaly detection. Traffic classification is essential as it ensures the Quality of Service (QoS) of the network, as well as allowing efficient resource planning. With the increase of encrypted or obfuscated protocol traffic on the Internet and multilayer data encapsulation, some classical classification methods have lost interest from the scientific community. The limitations of traditional classification methods based on port numbers and payload inspection to classify encrypted or obfuscated Internet traffic have led to significant research efforts focused on Machine Learning (ML) based classification approaches using statistical features from the transport layer. In an attempt to increase classification performance, Machine Learning strategies have gained interest from the scientific community and have shown promise in the future of traffic classification, specially to recognize encrypted traffic. However, ML approach also has its own limitations, as some of these methods have a high computational resource consumption, which limits their application when classifying large traffic or realtime flows. Limitations of ML application have led to the investigation of alternative approaches, including featurebased procedures and statistical methods. In this sense, statistical analysis methods, such as distances and divergences, have been used to classify traffic in large flows and in realtime. The main objective of statistical distance is to differentiate flows and find a pattern in traffic characteristics through statistical properties, which enable classification. Divergences are functional expressions often related to information theory, which measure the degree of discrepancy between any two distributions. This thesis focuses on proposing a new methodological approach to classify encrypted or obfuscated Internet traffic based on statistical methods that enable the evaluation of network traffic classification performance, including the use of computational resources in terms of CPU and memory. A set of traffic classifiers based on KullbackLeibler and JensenShannon divergences, and Euclidean, Hellinger, Bhattacharyya, and Wootters distances were proposed. The following are the four main contributions to the advancement of scientific knowledge reported in this thesis. First, an extensive literature review on the classification of encrypted and obfuscated Internet traffic was conducted. The results suggest that portbased and payloadbased methods are becoming obsolete due to the increasing use of traffic encryption and multilayer data encapsulation. MLbased methods are also becoming limited due to their computational complexity. As an alternative, Support Vector Machine (SVM), which is also an ML method, and the KolmogorovSmirnov and Chisquared tests can be used as reference for statistical classification. In parallel, the possibility of using statistical methods for Internet traffic classification has emerged in the literature, with the potential of good results in classification without the need of large computational resources. The potential statistical methods are Euclidean Distance, Hellinger Distance, Bhattacharyya Distance, Wootters Distance, as well as KullbackLeibler (KL) and JensenShannon divergences. Second, we present a proposal and implementation of a classifier based on SVM for P2P multimedia traffic, comparing the results with KolmogorovSmirnov (KS) and Chisquare tests. The results suggest that SVM classification with Linear kernel leads to a better classification performance than KS and Chisquare tests, depending on the value assigned to the Self C parameter. The SVM method with Linear kernel and suitable values for the Self C parameter may be a good choice to identify encrypted P2P multimedia traffic on the Internet. Third, we present a proposal and implementation of two classifiers based on KL Divergence and Euclidean Distance, which are compared to SVM with Linear kernel, configured with the standard Self C parameter, showing a reduced ability to classify flows based solely on packet sizes compared to KL and Euclidean Distance methods. KL and Euclidean methods were able to classify all tested applications, particularly streaming and P2P, where for almost all cases they efficiently identified them with high accuracy, with reduced consumption of computational resources. Based on the obtained results, it can be concluded that KL and Euclidean Distance methods are an alternative to SVM, as these statistical approaches can operate in realtime and do not require retraining every time a new type of traffic emerges. Fourth, we present a proposal and implementation of a set of classifiers for encrypted Internet traffic, based on JensenShannon Divergence and Hellinger, Bhattacharyya, and Wootters Distances, with their respective results compared to those obtained with methods based on Euclidean Distance, KL, KS, and ChiSquare. Additionally, we present a comparative qualitative analysis of the tested methods based on Kappa values and Receiver Operating Characteristic (ROC) curves. The results suggest average accuracy values above 90% for all statistical methods, classified as ”almost perfect reliability” in terms of Kappa values, with the exception of KS. This result indicates that these methods are viable options to classify encrypted Internet traffic, especially Hellinger Distance, which showed the best Kappa values compared to other classifiers. We conclude that the considered statistical methods can be accurate and costeffective in terms of computational resource consumption to classify network traffic. Our approach was based on the classification of Internet network traffic, focusing on statistical distances and divergences. We have shown that it is possible to classify and obtain good results with statistical methods, balancing classification performance and the use of computational resources in terms of CPU and memory. The validation of the proposal supports the argument of this thesis, which proposes the implementation of statistical methods as a viable alternative to Internet traffic classification compared to methods based on port numbers, payload inspection, and ML.A classificação de tráfego Internet visa identificar o tipo de aplicação ou protocolo que gerou um determinado pacote ou fluxo de pacotes na rede. Através da classificação de tráfego, Fornecedores de Serviços de Internet (ISP), governos e administradores de rede podem ter acesso às funções básicas e várias soluções, incluindo gestão da rede, monitoramento avançado de rede, auditoria de rede e deteção de anomalias. Classificar o tráfego é essencial, pois assegura a Qualidade de Serviço (QoS) da rede, além de permitir planear com eficiência o uso de recursos. Com o aumento de tráfego cifrado ou protocolo ofuscado na Internet e do encapsulamento de dados multicamadas, alguns métodos clássicos da classificação perderam interesse de investigação da comunidade científica. As limitações dos métodos tradicionais da classificação com base no número da porta e na inspeção de carga útil payload para classificar o tráfego de Internet cifrado ou ofuscado levaram a esforços significativos de investigação com foco em abordagens da classificação baseadas em técnicas de Aprendizagem Automática (ML) usando recursos estatísticos da camada de transporte. Na tentativa de aumentar o desempenho da classificação, as estratégias de Aprendizagem Automática ganharam o interesse da comunidade científica e se mostraram promissoras no futuro da classificação de tráfego, principalmente no reconhecimento de tráfego cifrado. No entanto, a abordagem em ML também têm as suas próprias limitações, pois alguns desses métodos possuem um elevado consumo de recursos computacionais, o que limita a sua aplicação para classificação de grandes fluxos de tráfego ou em tempo real. As limitações no âmbito da aplicação de ML levaram à investigação de abordagens alternativas, incluindo procedimentos baseados em características e métodos estatísticos. Neste sentido, os métodos de análise estatística, tais como distâncias e divergências, têm sido utilizados para classificar tráfego em grandes fluxos e em tempo real. A distância estatística possui como objetivo principal diferenciar os fluxos e permite encontrar um padrão nas características de tráfego através de propriedades estatísticas, que possibilitam a classificação. As divergências são expressões funcionais frequentemente relacionadas com a teoria da informação, que mede o grau de discrepância entre duas distribuições quaisquer. Esta tese focase na proposta de uma nova abordagem metodológica para classificação de tráfego cifrado ou ofuscado da Internet com base em métodos estatísticos que possibilite avaliar o desempenho da classificação de tráfego de rede, incluindo a utilização de recursos computacionais, em termos de CPU e memória. Foi proposto um conjunto de classificadores de tráfego baseados nas Divergências de KullbackLeibler e JensenShannon e Distâncias Euclidiana, Hellinger, Bhattacharyya e Wootters. A seguir resumemse os tese. Primeiro, realizámos uma ampla revisão de literatura sobre classificação de tráfego cifrado e ofuscado de Internet. Os resultados sugerem que os métodos baseados em porta e baseados em carga útil estão se tornando obsoletos em função do crescimento da utilização de cifragem de tráfego e encapsulamento de dados multicamada. O tipo de métodos baseados em ML também está se tornando limitado em função da complexidade computacional. Como alternativa, podese utilizar a Máquina de Vetor de Suporte (SVM), que também é um método de ML, e os testes de KolmogorovSmirnov e Quiquadrado como referência de comparação da classificação estatística. Em paralelo, surgiu na literatura a possibilidade de utilização de métodos estatísticos para classificação de tráfego de Internet, com potencial de bons resultados na classificação sem aporte de grandes recursos computacionais. Os métodos estatísticos potenciais são as Distâncias Euclidiana, Hellinger, Bhattacharyya e Wootters, além das Divergências de Kullback–Leibler (KL) e JensenShannon. Segundo, apresentamos uma proposta e implementação de um classificador baseado na Máquina de Vetor de Suporte (SVM) para o tráfego multimédia P2P (PeertoPeer), comparando os resultados com os testes de KolmogorovSmirnov (KS) e Quiquadrado. Os resultados sugerem que a classificação da SVM com kernel Linear conduz a um melhor desempenho da classificação do que os testes KS e Quiquadrado, dependente do valor atribuído ao parâmetro Self C. O método SVM com kernel Linear e com valores adequados para o parâmetro Self C pode ser uma boa escolha para identificar o tráfego Par a Par (P2P) multimédia cifrado na Internet. Terceiro, apresentamos uma proposta e implementação de dois classificadores baseados na Divergência de KullbackLeibler (KL) e na Distância Euclidiana, sendo comparados com a SVM com kernel Linear, configurado para o parâmestro Self C padrão, apresenta reduzida capacidade de classificar fluxos com base apenas nos tamanhos dos pacotes em relação aos métodos KL e Distância Euclidiana. Os métodos KL e Euclidiano foram capazes de classificar todas as aplicações testadas, destacandose streaming e P2P, onde para quase todos os casos foi eficiente identificálas com alta precisão, com reduzido consumo de recursos computacionais.Com base nos resultados obtidos, podese concluir que os métodos KL e Distância Euclidiana são uma alternativa à SVM, porque essas abordagens estatísticas podem operar em tempo real e não precisam de retreinamento cada vez que surge um novo tipo de tráfego. Quarto, apresentamos uma proposta e implementação de um conjunto de classificadores para o tráfego de Internet cifrado, baseados na Divergência de JensenShannon e nas Distâncias de Hellinger, Bhattacharyya e Wootters, sendo os respetivos resultados comparados com os resultados obtidos com os métodos baseados na Distância Euclidiana, KL, KS e Quiquadrado. Além disso, apresentamos uma análise qualitativa comparativa dos métodos testados com base nos valores de Kappa e Curvas Característica de Operação do Receptor (ROC). Os resultados sugerem valores médios de precisão acima de 90% para todos os métodos estatísticos, classificados como “confiabilidade quase perfeita” em valores de Kappa, com exceçãode KS. Esse resultado indica que esses métodos são opções viáveis para a classificação de tráfego cifrado da Internet, em especial a Distância de Hellinger, que apresentou os melhores resultados do valor de Kappa em comparaçãocom os demais classificadores. Concluise que os métodos estatísticos considerados podem ser precisos e económicos em termos de consumo de recursos computacionais para classificar o tráfego da rede. A nossa abordagem baseouse na classificação de tráfego de rede Internet, focando em distâncias e divergências estatísticas. Nós mostramos que é possível classificar e obter bons resultados com métodos estatísticos, equilibrando desempenho de classificação e uso de recursos computacionais em termos de CPU e memória. A validação da proposta sustenta o argumento desta tese, que propõe a implementação de métodos estatísticos como alternativa viável à classificação de tráfego da Internet em relação aos métodos com base no número da porta, na inspeção de carga útil e de ML.Thesis prepared at Instituto de Telecomunicações Delegação da Covilhã and at the Department of Computer Science of the University of Beira Interior, and submitted to the University of Beira Interior for discussion in public session to obtain the Ph.D. Degree in Computer Science and Engineering. This work has been funded by Portuguese FCT/MCTES through national funds and, when applicable, cofunded by EU funds under the project UIDB/50008/2020, and by operation Centro010145FEDER000019 C4 Centro de Competências em Cloud Computing, cofunded by the European Regional Development Fund (ERDF/FEDER) through the Programa Operacional Regional do Centro (Centro 2020). This work has also been funded by CAPES (Brazilian Federal Agency for Support and Evaluation of Graduate Education) within the Ministry of Education of Brazil under a scholarship supported by the International Cooperation Program CAPES/COFECUB Project 9090134/ 2013 at the University of Beira Interior

    MPC With Delayed Parties Over Star-Like Networks

    Get PDF
    While the efficiency of secure multi-party computation protocols has greatly increased in the last few years, these improvements and protocols are often based on rather unrealistic, idealised, assumptions about how technology is deployed in the real world. In this work we examine multi-party computation protocols in the presence of two major constraints present in deployed systems. Firstly, we consider the situation where the parties are connected not by direct point-to-point connections, but by a star-like topology with a few central post-office style relays. Secondly, we consider MPC protocols with a strong honest majority (nt/2n \gg t/2) in which we have stragglers (some parties are progressing slower than others). We model stragglers by allowing the adversary to delay messages to and from some parties for a given length of time. We first show that having only a single honest rely is enough to ensure consensus of the messages sent within a protocol; secondly, we show that special care must be taken to describe multiplication protocols in the case of relays and stragglers and that some well known protocols do not guarantee privacy and correctness in this setting; thirdly, we present an efficient honest-majority MPC protocol which can be run on top of the relays and which provides active-security with abort in the case of a strong honest majority, even when run with stragglers. We back up our protocol presentation with both experimental evaluations and simulations of the effect of the relays and delays on our protocol

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum
    corecore