5,221 research outputs found

    Undergraduate Catalog of Studies, 2023-2024

    Get PDF

    Computational techniques to interpret the neural code underlying complex cognitive processes

    Get PDF
    Advances in large-scale neural recording technology have significantly improved the capacity to further elucidate the neural code underlying complex cognitive processes. This thesis aimed to investigate two research questions in rodent models. First, what is the role of the hippocampus in memory and specifically what is the underlying neural code that contributes to spatial memory and navigational decision-making. Second, how is social cognition represented in the medial prefrontal cortex at the level of individual neurons. To start, the thesis begins by investigating memory and social cognition in the context of healthy and diseased states that use non-invasive methods (i.e. fMRI and animal behavioural studies). The main body of the thesis then shifts to developing our fundamental understanding of the neural mechanisms underpinning these cognitive processes by applying computational techniques to ana lyse stable large-scale neural recordings. To achieve this, tailored calcium imaging and behaviour preprocessing computational pipelines were developed and optimised for use in social interaction and spatial navigation experimental analysis. In parallel, a review was conducted on methods for multivariate/neural population analysis. A comparison of multiple neural manifold learning (NML) algorithms identified that non linear algorithms such as UMAP are more adaptable across datasets of varying noise and behavioural complexity. Furthermore, the review visualises how NML can be applied to disease states in the brain and introduces the secondary analyses that can be used to enhance or characterise a neural manifold. Lastly, the preprocessing and analytical pipelines were combined to investigate the neural mechanisms in volved in social cognition and spatial memory. The social cognition study explored how neural firing in the medial Prefrontal cortex changed as a function of the social dominance paradigm, the "Tube Test". The univariate analysis identified an ensemble of behavioural-tuned neurons that fire preferentially during specific behaviours such as "pushing" or "retreating" for the animal’s own behaviour and/or the competitor’s behaviour. Furthermore, in dominant animals, the neural population exhibited greater average firing than that of subordinate animals. Next, to investigate spatial memory, a spatial recency task was used, where rats learnt to navigate towards one of three reward locations and then recall the rewarded location of the session. During the task, over 1000 neurons were recorded from the hippocampal CA1 region for five rats over multiple sessions. Multivariate analysis revealed that the sequence of neurons encoding an animal’s spatial position leading up to a rewarded location was also active in the decision period before the animal navigates to the rewarded location. The result posits that prospective replay of neural sequences in the hippocampal CA1 region could provide a mechanism by which decision-making is supported

    Responding to Reading Difficulties: An Exploration from Different Professional Perspectives

    Get PDF
    The study was designed to explore educators’ perspectives on reading difficulties and their choice of teaching strategies for students with reading difficulties. The study aimed to understand how educators form their professional perspectives on reading difficulties, how this relates to their understanding of the concept of ‘dyslexia’ and how this informs their teaching methods. Furthermore, the study has explored the extent to which these chosen teaching strategies are inclusive and meet the needs of all students. A qualitative case study was used to generate data to address the research questions and achieve the aims of this study. Data were generated from semi-structured interviews with thirteen educators from different contexts and career stages, classroom observations in two primary schools in England, and a dyslexia training session online. Thematic data analysis was used to interpret the data and identify themes related to the educators’ understanding of the reading difficulty and pedagogy for students with reading difficulty (Braun and Clarke, 2006). Braun and Clarke's six steps were followed for analysing the data. Furthermore, multi-layer analysis (Robbins, 2007) was used to incorporate findings from three aspects of my theoretical framework: Rogoff’s (1995) three planes of analysis, Tobin’s (1999) comparative classroom ethnography, and models of disability. My study suggests that teachers’ understanding of reading difficulties is influenced by different models of disability at different levels of their thinking, which then also influences their choice of teaching strategies to respond to reading difficulties. My study findings also suggest that students with reading difficulties are not given enough opportunities to voice their needs and feelings, and it is recommended that spaces be provided for individuals to reflect and for all stakeholders to talk and share their reflections. In addition, my study recommends that student teachers should be prepared for working with students who have reading difficulties in their future classrooms by developing an understanding and knowledge of inclusive pedagogy and how this relates to teaching children how to read. This can also be extended to teachers who are currently working in schools to develop a better understanding of how to support all children to learn to read.

    Impact of Imaging and Distance Perception in VR Immersive Visual Experience

    Get PDF
    Virtual reality (VR) headsets have evolved to include unprecedented viewing quality. Meanwhile, they have become lightweight, wireless, and low-cost, which has opened to new applications and a much wider audience. VR headsets can now provide users with greater understanding of events and accuracy of observation, making decision-making faster and more effective. However, the spread of immersive technologies has shown a slow take-up, with the adoption of virtual reality limited to a few applications, typically related to entertainment. This reluctance appears to be due to the often-necessary change of operating paradigm and some scepticism towards the "VR advantage". The need therefore arises to evaluate the contribution that a VR system can make to user performance, for example to monitoring and decision-making. This will help system designers understand when immersive technologies can be proposed to replace or complement standard display systems such as a desktop monitor. In parallel to the VR headsets evolution there has been that of 360 cameras, which are now capable to instantly acquire photographs and videos in stereoscopic 3D (S3D) modality, with very high resolutions. 360° images are innately suited to VR headsets, where the captured view can be observed and explored through the natural rotation of the head. Acquired views can even be experienced and navigated from the inside as they are captured. The combination of omnidirectional images and VR headsets has opened to a new way of creating immersive visual representations. We call it: photo-based VR. This represents a new methodology that combines traditional model-based rendering with high-quality omnidirectional texture-mapping. Photo-based VR is particularly suitable for applications related to remote visits and realistic scene reconstruction, useful for monitoring and surveillance systems, control panels and operator training. The presented PhD study investigates the potential of photo-based VR representations. It starts by evaluating the role of immersion and user’s performance in today's graphical visual experience, to then use it as a reference to develop and evaluate new photo-based VR solutions. With the current literature on photo-based VR experience and associated user performance being very limited, this study builds new knowledge from the proposed assessments. We conduct five user studies on a few representative applications examining how visual representations can be affected by system factors (camera and display related) and how it can influence human factors (such as realism, presence, and emotions). Particular attention is paid to realistic depth perception, to support which we develop target solutions for photo-based VR. They are intended to provide users with a correct perception of space dimension and objects size. We call it: true-dimensional visualization. The presented work contributes to unexplored fields including photo-based VR and true-dimensional visualization, offering immersive system designers a thorough comprehension of the benefits, potential, and type of applications in which these new methods can make the difference. This thesis manuscript and its findings have been partly presented in scientific publications. In particular, five conference papers on Springer and the IEEE symposia, [1], [2], [3], [4], [5], and one journal article in an IEEE periodical [6], have been published

    Deep Learning Techniques for Electroencephalography Analysis

    Get PDF
    In this thesis we design deep learning techniques for training deep neural networks on electroencephalography (EEG) data and in particular on two problems, namely EEG-based motor imagery decoding and EEG-based affect recognition, addressing challenges associated with them. Regarding the problem of motor imagery (MI) decoding, we first consider the various kinds of domain shifts in the EEG signals, caused by inter-individual differences (e.g. brain anatomy, personality and cognitive profile). These domain shifts render multi-subject training a challenging task and impede robust cross-subject generalization. We build a two-stage model ensemble architecture and propose two objectives to train it, combining the strengths of curriculum learning and collaborative training. Our subject-independent experiments on the large datasets of Physionet and OpenBMI, verify the effectiveness of our approach. Next, we explore the utilization of the spatial covariance of EEG signals through alignment techniques, with the goal of learning domain-invariant representations. We introduce a Riemannian framework that concurrently performs covariance-based signal alignment and data augmentation, while training a convolutional neural network (CNN) on EEG time-series. Experiments on the BCI IV-2a dataset show that our method performs superiorly over traditional alignment, by inducing regularization to the weights of the CNN. We also study the problem of EEG-based affect recognition, inspired by works suggesting that emotions can be expressed in relative terms, i.e. through ordinal comparisons between different affective state levels. We propose treating data samples in a pairwise manner to infer the ordinal relation between their corresponding affective state labels, as an auxiliary training objective. We incorporate our objective in a deep network architecture which we jointly train on the tasks of sample-wise classification and pairwise ordinal ranking. We evaluate our method on the affective datasets of DEAP and SEED and obtain performance improvements over deep networks trained without the additional ranking objective

    Proceedings of the 10th International congress on architectural technology (ICAT 2024): architectural technology transformation.

    Get PDF
    The profession of architectural technology is influential in the transformation of the built environment regionally, nationally, and internationally. The congress provides a platform for industry, educators, researchers, and the next generation of built environment students and professionals to showcase where their influence is transforming the built environment through novel ideas, businesses, leadership, innovation, digital transformation, research and development, and sustainable forward-thinking technological and construction assembly design

    Characterisation of Blast Loading from Ideal and Non-Ideal Explosives

    Get PDF
    Explosive detonation in its simplest form can be characterised by an instantaneous release of energy at an infinitely small point in space as a solid explosive material. This is a result of chemical decomposition of an explosive which reforms as high pressure and temperature gases which expand radially. This supersonic expansion of detonation products compresses the surrounding medium resulting in a shock wave discontinuity which propagates away from an explosive epicentre at high speeds. This has the potential of significant damage to anything the shock wave interacts with. Shock wave quantification work conducted in 1940`s through to the 1980`s was done so to understand the effects of large scale explosive detonation which was an immediate threat due to the discovery of the nuclear bomb. Highly skilled experimental and theoretical scientists were assigned the task of capturing the effects of large scale detonations through innovative solutions and development of pressure gauges. The in-depth fundamental understanding of physics, combustion and fluid dynamics the researchers utilised resulted in the well-favoured semi-empirical blast predictions for simplistic free-field spherical/hemispherical blasts.\\ A broad amount of literature has been published on free-air characterisation of spherical/hemispherical explosives, with the detonation process and subsequent shock wave formation mechanics being well understood. However, there is yet to be a definitive and robust understanding of how deterministic a shock waves spatial and temporal parameters are for simplistic scenarios. This goes as far as some studies suggesting that semi-empirical tools are not as effective as previously assumed. Often the use of numerical simulations provide reasonable insights to blast loading conditions imparted on structures and scenarios with higher complexities. However, when the validation data used is assumed to exhibit erroneousness, the schemes are no longer characteristically high in fidelity. The lack of quantified variability and confidence in the data which is published, are significant issues for engineers when designing infrastructure that is both robust enough to withstand extreme loading, and not overly conservative that there are cost and material waste implications. This issue is investigated thoroughly within this thesis, highlighting the sensitivity of blast parameters across the scaled distance ranges, and determining their predictability with both numerical simulation and semi-empirical tools. The vast majority of free-field characterisation has been conducted using military grade explosive which exhibit ideal detonation behaviours; meaning the detonation reaction is effectively instantaneous. Ideal explosives, by the theoretical definition, can be categorised by a simplistic instantaneous energy release. In far-field regimes, any explosive with ideal-like compositions and behaviours should be scalable with mass. This assumption is not valid for homemade explosives (HME), such as ANFO (Ammonium Nitrate + Fuel Oil), whose compositions are usually homogenous, resulting in a finite reaction zone length. These can be long enough to cause failures in detonations and exhibit a variety of different energy releases depending on the mass of the charge resulting in HME's having different TNT equivalence values depending on their scale. Early works of ANFO characterisation was done so in the desire to replace TNT, to assess its capability of producing similar yields for a fraction of the manufacturing costs. This meant the hemispherical detonations of ANFO which have led to its overall classification, were done using charges of over 100kg and therefore non-ideal reaction zone effects become negligible in comparison to the overall charge size. Yields presented in this region were consistently measured at around 80\% of a similar TNT detonation and has therefore been incorrectly assumed a rule for ANFO across all mass ranges within published literature.\\ There is a distinct lack of characterisation of non-ideal explosives throughout the mass scales, posing a significant implication for designing structures to withstand the threat of HMEs. With the knowledge that energy is released at a much slower rate when detonating these compositions, the assumption that large scale trials accurately capturing the behaviour of a small charge masses, when scaled down, is not verified. Most HMEs will be hand held devices or, at the very least, backpack size, meaning the threat currently is not predictive with confidence through validated data conducted under well-controlled conditions. Small scale ANFO trials have demonstrated this to be the case within this thesis, with theoretical mechanisms proposed which offering a prediction method of the behaviour of non-ideal detonation across all mass scales. Findings in this PhD thesis will offer a conclusion on whether shock waves in free-field scenarios are deterministic for both ideal and no-ideal explosives, with a particular emphasis on the far-field range. The results presented are developments in the accurate quantification of shock wave loading conditions a structure is subjected to through explosive detonation and should be used by engineers to establish robust, probabilistic but accurate designs

    Multidisciplinary perspectives on Artificial Intelligence and the law

    Get PDF
    This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio

    Climate Change and Critical Agrarian Studies

    Full text link
    Climate change is perhaps the greatest threat to humanity today and plays out as a cruel engine of myriad forms of injustice, violence and destruction. The effects of climate change from human-made emissions of greenhouse gases are devastating and accelerating; yet are uncertain and uneven both in terms of geography and socio-economic impacts. Emerging from the dynamics of capitalism since the industrial revolution — as well as industrialisation under state-led socialism — the consequences of climate change are especially profound for the countryside and its inhabitants. The book interrogates the narratives and strategies that frame climate change and examines the institutionalised responses in agrarian settings, highlighting what exclusions and inclusions result. It explores how different people — in relation to class and other co-constituted axes of social difference such as gender, race, ethnicity, age and occupation — are affected by climate change, as well as the climate adaptation and mitigation responses being implemented in rural areas. The book in turn explores how climate change – and the responses to it - affect processes of social differentiation, trajectories of accumulation and in turn agrarian politics. Finally, the book examines what strategies are required to confront climate change, and the underlying political-economic dynamics that cause it, reflecting on what this means for agrarian struggles across the world. The 26 chapters in this volume explore how the relationship between capitalism and climate change plays out in the rural world and, in particular, the way agrarian struggles connect with the huge challenge of climate change. Through a huge variety of case studies alongside more conceptual chapters, the book makes the often-missing connection between climate change and critical agrarian studies. The book argues that making the connection between climate and agrarian justice is crucial
    • …
    corecore