2,787 research outputs found
Analysis and monitoring of single HaCaT cells using volumetric Raman mapping and machine learning
No explorer reached a pole without a map, no chef served a meal without tasting, and no surgeon implants untested devices. Higher accuracy maps, more sensitive taste buds, and more rigorous tests increase confidence in positive outcomes. Biomedical manufacturing necessitates rigour, whether developing drugs or creating bioengineered tissues [1]–[4]. By designing a dynamic environment that supports mammalian cells during experiments within a Raman spectroscope, this project provides a platform that more closely replicates in vivo conditions. The platform also adds the opportunity to automate the adaptation of the cell culture environment, alongside spectral monitoring of cells with machine learning and three-dimensional Raman mapping, called volumetric Raman mapping (VRM). Previous research highlighted key areas for refinement, like a structured approach for shading Raman maps [5], [6], and the collection of VRM [7]. Refining VRM shading and collection was the initial focus, k-means directed shading for vibrational spectroscopy map shading was developed in Chapter 3 and exploration of depth distortion and VRM calibration (Chapter 4). “Cage” scaffolds, designed using the findings from Chapter 4 were then utilised to influence cell behaviour by varying the number of cage beams to change the scaffold porosity. Altering the porosity facilitated spectroscopy investigation into previously observed changes in cell biology alteration in response to porous scaffolds [8]. VRM visualised changed single human keratinocyte (HaCaT) cell morphology, providing a complementary technique for machine learning classification. Increased technical rigour justified progression onto in-situ flow chamber for Raman spectroscopy development in Chapter 6, using a Psoriasis (dithranol-HaCaT) model on unfixed cells. K-means-directed shading and principal component analysis (PCA) revealed HaCaT cell adaptations aligning with previous publications [5] and earlier thesis sections. The k-means-directed Raman maps and PCA score plots verified the drug-supplying capacity of the flow chamber, justifying future investigation into VRM and machine learning for monitoring single cells within the flow chamber
Soft touchless sensors and touchless sensing for soft robots
Soft robots are characterized by their mechanical compliance, making them well-suited for various bio-inspired applications. However, the challenge of preserving their flexibility during deployment has necessitated using soft sensors which can enhance their mobility, energy efficiency, and spatial adaptability. Through emulating the structure, strategies, and working principles of human senses, soft robots can detect stimuli without direct contact with soft touchless sensors and tactile stimuli. This has resulted in noteworthy progress within the field of soft robotics. Nevertheless, soft, touchless sensors offer the advantage of non-invasive sensing and gripping without the drawbacks linked to physical contact. Consequently, the popularity of soft touchless sensors has grown in recent years, as they facilitate intuitive and safe interactions with humans, other robots, and the surrounding environment. This review explores the emerging confluence of touchless sensing and soft robotics, outlining a roadmap for deployable soft robots to achieve human-level dexterity
Reliable Sensor Intelligence in Resource Constrained and Unreliable Environment
The objective of this research is to design a sensor intelligence that is reliable in a resource constrained, unreliable environment. There are various sources of variations and uncertainty involved in intelligent sensor system, so it is critical to build reliable sensor intelligence. Many prior works seek to design reliable sensor intelligence by developing robust and reliable task. This thesis suggests that along with improving task itself, task reliability quantification based early warning can further improve sensor intelligence. DNN based early warning generator quantifies task reliability based on spatiotemporal characteristics of input, and the early warning controls sensor parameters and avoids system failure. This thesis presents an early warning generator that predicts task failure due to sensor hardware induced input corruption and controls the sensor operation. Moreover, lightweight uncertainty estimator is presented to take account of DNN model uncertainty in task reliability quantification without prohibitive computation from stochastic DNN. Cross-layer uncertainty estimation is also discussed to consider the effect of PIM variations.Ph.D
The 2023 terahertz science and technology roadmap
Terahertz (THz) radiation encompasses a wide spectral range within the electromagnetic spectrum that extends from microwaves to the far infrared (100 GHz–∼30 THz). Within its frequency boundaries exist a broad variety of scientific disciplines that have presented, and continue to present, technical challenges to researchers. During the past 50 years, for instance, the demands of the scientific community have substantially evolved and with a need for advanced instrumentation to support radio astronomy, Earth observation, weather forecasting, security imaging, telecommunications, non-destructive device testing and much more. Furthermore, applications have required an emergence of technology from the laboratory environment to production-scale supply and in-the-field deployments ranging from harsh ground-based locations to deep space. In addressing these requirements, the research and development community has advanced related technology and bridged the transition between electronics and photonics that high frequency operation demands. The multidisciplinary nature of THz work was our stimulus for creating the 2017 THz Science and Technology Roadmap (Dhillon et al 2017 J. Phys. D: Appl. Phys. 50 043001). As one might envisage, though, there remains much to explore both scientifically and technically and the field has continued to develop and expand rapidly. It is timely, therefore, to revise our previous roadmap and in this 2023 version we both provide an update on key developments in established technical areas that have important scientific and public benefit, and highlight new and emerging areas that show particular promise. The developments that we describe thus span from fundamental scientific research, such as THz astronomy and the emergent area of THz quantum optics, to highly applied and commercially and societally impactful subjects that include 6G THz communications, medical imaging, and climate monitoring and prediction. Our Roadmap vision draws upon the expertise and perspective of multiple international specialists that together provide an overview of past developments and the likely challenges facing the field of THz science and technology in future decades. The document is written in a form that is accessible to policy makers who wish to gain an overview of the current state of the THz art, and for the non-specialist and curious who wish to understand available technology and challenges. A such, our experts deliver a 'snapshot' introduction to the current status of the field and provide suggestions for exciting future technical development directions. Ultimately, we intend the Roadmap to portray the advantages and benefits of the THz domain and to stimulate further exploration of the field in support of scientific research and commercial realisation
Development of a SQUID magnetometry system for cryogenic neutron electric dipole moment experiment
A measurement of the neutron electric dipole moment (nEDM) could hold the key to understanding why the visible universe is the way it is: why matter should predominate over antimatter. As a charge-parity violating (CPV) quantity, an nEDM could provide an insight into new mechanisms that address this baryon asymmetry. The motivation for an improved sensitivity to an nEDM is to find it to be non-zero at a level consistent with certain beyond the Standard Model theories that predict new sources of CPV, or to establish a new limit that constrains them.
CryoEDM is an experiment that sought to better the current limit of cm by an order of magnitude. It is designed to measure the nEDM via the Ramsey Method of Separated Oscillatory Fields, in which it is critical that the magnetic field remains stable throughout. A way of accurately tracking the magnetic fields, moreover at a temperature K, is crucial for CryoEDM, and for future cryogenic projects.
This thesis presents work focussing on the development of a 12-SQUID magnetometry system for CryoEDM, that enables the magnetic field to be monitored to a precision of pT. A major component of its infrastructure is the superconducting capillary shields, which screen the input lines of the SQUIDs from the pick up of spurious magnetic fields that will perturb a SQUID's measurement. These are shown to have a transverse shielding factor of , which is a few orders of magnitude greater than the calculated requirement.
Efforts to characterise the shielding of the SQUID chips themselves are also discussed. The use of Cryoperm for shields reveals a tension between improved SQUID noise and worse neutron statistics. Investigations show that without it, SQUIDs have an elevated noise when cooled in a substantial magnetic field; with it, magnetostatic simulations suggest that it is detrimental to the polarisation of neutrons in transport. The findings suggest that with proper consideration, it is possible to reach a compromise between the two behaviours.
Computational work to develop a simulation of SQUID data is detailed, which is based on the Laplace equation for the magnetic scalar potential. These data are ultimately used in the development of a linear regression technique to determine the volume-averaged magnetic field in the neutron cells. This proves highly effective in determining the fields within the pT requirement under certain conditions
Investigating the learning potential of the Second Quantum Revolution: development of an approach for secondary school students
In recent years we have witnessed important changes: the Second Quantum Revolution is in the spotlight of many countries, and it is creating a new generation of technologies.
To unlock the potential of the Second Quantum Revolution, several countries have launched strategic plans and research programs that finance and set the pace of research and development of these new technologies (like the Quantum Flagship, the National Quantum Initiative Act and so on).
The increasing pace of technological changes is also challenging science education and institutional systems, requiring them to help to prepare new generations of experts.
This work is placed within physics education research and contributes to the challenge by developing an approach and a course about the Second Quantum Revolution. The aims are to promote quantum literacy and, in particular, to value from a cultural and educational perspective the Second Revolution.
The dissertation is articulated in two parts. In the first, we unpack the Second Quantum Revolution from a cultural perspective and shed light on the main revolutionary aspects that are elevated to the rank of principles implemented in the design of a course for secondary school students, prospective and in-service teachers. The design process and the educational reconstruction of the activities are presented as well as the results of a pilot study conducted to investigate the impact of the approach on students' understanding and to gather feedback to refine and improve the instructional materials.
The second part consists of the exploration of the Second Quantum Revolution as a context to introduce some basic concepts of quantum physics. We present the results of an implementation with secondary school students to investigate if and to what extent external representations could play any role to promote students’ understanding and acceptance of quantum physics as a personal reliable description of the world
Emerging Approaches for THz Array Imaging: A Tutorial Review and Software Tool
Accelerated by the increasing attention drawn by 5G, 6G, and Internet of
Things applications, communication and sensing technologies have rapidly
evolved from millimeter-wave (mmWave) to terahertz (THz) in recent years.
Enabled by significant advancements in electromagnetic (EM) hardware, mmWave
and THz frequency regimes spanning 30 GHz to 300 GHz and 300 GHz to 3000 GHz,
respectively, can be employed for a host of applications. The main feature of
THz systems is high-bandwidth transmission, enabling ultra-high-resolution
imaging and high-throughput communications; however, challenges in both the
hardware and algorithmic arenas remain for the ubiquitous adoption of THz
technology. Spectra comprising mmWave and THz frequencies are well-suited for
synthetic aperture radar (SAR) imaging at sub-millimeter resolutions for a wide
spectrum of tasks like material characterization and nondestructive testing
(NDT). This article provides a tutorial review of systems and algorithms for
THz SAR in the near-field with an emphasis on emerging algorithms that combine
signal processing and machine learning techniques. As part of this study, an
overview of classical and data-driven THz SAR algorithms is provided, focusing
on object detection for security applications and SAR image super-resolution.
We also discuss relevant issues, challenges, and future research directions for
emerging algorithms and THz SAR, including standardization of system and
algorithm benchmarking, adoption of state-of-the-art deep learning techniques,
signal processing-optimized machine learning, and hybrid data-driven signal
processing algorithms...Comment: Submitted to Proceedings of IEE
Advanced Computing and Related Applications Leveraging Brain-inspired Spiking Neural Networks
In the rapid evolution of next-generation brain-inspired artificial
intelligence and increasingly sophisticated electromagnetic environment, the
most bionic characteristics and anti-interference performance of spiking neural
networks show great potential in terms of computational speed, real-time
information processing, and spatio-temporal information processing. Data
processing. Spiking neural network is one of the cores of brain-like artificial
intelligence, which realizes brain-like computing by simulating the structure
and information transfer mode of biological neural networks. This paper
summarizes the strengths, weaknesses and applicability of five neuronal models
and analyzes the characteristics of five network topologies; then reviews the
spiking neural network algorithms and summarizes the unsupervised learning
algorithms based on synaptic plasticity rules and four types of supervised
learning algorithms from the perspectives of unsupervised learning and
supervised learning; finally focuses on the review of brain-like neuromorphic
chips under research at home and abroad. This paper is intended to provide
learning concepts and research orientations for the peers who are new to the
research field of spiking neural networks through systematic summaries
How to build a magnetometer with thermal atomic vapor: A tutorial
This article is designed as a step-by-step guide to optically pumped
magnetometers based on alkali atomic vapor cells. We begin with a general
introduction to atomic magneto-optical response, as well as expected
magnetometer performance merits and how they are affected by main sources of
noise. This is followed by a brief comparison of different magnetometer
realizations and an overview of current research, with the aim of helping
readers to identify the most suitable magnetometer type for specific
applications. Next, we discuss some practical considerations for experimental
implementations, using the case of an magnetometer as an example of the
design process. Finally, an interactive workbook with real magnetometer data is
provided to illustrate magnetometer-performance analysis.Comment: 52 pages, 9 figures, 3 tables. Submitted to New Journal of Physics as
an invited review/tutorial for the special issue "Focus on Hot Atomic
Vapors". Minor content and language errors corrected in v
- …