Heriot-Watt University Dubai
ROS: The Research Output Service. Heriot-Watt University EdinburghNot a member yet
4397 research outputs found
Sort by
The synthesis and derivation of tetra-substituted methylene bridge calix[4]arenes
Chapter 1 discusses the history of calix[n]arenes dating back to the initial investigation
of the phenol-formaldehyde procedure up to the seminal work by Gutsche that developed
the synthesis of these macrocycles allowing ease of access. Modification of these
macrocycles is discussed on the upper- and lower-rim followed by a detailed look into
different methods utilised to build functionalisation at the methylene bridge. Finally,
methods of introducing multiple functionalities to the methylene bridge is considered.
Chapter 2 presents the synthesis of a previously reported calix[4]arene which has been
mono-substituted at the methylene bridge containing a saturated 1,4 diketone. This
compound is then used as an intermediate to the synthesis of pyrrole appended C[4]s
through the Paal-Knorr synthesis with a range of different anilines. A library of these
C[4]s is presented with observations and discussions.
Chapter 3 contains a further look into the Paal-Knorr synthesis using the saturated 1,4
diketone intermediate with a set of long chain alkyl-amines. The exploration into other
heterocycles is then discussed including a synthesis of a thiophene using the 1,4 saturated
intermediate and a 1,4 unsaturated diketone with subsequent pyridazine ring closure. The
deprotection of the pyridazine, Thiophene and a long chain alkyl pyrrole is discussed
along with issues encountered, how they are overcome and decomposition of these
compounds.
Chapter 4 discusses introducing pyridyl functionality at the methylene bridge using the
Paal-Knorr synthesis. The different methods considered and trialled followed by
synthesis of two extended anilines and the reaction of 2-,3- and 4-(pyridyl)aniline with
the 1,4 saturated diketone forming the respective pyrroles, the products synthesised and
how these could be used to form asymmetrical calix[4]arenes.
Chapter 5 is a summary and overview of the work presented in this thesis with a small
section on the possible future work in this area
Contourite cyclicity and deposition
Cyclic depositional features are commonly developed in deepwater sedimentary facies.
Stacked sequences in varied forms is the most obvious characteristic, which is related to
complex variation in depositional conditions. This study introduces several geostatistical
approaches to analyse the cyclical bi-gradational sequences of contourite deposits from IODP
Expedition 339 in the Contourite Depositional System (CDS) in the Gulf of Cadiz. It analysed
both the vertical sequences and their cyclicity and their lateral correlation, both qualitatively
and quantitatively. Additionally, similar geostatistical approaches were applied to colour
sequences in turbidite and hemipelagite deposits from the Benguela Current Upwelling System,
SW African continental margin. This allowed comparison of sequences and cyclicity between
the different deepwater facies types. It also demonstrated the general applicability of this
method to deepwater sedimentary facies.
This study systematically examined contourite bi-gradational sequences deposited at IODP
Sites U1386 and U1387 between mid-Pleistocene to Recent. Transition probability analysis
based on the lithological logs confirmed the statistical validity of typical bi-gradational
sequences composed of coarsening-upward to fining-upward contourite divisions. These
typically ranged in thickness from 0.5-5m. Variations caused by minor erosion and non-deposition resulted in incomplete sequences and complex sequences with multiple gradational
divisions. The cyclic patterns of contourite bi-gradational sequences varied to some extent
between each studied hole. The 3-layer-sequence (C1-C2-C4-C5) dominates in this study area,
and a few sequences with more than 4 divisions include the C3 (sandy) division.
Autocorrelation of sequence duration in all studied holes indicates the existence of long-term
cyclicity of around 350 ky. The succession can be subdivided into 4 stages (0-350 ka, 350-700
ka, 750-1000 ka and >1000 ka) with alternating occurrence of more frequent and coarser
sequences and less frequent finer-grained sequences. Cross-correlation of this long-term
sequence frequency as well as of individual sequences shows moderately good but not perfect
correlation between holes and sites. This study suggests contourite deposition at the study sites
is controlled by both bottom current strength (speed) and sediment supply, both of which can
most probably be related to a complex variation in paleoclimate evolution and orbital cycles In hemipelagite and turbidite-hemipelagite hybrid deposits beneath the Benguela Upwelling
System off SW Africa, two colour sequence models (Hemipelagite Dominant Sequence and
Turbidite Hemipelagite Hybrid Sequence) were developed based on light-dark variation related
to organic matter content. Autocorrelation of sequence duration pointed out long-term cyclicity
through the past 4 My (0-1 Ma, 1-2.5 Ma, 2.5- 4 Ma and >4 Ma), which can be correlated to
stages in tthe long-term evolution of the Benguela Current Upwelling System. Lateral
correlation between studied sites indicated differences of sequence frequency in time and space,
which further contributed to reconstruction of the upwelling system.
The geostatistical study of cyclicity in contourite bi-gradational sequences and its comparison
cyclic characteristics of other deepwater facies (turbidites and hemipelagites), demonstrates
that such geostatistical approaches can be an important technique to evaluate the basic
sedimentary character of different systems and their numerical expression. It allows for
comparison between facies types and for better correlation with other time series records, such
as orbital climate patterns. Cyclic signatures can be correlated between different sites and help
better understand the sedimentary processes involved in their deposition
Multi qubit gates using ZZ interactions in superconducting circuits
In recent years quantum computing has shown great promise and has come on in
leaps and bounds. The promise of quantum computers is the speed-up over classical computers in specific areas and hence the ability to tackle even more complex
problems. As quantum computers evolve the need for more complex quantum gates
requiring more qubits (multi qubit gates) arises. These gates are currently broken
down into their one and two qubit gates. Multi qubit gate decomposition’s involve
many two qubit gates leading to the fidelity of these gates needing to be much higher
in order to produce a usable multi qubit gate. A possible solution to this is to introduce a single shot method for the multi qubit gates. In this thesis we investigate the
use of dispersive shifts to create these single shot methods. We examine two scenarios, first being a relatively simple three qubit gate (the iToffoli gate) to demonstrate
the procedure. We then move to extend this method to a larger number of qubits
examining its uses in quantum error correction and noting the potential pitfalls of
this method.
This thesis is organised as follows. In Chapter 1 we shall introduce the topic of
superconducting circuits discussing some simple circuits such as the LC Oscillator
and showing how these circuits can be modified to model superconducting qubits.
We shall also introduce the topic of Quantum Computing giving an overview of
the topic, discussing some quantum gates which shall be used and finally a short
introduction to Quantum Error Correction. In Chapter 2 we shall show how we implemented a single shot multi qubit gate within superconducting circuits. We shall
introduce some of the methods and analysis procedures we use within this thesis and
show numerical evidence of this gate. In Chapter 3 we shall discuss an extension
of the gate mechanism of chapter 1 to larger qubit clusters and show how it can
be modified to implement parity check gates and show how they can be used to
implement the stabilizer measurements used in the surface code. Finally in chapter
4 we shall discuss the future of this work, looking at some possible future directions for this research and suggesting some other more novel avenues which could
be explored
Embrace concept drift : a novel solution for online continual learning
Continual learning is a critical area of research in machine learning that aims to enable
models to learn new information without forgetting the old knowledge. Online continual learning, in particular, addresses the challenges of learning from a stream of data in
real-world environments where data can be unbounded and heterogeneous. There are two
main problems to be addressed in online continual learning: the first one is catastrophic
forgetting, a phenomenon where the model forgets the previously learned knowledge
while learning new tasks; the second one is concept drift, a situation where the distribution of the data changes over time. These issues can further complicate the learning
process, compared to traditional machine learning.
In this thesis, we propose a general framework for online continual learning that leverages both regularization-based and memory-based methods to mitigate catastrophic forgetting and handle concept drift. Specifically, we introduce a novel concept drift detection
algorithm based on the confidence values of the samples. We present a novel online continual learning paradigm, which utilizes concept drift as a rehearsal signal to improve
performance by consolidating or expanding the memory center. We also apply data condensation approaches to online continual learning in order to perform memory efficient
rehearsal.
Furthermore, we evaluate the accuracy of old tasks and new tasks, comparing with
many benchmark models. We present a novel evaluation metric - Stability and Plasticity
Balance to measure the balance between old and new accuracy.
We evaluate our proposed approach on a new benchmark dataset framework, Continual Online Learning (COnL), which consists of two scenarios of online continual learning: class-incremental learning and instance-incremental learning. In this thesis, the
benchmark dataset framework randomly selects a number of incremental classes from
3 different datasets: TinyImageNet, Germany Traffic Sign and Landmarks. Our primary
results demonstrate that concept drift can be a useful tool in memory rehearsal in the online continual learning setting. Our proposed approaches provide a promising direction
for future research in online continual learning and have the potential to enable models to learn continuously from unbounded and heterogeneous data streams in real-world
environments
Developing an implementation framework for Lean Six Sigma in high-value and low-volume industries
High-value low-volume (HVLV) industries hold a strong relevance in Germany, with a
focus on complex engineering and large projects conducted at low frequencies.
Methodologies for continuous improvement (CI), such as Lean Six Sigma (LSS), have
been used in mass production industries, such as automotive, to promote operational
excellence and are now relevant in HVLV industries to survive amid growing
international competition. The implementation of LSS in the HVLV industries has not
yet been studied to much extent. Therefore the purpose of the present study is to develop
an implementation framework for Lean Six Sigma in the wind power industry in
Germany.
To develop the conceptual framework, a systematic literature review of critical success
factors (CSFs) and critical failure factors (CFFs) was conducted. The review identified
similarities between the CSFs and CFFs which often reflected opposite conditions of the
same variable. The present study connects the success and failure factors as critical
influencing factors (CIFs), which include reasons for both failure and success. An
analysis of five relevant implementation frameworks of LSS shows that none of the
frameworks fully includes all CIFs and do not provide a complete answer concerning LSS
implementation in terms of what should be done to secure success and avoid failure in
the process.
The chosen research paradigm was critical realism and the research method was action
case in combination with action learning. The outcome of the study was an
implementation framework for LSS, the 3D framework house, as the essential result of
the research, which was validated by LSS experts.
The present study contributes to scholarship with the 3D framework house and a cycle
approach in 18 steps, as well as with the detailed description of the newly defined CIFs
with the focus dimensions and with the HVLV-specific focus dimensions.
It contributes to practice with the improved situation in the research organisation and with
the clear guideline for practitioners, giving answers to the questions “what needs to be
done” to improve the situation of the CIFs of LSS, “how this can be implemented” and
“who is responsible” in the HVLV industry context
Sequential assimilation of crowdsourced social media data into a simplified flood inundation model
Flooding is the most common natural hazard worldwide. Severe floods can cause significant
damage and sometimes loss of life. During a flood event, hydraulic models play an important
role in forecasting and identifying potential inundated areas, where emergency responses
should be deployed. Nevertheless, hydraulic models are not able to capture all of the
processes in flood propagation because flood behaviour is highly dynamic and complex.
Thus, there are always uncertainties associated with model simulations. As a result, near-real
time observations are required to incorporate with hydraulic models to improve model
forecasting skills. Crowdsourced (CS) social media data presents an opportunity for
supporting urban flood management as it can provide insightful information collected by
individuals in near real-time.
In this thesis, approachesto maximise the impact of CS social media data (Twitter) to reduce
uncertainty in flood inundation modelling (LISFLOOD-FP) through data assimilation were
investigated. The developed methodologies were tested and evaluated using a real flooding
case study of Phetchaburi city, Thailand. Firstly, two approaches (binary logistic regression
and fuzzy logic) were developed based on Twitter metadata and spatiotemporal analysis to
assess the quality of CS social media data. Both methods produced good results, but the
binary logistic model was preferred as it involved less subjectivity. Next, the generalized
likelihood uncertainty estimation methodology was applied to estimate model uncertainty
and identify behavioural parameter ranges. Particle swarm optimisation was also carried out
to calibrate for an optimum model parameter set. Following this, an ensemble Kalman filter
was applied to assimilate the flood depth information extracted from the CS data into the
LISFLOOD-FP simulations using various updating strategies. The findings show that the
global state update suffers from inconsistency of predicted water levels due to overestimating
the impact of the CS data, whereas a topography based local state update provides
encouraging results as the uncertainty in model forecasts narrows, albeit for a short time
period. To extend the improvement time span, a combination of state and boundary updating
was further investigated to correct both water levels and model inputs, and was found to
produce longer lasting improvements in terms of uncertainty reduction. Overall, the results
indicate the feasibility of applying CS social media data to reduce model uncertainty in flood
forecasting
Optical ground receivers for satellite based quantum communications
Cryptography has always been a key technology in security, privacy and defence.
From ancient Roman times, where messages were sent cyphered with simple encoding techniques, to modern times and the complex security protocols of the Internet.
During the last decades, security of information has been assumed, since classical
computers do not have the power to break the passwords used every day (if they are
generated properly). However, in 1984, a new threat emerged when Peter Shor presented the Shor’s algorithm, an algorithm that could be used in quantum computers
to break many of the secure communication protocols nowadays. Current quantum
computers are still in their early stages, with not enough qubits to perform this
algorithm in reasonable times. However, the threat is present, not future, since the
messages that are being sent by important institutions can be stored, and decoded
in the future once quantum computers are available.
Quantum key distribution (QKD) is one of the solutions proposed for this threat,
and the only one mathematically proven to be secure with no assumptions on the
eavesdropper power. This optical technology has recently gained interest to be performed with satellite communications, the main reason being the relative ease to
deploy a global network in this way. In satellite QKD, the parameter space and
available technology to optimise are very big, so there is still a lot of work to be
done to understand which is the optimal way to exploit this technology.
This dissertation investigates one of these parameters, the encoding scheme.
Most satellite QKD systems use polarisation schemes nowadays. This thesis presents
for the first time an experimental work of a time-bin encoding scheme for free-space
receivers within a full QKD system in the second chapter. The third and fourth
chapter explore the advantages of having multi-protocol free-space receivers that
can boost the interoperability between systems, polarisation filtering techniques to
reduce background. Finally, the last chapter presents a new technology that can
help increase communications rates
An assessment of the assisted seismic history matching workflow, practical innovations and solutions
In hydrocarbon reservoir monitoring, assisted seismic history matching (ASHM) remains
a large and intractable problem. Despite advances in optimisation algorithms, quantification of uncertainty, data quality, data processing, computational resources and general
subsurface knowledge, practical implementations of assisted/automated seismic history
matching (ASHM) remain boutique and inflexible. Consideration of recent research on
ASHM problems highlights a single-minded focus on algorithmic solutions, that ignore
the broader perspective of ASHM as a multidisciplinary framework for improving subsurface models. This thesis expands the consideration of ASHM beyond the optimisation,
to propose a novel three-phase approach. ASHM is posed as a larger workflow that includes acquiring, evaluating and establishing an ASHM model (Phase 1), history matching
(Phase 2) and model evaluation and improvement (Phase 3). By taking a big picture perspective with respect to ASHM, additional value and patterns to workflows emerge that
will improve the adoption of ASHM within the subsurface industry, by offering pragmatic
and targeted guidance to development, evaluation and improvement around subsurface
models via ASHM
Optimisation of microscopic techniques to assess isolated islet characteristics
Islets of Langerhans or pancreatic islets constitute ~2% of the mass of the human
pancreas and present on isolation as spheroids of 100-200 µm diameter. The 3D cellular
organisation of islets is specific to each species, and important for islet viability and
functionality. Isolated donor islets are used in transplantation for ameliorating Type I
diabetes in humans, however current techniques to assess islet viability are highly
specialised and not easily accessible in a non-clinical set up. The research in this thesis
aimed to create and optimise methodologies for multiple microscopic techniques and
analysis for isolated pancreatic islets. I found that live imaging of 3D intact pancreatic
islets has multiple challenges, one of the most important being techniques to efficiently
immobilise these organoid structures while retaining high-quality imaging and
flexibility in the experimental set-up. I developed a tailor-made hydrogel for pancreatic
islets and validated its use in live intact islets. The hydrogel was combined with
experimental and commercially available chemical dyes and enabled optimisation of the
analysis. Limitations in the labelling and imaging are discussed. Alternative dyes were
tested to label different structures as steps towards automated viability assessment of
isolated islets. New applications for an experimental dye to label alpha and beta cells
were tested in human islets. In pursuit of a better understanding of the insulin metabolic
pathways for its synthesis, maturation and release, a fluorescence timer tag was
designed and validated for its use in beta cell lines and pancreatic islets. This validation
was a multiple optimisation processes consisting of immunostaining and histology,
imaging analysis and characterisation in live beta cells. The thesis offers insight into
the complexities, opportunities and limitations offered by microscopic techniques in
islet assessment with the aim of enabling assessment of islet health before
transplantation and for research purposes
What does this notation mean anyway? Interpreting BNF-style notation as it is used in practice
BNF (Backus Naur Form) notation, as introduced in the Algol 60 report, was followed
by numerous notational variants (EBNF ISO (1996), ABNF Crocker et al. (2008), etc.),
and later by a new metalanguage which is used for discussing structured objects in Computer Science and Mathematical Logic. We call this latter offspring of BNF MBNF
(Math BNF). MBNF is sometimes called “abstract syntax”. MBNF can express structured objects that cannot be serialised as finite strings. What MBNF and other BNF
variants share is the use of production rules, whose form is given below, which state
that “every instance of ◦i
for i ∈ {1, . . . , n} is also an instance of •”.
• ::= ◦1 | · · · | ◦n
This thesis studies BNF and its variant forms and contrasts them with MBNF production rules. We show via a series of detailed examples and lemmas that MBNF, differs
substantially from BNF and its variants in how it is written, the operations it allows,
and the sets of entities it defines. We demonstrate with an example and a proof that
MBNF has features that, when combined, could make MBNF rule sets inconsistent.
Readers do not have a document which tells them how to read MBNF and have to learn
MBNF through a process of cultural initiation. We propose a framework, MathSyn,
that handles most uses of MBNF one might encounter in the wild