Lancaster E-Prints

Lancaster University

Lancaster E-Prints
Not a member yet
    109631 research outputs found

    Planning equitable on-street residential EV charging infrastructure : Evidence from Dublin City

    No full text
    The growing adoption of electric vehicles (EVs) presents challenges to the equitable deployment of charging infrastructure. Ensuring equitable access to residential charging stations is critical for supporting widespread EV adoption across diverse communities. As such, research on applying charging equity theories to the placement of residential charging stations remains essential. Based on the Irish national household travel survey and census data, this study introduces a Monte Carlo simulation model for charging demand estimation. Grounded in utilitarianism, sufficientarianism, and vertical equity, a multi-objective optimisation model is proposed for siting and sizing charging stations, along with their connections to transformers across disadvantaged, intermediate and advantaged communities. A case study in Dublin examines the proposed models under various EV penetration scenarios. The results reveal the spatial and temporal distribution of day-of-week charging demands across 524 communities. Subsequently, 63 and 82 stations with 102 and 166 chargers, and powered by 63 and 74 transformers, are planned for the 30 % and 50 % EV penetration scenarios, respectively. The findings highlight that neglecting charging demand heterogeneity can result in under- or over-supply of infrastructure. The study further emphasises that incorporating diverse charging equity theories into planning objective and constraint formulation can significantly promote equitable charging accessibility across disadvantaged and non-disadvantaged communities. The model is primarily applicable to residential areas that mix disadvantaged and other communities, rely heavily on on-street parking, and exhibit stable EV adoption

    Systematic design choices for fine-tuning text-classification models : Projection space, task instructions, and label encoding

    Get PDF
    Text classification (TC) is a foundational task in natural language processing (NLP), where supervised fine-tuning (SFT) of pre-trained language models (PLMs) has become the dominant paradigm. However, systematic investigations of three key design choices within fine-tuning frameworks for TC are still lacking: (1) using a classification head projecting to the label space versus the vocabulary space, (2) augmenting input text with task instructions, and (3) integrating label text directly into training sequences. To address this gap, we introduce a principled 2 2 2 design matrix and conduct an empirical study grounded in this unified methodological framework across four core benchmark datasets (two Chinese and two English) using both encoder-only and decoder-only PLMs, and further validate our findings on multi-label and long-document benchmarks. Results indicate that classification heads projecting to the label space or the vocabulary space achieve comparable performance. Explicit task instructions, while effective in few-shot in-context learning (ICL), do not consistently improve performance in supervised fine-tuning. Notably, although decoder-only models exhibit the capability to learn from label-appended sequences, this behavior superficially resembles ICL and fundamentally arises from architectural alignment between label placement and causal next-token supervision rather than genuine reasoning over label semantics. These results provide both analytical insight into PLM supervision dynamics and actionable design guidelines for efficient TC workflows. 1 1 All source code and processed datasets are publicly available at https://github.com/JianLyu07/design_choices

    Deconstruction

    No full text
    The emergence of digital technology is transforming our culture in ways that are hard to grasp. New media art and Deconstruction can both be considered as responses to the challenges of computerization and digital technology. The writings of Jacques Derrida, Bernard Stiegler and others connected with Deconstruction offer the most profound engagement with questions of technicity that are particularly pertinent to our current technologized condition, as does art involving new forms of media, networks and technologies. ‘Deconstruction’ in this context refers to the philosophical practice of the close reading of texts to reveal their internal structures and contradictions, exemplified in much of Derrida’s work. ‘Technicity’ is the term for how there is nothing about the human that is not always already technica

    Dual alignment : Partial negative and soft-label alignment for text-to-image person retrieval

    Get PDF
    Text-to-image person retrieval is a task to retrieve the right matched images based on a given textual description of the interested person. The main challenge lies in the inherent modal difference between texts and images. Most existing works narrow the modality gap by aligning the feature representations of text and image in a latent embedding space. However, these methods usually leverage the hard label and mine insufficient or incorrect hard negatives to achieve cross-modal alignment, generating incorrect hard negative pairs so as to suboptimal performance. To tackle the above problems, we propose a dual alignment framework, Partial negative and Soft-label Alignment (PASA), which includes the partial negative alignment (PA) strategy and the Soft-label Alignment (SA) strategy. Specifically, PA pushes far away the hard negatives in the triplet loss by considering a certain amount of negatives within each mini-batch as hard negatives, preventing the distraction to the positive text–image pairs. Based on PA, SA further achieves the alignment between the similarity distribution on these hard negatives by the manner of soft-label, as well as the alignment between inter-modal and intra-modal. Extensive experiments on three public datasets, CUHK-PEDES, ICFG-PEDES and RSTPReid, demonstrate that our proposed PASA method can consistently improve the performance of text-to-image person retrieval, and achieve new state-of-the-art results on the above three datasets

    Threshold displacement energies of oxygen in YBa2Cu3O7 : A multi-physics analysis

    No full text
    Neutron bombardment of high temperature superconducting (HTS) magnets may compromise the integrity of the magnetic confinement in future fusion reactors. The amount of damage produced by a single neutron can be predicted from the threshold displacement energies (TDE) of the constituent ions in the HTS materials, such as the Rare Earth Cuperates. Therefore, in this work a multiphysics simulation approach is adopted to determine the threshold displacement energies for oxygen in YBa2Cu3O7. Classical molecular dynamics (MD) simulations are employed to determine statistically representative TDEs for all four oxygen sites and these results are validated using Born-Oppenheimer MD employing forces derived from Density Functional Theory (DFT). The simulations were performed at the operational temperature (25 K) and the temperature of existing neutron irradiation studies (360 K) enabling a discussion about the relevance of this data. Overall, these findings enhance our understanding of radiation-induced damage in HTS materials and provide data that can be incorporated into higher order models offering critical insights into shielding design and magnet longevity

    Late Amazonian-aged volcanic cones of explosive origin in Ceraunius fossae, Tharsis, Mars

    Get PDF
    Detailed volcanological studies continue to enhance our understanding of Martian eruptive styles and their associated volcanic products. Growing evidence points to the involvement of mildly explosive eruptions as one of the eruption styles that contributed to the formation of distributed volcanic edifices in the volcanic province, Tharsis. This highlights a complex and dynamic eruptive evolution that occurred during the late Amazonian volcanism. Therefore, here, we report on the presence of small-scale, conical-shaped volcanic edifices located at the edge of Ceraunius Fossae in Tharsis. The association of the N-S aligned cones with a rough-surfaced lava flow enabled us to constrain the minimum age of their volcanic activity at ca. 48 Ma. Although they superficially resemble Martian scoria cones, their morphometric parameters indicate that they have a distinct and separate origin. They comprise coarser pyroclastic material such as spatter, and display an accumulation of likely volcanic bombs on the cones' slopes and at their bases, observable in the high-resolution images. Combining the sizes and distribution of the mapped individual volcanic bombs with a ballistic emplacement model enables us to calculate the exit velocity and maximum height for a given bomb density at a given launch angle. This provides a means to improve our understanding of ballistic trajectories and distances over which the pyroclastic material can be transported on Mars. Moreover, we argue that the portfolio of Martian volcanic edifices is more diverse than currently recognized. The use of high-resolution remotely sensed volcanological mapping could provide critical information about volcanic products and, consequently, the magma fragmentation, which depends on the eruptivity, controlled by magma composition and volatile contents

    Acceleration of the CASINO quantum Monte Carlo software using graphics processing units and OpenACC

    No full text
    We describe how quantum Monte Carlo calculations using the CASINO software can be accelerated using graphics processing units (GPUs) and OpenACC. In particular we consider offloading Ewald summation, the evaluation of long-range two-body terms in the Jastrow correlation factor, and the evaluation of orbitals in a blip basis set. We present results for three- and two-dimensional homogeneous electron gases and ab initio simulations of bulk materials, showing that significant speedups of up to a factor of 2.5 can be achieved by the use of GPUs when several hundred particles are included in the simulations. The use of single-precision arithmetic can improve the speedup further without significant detriment to the accuracy of the calculations

    Predicting shoreline changes using deep learning techniques with Bayesian optimisation

    No full text
    Accurate prediction of shoreline change is vital for effective coastal planning and management, especially under increasing climate variabilities. This study explores the applicability of deep learning (DL) techniques, particularly Long Short-Term Memory (LSTM) and Convolutional Neural Network-LSTM (CNN-LSTM) models, for shoreline forecasting at monthly to inter-annual timescales, under two modelling approaches—direct input (DI) and autoregressive (AR). All models demonstrated the ability to reproduce temporal shoreline variability, while the autoregressive DL models were performing better. Further, a noise impact assessment revealed that seasonal decomposition and noise filtering significantly enhanced the model performance. In particular, the models using 52-week data decomposition and residual noise reduction improved the model performance. The reduction of data noises also resulted in narrower ensemble prediction envelopes, indicating that ensemble candidate models behave with low diversity. The temporal data resolution analysis showed that lower data resolutions reduce the predictive performance of the model and at least fortnightly data are required to satisfactorily capture the trend of variability of the shoreline position at this beach. The use of ensemble predictions, derived from a selected subset of model trials based on their collective performance, proved beneficial by capturing diverse temporal behaviours, thereby offering a quasi-probabilistic forecast with minimal computational cost. Overall, the study underscores the potential of DL models, particularly with autoregressive architectures, for reliable and transferable shoreline change prediction. It also emphasizes the importance of data quality, resolution, and preprocessing in improving model robustness, laying the groundwork for future research into use of DL in multi-scale shoreline predictions

    The impact of foreign media on political mobilization during the Arab Spring

    Get PDF
    We investigate how foreign media influenced political mobilization during the Arab Spring in the Middle East and North Africa (MENA) region. Focusing on two prominent transnational networks, Al Jazeera and Al Arabiya, we use Arab Barometer survey data to track political mobilization and media use indicators in Jordan, Lebanon, and the Palestinian Territories. To address potential endogeneity, we use the frequency of lightning strikes and submarine cable seaquake shocks as instrumental variables, which help isolate exogenous variation in access to foreign media. Our results show that access to foreign media has a positive and statistically significant effect on political mobilization. A one-standard-deviation increase corresponds to a rise in the likelihood of participating in protests of approximately 6.5 percentage points, a gain of approximately 39% at the sample mean. We argue that this effect is primarily driven by the informational dimension of foreign media, rather than its ideological content

    Optimising A Compact Dual-Particle Imager with A Novel Combination of Organic and Inorganic Scintillators

    Get PDF
    This work discusses the process of optimising the performance of a compact dual-particle imager with a fast response time of 60 s and a signal processing time of 60 s. The imaging concept relies on a radiation scattering technique with a neutron-scattering sub-system, a Compton-scattering sub-system and a thermal neutron absorption layer. The enhanced design proposed here reduces the signal processing time compared to earlier work. The time lagging occurs mainly during neutron-gamma discrimination within the neutron scattering sub-system. Hence, the optimisation here which mainly targets investigating two promising dual particle detectors EJ-212 and EJ-276D as possible replacement for the EJ-204 in the original device. The results indicated that EJ-212 and EJ-276D offer similar elastic scattering and escape probabilities of neutrons at energies less than 100 keV. At energies higher than 500 keV, the elastic scattering probabilities increase with thickness, reaching a maximum around 24 % in both detectors. As for gamma-ray photons, the results showed that the two detectors have similar total mass attenuation coefficients. Further investigation showed that EJ-212 exhibits a higher Compton scattering cross-section with respect to EJ-276D. Additionally, results show that EJ-276D maintains the imager intrinsic efficiency while offering built-in neutron-gamma discrimination abilities

    28,928

    full texts

    109,640

    metadata records
    Updated in last 30 days.
    Lancaster E-Prints is based in United Kingdom
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇