2,881 research outputs found
An Efficient End-to-End Transformer with Progressive Tri-modal Attention for Multi-modal Emotion Recognition
Recent works on multi-modal emotion recognition move towards end-to-end
models, which can extract the task-specific features supervised by the target
task compared with the two-phase pipeline. However, previous methods only model
the feature interactions between the textual and either acoustic and visual
modalities, ignoring capturing the feature interactions between the acoustic
and visual modalities. In this paper, we propose the multi-modal end-to-end
transformer (ME2ET), which can effectively model the tri-modal features
interaction among the textual, acoustic, and visual modalities at the low-level
and high-level. At the low-level, we propose the progressive tri-modal
attention, which can model the tri-modal feature interactions by adopting a
two-pass strategy and can further leverage such interactions to significantly
reduce the computation and memory complexity through reducing the input token
length. At the high-level, we introduce the tri-modal feature fusion layer to
explicitly aggregate the semantic representations of three modalities. The
experimental results on the CMU-MOSEI and IEMOCAP datasets show that ME2ET
achieves the state-of-the-art performance. The further in-depth analysis
demonstrates the effectiveness, efficiency, and interpretability of the
proposed progressive tri-modal attention, which can help our model to achieve
better performance while significantly reducing the computation and memory
cost. Our code will be publicly available
Accelerated Particle Swarm Optimization and Support Vector Machine for Business Optimization and Applications
Business optimization is becoming increasingly important because all business
activities aim to maximize the profit and performance of products and services,
under limited resources and appropriate constraints. Recent developments in
support vector machine and metaheuristics show many advantages of these
techniques. In particular, particle swarm optimization is now widely used in
solving tough optimization problems. In this paper, we use a combination of a
recently developed Accelerated PSO and a nonlinear support vector machine to
form a framework for solving business optimization problems. We first apply the
proposed APSO-SVM to production optimization, and then use it for income
prediction and project scheduling. We also carry out some parametric studies
and discuss the advantages of the proposed metaheuristic SVM.Comment: 12 page
Spectral sensitivity near exceptional points as a resource for hardware encryption
The spectral sensitivity near exceptional points (EPs) has been recently explored as an avenue for building sensors with enhanced sensitivity. However, to date, it is not clear whether this class of sensors does indeed outperform traditional sensors in terms of signal-to-noise ratio. In this work, we investigate the spectral sensitivity associated with EPs under a different lens and propose to utilize it as a resource for hardware security. In particular, we introduce a physically unclonable function (PUF) based on analogue electronic circuits that benefit from the drastic eigenvalues bifurcation near a divergent exceptional point to enhance the stochastic entropy caused by inherent parameter fluctuations in electronic components. This in turn results in a perfect entropy source for the generation of encryption keys encoded in analog electrical signals. This lightweight and robust analog-PUF structure may lead to a variety of unforeseen securities and anti-counterfeiting applications in radio-frequency fingerprinting and wireless communications
Process Chart for Controlling Wafer Defects using Fuzzy Theory
Manufacturers of integrated circuits (IC) frequently utilize c-charts to monitor wafer defects. The clustering of wafer defects increases with the surface area of the wafers. The clustering of defects causes the Poisson-based c-chart to show many false alarms. Although Neyman-based c-chart has been developed to reduce the number of false alarms, it has some shortcomings in practical use. This study presents a process control chart that applies Fuzzy theory and the engineering experience to monitor the clustered defects on a wafer. The proposed method is simpler and more efficient than that of the Neyman-based c-chart. A case study of an IC company in Taiwan demonstrates the effectiveness of the proposed method
Understanding the Behaviour of Silver as a Low Friction Coating in Aerospace Fasteners
Nuts and bolts used in aero-engines are manufactured from heat-resistant super-alloys. When used in a like on like couple, these materials have a high coefficient of friction, and frequently seizure occurs. In order to prevent this, a silver coating is applied to the nut threads, providing a low friction boundary at the interface. Additionally, a radial crimp is applied to the nut, in order to provide a self-locking feature preventing vibration self-loosening.
In this study, the coefficient of friction of the thread contact will be investigated both during initial joint assembly, and after thermal ageing. Additionally, a finite element model will be employed to investigate the contact mechanics as a consequence of the crimp.
The low coefficient of friction observed during initial assembly was found to be a consequence of shear flow of the silver coating, with an approximate doubling of this value once the coating aged. Areas of silver removal were found to be coincident with areas of high contact pressure in the joint, attributable to the crimp feature
AutoML-GPT: Large Language Model for AutoML
With the emerging trend of GPT models, we have established a framework called
AutoML-GPT that integrates a comprehensive set of tools and libraries. This
framework grants users access to a wide range of data preprocessing techniques,
feature engineering methods, and model selection algorithms. Through a
conversational interface, users can specify their requirements, constraints,
and evaluation metrics. Throughout the process, AutoML-GPT employs advanced
techniques for hyperparameter optimization and model selection, ensuring that
the resulting model achieves optimal performance. The system effectively
manages the complexity of the machine learning pipeline, guiding users towards
the best choices without requiring deep domain knowledge. Through our
experimental results on diverse datasets, we have demonstrated that AutoML-GPT
significantly reduces the time and effort required for machine learning tasks.
Its ability to leverage the vast knowledge encoded in large language models
enables it to provide valuable insights, identify potential pitfalls, and
suggest effective solutions to common challenges faced during model training
North Dakota Economic-Demographic Assessment Model (NEDAM): Technical Description
This report describes the logic, structure, data bases, and operational procedures of the North Dakota model.Research Methods/ Statistical Methods,
- …