1,902 research outputs found
Deep generative models for network data synthesis and monitoring
Measurement and monitoring are fundamental tasks in all networks, enabling the down-stream management and optimization of the network.
Although networks inherently
have abundant amounts of monitoring data, its access and effective measurement is
another story. The challenges exist in many aspects. First, the inaccessibility of network monitoring data for external users, and it is hard to provide a high-fidelity dataset
without leaking commercial sensitive information. Second, it could be very expensive
to carry out effective data collection to cover a large-scale network system, considering the size of network growing, i.e., cell number of radio network and the number of
flows in the Internet Service Provider (ISP) network. Third, it is difficult to ensure fidelity and efficiency simultaneously in network monitoring, as the available resources
in the network element that can be applied to support the measurement function are
too limited to implement sophisticated mechanisms. Finally, understanding and explaining the behavior of the network becomes challenging due to its size and complex
structure. Various emerging optimization-based solutions (e.g., compressive sensing)
or data-driven solutions (e.g. deep learning) have been proposed for the aforementioned challenges. However, the fidelity and efficiency of existing methods cannot yet
meet the current network requirements.
The contributions made in this thesis significantly advance the state of the art in
the domain of network measurement and monitoring techniques. Overall, we leverage
cutting-edge machine learning technology, deep generative modeling, throughout the
entire thesis. First, we design and realize APPSHOT , an efficient city-scale network
traffic sharing with a conditional generative model, which only requires open-source
contextual data during inference (e.g., land use information and population distribution). Second, we develop an efficient drive testing system — GENDT, based on generative model, which combines graph neural networks, conditional generation, and quantified model uncertainty to enhance the efficiency of mobile drive testing. Third, we
design and implement DISTILGAN, a high-fidelity, efficient, versatile, and real-time
network telemetry system with latent GANs and spectral-temporal networks. Finally,
we propose SPOTLIGHT , an accurate, explainable, and efficient anomaly detection system of the Open RAN (Radio Access Network) system. The lessons learned through
this research are summarized, and interesting topics are discussed for future work in
this domain. All proposed solutions have been evaluated with real-world datasets and
applied to support different applications in real systems
AI-based design methodologies for hot form quench (HFQ®)
This thesis aims to develop advanced design methodologies that fully exploit the capabilities of the Hot Form Quench (HFQ®) stamping process in stamping complex geometric features in high-strength aluminium alloy structural components. While previous research has focused on material models for FE simulations, these simulations are not suitable for early-phase design due to their high computational cost and expertise requirements. This project has two main objectives: first, to develop design guidelines for the early-stage design phase; and second, to create a machine learning-based platform that can optimise 3D geometries under hot stamping constraints, for both early and late-stage design. With these methodologies, the aim is to facilitate the incorporation of HFQ capabilities into component geometry design, enabling the full realisation of its benefits.
To achieve the objectives of this project, two main efforts were undertaken. Firstly, the analysis of aluminium alloys for stamping deep corners was simplified by identifying the effects of corner geometry and material characteristics on post-form thinning distribution. New equation sets were proposed to model trends and design maps were created to guide component design at early stages. Secondly, a platform was developed to optimise 3D geometries for stamping, using deep learning technologies to incorporate manufacturing capabilities. This platform combined two neural networks: a geometry generator based on Signed Distance Functions (SDFs), and an image-based manufacturability surrogate model. The platform used gradient-based techniques to update the inputs to the geometry generator based on the surrogate model's manufacturability information. The effectiveness of the platform was demonstrated on two geometry classes, Corners and Bulkheads, with five case studies conducted to optimise under post-stamped thinning constraints. Results showed that the platform allowed for free morphing of complex geometries, leading to significant improvements in component quality.
The research outcomes represent a significant contribution to the field of technologically advanced manufacturing methods and offer promising avenues for future research. The developed methodologies provide practical solutions for designers to identify optimal component geometries, ensuring manufacturing feasibility and reducing design development time and costs. The potential applications of these methodologies extend to real-world industrial settings and can significantly contribute to the continued advancement of the manufacturing sector.Open Acces
2023-2024 Catalog
The 2023-2024 Governors State University Undergraduate and Graduate Catalog is a comprehensive listing of current information regarding:Degree RequirementsCourse OfferingsUndergraduate and Graduate Rules and Regulation
Unsupervised CT Metal Artifact Reduction by Plugging Diffusion Priors in Dual Domains
During the process of computed tomography (CT), metallic implants often cause
disruptive artifacts in the reconstructed images, impeding accurate diagnosis.
Several supervised deep learning-based approaches have been proposed for
reducing metal artifacts (MAR). However, these methods heavily rely on training
with simulated data, as obtaining paired metal artifact CT and clean CT data in
clinical settings is challenging. This limitation can lead to decreased
performance when applying these methods in clinical practice. Existing
unsupervised MAR methods, whether based on learning or not, typically operate
within a single domain, either in the image domain or the sinogram domain. In
this paper, we propose an unsupervised MAR method based on the diffusion model,
a generative model with a high capacity to represent data distributions.
Specifically, we first train a diffusion model using CT images without metal
artifacts. Subsequently, we iteratively utilize the priors embedded within the
pre-trained diffusion model in both the sinogram and image domains to restore
the degraded portions caused by metal artifacts. This dual-domain processing
empowers our approach to outperform existing unsupervised MAR methods,
including another MAR method based on the diffusion model, which we have
qualitatively and quantitatively validated using synthetic datasets. Moreover,
our method demonstrates superior visual results compared to both supervised and
unsupervised methods on clinical datasets
Analytical validation of innovative magneto-inertial outcomes: a controlled environment study.
peer reviewe
Exploiting Process Algebras and BPM Techniques for Guaranteeing Success of Distributed Activities
The communications and collaborations among activities, pro-
cesses, or systems, in general, are the base of complex sys-
tems defined as distributed systems. Given the increasing
complexity of their structure, interactions, and functionali-
ties, many research areas are interested in providing mod-
elling techniques and verification capabilities to guarantee
their correctness and satisfaction of properties. In particular,
the formal methods community provides robust verification
techniques to prove system properties. However, most ap-
proaches rely on manually designed formal models, making
the analysis process challenging because it requires an expert
in the field. On the other hand, the BPM community pro-
vides a widely used graphical notation (i.e., BPMN) to design
internal behaviour and interactions of complex distributed
systems that can be enhanced with additional features (e.g.,
privacy technologies). Furthermore, BPM uses process min-
ing techniques to automatically discover these models from
events observation. However, verifying properties and ex-
pected behaviour, especially in collaborations, still needs a
solid methodology.
This thesis aims at exploiting the features of the formal meth-
ods and BPM communities to provide approaches that en-
able formal verification over distributed systems. In this con-
text, we propose two approaches. The modelling-based ap-
proach starts from BPMN models and produces process al-
gebra specifications to enable formal verification of system
properties, including privacy-related ones. The process mining-
based approach starts from logs observations to automati-
xv
cally generate process algebra specifications to enable veri-
fication capabilities
Conformance Checking-based Concept Drift Detection in Process Mining
One of the main challenges of process mining is to obtain
models that represent a process as simply and accurately as
possible. Both characteristics can be greatly influenced by
changes in the control flow of the process throughout its life
cycle.
In this thesis we propose the use of conformance metrics to
monitor such changes in a way that allows the division of the
log into sub-logs representing different versions of the process
over time. The validity of the hypothesis has been formally
demonstrated, showing that all kinds of changes in the process
flow can be captured using these approaches, including
sudden, gradual drifts on both clean and noisy environments,
where differentiating between anomalous executions and real
changes can be tricky
- …