98 research outputs found
Generation Alpha: Understanding the Next Cohort of University Students
Technology is changing at a blistering pace and is impacting on the way we consider knowledge as a free commodity, along with the ability to apply skills, concepts and understandings. Technology is aiding the way the world is evolving, and its contributions to education are not an exemption. While technology advances will play a crucial part in future teaching-learning approaches, educators will also be challenged by the next higher-education generation, the Alpha Generation. This entrepreneurial generation will embrace the innovation, progressiveness, and advancement with the expectation that one in two Generation Alphas will obtain a university degree. In anticipating the educational challenges and opportunities of the future higher education environment, this research reflected on Generation Alpha as the next cohort of university students, considering their preferred learning styles, perceptions and expectations relating to education. The research employed a theoretical analysis based on the characteristics and traits that distinguishes Generation Alpha, spearheaded by technology advances. The empirical investigation considered three independent studies that were previous conducted by authors from Slovakia, Hungary, Australia, and Turkey to understand the challenges and opportunities pertaining to Generation Alpha. The research identified the influence of social media, social connections, high levels of perceptions and the Generation Alpha’s ability to interpret information as strengths to consider in future teaching-learning approaches in the higher education environment. This research concluded with recommendations on how universities could be transformed to ensure a better learning experience for Generation Alpha students, aligned with their characteristics, perceptions and expectations
Nitrided Ferroalloy Production By Metallurgical SHS Process: Scientific Foundations and Technology
The main objective of this paper is to present results of the research in the development of a specialized self-propagating high-temperature synthesis (SHS) technology for ferroalloy composites, as applied to steelmaking. The problem of creating such a production cycle has been solved by developing a new approach to the practical implementation of self-propagating high-temperature synthesis, as applied to metallurgy. The metallurgical variation of SHS is based on the use of different metallurgic alloys (including waste in the form of dust from ferroalloy production) as basic raw materials in the new process. Here, the process of synthesis by combustion is realized through exothermic exchange reactions. The process produces a composite, based on inorganic compositions with a bond of iron and/or alloy based on iron. It has been shown that in terms of the aggregate state of initial reagents, metallurgical SHS processes are either gasless or gas-absorbing. Combustion regimes significantly differ when realized in practice. To organize the metallurgical SHS process in weakly exothermic systems, different variations of the thermal trimming principle are used. In the present study, self-propagating high-temperature synthesis of ferrovanadium nitride, ferrochromium nitride and ferrosilicon nitride; which is widely used in steel alloying, was investigated.
Keywords: self-propagating high-temperature synthesis (SHS); composite ferroalloys; nitrides; borides; filtration combustion; ferrovanadium nitride ferrochromium nitride and ferrosilicon nitrid
Deep Learning of Atomically Resolved Scanning Transmission Electron Microscopy Images: Chemical Identification and Tracking Local Transformations
Recent advances in scanning transmission electron and scanning probe
microscopies have opened exciting opportunities in probing the materials
structural parameters and various functional properties in real space with
angstrom-level precision. This progress has been accompanied by an exponential
increase in the size and quality of datasets produced by microscopic and
spectroscopic experimental techniques. These developments necessitate adequate
methods for extracting relevant physical and chemical information from the
large datasets, for which a priori information on the structures of various
atomic configurations and lattice defects is limited or absent. Here we
demonstrate an application of deep neural networks to extract information from
atomically resolved images including location of the atomic species and type of
defects. We develop a 'weakly-supervised' approach that uses information on the
coordinates of all atomic species in the image, extracted via a deep neural
network, to identify a rich variety of defects that are not part of an initial
training set. We further apply our approach to interpret complex atomic and
defect transformation, including switching between different coordination of
silicon dopants in graphene as a function of time, formation of peculiar
silicon dimer with mixed 3-fold and 4-fold coordination, and the motion of
molecular 'rotor'. This deep learning based approach resembles logic of a human
operator, but can be scaled leading to significant shift in the way of
extracting and analyzing information from raw experimental data
Towards Lightweight Data Integration using Multi-workflow Provenance and Data Observability
Modern large-scale scientific discovery requires multidisciplinary
collaboration across diverse computing facilities, including High Performance
Computing (HPC) machines and the Edge-to-Cloud continuum. Integrated data
analysis plays a crucial role in scientific discovery, especially in the
current AI era, by enabling Responsible AI development, FAIR, Reproducibility,
and User Steering. However, the heterogeneous nature of science poses
challenges such as dealing with multiple supporting tools, cross-facility
environments, and efficient HPC execution. Building on data observability,
adapter system design, and provenance, we propose MIDA: an approach for
lightweight runtime Multi-workflow Integrated Data Analysis. MIDA defines data
observability strategies and adaptability methods for various parallel systems
and machine learning tools. With observability, it intercepts the dataflows in
the background without requiring instrumentation while integrating domain,
provenance, and telemetry data at runtime into a unified database ready for
user steering queries. We conduct experiments showing end-to-end multi-workflow
analysis integrating data from Dask and MLFlow in a real distributed deep
learning use case for materials science that runs on multiple environments with
up to 276 GPUs in parallel. We show near-zero overhead running up to 100,000
tasks on 1,680 CPU cores on the Summit supercomputer.Comment: 10 pages, 5 figures, 2 Listings, 42 references, Paper accepted at
IEEE eScience'2
- …