6 research outputs found
Comparing XR And Digital Flipped Methods To Meet Learning Objectives
Digital learning has become increasingly important over the last decade as students and educators adopt new types of technology to keep up with emerging trends. The advent of the Covid-19 pandemic accelerated this rate of change in the higher education sector, leading to remote laboratory experiences and video conferencing becoming increasingly normal. In the wake of this transition, the priority is to understand how these technologies can be blended into existing teaching methodologies, in a complementary way, that enhances the student’s pedagogical experience. The upcoming study will compare three digital-based learning simulations to see which has the most beneficial effect on practical student laboratory experiences. Engineering students will be exposed to one of three forms of digital “pre-lab” laboratory simulation and their academic performance assessed following a physical laboratory. The three forms are a 2D photography “iLabs” simulation, a web-based “low fidelity” simulator and a Unity based immersive Virtual Reality (iVR) lab simulator. All three methods are based on the same empirically derived data. As a control, another group of students will not receive a pre-lab simulation, just a standard pre-lab quiz. The study methods will be tested in a small scale preliminary study with a smaller cohort of students ahead of the main work to optimize the experience. This research will build upon existing work carried out in the field of virtual labs, that indicates these experiences can help reinforce student learning outcomes, whilst also unpicking the complex relationship between simulation immersion, fidelity and memory recall in a learning context. In addition, the study will give an opportunity to perform a detailed cost versus pedagogical impact assessment, as each of these simulations has been designed and built from the ground up by the authors
Synergistic carbon metabolism in a fast growing mixotrophic freshwater microalgal species Micractinium inermum
In recent years microalgae have attracted significant interest as a potential source of sustainable biofuel. Mixotrophic microalgae are able to simultaneously photosynthesise while assimilating and metabolising organic carbon. By combining autotrophic and heterotrophic metabolic pathways biomass productivity can be significantly increased. In this study, acetate-fed mixotrophic Micractinium inermum cultures were found to have a specific growth rate 1.74 times the sum of autotrophic and heterotrophic growth. It was hypothesised that gas exchange between the two metabolic pathways within mixotrophic cultures may have prevented growth limitation and enhanced growth. To determine the extent of synergistic gas exchange and its influence on metabolic activity, dissolved inorganic carbon (DIC), dissolved oxygen (DO) and photosynthesis and respiration rates were measured under different trophic conditions. A 32.7 fold and 2.4 fold increase in DIC and DO concentrations, relative to autotrophic and heterotrophic cultures respectively, were coupled with significant increases in rates of photosynthesis and respiration. These data strongly support the hypothesis of mixotrophic gas exchange within M. inermum cultures. In addition to enhanced growth, this phenomenon may provide reductions in aeration and oxygen stripping costs related to microalgae production
Three Point Bend Experiment - Meta Quest VR .apk Program - University of Sheffield
A virtual "digital twin" experiment using Unity VR of the 3 Point Bend test delivered at the department of Multidisciplinary Engineering Education at the University of Sheffield.This .apk file can be installed via the program SideQuest to any oculus quest headset and run.In order to create a fully bespoke iVR experience it was decided that a game engine would be required in order to provide the truly immersive visual and interactive elements coupled with realistic simulations of physics. This decision was also coupled with a requirement to minimize VR hardware costs and enable an experience that is untethered (i.e. no cables linked to a PC). Based upon these considerations, the educational version of Unity 3D game engine (2020.3.34f1) was selected for use with Meta’s Quest 1 & 2 headsets. This software is free for academic use and the basic Quest headsets are low-cost consumer products.The simulation geometry was created using educational versions of 3D CAD software (Solidworks, Fusion 360) based on the dimensions of the actual experimental apparatus. As photorealistic geometry was not the objective in this simulation, in order to help actively decouple fidelity effects (i.e. the students getting distracted by the 3d models themselves). The only part of the models that were modelled in realistic detail were the 3 point bending apparatus fixtures and beam samples; this was necessary to provide visual cues on operating the experiment and also facilitate experiential learning relevant to the practical experiment itself. This approach also reduced the overall computation expense in rendering the models, so the simulation could run at higher framerates, reducing likelihood of instances of cybersickness. The modeled geometry was exported to the free 3D modeling software Blender for further geometry optimisation (reduction of mesh complexity), followed by the addition of deflection animation (beam models only) and material texture baking. The finished 3D models were then exported in .GLTF format and then imported to the Unity Engine for use in the iVR simulation.The Unity program was designed incorporating the free Oculus XR Plugin, to enable both controller and hand tracking interactions when transferred to the Quest headsets.The same empirically derived coefficients of proportionality between force and deflection that were used in the Lo-Fi simulation were also used in this simulation to calculate the deflection for a given load applied to a sample. This deflection was displayed within a text box on the 3D model, and an appropriate movement of the jaws and distortion to the mesh of the beam was applied.The user experience of the simulation is as follows once the program is loaded, the user is presented with a scale-correct simplified version of the three-point bending apparatus in an empty boundless space. Using the Oculus controllers or their hands, the users can pick up any sample to test and place it into the jaws of the test machine. It should be noted that this element was considered to be an important differentiator between the simulation types as high levels of interactivity have been previously shown to increase knowledge and skills acquisition. The force applied to the sample can be then adjusted using two large red interactable buttons and the amount of deflection read from the machine's virtual display. The beams deform according to the load placed upon them. The deflection is approximated visually, however, the deflection data given is accurate based on empirical data.</p
Three Point Bend Experiment - Unity VR Source Code - University of Sheffield
A virtual "digital twin" experiment using Unity VR of the 3 Point Bend test delivered at the department of Multidisciplinary Engineering Education at the University of Sheffield.In order to create a fully bespoke iVR experience it was decided that a game engine would be required in order to provide the truly immersive visual and interactive elements coupled with realistic simulations of physics. This decision was also coupled with a requirement to minimize VR hardware costs and enable an experience that is untethered (i.e. no cables linked to a PC). Based upon these considerations, the educational version of Unity 3D game engine (2020.3.34f1) was selected for use with Meta’s Quest 1 & 2 headsets. This software is free for academic use and the basic Quest headsets are low-cost consumer products.The simulation geometry was created using educational versions of 3D CAD software (Solidworks, Fusion 360) based on the dimensions of the actual experimental apparatus. As photorealistic geometry was not the objective in this simulation, in order to help actively decouple fidelity effects (i.e. the students getting distracted by the 3d models themselves). The only part of the models that were modelled in realistic detail were the 3 point bending apparatus fixtures and beam samples; this was necessary to provide visual cues on operating the experiment and also facilitate experiential learning relevant to the practical experiment itself. This approach also reduced the overall computation expense in rendering the models, so the simulation could run at higher framerates, reducing likelihood of instances of cybersickness. The modeled geometry was exported to the free 3D modeling software Blender for further geometry optimisation (reduction of mesh complexity), followed by the addition of deflection animation (beam models only) and material texture baking. The finished 3D models were then exported in .GLTF format and then imported to the Unity Engine for use in the iVR simulation.The Unity program was designed incorporating the free Oculus XR Plugin, to enable both controller and hand tracking interactions when transferred to the Quest headsets.The same empirically derived coefficients of proportionality between force and deflection that were used in the Lo-Fi simulation were also used in this simulation to calculate the deflection for a given load applied to a sample. This deflection was displayed within a text box on the 3D model, and an appropriate movement of the jaws and distortion to the mesh of the beam was applied.The user experience of the simulation is as follows once the program is loaded, the user is presented with a scale-correct simplified version of the three-point bending apparatus in an empty boundless space. Using the Oculus controllers or their hands, the users can pick up any sample to test and place it into the jaws of the test machine. It should be noted that this element was considered to be an important differentiator between the simulation types as high levels of interactivity have been previously shown to increase knowledge and skills acquisition. The force applied to the sample can be then adjusted using two large red interactable buttons and the amount of deflection read from the machine's virtual display. The beams deform according to the load placed upon them. The deflection is approximated visually, however, the deflection data given is accurate based on empirical data.</p
Comparing XR and Digital Flipped Methods to Meet Learning Objectives - Main Study Data 2023
The attached data and statistical analysis is used in a study that compares three digital-based learning simulations to see which has the most beneficial effect on practical student laboratory experiences. Engineering students were allocated a grouping that exposed them to one of three forms of digital “pre-lab” laboratory simulation and their academic performance was assessed following the physical laboratory. The three forms were a 2D photography-based “iLabs” simulation, a web-based “low fidelity” simulator and a Unity immersive Virtual Reality (iVR) lab simulator. All three simulation methods were based on the same empirically derived data that was taken from the same laboratory equipment which the students use in the final part of the study. As a control, another group of students did not receive a pre-lab simulation, just a pre-lab quiz.Ethics application number: 051651</p