108 research outputs found
20150709: Department of Engineering, 2008
These items include materials from Engineering at Marshall University from 2008. Items were received in 2015 and include notable materials from or about the History of Engineering Department at Marshall University among other support and presentations about Engineering at Marshall and enrollment information. This is not an exhaustive list. Please download the finding aid for a full list of contents
Study of the doubly charmed tetraquark T+cc
Quantum chromodynamics, the theory of the strong force, describes interactions of coloured quarks and gluons and the formation of hadronic matter. Conventional hadronic matter consists of baryons and mesons made of three quarks and quark-antiquark pairs, respectively. Particles with an alternative quark content are known as exotic states. Here a study is reported of an exotic narrow state in the D0D0π+ mass spectrum just below the D*+D0 mass threshold produced in proton-proton collisions collected with the LHCb detector at the Large Hadron Collider. The state is consistent with the ground isoscalar T+cc tetraquark with a quark content of ccu⎯⎯⎯d⎯⎯⎯ and spin-parity quantum numbers JP = 1+. Study of the DD mass spectra disfavours interpretation of the resonance as the isovector state. The decay structure via intermediate off-shell D*+ mesons is consistent with the observed D0π+ mass distribution. To analyse the mass of the resonance and its coupling to the D*D system, a dedicated model is developed under the assumption of an isoscalar axial-vector T+cc state decaying to the D*D channel. Using this model, resonance parameters including the pole position, scattering length, effective range and compositeness are determined to reveal important information about the nature of the T+cc state. In addition, an unexpected dependence of the production rate on track multiplicity is observed
Met Office UKCP Local CPM precipitation ML emulator dataset
Met Office UKCP Local CPM precipitation ML emulator dataset This is a collection of two datasets: one sourced from CPM data (bham_gcmx-4x_12em_psl-sphum4th-temp4th-vort4th_eqvt_random-season.tar.gz) and one sourced from GCM data (bham_60km-4x_12em_psl-sphum4th-temp4th-vort4th_eqvt_random-season.tar.gz). Each dataset is made up of climate model variables extracted from the Met Office's storage system, combining many variables over many years. It consists of 3 NetCDF files (train.nc, test.nc and val.nc), a YML ds-config.yml file and a README (similar to this one but tailored to the source of the data). Code used to create the dataset can be found here: https://github.com/henryaddison/mlde-data (specifically the v0.1.0 tag: https://github.com/henryaddison/mlde-data/tree/v0.1.0). The YML file contains the configuration for the creation of the dataset, including the variables, scenario, ensemble members, spatial domain and resolution, and the scheme for splitting the data across the three subsets. Each NetCDF contains the same variables but split into different subsets (train, val and test) of the based on time dimension. Otherwise the NetCDF files have the sames dimensions and coordinates for ensemble_member, grid_longitude and grid_latitude. Spatial resolution: This has two parts - the resolution of the data and the grid resolution stored at in the file. For predictand variables this is 2.2km variables coarsened 4 times to 8.8km (this is the target grid). For predictor variables this is 2.2km variables conservatively regriddded to GCM 60km grid or variables from GCM (so already on 60km grid) then regrid (nearest neighbour) to the target grid of predictands. In the naming convention of resolution used in config files, 60km resolution is synonamous with the GCM grid and 2.2km resolution is synonamous with the CPM grid. Spatial domain: A 64x64 section of the 8.8km target grid covering England and Wales Time resolution: daily Time domain: 1st Dec 1980 to 30th Nov 2000; 1st Dec 2020 to 30th Nov 2040; 1st Dec 2060 to 30th Nov 2080. Uses a 360-day calendar. Scenario: RCP8.5 Ensemble Members: 01, 04-13 & 15 (these correspond to the 12 ensemble member runs from the CPM but don't carry intrinsic meaning). Split scheme: 70% training, 15% validation, 15% testing, split by choosing complete seasons at random, with an equal number of each season from each of the 3 time periods. Predictor variables psl (hPa) - mean sea level pressure temp850, temp700, temp500, temp250 - air temperature (K) at 850, 700, 500 and 250 hPa vorticity850, vorticity700, vorticity500, vorticity250 - relative vorticity (s^-1) at 850, 700, 500 and 250 hPa spechum850, spechum700, spechum500, spechum250 - specific humidity at 850, 700, 500 and 250 hPa Predictand variable target_pr - precipitation rate (mm/day
Met Office UKCP Local CPM precipitation ML emulator dataset
Met Office UKCP Local CPM precipitation ML emulator dataset This is a collection of two datasets: one sourced from CPM data (bham_gcmx-4x_12em_psl-sphum4th-temp4th-vort4th_eqvt_random-season.tar.gz) and one sourced from GCM data (bham_60km-4x_12em_psl-sphum4th-temp4th-vort4th_eqvt_random-season.tar.gz). Each dataset is made up of climate model variables extracted from the Met Office's storage system, combining many variables over many years. It consists of 3 NetCDF files (train.nc, test.nc and val.nc), a YML ds-config.yml file and a README (similar to this one but tailored to the source of the data). Code used to create the dataset can be found here: https://github.com/henryaddison/mlde-data (specifically the v0.1.0 tag: https://github.com/henryaddison/mlde-data/tree/v0.1.0). The YML file contains the configuration for the creation of the dataset, including the variables, scenario, ensemble members, spatial domain and resolution, and the scheme for splitting the data across the three subsets. Each NetCDF contains the same variables but split into different subsets (train, val and test) of the based on time dimension. Otherwise the NetCDF files have the sames dimensions and coordinates for ensemble_member, grid_longitude and grid_latitude. Spatial resolution: This has two parts - the resolution of the data and the grid resolution stored at in the file. For predictand variables this is 2.2km variables coarsened 4 times to 8.8km (this is the target grid). For predictor variables this is 2.2km variables conservatively regriddded to GCM 60km grid or variables from GCM (so already on 60km grid) then regrid (nearest neighbour) to the target grid of predictands. In the naming convention of resolution used in config files, 60km resolution is synonamous with the GCM grid and 2.2km resolution is synonamous with the CPM grid. Spatial domain: A 64x64 section of the 8.8km target grid covering England and Wales Time resolution: daily Time domain: 1st Dec 1980 to 30th Nov 2000; 1st Dec 2020 to 30th Nov 2040; 1st Dec 2060 to 30th Nov 2080. Uses a 360-day calendar. Scenario: RCP8.5 Ensemble Members: 01, 04-13 & 15 (these correspond to the 12 ensemble member runs from the CPM but don't carry intrinsic meaning). Split scheme: 70% training, 15% validation, 15% testing, split by choosing complete seasons at random, with an equal number of each season from each of the 3 time periods. Predictor variables psl (hPa) - mean sea level pressure temp850, temp700, temp500, temp250 - air temperature (K) at 850, 700, 500 and 250 hPa vorticity850, vorticity700, vorticity500, vorticity250 - relative vorticity (s^-1) at 850, 700, 500 and 250 hPa spechum850, spechum700, spechum500, spechum250 - specific humidity at 850, 700, 500 and 250 hPa Predictand variable target_pr - precipitation rate (mm/day
Met Office UKCP Local CPM precipitation ML emulator dataset, version 1.0.1
Met Office UKCP Local CPM precipitation ML emulator dataset This is a collection of two datasets: one sourced from CPM data (bham64_ccpm-4x_12em_psl-sphum4th-temp4th-vort4th_pr.tar.gz) and one sourced from GCM data (bham64_gcm-4x_12em_psl-sphum4th-temp4th-vort4th_pr.tar.gz). Each dataset is made up of climate model variables extracted from the Met Office's storage system, combining many variables over many years. It consists of 3 NetCDF files (train.nc, test.nc and val.nc), a YML ds-config.yml file and a README (similar to this one but tailored to the source of the data). Code used to create the dataset can be found here: https://github.com/henryaddison/mlde-data (specifically the james-submission tag). The YML file contains the configuration for the creation of the dataset, including the variables, scenario, ensemble members, spatial domain and resolution, and the scheme for splitting the data across the three subsets. Each NetCDF contains the same variables but split into different subsets (train, val and test) of the based on time dimension. Otherwise the NetCDF files have the sames dimensions and coordinates for ensemble_member, grid_longitude and grid_latitude. Spatial resolution: This has two parts - the resolution of the data and the grid resolution stored at in the file. For predictand variables this is 2.2km variables coarsened 4 times to 8.8km (this is the target grid). For predictor variables this is 2.2km variables conservatively regriddded to GCM 60km grid or variables from GCM (so already on 60km grid) then regrid (nearest neighbour) to the target grid of predictands. In the naming convention of resolution used in config files, 60km resolution is synonamous with the GCM grid and 2.2km resolution is synonamous with the CPM grid. Spatial domain: A 64x64 section of the 8.8km target grid covering England and Wales Time resolution: daily Time domain: 1st Dec 1980 to 30th Nov 2000; 1st Dec 2020 to 30th Nov 2040; 1st Dec 2060 to 30th Nov 2080. Uses a 360-day calendar. Scenario: RCP8.5 Ensemble Members: 01, 04-13 & 15 (these correspond to the 12 ensemble member runs from the CPM but don't carry intrinsic meaning). Split scheme: 70% training, 15% validation, 15% testing, split by choosing complete seasons at random, with an equal number of each season from each of the 3 time periods. Predictor variables psl (hPa) - mean sea level pressure temp850, temp700, temp500, temp250 - air temperature (K) at 850, 700, 500 and 250 hPa vorticity850, vorticity700, vorticity500, vorticity250 - relative vorticity (s^-1) at 850, 700, 500 and 250 hPa spechum850, spechum700, spechum500, spechum250 - specific humidity at 850, 700, 500 and 250 hPa Predictand variable target_pr - precipitation rate (mm/day) UPDATE 2025-03-27: Dataset tars are renamed to make it clearer their source (ccpm for coarsened CPM and gcm for GCM)
Dataset for "Metamaterial-Based LTCC Compressed Luneburg Lens Antenna at 60 GHz for Wireless Communications"
Dataset containing the data in the figures published in D. Zelenchuk, V. Kirillov, C. Kärnfelt, F. Gallée, and I. Munina, “Metamaterial-Based LTCC Compressed Luneburg Lens Antenna at 60 GHz for Wireless Communications,” Electronics, vol. 12, no. 11, p. 2354, May 2023, doi: 10.3390/electronics12112354. Note data files are in CSV format. Abstract for paper: In this study, a metamaterial-based LTCC compressed Luneburg lens was designed, manufactured and measured. The lens was designed at 60 GHz to utilize the unlicensed mm-wave spectrum available for short-range high-capacity wireless communication networks. The transformation optics method was applied to ensure the compression of the Luneburg lens antenna and thus maintain a low-profile structure. The two different types of unit cells for low and high permittivity regions were considered. The parametric study of the effect of compression on lens performance was presented. The antenna is implemented with a standard high-permittivity LTCC process, and details of the manufacturing process for the metamaterial lens are discussed. The low-profile lens is thinner than 2 mm and measures 19 mm in diameter. A size reduction of 63.6% in comparison with a spherical lens was achieved. The near-field to far-field mm-wave measurement technique is presented, and the measurement results show a peak antenna gain of 16 dBi at 60 GHz and a beam-scanning capacity with 1 dB scan loss within a 50° field of view
A Survey on Adoption Guidelines for the FAIR4RS Principles: Dataset
A list of 30+ online resources have been identified and curated by the FAIR4RS Subgroup 5: Adoption Guidelines. These resources are available as the supplementary materials of the report (Martinez et al., 2022) and can be downloaded and cited from this landing page. The list is open for additions by the community via comments directly to this link. We particularly encourgae authors of new and exisiting resources to add as much detail as possible to describe their resource and its relevance to the FAIR4RS Principles. Each of the columns has a description and whether the information is optional or not. Whe plan to add tags to the added resources for each semester and when there is another set of 30 resources we can resealease a new version. This list reflects the wide spectrum of global contributions supporting the implementation of the FAIR Principles, particularly regarding research software. It is a snapshot of currently available resources, although we expect that new resources will become available in the future and that the contents of the current list will evolve. It is important to note that most of these resources precede the definition of the FAIR4RS Principles; however, these still support their implementation. The resources were manually collected, analyzed, and categorized according to their type: guidelines, tools, metadata schemas and registries/repositories. For each resource detail is also provided on which of the FAIR4RS Principles that the resource supports. Data collection This subgroup initiated a crowdsourcing effort to identify relevant resources. All members had the opportunity to provide and describe existing FAIR research software guidelines and tools. During the first two months of the subgroup operation in 2021, subgroup participants (referred to as data providers) added resources to an online spreadsheet. Data providers were encouraged to list resources that they were aware of, authored, or were supported by their institutions. Subsequently, the subgroup organized virtual calls to discuss the resources, their descriptions and the categorization. Over the next two months each data provider added descriptions to resources they were familiar with. This meant that some resources gained descriptions from different data providers. Before the completion of the list, the subgroup leads checked the list and cleaned it (removing items that lacked information or providing complementary information). The resulting list is the first crowdsourced list of its type and it welcomes your contributions!Martinez, P. A., Struck, A., Castro, L. J., Garijo, D., Loewe, A., Gesing, S., Barker, M., Chue Hong, N., Erdmann, C., Martinez-Ortiz, C., & Sansone, S.-A. (2022). A Survey on Adoption Guidelines for the FAIR4RS Principles: Dataset (1.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.637554
Low- versus standard-dose alteplase in acute lacunar ischemic stroke: the ENCHANTED trial - online supplemental
Objective: To determine any differential efficacy and safety of low- versus standard-dose intravenous alteplase for lacunar versus non-lacunar acute ischemic stroke (AIS), we performed post-hoc analyzes from the Enhanced Control of Hypertension and Thrombolysis Stroke Study (ENCHANTED) alteplase dose-arm. Methods: In a cohort of 3297 ENCHANTED participants, we identified those with lacunar or non-lacunar AIS with different levels of confidence (definite/probable/possible) according to pre-specified definitions based on clinical and adjudicated imaging findings. Logistic regression models were used to determine associations of lacunar AIS with 90-day outcomes (primary, modified Rankin scale [mRS] scores 2-6; secondary, other mRS scores, intracerebral hemorrhage [ICH], and early neurologic deterioration [END] or death) and treatment effects of low- versus standard-dose alteplase across lacunar and non-lacunar AIS with adjustment for baseline covariables. Results: Of 2588 participants with available imaging and clinical data, we classified cases as definite/probable lacunar (n=490) or non-lacunar AIS (n=2098) for primary analyses. Regardless of alteplase dose received, lacunar AIS participants had favorable functional (mRS 2-6, adjusted odds ratio [95% CI] 0.60 [0.47-0.77]) and other clinical or safety outcomes, compared to participants with non-lacunar AIS. Low-dose alteplase (versus standard) had no differential effect on functional outcomes (mRS 2-6, 1.04 [0.87-1.24]) but reduced the risk of symptomatic ICH in all included participants. There were no differential treatment effects of low- versus standard-dose alteplase on all outcomes across lacunar and non-lacunar AIS (all Pinteraction ≥0.07). Conclusions: We found no evidence from the ENCHANTED trial that low-dose alteplase had any advantages over standard-dose for definite/probable lacunar AIS
Multi-laboratory compilation of atmospheric carbon dioxide data for the year 2023-2024; obspack_ch4_1_NRT_v6.2_2024-06-27
This product is constructed using the Observation Package (ObsPack) framework [Masarie et al., 2014; www.earth-syst-sci-data.net/6/375/2014/]. The framework is designed to bring together atmospheric greenhouse gas (GHG) observations from a variety of sampling platforms, prepare them with specific applications in mind, and package and distribute them in a self-consistent and well-documented product. ObsPack products are intended to support GHG budget studies and represent a new generation of cooperative value-added GHG data products
- …
