117 research outputs found
Dynamic Shift from Cloud Computing to Industry 4.0: Eco-Friendly Choice or Climate Change Threat
Cloud computing utilizes thousands of Cloud Data Centres (CDC) and fulfils the demand of end-users dynamically using new technologies and paradigms such as Industry 4.0 and Internet of Things (IoT). With the emergence of Industry 4.0, the quality of cloud service has increased; however, CDC consumes a large amount of energy and produces a huge quantity of carbon footprint, which is one of the major drivers of climate change. This chapter discusses the impacts of cloud developments on climate and quantifies the carbon footprint of cloud computing in a warming world. Further, the dynamic transition from cloud computing to Industry 4.0 is discussed from an eco-friendly/climate change threat perspective. Finally, open research challenges and opportunities for prospective researchers are explored
START: Straggler Prediction and Mitigation for Cloud Computing Environments using Encoder LSTM Networks
A common performance problem in large-scale cloud systems is dealing with straggler tasks that are slow running instances which increase the overall response time. Such tasks impact the system's QoS and the SLA. There is a need for automatic straggler detection and mitigation mechanisms that execute jobs without violating the SLA. Prior work typically builds reactive models that focus first on detection and then mitigation of straggler tasks, which leads to delays. Other works use prediction based proactive mechanisms, but ignore volatile task characteristics. We propose a Straggler Prediction and Mitigation Technique (START) that is able to predict which tasks might be stragglers and dynamically adapt scheduling to achieve lower response times. START analyzes all tasks and hosts based on compute and network resource consumption using an Encoder LSTM network to predict and mitigate expected straggler tasks. This reduces the SLA violation rate and execution time without compromising QoS. Specifically, we use the CloudSim toolkit to simulate START and compare it with IGRU-SD, SGC, Dolly, GRASS, NearestFit and Wrangler in terms of QoS parameters. Experiments show that START reduces execution time, resource contention, energy and SLA violations by 13%, 11%, 16%, 19%, compared to the state-of-the-art
iThermoFog: IoT‐Fog based Automatic Thermal Profile Creation for Cloud Data Centers using Artificial Intelligence Techniques
Preventing failures in Cloud Data Centers (CDCs) due to high temperatures is a key challenge. Such centers have so many servers that it is very difficult to efficiently keep their temperature under control. To help address this issue, we propose an artificial intelligence (AI) based automatic scheduling method that creates a thermal profile of CDC nodes using an integrated Internet of Things (IoT) and Fog computing environment called iThermoFog .We use a Gaussian Mixture Model to approximate the thermal characteristics of the servers which are used to predict and schedule tasks to minimize the average CDC temperature. Through empirical evaluation on an iFogSim and ThermoSim based testbed and IoT based smart home application, we show that iThermoFog outperforms the current state‐of‐the‐art thermal‐aware scheduling method. Specifically, iThermoFog reduces mean square temperatures by 13.5%, while simultaneously improving energy consumption, execution time, scheduling time and bandwidth usage
Deuteron and antideuteron production in Au+Au collisions at sqrt(s_NN)=200 GeV
The production of deuterons and antideuterons in the transverse momentum
range 1.1 < p_T < 4.3 GeV/c at mid-rapidity in Au + Au collisions at
sqrt(s_NN)=200 GeV has been studied by the PHENIX experiment at RHIC. A
coalescence analysis comparing the deuteron and antideuteron spectra with those
of protons and antiprotons, has been performed. The coalescence probability is
equal for both deuterons and antideuterons and increases as a function of p_T,
which is consistent with an expanding collision zone. Comparing (anti)proton
yields p_bar/p = 0.73 +/- 0.01, with (anti)deuteron yields: d_bar/d = 0.47 +/-
0.03, we estimate that n_bar/n = 0.64 +/- 0.04.Comment: 326 authors, 6 pages text, 5 figures, 1 Table. Submitted to PRL.
Plain text data tables for the points plotted in figures for this and
previous PHENIX publications are (or will be) publicly available at
http://www.phenix.bnl.gov/papers.htm
Single Electrons from Heavy Flavor Decays in p+p Collisions at sqrt(s) = 200 GeV
The invariant differential cross section for inclusive electron production in
p+p collisions at sqrt(s) = 200 GeV has been measured by the PHENIX experiment
at the Relativistic Heavy Ion Collider over the transverse momentum range $0.4
<= p_T <= 5.0 GeV/c at midrapidity (eta <= 0.35). The contribution to the
inclusive electron spectrum from semileptonic decays of hadrons carrying heavy
flavor, i.e. charm quarks or, at high p_T, bottom quarks, is determined via
three independent methods. The resulting electron spectrum from heavy flavor
decays is compared to recent leading and next-to-leading order perturbative QCD
calculations. The total cross section of charm quark-antiquark pair production
is determined as sigma_(c c^bar) = 0.92 +/- 0.15 (stat.) +- 0.54 (sys.) mb.Comment: 329 authors, 6 pages text, 3 figures. Submitted to Phys. Rev. Lett.
Plain text data tables for the points plotted in figures for this and
previous PHENIX publications are (or will be) publicly available at
http://www.phenix.bnl.gov/papers.htm
Next Generation Technologies for Smart Healthcare: Challenges, Vision, Model, Trends and Future Directions
Modern industry employs technologies for automation that may include Internet of Things (IoT), Cloud and/or Fog Computing, 5G as well as Artificial Intelligence (AI), Machine Learning (ML), or Blockchain. Currently, a part of research for the new industrial era is in the direction of improving healthcare services. This work throws light on some of the major challenges in providing affordable, efficient, secure and reliable healthcare from the viewpoint of computer and medical sciences. We describe a vision of how a holistic model can fulfill the growing demands of healthcare industry, and explain a conceptual model that can provide a complete solution for these increasing demands. In our model, we elucidate the components and their interaction at different levels, leveraging state‐of‐the art technologies in IoT, Fog computing, AI, ML and Blockchain. We finally describe current trends in this field and propose future directions to explore emerging paradigms and technologies on evolution of healthcare leveraging next generation computing systems
Production of phi mesons at mid-rapidity in sqrt(s_NN) = 200 GeV Au+Au collisions at RHIC
We present the first results of meson production in the K^+K^- decay channel
from Au+Au collisions at sqrt(s_NN) = 200 GeV as measured at mid-rapidity by
the PHENIX detector at RHIC. Precision resonance centroid and width values are
extracted as a function of collision centrality. No significant variation from
the PDG accepted values is observed. The transverse mass spectra are fitted
with a linear exponential function for which the derived inverse slope
parameter is seen to be constant as a function of centrality. These data are
also fitted by a hydrodynamic model with the result that the freeze-out
temperature and the expansion velocity values are consistent with the values
previously derived from fitting single hadron inclusive data. As a function of
transverse momentum the collisions scaled peripheral.to.central yield ratio RCP
for the is comparable to that of pions rather than that of protons. This result
lends support to theoretical models which distinguish between baryons and
mesons instead of particle mass for explaining the anomalous proton yield.Comment: 326 authors, 24 pages text, 23 figures, 6 tables, RevTeX 4. To be
submitted to Physical Review C as a regular article. Plain text data tables
for the points plotted in figures for this and previous PHENIX publications
are (or will be) publicly available at http://www.phenix.bnl.gov/papers.htm
Measurement of Transverse Single-Spin Asymmetries for Mid-rapidity Production of Neutral Pions and Charged Hadrons in Polarized p+p Collisions at sqrt(s) = 200 GeV
The transverse single-spin asymmetries of neutral pions and non-identified
charged hadrons have been measured at mid-rapidity in polarized proton-proton
collisions at sqrt(s) = 200 GeV. The data cover a transverse momentum (p_T)
range 0.5-5.0 GeV/c for charged hadrons and 1.0-5.0 GeV/c for neutral pions, at
a Feynman-x (x_F) value of approximately zero. The asymmetries seen in this
previously unexplored kinematic region are consistent with zero within
statistical errors of a few percent. In addition, the inclusive charged hadron
cross section at mid-rapidity from 0.5 < p_T < 7.0 GeV/c is presented and
compared to NLO pQCD calculations. Successful description of the unpolarized
cross section above ~2 GeV/c using NLO pQCD suggests that pQCD is applicable in
the interpretation of the asymmetry results in the relevant kinematic range.Comment: 331 authors, 6 pages text, 2 figures, 3 tables. Submitted to Phys.
Rev. Lett. Plain text data tables for the points plotted in figures for this
and previous PHENIX publications are (or will be) publicly available at
http://www.phenix.bnl.gov/papers.htm
A Systematic Review and Meta-analysis Protocol on Depressive Symptoms Among Medical Students in South Asia Using Patient-reported Validated Assessment Tools: Prevalence and Associated Factors
Depression among medical students in South Asia is notably higher than the global average, with prevalence rates ranging from approximately 30% to 60%. Untreated depression not only affects individual student's well-being, but also impacts academic performance and future clinical competence. This study protocol aims to synthesize evidence on the prevalence and associated factors of depressive symptoms among medical students in South Asia. The study will systematically navigate Medline (PubMed), Scopus, CINHAL, EMBASE, and APA PsycInfo for studies available before 1st May, 2025, following PRISMA guidelines for reporting and adhering to PRISMA-P standards for protocol development. The search will search for grey literature and adopt citation chain technique, using keyword truncation and string search along with standard indexing terms. Observational longitudinal studies, including cross-sectional, cohort studies, and case-control using validated patient-reported depressive symptoms measuring tools comprising South Asian medical students. Review articles, intervention studies, case reports, case series, commentaries, pre-prints, conference abstracts, protocols, unpublished research, and correspondences will not be considered. No language limitation will be applied. Two independent reviewers will screen studies, with disagreements resolved by a third reviewer. The study aims to extract information on prevalence and associated factors of depressive symptoms, conducting a narrative synthesis and meta-analysis using random effect models. Forest and funnel plots will be used to visualize findings, while heterogeneity will be assessed using the I2 statistic, with subgroup and sensitivity analysis performed to ascertain the robustness. Risk of bias (RoB) will be measured adopting the modified Newcastle-Ottawa Scale (mNOS). Statistical analysis will be conducted using R studio v.4.3.2 and GraphPad Prism v.9.0. Understanding the prevalence and risk factors is essential to guide targeted interventions and evidence-based policy reforms that support the mental well-being of future healthcare professionals. By systematically synthesizing data from observational studies, this review will provide a comprehensive synthesis of depressive symptoms, prevalence and its correlates among medical students in South Asian region, laying the groundwork for preventive strategies and improved mental health care practices
ThermoSim: Deep Learning based Framework for Modeling and Simulation of Thermal-aware Resource Management for Cloud Computing Environments
Current cloud computing frameworks host millions of physical servers that utilize cloud computing resources in the form of different virtual machines. Cloud Data Center (CDC) infrastructures require significant amounts of energy to deliver large scale computational services. Moreover, computing nodes generate large volumes of heat, requiring cooling units in turn to eliminate the effect of this heat. Thus, overall energy consumption of the CDC increases tremendously for servers as well as for cooling units. However, current workload allocation policies do not take into account effect on temperature and it is challenging to simulate the thermal behavior of CDCs. There is a need for a thermal-aware framework to simulate and model the behavior of nodes and measure the important performance parameters which can be affected by its temperature. In this paper, we propose a lightweight framework, ThermoSim, for modeling and simulation of thermal-aware resource management for cloud computing environments. This work presents a Recurrent Neural Network based deep learning temperature predictor for CDCs which is utilized by ThermoSim for lightweight resource management in constrained cloud environments. ThermoSim extends the CloudSim toolkit helping to analyze the performance of various key parameters such as energy consumption, service level agreement violation rate, number of virtual machine migrations and temperature during the management of cloud resources for execution of workloads. Further, different energy-aware and thermal-aware resource management techniques are tested using the proposed ThermoSim framework in order to validate it against the existing framework (Thas). The experimental results demonstrate the proposed framework is capable of modeling and simulating the thermal behavior of a CDC and ThermoSim framework is better than Thas in terms of energy consumption, cost, time, memory usage and prediction accuracy
- …
