27 research outputs found

    All you can stream: Investigating the role of user behavior for greenhouse gas intensity of video streaming

    Full text link
    The information and communication technology sector reportedly has a relevant impact on the environment. Within this sector, video streaming has been identified as a major driver of CO2-emissions. To make streaming more sustainable, environmentally relevant factors must be identified on both the user and the provider side. Hence, environmental assessments, like life cycle assessments (LCA), need to broaden their perspective from a mere technological to one that includes user decisions and behavior. However, quantitative data on user behavior (e.g. streaming duration, choice of end device and resolution) are often lacking or difficult to integrate in LCA. Additionally, identifying relevant determinants of user behavior, such as the design of streaming platforms or user motivations, may help to design streaming services that keep environmental impact at a passable level. In order to carry out assessments in such a way, interdisciplinary collaboration is necessary. Therefore, this exploratory study combined LCA with an online survey (N= 91, 7 consecutive days of assessment). Based on this dataset the use phase of online video streaming was modeled. Additionally, factors such as sociodemographic, motivational and contextual determinants were measured. Results show that CO2-intensity of video streaming depends on several factors. It is shown that for climate intensity there is a factor 10 between choosing a smart TV and smartphone for video streaming. Furthermore, results show that some factors can be tackled from provider side to reduce overall energy demand at the user side; one of which is setting a low resolution as default.Comment: 7th International Conference on ICT for Sustainability (ICT4S

    Resisting Sleep Pressure:Impact on Resting State Functional Network Connectivity

    Get PDF
    In today's 24/7 society, sleep restriction is a common phenomenon which leads to increased levels of sleep pressure in daily life. However, the magnitude and extent of impairment of brain functioning due to increased sleep pressure is still not completely understood. Resting state network (RSN) analyses have become increasingly popular because they allow us to investigate brain activity patterns in the absence of a specific task and to identify changes under different levels of vigilance (e.g. due to increased sleep pressure). RSNs are commonly derived from BOLD fMRI signals but studies progressively also employ cerebral blood flow (CBF) signals. To investigate the impact of sleep pressure on RSNs, we examined RSNs of participants under high (19 h awake) and normal (10 h awake) sleep pressure with three imaging modalities (arterial spin labeling, BOLD, pseudo BOLD) while providing confirmation of vigilance states in most conditions. We demonstrated that CBF and pseudo BOLD signals (measured with arterial spin labeling) are suited to derive independent component analysis based RSNs. The spatial map differences of these RSNs were rather small, suggesting a strong biological substrate underlying these networks. Interestingly, increased sleep pressure, namely longer time awake, specifically changed the functional network connectivity (FNC) between RSNs. In summary, all FNCs of the default mode network with any other network or component showed increasing effects as a function of increased 'time awake'. All other FNCs became more anti-correlated with increased 'time awake'. The sensorimotor networks were the only ones who showed a within network change of FNC, namely decreased connectivity as function of 'time awake'. These specific changes of FNC could reflect both compensatory mechanisms aiming to fight sleep as well as a first reduction of consciousness while becoming drowsy. We think that the specific changes observed in functional network connectivity could imply an impairment of information transfer between the affected RSNs

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Climate change implications of gaming products and services

    Get PDF
    There is increasing concern over the climate change impact of games consoles. There is, however, little research on the life cycle carbon impact of consoles and existing research (the majority of which is focused on usage) is outdated. This study uses life cycle assessment (LCA) methodology to compare the climate change impact of different console-based gaming methods (i.e. games played from a disc, a down-loaded file, or streamed from the cloud). Console usage and Internet usage were identified as life cycle stages where data were unknown or uncertain. Two studies to improve the understanding of these areas were undertaken in this research and used to complete a cradle-to-grave carbon footprint study of gaming (compared using a functional unit of carbon equivalent emissions per hour of gameplay). Results estimated that, for average cases, download is the lowest carbon method of gaming at 0.047 kgCO2e/h, followed by disc at 0.055 kgCO2e/h. Cloud gaming has higher estimated carbon emissions at 0.149 kgCO2e/h, largely due to the additional energy consumed during use in the Internet, gaming servers, and home router equip-ment. These findings only represent average cases and the size of game files and length of gameplay time were found to be key variables significantly impacting the results. For example, for games played for under 8 hours, cloud gaming was found to have lower carbon emissions than downloads (up to 24 hours when compared to disc). In order to analyse these results, a new method for identifying which gaming method has the lowest carbon emissions with variation in both file size and gameplay time was developed. This has allowed for the identification of the thresholds in which different gaming methods have lowest carbon emissions, for any given range of input variables. The carbon emissions of gaming are highly dependent on consumer behav-iour (which game method is used, how long games are played for, and the type and size of those games) and therefore LCA based on average assumptions for these variables has limited application

    Rainfall and Temperature as Environmental Factors Impacting Beach Water Quality in Coastal Georgia

    No full text
    Presentation given at the Georgia Southern University Research Symposium

    Electricity Intensity of Internet Data Transmission: Untangling the Estimates

    No full text
    In order to understand the electricity use of Internet services, it is important to have accurate estimates for the average electricity intensity of transmitting data through the Internet (measured as kilowatt-hours per gigabyte [kWh/GB]). This study identifies representative estimates for the average electricity intensity of fixed-line Internet transmission networks over time and suggests criteria for making accurate estimates in the future. Differences in system boundary, assumptions used, and year to which the data apply significantly affect such estimates. Surprisingly, methodology used is not a major source of error, as has been suggested in the past. This article derives criteria to identify accurate estimates over time and provides a new estimate of 0.06 kWh/GB for 2015. By retroactively applying our criteria to existing studies, we were able to determine that the electricity intensity of data transmission (core and fixed-line access networks) has decreased by half approximately every 2 years since 2000 (for developed countries), a rate of change comparable to that found in the efficiency of computing more generally

    Increased susceptibility of human endothelial cells to infections by SARS-CoV-2 variants

    No full text
    Coronavirus disease 2019 (COVID-19) spawned a global health crisis in late 2019 and is caused by the novel coronavirus SARS-CoV-2. SARS-CoV-2 infection can lead to elevated markers of endothelial dysfunction associated with higher risk of mortality. It is unclear whether endothelial dysfunction is caused by direct infection of endothelial cells or is mainly secondary to inflammation. Here, we investigate whether different types of endothelial cells are susceptible to SARS-CoV-2. Human endothelial cells from different vascular beds including umbilical vein endothelial cells, coronary artery endothelial cells (HCAEC), cardiac and lung microvascular endothelial cells, or pulmonary arterial cells were inoculated in vitro with SARS-CoV-2. Viral spike protein was only detected in HCAECs after SARS-CoV-2 infection but not in the other endothelial cells tested. Consistently, only HCAEC expressed the SARS-CoV-2 receptor angiotensin-converting enzyme 2 (ACE2), required for virus infection. Infection with the SARS-CoV-2 variants B.1.1.7, B.1.351, and P.2 resulted in significantly higher levels of viral spike protein. Despite this, no intracellular double-stranded viral RNA was detected and the supernatant did not contain infectious virus. Analysis of the cellular distribution of the spike protein revealed that it co-localized with endosomal calnexin. SARS-CoV-2 infection did induce the ER stress gene EDEM1, which is responsible for clearance of misfolded proteins from the ER. Whereas the wild type of SARS-CoV-2 did not induce cytotoxic or pro-inflammatory effects, the variant B.1.1.7 reduced the HCAEC cell number. Of the different tested endothelial cells, HCAECs showed highest viral uptake but did not promote virus replication. Effects on cell number were only observed after infection with the variant B.1.1.7, suggesting that endothelial protection may be particularly important in patients infected with this variant
    corecore