5,113 research outputs found

    The relationship between difficulties in emotion regulation and dysfunctional technology use among adolescents

    Get PDF
    Objectives Since two decades scientific research is studying excessive and dysfunctional new technologies use and its influences on people’s lives, in terms of personal, relational, scholastic and work functioning impairment. The objectives of the present study are to investigate gender differences in problematic new technologies use as well as to examine the relationship between problematic new technologies use, emotional regulation and its specific dimensions. Methods 280 italian adolescents (51.1% males) aged 11 to 18 years (mean age = 13.31; SD = 2.33) were recruited from two italian secondary public schools and involved in this study. Data were collected using the Internet Addiction Test, the Video Game Dependency Scale, the Brief Multicultural Version of the Test of Mobile-Phone Dependence and the Difficulties in Emotion Regulation Scale. Results Results indicate significant association between emotion dysregulation and problematic internet (r = .504; p < .001), videogame (r = .372; p < .001), mobile-phone (r = .424; p < .001) use. These results support hypothesis that adolescents with greater emotion dysregulation are more likely to experience problematic new technologies use. Additionally, stepwise multiple regression analysis pointed out that the lack of effective emotion regulation strategies is a common risk factors between the problematic new technologies use, but regression analysis highlighted specific risk factors for some of the investigated dependent behaviors. Conclusions Findings of this study highlight a link between problematic new technologies use, emotion dysregulation and its specific dimensions. The results are discussed considering scientific advances and the role of emotional dysregulation in determining problematic new technologies use in adolescence. Further research with larger sample sizes is needed to confirm our data

    Special Agents Hunting Down Women Silent Killer: The Emerging Role of the p38α Kinase

    Get PDF
    Ovarian cancer is sensitive to chemotherapy with platinum compounds; however, the therapy success rate is significantly lowered by a high incidence of recurrence and by the acquisition of drug resistance. These negative outcomes mainly depend on altered apoptotic and drug resistance pathways, determining the need for the design of new therapeutic strategies to improve patient survival. This challenge has become even more critical because it has been recognized that hindering uncontrolled cell growth is not sufficient as the only curative approach. In fact, while current therapies are mostly conceived to impair survival of highly proliferating cells, several lines of research are now focusing on cancer-specific features to specifically target malignant cells with the aim of avoiding drug resistance and reducing adverse effects. Recently, great interest has been generated by the identification of metabolic reprogramming mechanisms occurring in cancer cells, such as the increase in glycolysis levels. In this light, pharmacologic manipulation of relevant pathways involved in cancer-specific metabolism and drug resistance could prove an effective approach to treat ovarian cancer patients

    Office Buildings Cooling Need in the Italian Climatic Context: Assessing the Performances of Typical Envelopes☆

    Get PDF
    Abstract This study assesses the cooling thermal energy need of office buildings that can represent typical cases within the Italian context, in particular defined by the envelope solutions that belong to three main construction ages, including new solutions that meet the current requirements for the envelope thermal properties. The results show that the large-glazed and lighter solutions (also newly built, consistent with the recent standards) reveal the worst behavior, while the buildings characterized by the lowest needs are the old conventional ones, even though they do not comply with the recent requirements for the envelope components. The same cooling performances are achieved by newly built insulated envelopes if they are "conventionally-glazed", but only when a strategy to loose the heat stored by the massive internal surfaces is adopted

    Smart operators: How Industry 4.0 is affecting the worker's performance in manufacturing contexts

    Get PDF
    Abstract The fourth industrial revolution is affecting the workforce at strategical, tactical, and operational levels and it is leading to the development of new careers with precise and specific skills and competence. The implementation of enabling technologies in the industrial context involves new types of interactions between operators and machines, interactions that transform the industrial workforce and have significant implications for the nature of the work. The incoming generation of Smart Operators 4.0 is characterised by intelligent and qualified operators who perform the work with the support of machines, interact with collaborative robots and advanced systems, use technologies such as wearable devices and augmented and virtual reality. The correct interaction between the workforce and the various enabling technologies of the 4.0 paradigm represents a crucial aspect of the success of the smart factory. However, this interaction is affected by the variability of human behaviour and its reliability, which can strongly influence the quality, safety, and productivity standards. For this reason, this paper aims to provide a clear and complete analysis of the different types of smart operators and the impact of 4.0 enabling technologies on the performance of operators, evaluating the stakeholders involved, the type of interaction, the changes required for operators in terms of added and removed work, and the new performance achieved by workers

    Smart operators: How augmented and virtual technologies are affecting the worker's performance in manufacturing contexts

    Get PDF
    Purpose: The correct interaction between the workforce and augmented, virtual, and mixed reality technologies represents a crucial aspect of the success of the smart factory. This interaction is, indeed, affected by the variability of human behavior and its reliability, which can strongly influence the quality, safety, and productivity standards. For this reason, this paper aims to provide a clear and complete analysis of the impacts of these technologies on the performance of operators. Design/methodology/approach: A Systematic Literature Review (SLR) was conducted to identify peer-reviewed papers that focused on the implementation of augmented and virtual technologies in manufacturing systems and their effects on human performance. Findings: In total, 61 papers were selected and thoroughly analyzed. The findings of this study reveal that Augmented, Virtual and Mixed Reality can be applied for several applications in manufacturing systems with different types of devices, that involve various advantages and disadvantages. The worker’s performance that are influencing by the use of these technologies are above all time to complete a task, error rate and mental and physical workload. Originality/value: Over the years Augmented, Virtual and Mixed Reality technologies in manufacturing systems have been investigated by researchers. Several studies mostly focused on technological issues, have been conducted. The role of the operator, whose tasks may be influenced positively or negatively by the use of new devices, has been hardly ever analyzed and a deep analysis of human performance affected by these technologies is missing. This study represents a preliminary analysis to fill this gap. The results obtained from the SLR allowed us to develop a conceptual framework that investigates the current state-of-the-art knowledge about the topic and highlights gaps in the current researchPeer Reviewe

    Data-driven predictive control in a stochastic setting: a unified framework

    Get PDF
    Data-driven predictive control (DDPC) has been recently proposed as an effective alternative to traditional model-predictive control (MPC) for its unique features of being time-efficient and unbiased with respect to the oracle solution. Nonetheless, it has also been observed that noise may strongly jeopardize the final closed-loop performance since it affects both the data-based system representation and the control update computed from the online measurements. Recent studies have shown that regularization is potentially a successful tool to counteract the effect of noise. At the same time, regularization requires the tuning of a set of penalty terms, whose choice might be practically difficult without closed-loop experiments. In this paper, by means of subspace identification tools, we pursue a three-fold goal: (i)(i) we set up a unified framework for the existing regularized data-driven predictive control schemes for stochastic systems; (ii)(ii) we introduce γ\gamma-DDPC, an efficient two-stage scheme that splits the optimization problem into two parts: fitting the initial conditions and optimizing the future performance, while guaranteeing constraint satisfaction; (iii)(iii) we discuss the role of regularization for data-driven predictive control, providing new insight on whenwhen and howhow it should be applied. A benchmark numerical case study finally illustrates the performance of γ\gamma-DDPC, showing how controller design can be simplified in terms of tuning effort and computational complexity when benefiting from the insights coming from the subspace identification realm.Comment: 17 pages, 12 figure

    A framework to measure the robustness of programs in the unpredictable environment

    Get PDF
    Due to the diffusion of IoT, modern software systems are often thought to control and coordinate smart devices in order to manage assets and resources, and to guarantee efficient behaviours. For this class of systems, which interact extensively with humans and with their environment, it is thus crucial to guarantee their correct behaviour in order to avoid unexpected and possibly dangerous situations. In this paper we will present a framework that allows us to measure the robustness of systems. This is the ability of a program to tolerate changes in the environmental conditions and preserving the original behaviour. In the proposed framework, the interaction of a program with its environment is represented as a sequence of random variables describing how both evolve in time. For this reason, the considered measures will be defined among probability distributions of observed data. The proposed framework will be then used to define the notions of adaptability and reliability. The former indicates the ability of a program to absorb perturbation on environmental conditions after a given amount of time. The latter expresses the ability of a program to maintain its intended behaviour (up-to some reasonable tolerance) despite the presence of perturbations in the environment. Moreover, an algorithm, based on statistical inference, it proposed to evaluate the proposed metric and the aforementioned properties. Throughout the paper, two case studies are used to the describe and evaluate the proposed approach

    Model predictive control with dynamic move blocking

    Full text link
    Model Predictive Control (MPC) has proven to be a powerful tool for the control of systems with constraints. Nonetheless, in many applications, a major challenge arises, that is finding the optimal solution within a single sampling instant to apply a receding-horizon policy. In such cases, many suboptimal solutions have been proposed, among which the possibility of "blocking" some moves a-priori. In this paper, we propose a dynamic approach to move blocking, to exploit the solution already available at the previous iteration, and we show not only that such an approach preserves asymptotic stability, but also that the decrease of performance with respect to the ideal solution can be theoretically bounded.Comment: 7 page

    A practitioner's guide to noise handling strategies in data-driven predictive control

    Get PDF
    Today's increasing availability of data is having a remarkable impact on control design. However, for data-driven control approaches to become widespread in practical applications, it is necessary to devise strategies that can effectively handle the presence of noise in the data used to design the controller. In this work, we analyse the existing approaches to deal with noisy measurements in data-driven predictive control (DDPC) and we highlight the advantages and downsides of each technique from a practitioner's perspective. Our qualitative conclusions are supported by the results obtained from two benchmark examples.</p
    corecore