39 research outputs found

    ChatGPT and Persuasive Technologies for the Management and Delivery of Personalized Recommendations in Hotel Hospitality

    Full text link
    Recommender systems have become indispensable tools in the hotel hospitality industry, enabling personalized and tailored experiences for guests. Recent advancements in large language models (LLMs), such as ChatGPT, and persuasive technologies, have opened new avenues for enhancing the effectiveness of those systems. This paper explores the potential of integrating ChatGPT and persuasive technologies for automating and improving hotel hospitality recommender systems. First, we delve into the capabilities of ChatGPT, which can understand and generate human-like text, enabling more accurate and context-aware recommendations. We discuss the integration of ChatGPT into recommender systems, highlighting the ability to analyze user preferences, extract valuable insights from online reviews, and generate personalized recommendations based on guest profiles. Second, we investigate the role of persuasive technology in influencing user behavior and enhancing the persuasive impact of hotel recommendations. By incorporating persuasive techniques, such as social proof, scarcity and personalization, recommender systems can effectively influence user decision-making and encourage desired actions, such as booking a specific hotel or upgrading their room. To investigate the efficacy of ChatGPT and persuasive technologies, we present a pilot experi-ment with a case study involving a hotel recommender system. We aim to study the impact of integrating ChatGPT and persua-sive techniques on user engagement, satisfaction, and conversion rates. The preliminary results demonstrate the potential of these technologies in enhancing the overall guest experience and business performance. Overall, this paper contributes to the field of hotel hospitality by exploring the synergistic relationship between LLMs and persuasive technology in recommender systems, ultimately influencing guest satisfaction and hotel revenue.Comment: 17 pages, 12 figure

    Correction to: Two years later: Is the SARS-CoV-2 pandemic still having an impact on emergency surgery? An international cross-sectional survey among WSES members

    Get PDF
    Background: The SARS-CoV-2 pandemic is still ongoing and a major challenge for health care services worldwide. In the first WSES COVID-19 emergency surgery survey, a strong negative impact on emergency surgery (ES) had been described already early in the pandemic situation. However, the knowledge is limited about current effects of the pandemic on patient flow through emergency rooms, daily routine and decision making in ES as well as their changes over time during the last two pandemic years. This second WSES COVID-19 emergency surgery survey investigates the impact of the SARS-CoV-2 pandemic on ES during the course of the pandemic. Methods: A web survey had been distributed to medical specialists in ES during a four-week period from January 2022, investigating the impact of the pandemic on patients and septic diseases both requiring ES, structural problems due to the pandemic and time-to-intervention in ES routine. Results: 367 collaborators from 59 countries responded to the survey. The majority indicated that the pandemic still significantly impacts on treatment and outcome of surgical emergency patients (83.1% and 78.5%, respectively). As reasons, the collaborators reported decreased case load in ES (44.7%), but patients presenting with more prolonged and severe diseases, especially concerning perforated appendicitis (62.1%) and diverticulitis (57.5%). Otherwise, approximately 50% of the participants still observe a delay in time-to-intervention in ES compared with the situation before the pandemic. Relevant causes leading to enlarged time-to-intervention in ES during the pandemic are persistent problems with in-hospital logistics, lacks in medical staff as well as operating room and intensive care capacities during the pandemic. This leads not only to the need for triage or transferring of ES patients to other hospitals, reported by 64.0% and 48.8% of the collaborators, respectively, but also to paradigm shifts in treatment modalities to non-operative approaches reported by 67.3% of the participants, especially in uncomplicated appendicitis, cholecystitis and multiple-recurrent diverticulitis. Conclusions: The SARS-CoV-2 pandemic still significantly impacts on care and outcome of patients in ES. Well-known problems with in-hospital logistics are not sufficiently resolved by now; however, medical staff shortages and reduced capacities have been dramatically aggravated over last two pandemic years

    Feature Selection with a Backtracking Search Optimization Algorithm

    No full text
    Feature selection carries significance in the outcome of any classification or regression task. Exercising evolutionary computation algorithms in feature selection has led to the construction of efficient discrete optimization algorithms. In this paper, a modified backtracking search algorithm is employed to perform wrapper-based feature selection, where two modifications of the standard backtracking search algorithm are adopted. The first one concentrates on utilizing a particle ranking operator regarding the current population. The second one focuses on removing the case of using a single particle on the mutation process. Then, the implementation of the above algorithm in feature selection is carried out in terms of two general frameworks, which originally were developed for the particle swarm optimization. The first framework is based on the binary and the second on the set-based particle swarm optimization. The experimental analysis shows that the above variants of the backtracking search algorithm perform equally well on the classification of several datasets

    Structure Learning and Hyperparameter Optimization Using an Automated Machine Learning (AutoML) Pipeline

    No full text
    In this paper, we built an automated machine learning (AutoML) pipeline for structure-based learning and hyperparameter optimization purposes. The pipeline consists of three main automated stages. The first carries out the collection and preprocessing of the dataset from the Kaggle database through the Kaggle API. The second utilizes the Keras-Bayesian optimization tuning library to perform hyperparameter optimization. The third focuses on the training process of the machine learning (ML) model using the hyperparameter values estimated in the previous stage, and its evaluation is performed on the testing data by implementing the Neptune AI. The main technologies used to develop a stable and reusable machine learning pipeline are the popular Git version control system, the Google cloud virtual machine, the Jenkins server, the Docker containerization technology, and the Ngrok reverse proxy tool. The latter can securely publish the local Jenkins address as public through the internet. As such, some parts of the proposed pipeline are taken from the thematic area of machine learning operations (MLOps), resulting in a hybrid software scheme. The machine learning model was used to evaluate the pipeline, which is a multilayer perceptron (MLP) that combines typical dense, as well as polynomial, layers. The simulation results show that the proposed pipeline exhibits a reliable and accurate performance while managing to boost the network’s performance in classification tasks

    Ontology-Based Feature Selection: A Survey

    No full text
    The Semantic Web emerged as an extension to the traditional Web, adding meaning (semantics) to a distributed Web of structured and linked information. At its core, the concept of ontology provides the means to semantically describe and structure information, and expose it to software and human agents in a machine and human-readable form. For software agents to be realized, it is crucial to develop powerful artificial intelligence and machine-learning techniques, able to extract knowledge from information sources, and represent it in the underlying ontology. This survey aims to provide insight into key aspects of ontology-based knowledge extraction from various sources such as text, databases, and human expertise, realized in the realm of feature selection. First, common classification and feature selection algorithms are presented. Then, selected approaches, which utilize ontologies to represent features and perform feature selection and classification, are described. The selective and representative approaches span diverse application domains, such as document classification, opinion mining, manufacturing, recommendation systems, urban management, information security systems, and demonstrate the feasibility and applicability of such methods. This survey, in addition to the criteria-based presentation of related works, contributes a number of open issues and challenges related to this still active research topic

    LSTM-Based Prediction of Mediterranean Vegetation Dynamics Using NDVI Time-Series Data

    No full text
    Vegetation index time-series analysis of multitemporal satellite data is widely used to study vegetation dynamics in the present climate change era. This paper proposes a systematic methodology to predict the Normalized Difference Vegetation Index (NDVI) using time-series data extracted from the Moderate Resolution Imaging Spectroradiometer (MODIS). The key idea is to obtain accurate NDVI predictions by combining the merits of two effective computational intelligence techniques; namely, fuzzy clustering and long short-term memory (LSTM) neural networks under the framework of dynamic time warping (DTW) similarity measure. The study area is the Lesvos Island, located in the Aegean Sea, Greece, which is an insular environment in the Mediterranean coastal region. The algorithmic steps and the main contributions of the current work are described as follows. (1) A data reduction mechanism was applied to obtain a set of representative time series. (2) Since DTW is a similarity measure and not a distance, a multidimensional scaling approach was applied to transform the representative time series into points in a low-dimensional space, thus enabling the use of the Euclidean distance. (3) An efficient optimal fuzzy clustering scheme was implemented to obtain the optimal number of clusters that better described the underline distribution of the low-dimensional points. (4) The center of each cluster was mapped into time series, which were the mean of all representative time series that corresponded to the points belonging to that cluster. (5) Finally, the time series obtained in the last step were further processed in terms of LSTM neural networks. In particular, development and evaluation of the LSTM models was carried out considering a one-year period, i.e., 12 monthly time steps. The results indicate that the method identified unique time-series patterns of NDVI among different CORINE land-use/land-cover (LULC) types. The LSTM networks predicted the NDVI with root mean squared error (RMSE) ranging from 0.017 to 0.079. For the validation year of 2020, the difference between forecasted and actual NDVI was less than 0.1 in most of the study area. This study indicates that the synergy of the optimal fuzzy clustering based on DTW similarity of NDVI time-series data and the use of LSTM networks with clustered data can provide useful results for monitoring vegetation dynamics in fragmented Mediterranean ecosystems

    Fuzzy Modeling via Optimal Fuzzy Clustering

    No full text
    Abstract: This paper introduces a new method for fuzzy modeling based on set of input-output data pairs. The method consists of a sequence of steps aiming towards developing a Sugeno-type fuzzy model of optimal structure. In the first place, the algorithm uses the fuzzy c-means to classify all the input training data vectors into a predefined number of clusters. The centers of these clusters are further processed by using optimal fuzzy clustering, which is based on the weighted fuzzy c-means algorithm. The resulted optimal fuzzy partition defines the number of fuzzy rules and provides an initial estimation for the system parameters, which are further tuned using the gradient-descend algorithm. The proposed method is successfully applied to a time series prediction problem, where its performance is compared to the performances of other methods found in the literature
    corecore