4,096 research outputs found

    Hybridization of cognitive computing for food services

    Get PDF
    The application of data mining technology to food services and the restaurant industry has certain social value. By predicting customer traffic and needs, a restaurant can prepare a reasonable amount of meals for customers according to predicted needs which is conducive to improving the dining experience of customers and also improving the quality of food preparation and making the restaurant itself operate more efficiently. In recent years, we have seen the use of collaborative robots for use in the fast food industry. In Asia and more specifically in Japan, we have seen many fast-food chains implement the use of robots to better serve their customers. By studying the linear regression algorithm and the random forest algorithm, this paper proposes a new interwoven novel fusion approach of combining both algorithms and applies the new model to restaurant data to assist in the prediction of customer traffic in the restaurant industry. This predictive algorithm using cognitive techniques can assist these newly place robots in the food industry better serve their client base and in doing so make the industry more efficient. Experimental, comparison, and analysis are reported in the paper. The error rate of the fusion solution is reduced by approximately 5.503% compared with the linear regression algorithm and is approximately 3.719% lower than the error rate of the random forest algorithm. Results show that the new fusion algorithm can achieve better prediction results of customer traffic prediction for the restaurant industry. Furthermore, we also provide a new take on the application of data mining technology in the restaurant industry itself

    Interspecific association of brown trout (Salmo trutta) with non-native brook trout (Salvelinus fontinalis) at the fry stage

    Get PDF
    The introduction of non-native brook trout (Salvelinus fontinalis) in Europe has led to displacement and decreasing populations of native brown trout (Salmo trutta). Some studies have found that brown trout shift to a diet niche similar to brook trout when the two species live in sympatry, which conflicts with the competitive exclusion principle. A change in feeding niche may be a sign of early interspecific association and social learning, leading to behavioral changes. As a first step to address this possibility, it is essential to assess the interspecific association between the species during the early ontogenetic life stages. In this study, we therefore assess whether juvenile brown trout associate with non-native juvenile brook trout to the same extent as with conspecifics by setting up two experiments: (i) a binomial choice test allowing visual and chemical cues to estimate the species specificity of group preference, and (ii) an association test without physical barriers to estimate the degree of association of a focal brown trout with a group of either conspecifics or heterospecifics. In experiment (1), we found that focal juvenile brown trout preferred to associate with the stimuli groups and did not discriminate either against conspecific or heterospecific groups. Furthermore, more active individuals showed stronger preference for the stimuli group than less active ones, regardless of species. In experiment (2), we found that brook trout groups had a tighter group structure than brown trout groups, and that focal brown trout showed stronger association with brook trout than with brown trout. These results indicate that brown trout may associate with brook trout at an early life stage, which would allow for interspecific social learning to occur. Future studies should look closer into causes and consequences of interspecific association and social learning, including potential effects on the phenotype selection in brown trout populations

    Extraction of User Navigation Pattern Based on Particle Swarm Optimization

    Get PDF
    With current projections regarding the growth of Internet sales, online retailing raises many questions about how to market on the Net. A Recommender System (RS) is a composition of software tools that provides valuable piece of advice for items or services chosen by a user. Recommender systems are currently useful in both the research and in the commercial areas. Recommender systems are a means of personalizing a site and a solution to the customer?s information overload problem. Recommender Systems (RS) are software tools and techniques providing suggestions for items and/or services to be of use to a user. These systems are achieving widespread success in e-commerce applications nowadays, with the advent of internet. This paper presents a categorical review of the field of recommender systems and describes the state-of-the-art of the recommendation methods that are usually classified into four categories: Content based Collaborative, Demographic and Hybrid systems. To build our recommender system we will use fuzzy logic and Markov chain algorithm

    Building new places of the creative economy. The rise of coworking spaces

    Get PDF
    The late 2000s have seen the emergence of a new kind of workplace: the coworking space. As of February 2013, 2500 spaces had been identified worldwide. This paper endeavors to situate the phenomenon within the existing theory of the creative, urban economy, and to serve as a platform for discussion and further research. Coworking spaces (CS) are regarded as "serendipity accelerators", designed to host creative people and entrepreneurs who endeavor to break isolation and to find a convivial environment that favors meetings and collaboration. At the beginning of the movement, CS creations were purely private initiatives. The concept has since attracted the interest of media, and CS have been incorporated in larger public programs aimed at the making of the "creative city", which often materializes in the regeneration of decayed industrial neighborhoods. CS are the outcome of the blurring of the frontiers and hybridization processes between technological, economic and social categories. Even if their sustainability and growth potential deserve to be questioned, they are strongly anchored in the workplace landscape of major business cities

    A Hybrid Data-Driven Web-Based UI-UX Assessment Model

    Full text link
    Today, a large proportion of end user information systems have their Graphical User Interfaces (GUI) built with web-based technology (JavaScript, CSS, and HTML). Some of these web-based systems include: Internet of Things (IOT), Infotainment (in vehicles), Interactive Display Screens (for digital menu boards, information kiosks, digital signage displays at bus stops or airports, bank ATMs, etc.), and web applications/services (on smart devices). As such, web-based UI must be evaluated in order to improve upon its ability to perform the technical task for which it was designed. This study develops a framework and a processes for evaluating and improving the quality of web-based user interface (UI) as well as at a stratified level. The study develops a comprehensive framework which is a conglomeration of algorithms such as the multi-criteria decision making method of analytical hierarchy process (AHP) in coefficient generation, sentiment analysis, K-means clustering algorithms and explainable AI (XAI)

    SciTech News Volume 71, No. 1 (2017)

    Get PDF
    Columns and Reports From the Editor 3 Division News Science-Technology Division 5 Chemistry Division 8 Engineering Division Aerospace Section of the Engineering Division 9 Architecture, Building Engineering, Construction and Design Section of the Engineering Division 11 Reviews Sci-Tech Book News Reviews 12 Advertisements IEEE

    Airline types

    Get PDF
    acceptedVersio

    Enhancement of Metaheuristic Algorithm for Scheduling Workflows in Multi-fog Environments

    Get PDF
    Whether in computer science, engineering, or economics, optimization lies at the heart of any challenge involving decision-making. Choosing between several options is part of the decision- making process. Our desire to make the "better" decision drives our decision. An objective function or performance index describes the assessment of the alternative's goodness. The theory and methods of optimization are concerned with picking the best option. There are two types of optimization methods: deterministic and stochastic. The first is a traditional approach, which works well for small and linear problems. However, they struggle to address most of the real-world problems, which have a highly dimensional, nonlinear, and complex nature. As an alternative, stochastic optimization algorithms are specifically designed to tackle these types of challenges and are more common nowadays. This study proposed two stochastic, robust swarm-based metaheuristic optimization methods. They are both hybrid algorithms, which are formulated by combining Particle Swarm Optimization and Salp Swarm Optimization algorithms. Further, these algorithms are then applied to an important and thought-provoking problem. The problem is scientific workflow scheduling in multiple fog environments. Many computer environments, such as fog computing, are plagued by security attacks that must be handled. DDoS attacks are effectively harmful to fog computing environments as they occupy the fog's resources and make them busy. Thus, the fog environments would generally have fewer resources available during these types of attacks, and then the scheduling of submitted Internet of Things (IoT) workflows would be affected. Nevertheless, the current systems disregard the impact of DDoS attacks occurring in their scheduling process, causing the amount of workflows that miss deadlines as well as increasing the amount of tasks that are offloaded to the cloud. Hence, this study proposed a hybrid optimization algorithm as a solution for dealing with the workflow scheduling issue in various fog computing locations. The proposed algorithm comprises Salp Swarm Algorithm (SSA) and Particle Swarm Optimization (PSO). In dealing with the effects of DDoS attacks on fog computing locations, two Markov-chain schemes of discrete time types were used, whereby one calculates the average network bandwidth existing in each fog while the other determines the number of virtual machines existing in every fog on average. DDoS attacks are addressed at various levels. The approach predicts the DDoS attack’s influences on fog environments. Based on the simulation results, the proposed method can significantly lessen the amount of offloaded tasks that are transferred to the cloud data centers. It could also decrease the amount of workflows with missed deadlines. Moreover, the significance of green fog computing is growing in fog computing environments, in which the consumption of energy plays an essential role in determining maintenance expenses and carbon dioxide emissions. The implementation of efficient scheduling methods has the potential to mitigate the usage of energy by allocating tasks to the most appropriate resources, considering the energy efficiency of each individual resource. In order to mitigate these challenges, the proposed algorithm integrates the Dynamic Voltage and Frequency Scaling (DVFS) technique, which is commonly employed to enhance the energy efficiency of processors. The experimental findings demonstrate that the utilization of the proposed method, combined with the Dynamic Voltage and Frequency Scaling (DVFS) technique, yields improved outcomes. These benefits encompass a minimization in energy consumption. Consequently, this approach emerges as a more environmentally friendly and sustainable solution for fog computing environments
    • …
    corecore