1,156 research outputs found

    Marketing relations and communication infrastructure development in the banking sector based on big data mining

    Get PDF
    Purpose: The article aims to study the methodological tools for applying the technologies of intellectual analysis of big data in the modern digital space, the further implementation of which can become the basis for the marketing relations concept implementation in the banking sector of the Russian Federation‘ economy. Structure/Methodology/Approach: For the marketing relations development in the banking sector in the digital economy, it seems necessary: firstly, to identify the opportunities and advantages of the big data mining in banking marketing; secondly, to identify the sources and methods of processing big data; thirdly, to study the examples of the big data mining successful use by Russian banks and to formulate the recommendations on the big data technologies implementation in the digital marketing banking strategy. Findings: The authors‘ analysis showed that big data technologies processing of open online and offline sources of information significantly increases the data amount available for intelligent analysis, as a result of which the interaction between the bank and the target client reaches a new level of partnership. Practical Implications: Conclusions and generalizations of the study can be applied in the practice of managing financial institutions. The results of the study can be used by bank management to form a digital marketing strategy for long-term communication. Originality/Value: The main contribution of this study is that the authors have identified the main directions of using big data in relationship marketing to generate additional profit, as well as the possibility of intellectual analysis of the client base, aimed at expanding the market share and retaining customers in the banking sector of the economy.peer-reviewe

    A survey of online data-driven proactive 5G network optimisation using machine learning

    Get PDF
    In the fifth-generation (5G) mobile networks, proactive network optimisation plays an important role in meeting the exponential traffic growth, more stringent service requirements, and to reduce capitaland operational expenditure. Proactive network optimisation is widely acknowledged as on e of the most promising ways to transform the 5G network based on big data analysis and cloud-fog-edge computing, but there are many challenges. Proactive algorithms will require accurate forecasting of highly contextualised traffic demand and quantifying the uncertainty to drive decision making with performance guarantees. Context in Cyber-Physical-Social Systems (CPSS) is often challenging to uncover, unfolds over time, and even more difficult to quantify and integrate into decision making. The first part of the review focuses on mining and inferring CPSS context from heterogeneous data sources, such as online user-generated-content. It will examine the state-of-the-art methods currently employed to infer location, social behaviour, and traffic demand through a cloud-edge computing framework; combining them to form the input to proactive algorithms. The second part of the review focuses on exploiting and integrating the demand knowledge for a range of proactive optimisation techniques, including the key aspects of load balancing, mobile edge caching, and interference management. In both parts, appropriate state-of-the-art machine learning techniques (including probabilistic uncertainty cascades in proactive optimisation), complexity-performance trade-offs, and demonstrative examples are presented to inspire readers. This survey couples the potential of online big data analytics, cloud-edge computing, statistical machine learning, and proactive network optimisation in a common cross-layer wireless framework. The wider impact of this survey includes better cross-fertilising the academic fields of data analytics, mobile edge computing, AI, CPSS, and wireless communications, as well as informing the industry of the promising potentials in this area

    Using Physical and Social Sensors in Real-Time Data Streaming for Natural Hazard Monitoring and Response

    Get PDF
    Technological breakthroughs in computing over the last few decades have resulted in important advances in natural hazards analysis. In particular, integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better estimates of real-time hazard. The main goal of this work is to utilize innovative streaming algorithms for improved real-time seismic hazard analysis by integrating different data sources and processing tools into cloud applications. In streaming algorithms, a sequence of items from physical and social sensors can be processed in as little as one pass with no need to store the data locally. Massive data volumes can be analyzed in near-real time with reasonable limits on storage space, an important advantage for natural hazard analysis. Seismic hazard maps are used by policymakers to set earthquake resistant construction standards, by insurance companies to set insurance rates and by civil engineers to estimate stability and damage potential. This research first focuses on improving probabilistic seismic hazard map production. The result is a series of maps for different frequency bands at significantly increased resolution with much lower latency time that includes a range of high-resolution sensitivity tests. Second, a method is developed for real-time earthquake intensity estimation using joint streaming analysis from physical and social sensors. Automatically calculated intensity estimates from physical sensors such as seismometers use empirical relationships between ground motion and intensity, while those from social sensors employ questionaries that evaluate ground shaking levels based on personal observations. Neither is always sufficiently precise and/or timely. Results demonstrate that joint processing can significantly reduce the response time to a damaging earthquake and estimate preliminary intensity levels during the first ten minutes after an event. The combination of social media and network sensor data, in conjunction with innovative computing algorithms, provides a new paradigm for real-time earthquake detection, facilitating rapid and inexpensive risk reduction. In particular, streaming algorithms are an efficient method that addresses three major problems in hazard estimation by improving resolution, decreasing processing latency to near real-time standards and providing more accurate results through the integration of multiple data sets

    Analysis of the mobility of people and tourists in nightlife areas in the city of Lisbon

    Get PDF
    The population's mobility can be better understood by looking into specific mobile phone activities. Thus, this dissertation aims to analyze the mobility of people and tourists in nightlife areas in the city of Lisbon using data from each user’s mobile device provided by a mobile operator. This data is obtained through an agreement between the mobile operator and the Lisbon city council. The main purpose is to provide the Lisbon Urban Data Lab team (LxDataLab) with data-based information that they can use to better manage and allocate resources regarding nightlife areas with strong public space occupation. These spaces have a great impact on the life of the city, and their management is very important in order to fulfill the interests of the several intervenients such as merchants, residents and users. The development of this research has focused on three stages: 1) create knowledge regarding Lisbon's nightlife and understand if the data can answer the research questions; 2) Understand, extract, clean and transform data, to develop a set of clean and useful data capable of responding to our needs; 3) Data visualization where it was possible to do a complete analysis of the data, extracting the value, knowledge and answers needed to provide to decision makers. Regarding deliverables, the work created also includes python files where data processing was done in addition to this document. For the visualizations and dashboards Microsoft PowerBI was used. The analysis and conclusions drawn were validated by the LxDataLab team through an online presentation of results.A mobilidade da população pode ser melhor compreendida através da análise de atividades específicas do telemóvel. Assim, esta dissertação tem como objetivo analisar a mobilidade de pessoas e turistas em zonas de diversão noturna, na cidade de Lisboa, utilizando dados de dispositivos móveis de cada utilizador, fornecidos por uma operadora. Estes dados são obtidos através de um acordo entre a operadora e a Câmara Municipal de Lisboa. O principal objetivo é fornecer à equipa Laboratório de Dados Urbanos de Lisboa (LxDataLab) informações baseadas em dados que possam ser utilizadas, para melhor gerir e afetar recursos relativamente às áreas de diversão noturna. Estes espaços têm um grande impacto na vida da cidade, e a sua gestão é muito importante para satisfazer os interesses dos vários intervenientes: comerciantes, residentes e utilizadores. O desenvolvimento desta investigação centrou-se em três etapas: 1) criar conhecimento relativamente à vida noturna de Lisboa e perceber se os dados respondem às questões de investigação; 2) compreender, extrair, limpar e transformar os dados, para desenvolver um conjunto de dados úteis, capazes de responder às nossas necessidades; 3) visualização dos dados, onde foi possível fazer uma análise completa dos dados, extraindo valor, conhecimento e as respostas necessárias para fornecer aos decisores. Relativamente aos entregáveis, o trabalho criado inclui também ficheiros python onde foi feito o processamento dos dados, além deste documento. Para as visualizações e dashboards foi utilizado o Microsoft PowerBI. As análises e conclusões retiradas foram validadas pela equipa do LxDataLab através de uma apresentação de resultados online

    Telecommunication data monetization

    Get PDF
    The aim of the study was to find out what kind of telecommunication data monetization models are interesting and potential. The focus was on finding out what kind of business model trends there are already, how telco data can be collected and monetized, how mature telco data monetization is and how telco data monetization can be advanced by adopting already existing models or creating innovative ways to do business from scratch. Empirical part consisted of theme interviews and workshops on the topics. The study indicates that internal telco data monetization is quite mature and it has been developed for a long time but many of the external telco data monetization projects are in piloting and testing phase. Telecommunication data monetization is quite similar with other data monetization processes, so already existing effective and profitable models can be adopted and clear need for creating totally new business models was not found. Location telco data based insight was seen as the most valuable way to do external monetization while also IoT and sensor telco data as a value were seen potential in the future. In telco data monetization projects, one of the biggest key activities is to fulfill data privacy regulations and still keep the business profitable

    Optimising clients with API gateways

    Get PDF
    This thesis investigates the benefits and complications around working with API (Application Programming Interface) gateways. When we say API gateway, we mean to proxy and potentially enhance the communication between servers and clients, such as browsers, by transforming the data. We do this by examining the underlying protocol HTTP/1.1 and the general theory regarding API gateways. An API gateway framework was developed in order to further understand some of the common problems and provide a way to rapidly develop prototype solutions to them. The framework was applied in three case studies in order to discover potential problematic areas and solve these in real world production systems. We could from the results see that the benefits gained from using an API gateway varied from case to case, and with results in hand, predict in which scenarios API gateways are the most beneficial.APIer över HTTP anpassas sällan för olika klienters behov vilket medför krånglig kommunikation och reducerad prestanda. En API-gateway kan placeras mellan klienter och APIer för att åtgärda detta
    corecore