1,478 research outputs found

    Privacy-preserving data analytics in cloud computing

    Get PDF
    The evolution of digital content and rapid expansion of data sources has raised the need for streamlined monitoring, collection, storage and analysis of massive, heterogeneous data to extract useful knowledge and support decision-making mechanisms. In this context, cloud computing o↵ers extensive, cost-e↵ective and on demand computing resources that improve the quality of services for users and also help service providers (enterprises, governments and individuals). Service providers can avoid the expense of acquiring and maintaining IT resources while migrating data and remotely managing processes including aggregation, monitoring and analysis in cloud servers. However, privacy and security concerns of cloud computing services, especially in storing sensitive data (e.g. personal, healthcare and financial) are major challenges to the adoption of these services. To overcome such barriers, several privacy-preserving techniques have been developed to protect outsourced data in the cloud. Cryptography is a well-known mechanism that can ensure data confidentiality in the cloud. Traditional cryptography techniques have the ability to protect the data through encryption in cloud servers and data owners can retrieve and decrypt data for their processing purposes. However, in this case, cloud users can use the cloud resources for data storage but they cannot take full advantage of cloud-based processing services. This raises the need to develop advanced cryptosystems that can protect data privacy, both while in storage and in processing in the cloud. Homomorphic Encryption (HE) has gained attention recently because it can preserve the privacy of data while it is stored and processed in the cloud servers and data owners can retrieve and decrypt their processed data to their own secure side. Therefore, HE o↵ers an end-to-end security mechanism that is a preferable feature in cloud-based applications. In this thesis, we developed innovative privacy-preserving cloud-based models based on HE cryptosystems. This allowed us to build secure and advanced analytic models in various fields. We began by designing and implementing a secure analytic cloud-based model based on a lightweight HE cryptosystem. We used a private resident cloud entity, called ”privacy manager”, as an intermediate communication server between data owners and public cloud servers. The privacy manager handles analytical tasks that cannot be accomplished by the lightweight HE cryptosystem. This model is convenient for several application domains that require real-time responses. Data owners delegate their processing tasks to the privacy manager, which then helps to automate analysis tasks without the need to interact with data owners. We then developed a comprehensive, secure analytical model based on a Fully Homomorphic Encryption (FHE), that has more computational capability than the lightweight HE. Although FHE can automate analysis tasks and avoid the use of the privacy manager entity, it also leads to massive computational overhead. To overcome this issue, we took the advantage of the massive cloud resources by designing a MapReduce model that massively parallelises HE analytical tasks. Our parallelisation approach significantly speeds up the performance of analysis computations based on FHE. We then considered distributed analytic models where the data is generated from distributed heterogeneous sources such as healthcare and industrial sensors that are attached to people or installed in a distributed-based manner. We developed a secure distributed analytic model by re-designing several analytic algorithms (centroid-based and distribution-based clustering) to adapt them into a secure distributed-based models based on FHE. Our distributed analytic model was developed not only for distributed-based applications, but also it eliminates FHE overhead obstacle by achieving high efficiency in FHE computations. Furthermore, the distributed approach is scalable across three factors: analysis accuracy, execution time and the amount of resources used. This scalability feature enables users to consider the requirements of their analysis tasks based on these factors (e.g. users may have limited resources or time constrains to accomplish their analysis tasks). Finally, we designed and implemented two privacy-preserving real-time cloud-based applications to demonstrate the capabilities of HE cryptosystems, in terms of both efficiency and computational capabilities for applications that require timely and reliable delivery of services. First, we developed a secure cloud-based billing model for a sensor-enabled smart grid infrastructure by using lightweight HE. This model handled billing analysis tasks for individual users in a secure manner without the need to interact with any trusted parties. Second, we built a real-time secure health surveillance model for smarter health communities in the cloud. We developed a secure change detection model based on an exponential smoothing technique to predict future changes in health vital signs based on FHE. Moreover, we built an innovative technique to parallelise FHE computations which significantly reduces computational overhead

    Analysing, visualising and supporting collaborative learning using interactive tabletops

    Get PDF
    The key contribution of this thesis is a novel approach to design, implement and evaluate the conceptual and technological infrastructure that captures student’s activity at interactive tabletops and analyses these data through Interaction Data Analytics techniques to provide support to teachers by enhancing their awareness of student’s collaboration. To achieve the above, this thesis presents a series of carefully designed user studies to understand how to capture, analyse and distil indicators of collaborative learning. We perform this in three steps: the exploration of the feasibility of the approach, the construction of a novel solution and the execution of the conceptual proposal, both under controlled conditions and in the wild. A total of eight datasets were analysed for the studies that are described in this thesis. This work pioneered in a number of areas including the application of data mining techniques to study collaboration at the tabletop, a plug-in solution to add user-identification to a regular tabletop using a depth sensor and the first multi-tabletop classroom used to run authentic collaborative activities associated with the curricula. In summary, while the mechanisms, interfaces and studies presented in this thesis were mostly explored in the context of interactive tabletops, the findings are likely to be relevant to other forms of groupware and learning scenarios that can be implemented in real classrooms. Through the mechanisms, the studies conducted and our conceptual framework this thesis provides an important research foundation for the ways in which interactive tabletops, along with data mining and visualisation techniques, can be used to provide support to improve teacher’s understanding about student’s collaboration and learning in small groups

    Measuring Competitiveness at NUTS3 Level and Territorial Partitioning of the Italian Provinces

    Get PDF
    In this paper we propose a dashboard of indicators of territorial attractiveness at NUTS3 level in the framework of the EU Regional Competitiveness Index (RCI). Then, the Fuzzy C-Medoids Clustering model with multivariate data and contiguity constraints is applied for partitioning the Italian provinces (NUTS3). The novelty is the territorial level analized, and the identification of the elementary indicators at the basis of the construction of the eleven composite competitiveness pillars. The positioning of the Italian provinces is deeply analyzed. The clusters obtained with and without contraints are compared. The obtained partition may play an important role in the design of policies at the NUTS3 level, a route already considered by the Italian government. The analysis developed and the related set of indicators at NUTS3 level constitute an information base that could be effectively used for the implementation of the National Recovery and Resilience Plan (NRRP)

    Data science for buildings, a multi-scale approach bridging occupants to smart-city energy planning

    Get PDF
    In a context of global carbon emission reduction goals, buildings have been identified to detain valuable energy-saving abilities. With the exponential increase of smart, connected building automation systems, massive amounts of data are now accessible for analysis. These coupled with powerful data science methods and machine learning algorithms present a unique opportunity to identify untapped energy-saving potentials from field information, and effectively turn buildings into active assets of the built energy infrastructure.However, the diversity of building occupants, infrastructures, and the disparities in collected information has produced disjointed scales of analytics that make it tedious for approaches to scale and generalize over the building stock.This coupled with the lack of standards in the sector has hindered the broader adoption of data science practices in the field, and engendered the following questioning:How can data science facilitate the scaling of approaches and bridge disconnected spatiotemporal scales of the built environment to deliver enhanced energy-saving strategies?This thesis focuses on addressing this interrogation by investigating data-driven, scalable, interpretable, and multi-scale approaches across varying types of analytical classes. The work particularly explores descriptive, predictive, and prescriptive analytics to connect occupants, buildings, and urban energy planning together for improved energy performances.First, a novel multi-dimensional data-mining framework is developed, producing distinct dimensional outlines supporting systematic methodological approaches and refined knowledge discovery. Second, an automated building heat dynamics identification method is put forward, supporting large-scale thermal performance examination of buildings in a non-intrusive manner. The method produced 64\% of good quality model fits, against 14\% close, and 22\% poor ones out of 225 Dutch residential buildings. %, which were open-sourced in the interest of developing benchmarks. Third, a pioneering hierarchical forecasting method was designed, bridging individual and aggregated building load predictions in a coherent, data-efficient fashion. The approach was evaluated over hierarchies of 37, 140, and 383 nodal elements and showcased improved accuracy and coherency performances against disjointed prediction systems.Finally, building occupants and urban energy planning strategies are investigated under the prism of uncertainty. In a neighborhood of 41 Dutch residential buildings, occupants were determined to significantly impact optimal energy community designs in the context of weather and economic uncertainties.Overall, the thesis demonstrated the added value of multi-scale approaches in all analytical classes while fostering best data-science practices in the sector from benchmarks and open-source implementations

    “Story of a Bank” Basel II accreditation through university-industry collaboration-case study

    Get PDF
    This paper deals with a case study of credit risk scoring models at Industrial Bank. The aim of this research is to investigate how a Malaysian financial institution developed and integrated credit risk scoring models with current organisational needs and evaluation of best practices for university-industry collaboration on this initiative. Attempts were made to categorise the credit risk scoring models initiative according to a variety of statistical techniques from modeling. This is an exploratory study which uses qualitative research methodology. Analysis of document from company annual reports as well as articles from journal, Bank Negara Malaysia, (BNM) regulatory reports as well as working papers and semistructured interviews were conducted to identify the organisational needs as a result of context and task. A company-wide development system for credit risk scoring model was effectively integrated to provide a direct support to competence management endeavor. The company’s credit risk scoring models initiatives have also resulted in managerial implications such as increased effectiveness of risk management through measuring the riskiness of each customer and automated the whole process, thereby leading to significant efficiency improvements. Thus, scoring models help banks to control credit risks. Going forward, credit risk scoring model is to become the best practice approach of the receivables management process and is essential to effective credit risk management

    Data science for buildings, a multi-scale approach bridging occupants to smart-city energy planning

    Get PDF

    Forecasting: theory and practice

    Get PDF
    Forecasting has always been in the forefront of decision making and planning. The uncertainty that surrounds the future is both exciting and challenging, with individuals and organisations seeking to minimise risks and maximise utilities. The lack of a free-lunch theorem implies the need for a diverse set of forecasting methods to tackle an array of applications. This unique article provides a non-systematic review of the theory and the practice of forecasting. We offer a wide range of theoretical, state-of-the-art models, methods, principles, and approaches to prepare, produce, organise, and evaluate forecasts. We then demonstrate how such theoretical concepts are applied in a variety of real-life contexts, including operations, economics, finance, energy, environment, and social good. We do not claim that this review is an exhaustive list of methods and applications. The list was compiled based on the expertise and interests of the authors. However, we wish that our encyclopedic presentation will offer a point of reference for the rich work that has been undertaken over the last decades, with some key insights for the future of the forecasting theory and practice

    Reconciliation, Restoration and Reconstruction of a Conflict Ridden Country

    Get PDF
    Conflict has sadly been a constant part of history. Winning a conflict and making a lasting peace are often not the same thing. While a peace treaty ends a conflict and often dictates terms from the winners’ perspective, it may not create a lasting peace. Short of unconditional surrender, modern conflict ends with a negotiated cessation of hostilities. Such accords may have some initial reconstruction agreements, but Reconciliation, Restoration and Reconstruction (RRR) is a long term process. This study maintains that to achieve a lasting peace: 1) The culture and beliefs of the conflict nation must be continuously considered and 2) RRR is a long term effort which will occur over years not just in the immediate wake of signing a treaty or agreement. To assure the inclusion of all stakeholders and gain the best results in dealing with this “wicked problem”, an array of Operations Research techniques can be used to support the long term planning and execution of a RRR effort. The final decisions will always be political, but the analysis provided by an OR support team will guide the decision makers to better execute consensus decisions that consider all stakeholder needs. The development of the value hierarchy framework in this dissertation is a keystone of building a rational OR supported long term plan for a successful RRR. The primary aim of the research is to propose a framework and associated set of guidelines derived from appropriate techniques of OR, Decision Analysis and Project Management (right from development of a consensus based value hierarchy to its implementation, feedback and steering corrections) that may be applied to help RRR efforts in any conflict ridden country across the globe. The framework is applicable to any conflict ridden country after incorporating changes particular to any country witnessing a prolonged conflict
    • …
    corecore