10,770 research outputs found
Big Data Reference Architecture for e-Learning Analytical Systems
The recent advancements in technology have produced big data and become the necessity for researcher to analyze the data in order to make it meaningful. Massive amounts of data are collected across social media sites, mobile communications, business environments and institutions. In order to efficiently analyze this large quantity of raw data, the concept of big data was introduced. In this regard, big data analytic is needed in order to provide techniques to analyze the data. This new concept is expected to help education in the near future, by changing the way we approach the e-Learning process, by encouraging the interaction between learners and teachers, by allowing the fulfilment of the individual requirements and goals of learners. The learning environment generates massive knowledge by means of the various services provided in massive open online courses. Such knowledge is produced via learning actor interactions. Also, data analytics can be a valuable tool to help e-Learning organizations deliver better services to the public. It can provide important insights into consumer behavior and better predict demand for goods and services, thereby allowing for better resource management. This result motivates to put forward solutions for big data usage to the educational field. This research article unfolds a big data reference architecture for e-Learning analytical systems to make a unified analysis of the massive data generated by learning actors. This reference architecture makes the process of the massive data produced in big data e-learning system. Finally, the BiDRA for e-Learning analytical systems was evaluated based on the quality of maintainability, modularity, reusability, performance, and scalability
Big Data Reference Architectures, a systematic literature review
Today, we live in a world that produces data at an unprecedented rate. The significant amount of data has raised lots of attention and many strive to harness the power of this new material. In the same direction, academics and practitioners have considered means through which they can incorporate datadriven functions and explore patterns that were otherwise unknown. This has led to a concept called Big Data. Big Data is a field that deals with data sets that are too large and complex for traditional approaches to handle. Technical matters are fundamentally critical, but what is even more necessary, is an architecture that supports the orchestration of Big Data systems; an image of the system providing with clear understanding of different elements and their interdependencies. Reference architectures aid in defining the body of system and its key components, relationships, behaviors, patterns and limitations. This study provides an in-depth review of Big Data Reference Architectures by applying a systematic literature review. The study demonstrates a synthesis of high-quality research to offer indications of new trends. The study contributes to the body of knowledge on the principles of Reference Architectures, the current state of Big Data Reference Architectures, and their limitations
Technology Selection for Big Data and Analytical Applications
The term Big Data has become pervasive in recent years, as smart phones, televisions, washing machines, refrigerators, smart meters, diverse sensors, eyeglasses, and even clothes connect to the Internet. However, their generated data is essentially worthless without appropriate data analytics that utilizes information retrieval, statistics, as well as various other techniques. As Big Data is commonly too big for a single person or institution to investigate, appropriate tools are being used that go way beyond a traditional data warehouse and that have been developed in recent years. Unfortunately, there is no single solution but a large variety of different tools, each of which with distinct functionalities, properties and characteristics. Especially small and medium-sized companies have a hard time to keep track, as this requires time, skills, money, and specific knowledge that, in combination, result in high entrance barriers for Big Data utilization. This paper aims to reduce these barriers by explaining and structuring different classes of technologies and the basic criteria for proper technology selection. It proposes a framework that guides especially small and mid-sized companies through a suitable selection process that can serve as a basis for further advances
An End-to-End Big Data Analytics Platform for IoT-enabled Smart Factories: A Case Study of Battery Module Assembly System for Electric Vehicles
Within the concept of factories of the future, big data analytics systems play a critical role in supporting decision-making at various stages across enterprise processes. However, the design and deployment of industry-ready, lightweight, modular, flexible, and low-cost big data analytics solutions remains one of the main challenges towards the Industry 4.0 enabled digital transformation. This paper presents an end-to-end IoT-based big data analytics platform that consists of five interconnected layers and several components for data acquisition, integration, storage, analytics and visualisation purposes. The platform architecture benefits from state-of-the-art technologies and integrates them in a systematic and interoperable way with clear information flows. The developed platform has been deployed in an Electric Vehicle (EV) battery module smart assembly automation system designed by the Automation Systems Group (ASG) at the University of Warwick, UK. The developed proof-of-concept solution demonstrates how a wide variety of tools and methods can be orchestrated to work together aiming to support decision-making and to improve both process and product qualities in smart manufacturing environments
An Industrial Data Analysis and Supervision Framework for Predictive Manufacturing Systems
Due to the advancements in the Information and Communication Technologies field in the
modern interconnected world, the manufacturing industry is becoming a more and more
data rich environment, with large volumes of data being generated on a daily basis, thus
presenting a new set of opportunities to be explored towards improving the efficiency and
quality of production processes.
This can be done through the development of the so called Predictive Manufacturing
Systems. These systems aim to improve manufacturing processes through a combination
of concepts such as Cyber-Physical Production Systems, Machine Learning and real-time
Data Analytics in order to predict future states and events in production. This can be used
in a wide array of applications, including predictive maintenance policies, improving quality
control through the early detection of faults and defects or optimize energy consumption,
to name a few.
Therefore, the research efforts presented in this document focus on the design and development
of a generic framework to guide the implementation of predictive manufacturing
systems through a set of common requirements and components. This approach aims
to enable manufacturers to extract, analyse, interpret and transform their data into actionable
knowledge that can be leveraged into a business advantage. To this end a list
of goals, functional and non-functional requirements is defined for these systems based
on a thorough literature review and empirical knowledge. Subsequently the Intelligent
Data Analysis and Real-Time Supervision (IDARTS) framework is proposed, along with
a detailed description of each of its main components.
Finally, a pilot implementation is presented for each of this components, followed by the
demonstration of the proposed framework in three different scenarios including several use
cases in varied real-world industrial areas. In this way the proposed work aims to provide
a common foundation for the full realization of Predictive Manufacturing Systems
Continuous maintenance and the future – Foundations and technological challenges
High value and long life products require continuous maintenance throughout their life cycle to achieve required performance with optimum through-life cost. This paper presents foundations and technologies required to offer the maintenance service. Component and system level degradation science, assessment and modelling along with life cycle ‘big data’ analytics are the two most important knowledge and skill base required for the continuous maintenance. Advanced computing and visualisation technologies will improve efficiency of the maintenance and reduce through-life cost of the product. Future of continuous maintenance within the Industry 4.0 context also identifies the role of IoT, standards and cyber security
Proposing A Supply Chain Analytics Reference Model As Performance Enabler
Nowadays firms have to react quickly to changing markets creating a need for accurate forecasts of demand and supply. In a data-rich environment as it is within the field of supply chain management, much information needs to be stored, processed, and transformed for decision making. To deal with the increasing amounts of data, firms must be aware of chances in supply chain management such as supply chain analytic capabilities to stay agile, flexible, and make use of (complex) data. Supply chain analytics can predict patterns and trends, even in high velocity markets in real-time supporting decision making by using supply chain analytic tools based on data. The benefits of successfully implementing supply chain analytic processes are enormous and result in competitive advantages for companies such as lowering costs while increasing revenues. As many companies fail to apply supply chain analytic processes and tools, this paper examines the challenges, benefits, and factors for the introduction of supply chain analytics using the input-output model
- …