111 research outputs found
Digital Technologies for Digital Innovation: Unlocking Data and Knowledge to Drive Organizational Value Creation
The rise of digitization has radically transformed innovation processes of today's companies and is increasingly challenging existing theories and practices. Digital innovation can describe both the use of digital technologies during the innovation process and the outcome of innovation. This thesis aims to improve the understanding of digital innovation in today's digitized world by contributing to the theoretical and practical knowledge along the four organizational activities of the digital innovation process: initiation, development, implementation, and exploitation. In doing so, the thesis pays special attention to the use of digital technologies and tools (e.g., machine learning, online crowdsourcing platforms, etc.) that unlock knowledge and data to facilitate new products, services, and other value streams.
When initiating digital innovations, organizations seek to identify, assimilate, and apply valuable knowledge from within and outside the organization. This activity is crucial for organizations as it determines how they address the increasing pressure to innovate in their industries and markets while innovation processes themselves are changing and becoming more distributed and open. Papers A and B of this thesis address this phase by examining how digital technologies are changing knowledge gathering, e.g., through new ways of crowdsourcing ideas and facilitating cooperation and collaboration among users and innovation collectives.
Paper A focuses on organizational culture as a critical backdrop of digital innovations and explores whether it influences the implementation of idea platforms and, in this way, facilitates the discovery of innovations. The paper reveals that the implementation of idea platforms is facilitated by a culture that emphasizes policies, procedures, and information management. Additionally, the paper highlights the importance of taking organizational culture into account when introducing a new technology or process that may be incompatible with the existing culture.
Paper B examines newly formed innovation collectives and initiatives for developing ventilators to address shortages during the rise of the COVID-19 pandemic. The paper focuses on digital technologies enabling a transformation in the way innovation collectives form, communicate, and collaborate - all during a period of shutdown and social distancing. The paper underlines the role of digital technologies and collaboration platforms through networking, communication, and decentralized development. The results show that through the effective use of digital technologies, even complex innovations are no longer developed only in large enterprises but also by innovation collectives that can involve dynamic sets of actors with diverse goals and capabilities. In addition, established organizations are increasingly confronted with community innovations that offer complex solutions based on a modular architecture characteristic of digital innovations.
Such modular layered architectures are a critical concept in the development of digital innovations. This phase of the digital innovation process encompasses the design, development, and adoption of technological artifacts, which are explored in Sections C and D of this paper.
Paper C focuses on the latter, the adoption of digital services artifacts in the plant and mechanical engineering industry. The paper presents an integrative model based on the Technology-Organization-Environment (TOE) framework that examines different contextual factors as important components of the introduction, adoption, and routinization of digital service innovations. The results provide a basis for studying the assimilation of digital service innovations and can serve as a reference model for informing managerial decisions.
Paper D, in turn, focuses on the design and development of a technology artifact. The paper focuses on applying cloud-based machine learning services to implement a visual inspection system in the manufacturing industry. The results show, for one, the value of standardization and vendor-supplied IS architecture concepts in digital innovation and, for another, how such innovations can facilitate further innovations in manufacturing.
The implementation of digital innovations marks the third phase of the digital innovation process, which is addressed in Paper E. It encompasses organizational changes that occur during digital innovation initiatives. This phase emphasizes change through digital innovation initiatives within the organization (e.g., strategy, structure, people, and technology) and across the organizational environment.
Paper E investigates how digital service innovations impact industrial firms, relationships between firms and their customers, and product/service offerings. The paper uses work systems theory as a theoretical foundation to structure the results and analyze them through the lens of service systems. While this analysis helps to identify the organizational changes that result from the implementation of digital innovations, the paper also provides a basis for further research and supports practitioners with systematic analyses of organizational change.
The last phase of the digital innovation process is about exploiting existing systems/data for new purposes and innovations. In this regard, it is important to better understand the improvements and effects in the domains beyond the sheer outcome of digital innovation, such as organizational learning or organizational change capabilities. Paper F of this thesis investigates the exploitation of digital innovations in the context of organizational learning. One aspect of this addresses how individuals within the organization leverage innovation to explore and exploit knowledge.
Paper F utilizes the organizational learning perspective and examines the dynamics of human learning and machine learning to understand how organizations can benefit from their respective idiosyncrasies in enabling bilateral learning. The paper demonstrates how bilateral human-machine learning can improve the overall performance using a case study from the trading sector. Drawing on these findings, the paper offers new insights into the coordination of human learning and machine learning, and moreover, the collaboration between human and artificial intelligence in organizational routines
Evolutionary computation for expensive optimization: a survey
Expensive optimization problem (EOP) widely exists in various significant real-world applications. However, EOP requires expensive or even unaffordable costs for evaluating candidate solutions, which is expensive for the algorithm to find a satisfactory solution. Moreover, due to the fast-growing application demands in the economy and society, such as the emergence of the smart cities, the internet of things, and the big data era, solving EOP more efficiently has become increasingly essential in various fields, which poses great challenges on the problem-solving ability of optimization approach for EOP. Among various optimization approaches, evolutionary computation (EC) is a promising global optimization tool widely used for solving EOP efficiently in the past decades. Given the fruitful advancements of EC for EOP, it is essential to review these advancements in order to synthesize and give previous research experiences and references to aid the development of relevant research fields and real-world applications. Motivated by this, this paper aims to provide a comprehensive survey to show why and how EC can solve EOP efficiently. For this aim, this paper firstly analyzes the total optimization cost of EC in solving EOP. Then, based on the analysis, three promising research directions are pointed out for solving EOP, which are problem approximation and substitution, algorithm design and enhancement, and parallel and distributed computation. Note that, to the best of our knowledge, this paper is the first that outlines the possible directions for efficiently solving EOP by analyzing the total expensive cost. Based on this, existing works are reviewed comprehensively via a taxonomy with four parts, including the above three research directions and the real-world application part. Moreover, some future research directions are also discussed in this paper. It is believed that such a survey can attract attention, encourage discussions, and stimulate new EC research ideas for solving EOP and related real-world applications more efficiently
Agroforestry-Based Ecosystem Services
As a dynamic interface between agriculture and forestry, agroforestry has only recently been formally recognized as a relevant part of land use with âtrees outside forestâ in important parts of the worldâbut not everywhere yet. The Sustainable Development Goals have called attention to the need for the multifunctionality of landscapes that simultaneously contribute to multiple goals. In the UN decade of landscape restoration, as well as in response to the climate change urgency and biodiversity extinction crisis, an increase in global tree cover is widely seen as desirable, but its management by farmers or forest managers remains contested. Agroforestry research relates treeâsoilâcropâlivestock interactions at the plot level with landscape-level analysis of social-ecological systems and efforts to transcend the historical dichotomy between forest and agriculture as separate policy domains. An âecosystem servicesâ perspective quantifies land productivity, flows of water, net greenhouse gas emissions, and biodiversity conservation, and combines an âactorâ perspective (farmer, landscape manager) with that of âdownstreamâ stakeholders (in the same watershed, ecologically conscious consumers elsewhere, global citizens) and higher-level regulators designing land-use policies and spatial zoning
Evolutionary Computation 2020
Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms
WCET and Priority Assignment Analysis of Real-Time Systems using Search and Machine Learning
Real-time systems have become indispensable for human life as they are used in numerous industries, such as vehicles, medical devices, and satellite systems. These systems are very sensitive to violations of their time constraints (deadlines), which can have catastrophic consequences. To verify whether the systems meet their time constraints, engineers perform schedulability analysis from early stages and throughout development. However, there are challenges in obtaining precise results from schedulability analysis due to estimating the worst-case execution times (WCETs) and assigning optimal priorities to tasks.
Estimating WCET is an important activity at early design stages of real-time systems. Based on such WCET estimates, engineers make design and implementation decisions to ensure that task executions always complete before their specified deadlines. However, in practice, engineers often cannot provide a precise point of WCET estimates and they prefer to provide plausible WCET ranges.
Task priority assignment is an important decision, as it determines the order of task executions and it has a substantial impact on schedulability results. It thus requires finding optimal priority assignments so that tasks not only complete their execution but also maximize the safety margins from their deadlines. Optimal priority values increase the tolerance of real-time systems to unexpected overheads in task executions so that they can still meet their deadlines. However, it is a hard problem to find optimal priority assignments because their evaluation relies on uncertain WCET values and complex engineering constraints must be accounted for.
This dissertation proposes three approaches to estimate WCET and assign optimal priorities at design stages. Combining a genetic algorithm and logistic regression, we first suggest an automatic approach to infer safe WCET ranges with a probabilistic guarantee based on the worst-case scheduling scenarios. We then introduce an extended approach to account for weakly hard real-time systems with an industrial schedule simulator. We evaluate our approaches by applying them to industrial systems from different domains and several synthetic systems. The results suggest that they are possible to estimate probabilistic safe WCET ranges efficiently and accurately so the deadline constraints are likely to be satisfied with a high degree of confidence. Moreover, we propose an automated technique that aims to identify the best possible priority assignments in real-time systems. The approach deals with multiple objectives regarding safety margins and engineering constraints using a coevolutionary algorithm. Evaluation with synthetic and industrial systems shows that the approach significantly outperforms both a baseline approach and solutions defined by practitioners. All the solutions in this dissertation scale to complex industrial systems for offline analysis within an acceptable time, i.e., at most 27 hours
The diversity-accuracy duality in ensembles of classifiersd
Horizontal scaling of Machine Learning algorithms has the potential to tackle concerns over the scalability and sustainability of Deep Learning methods, viz. their consumption of energy and computational resources, as well their increasing inaccessibility to researchers. One way to enact horizontal scaling is by employing ensemble learning methods, since they enable distribution. There is a consensus on the point that diversity between individual learners leads to better performance, which is why we have focused on it as the criterion for distributing the base models of an ensemble. However, there is no standard agreement on how diversity should be defined and thus how to exploit it to construct a high-performing classifier. Therefore, we have proposed different definitions of diversity and innovative algorithms which promote it in a systematic way.
We have first considered architectural diversity with an algorithm called WILDA: Wide Learning of Diverse Architectures. In a distributed fashion, this algorithm evolves a set of neural networks that are pretrained on the target task and diverse w.r.t. architectural feature descriptors. We have then generalised this notion by defining behavioural diversity on the basis of the divergence between the errors made by different models on a dataset. We have defined several diversity metrics and used them to guide a novelty search algorithm which builds an ensemble of behaviourally diverse classifiers. The algorithm promotes diversity in ensembles by explicitly searching for it, without selecting for accuracy. We have then extended this approach with a surrogate diversity model, which reduces the computational burden of this search by eliminating the need to train each network in the population with stochastic gradient descent at each step. These methods have enabled us to investigate the role that both architectural and behavioural diversity play in contributing to the performance of an ensemble.
In order to study the relationship between diversity and accuracy in classifier ensembles, we have then proposed several methods that extend the novelty search with accuracy objectives. Surprisingly, we have observed that, with the highest-performing diversity metrics, there is an equivalence between searching for diversity objectives and searching for accuracy objectives. This contradicts widespread assumptions that a trade-off must be found by balancing diversity and accuracy objectives. We therefore posit the existence of a diversity-accuracy duality in ensembles of classifiers. An implication of this is the possibility of evolving diverse ensembles without detriment to their accuracy, since it is implicitly ensured.Open Acces
Three Risky Decades: A Time for Econophysics?
Our Special Issue we publish at a turning point, which we have not dealt with since World War II. The interconnected long-term global shocks such as the coronavirus pandemic, the war in Ukraine, and catastrophic climate change have imposed significant humanitary, socio-economic, political, and environmental restrictions on the globalization process and all aspects of economic and social life including the existence of individual people. The planet is trappedâthe current situation seems to be the prelude to an apocalypse whose long-term effects we will have for decades. Therefore, it urgently requires a concept of the planet's survival to be builtâonly on this basis can the conditions for its development be created. The Special Issue gives evidence of the state of econophysics before the current situation. Therefore, it can provide excellent econophysics or an inter-and cross-disciplinary starting point of a rational approach to a new era
Mining Explicit and Implicit Relationships in Data Using Symbolic Regression
Identification of implicit and explicit relations within observed data is a generic problem commonly encountered in several domains including science, engineering, finance, and more. It forms the core component of data analytics, a process of discovering useful information from data sets that are potentially huge and otherwise incomprehensible. In industries, such information is often instrumental for profitable decision making, whereas in science and engineering it is used to build empirical models, propose new or verify existing theories and explain natural phenomena. In recent times, digital and internet based technologies have proliferated, making it viable to generate and collect large amount of data at low cost. This inturn has resulted in an ever growing need for methods to analyse and draw interpretations from such data quickly and reliably. With this overarching goal, this thesis attempts to make contributions towards developing accurate and efficient methods for discovering such relations through evolutionary search, a method commonly referred to as Symbolic Regression (SR).
A data set of input variables x and a corresponding observed response y is given. The aim is to find an explicit function y = f (x) or an implicit function f (x, y) = 0, which represents the data set. While seemingly simple, the problem is challenging for several reasons. Some of the conventional regression methods try to âguessâ a functional form such as linear/quadratic/polynomial, and attempt to do a curve-fitting of the data to the equation, which may limit the possibility of discovering more complex relations, if they exist. On the other hand, there are meta-modelling techniques such as response surface method, Kriging, etc., that model the given data accurately, but provide a âblack-boxâ predictor instead of an expression. Such approximations convey little or no insights about how the variables and responses are dependent on each other, or their relative contribution to the output. SR attempts to alleviate the above two extremes by providing a structure which evolves mathematical expressions instead of assuming them. Thus, it is flexible enough to represent the data, but at the same time provides useful insights instead of a black-box predictor. SR can be categorized as part of Explainable Artificial Intelligence and can contribute to Trustworthy Artificial Intelligence.
The works proposed in this thesis aims to integrate the concept of âsemanticsâ deeper into Genetic Programming (GP) and Evolutionary Feature Synthesis, which are the two algorithms usually employed for conducting SR. The semantics will be integrated into well-known components of the algorithms such as compactness, diversity, recombination, constant optimization, etc. The main contribution of this thesis is the proposal of two novel operators to generate expressions based on Linear Programming and Mixed Integer Programming with the aim of controlling the length of the discovered expressions without compromising on the accuracy. In the experiments, these operators are proven to be able to discover expressions with better accuracy and interpretability on many explicit and implicit benchmarks. Moreover, some applications of SR on real-world data sets are shown to demonstrate the practicality of the proposed approaches. Besides, in related to practical problems, how GP can be applied to effectively solve the Resource Constrained Scheduling Problems is also presented
Planning and Scheduling Optimization
Although planning and scheduling optimization have been explored in the literature for many years now, it still remains a hot topic in the current scientific research. The changing market trends, globalization, technical and technological progress, and sustainability considerations make it necessary to deal with new optimization challenges in modern manufacturing, engineering, and healthcare systems. This book provides an overview of the recent advances in different areas connected with operations research models and other applications of intelligent computing techniques used for planning and scheduling optimization. The wide range of theoretical and practical research findings reported in this book confirms that the planning and scheduling problem is a complex issue that is present in different industrial sectors and organizations and opens promising and dynamic perspectives of research and development
An Approach Based on Particle Swarm Optimization for Inspection of Spacecraft Hulls by a Swarm of Miniaturized Robots
The remoteness and hazards that are inherent to the operating environments of space infrastructures promote their need for automated robotic inspection. In particular, micrometeoroid and orbital debris impact and structural fatigue are common sources of damage to spacecraft hulls. Vibration sensing has been used to detect structural damage in spacecraft hulls as well as in structural health monitoring practices in industry by deploying static sensors. In this paper, we propose using a swarm of miniaturized vibration-sensing mobile robots realizing a network of mobile sensors. We present a distributed inspection algorithm based on the bio-inspired particle swarm optimization and evolutionary algorithm niching techniques to deliver the task of enumeration and localization of an a priori unknown number of vibration sources on a simplified 2.5D spacecraft surface. Our algorithm is deployed on a swarm of simulated cm-scale wheeled robots. These are guided in their inspection task by sensing vibrations arising from failure points on the surface which are detected by on-board accelerometers. We study three performance metrics: (1) proximity of the localized sources to the ground truth locations, (2) time to localize each source, and (3) time to finish the inspection task given a 75% inspection coverage threshold. We find that our swarm is able to successfully localize the present so
- âŠ