9,251 research outputs found
The correlation between values-based leadership and economic success: An empirical evaluation within selected German cooperative banks and related policy implications
We are living in a society which is characterised by a permanent developing knowledge culture. The emergence of this megatrend among others, in combination with the financial crisis started in 2007 is the basis of the discussion about values and their impact on economic success. This study explores the links between values, which promote leadership, especially cooperative values, and economic success.
The thesis is based on an online survey with all cooperative banks in Germany with an individual balance sheet totals over 1500 million Euro. These banks represent the greatest cooperative banks in Germany and their employees were invited to answer a questionnaire in order to analyse if cooperative values are part of everyday leadership and are perceived accordingly. The examined values were fairness, confidence, certainty, competence, reliability, individuality, common
ground, respect, partnership, responsibility and solidarity. These values were set in correlation to financial figures: capital adequacy, asset quality, management efficiency, earnings quality and liquidity management. In addition to that the questionnaire of the online survey contained questions about performance appraisal systems including feedback systems for executives.
The concept of values-based leadership and economic success measured in Key Performance Indicators formed the conceptual framework as presented in the literature review. Beyond that, this research follows the fundamental philosophy ‘critical theory’, because critical theory as a social theory is oriented toward critiquing society as a whole or like in this research project a part of our society.
The study shows first small indications about the relationships between cooperative management values and business key figures. Correlation analysis was one of the main statistical analysis method of the study, because it measures the relationship between two items. In this case values and financial figures. In addition, various regression analyses were carried out. The aim of regression analysis is to determine the relationships between a dependent variable (financial figures) on the one hand and several explanatory variables (cooperative values) on the other.
The elaborations in this study indicate that values-based leadership might have a positive influence on economic success. Organisations could be able to improve their results if they follow the concept of values-based leadership or even the cooperative values management style. The findings of this study might have important implications for those training, coaching or selecting executives, those intending to take a leadership position or who already are leaders, the organisations within values-based leadership is put into focus and for other researchers who want to build on the results.
Thus, this study contributes to both practice and knowledge
History, Features, Challenges, and Critical Success Factors of Enterprise Resource Planning (ERP) in The Era of Industry 4.0
ERP has been adopting newer features over the last several decades and shaping global businesses with the advent of newer technologies. This research article uses a state-of-the-art review method with the purpose to review and synthesize the latest information on the possible integration of potential Industry 4.0 technologies into the future development of ERP. Different software that contributed to the development of the existing ERP is found to be Material Requirement Planning (MRP), Manufacturing Resource Planning (MRPII), and Computer Integrated Manufacturing (CIM). Potential disruptive Industry 4.0 technologies that are featured to be integrated into future ERP are artificial intelligence, business intelligence, the internet of things, big data, blockchain technology, and omnichannel strategy. Notable Critical Success Factors of ERP have been reported to be top management support, project team, IT infrastructure, communication, skilled staff, training & education, and monitoring & evaluation. Moreover, cybersecurity has been found to be the most challenging issue to overcome in future versions of ERP. This review article could help future ERP researchers and respective stakeholders contribute to integrating newer features in future versions of ERP
Atomistically-informed continuum modeling and isogeometric analysis of 2D materials over holey substrates
This work develops, discretizes, and validates a continuum model of a molybdenum disulfide (MoS2) monolayer interacting with a periodic holey silicon nitride (Si3N4) substrate via van der Waals (vdW) forces. The MoS2 layer is modeled as a geometrically nonlinear Kirchhoff–Love shell, and vdW forces are modeled by a Lennard-Jones (LJ) potential, simplified using approximations for a smooth substrate topography. Both the shell model and LJ interactions include novel extensions informed by close comparison with fully-atomistic calculations. The material parameters of the shell model are calibrated by comparing small-strain tensile and bending tests with atomistic simulations. This model is efficiently discretized using isogeometric analysis (IGA) for the shell structure and a pseudo-time continuation method for energy minimization. The IGA shell model is validated against fully-atomistic calculations for several benchmark problems with different substrate geometries. Agreement with atomistic results depends on geometric nonlinearity in some cases, but a simple isotropic St.Venant–Kirchhoff model is found to be sufficient to represent material behavior. We find that the IGA discretization of the continuum model has a much lower computational cost than atomistic simulations, and expect that it will enable efficient design space exploration in strain engineering applications. This is demonstrated by studying the dependence of strain and curvature in MoS2 over a holey substrate as a function of the hole spacing on scales inaccessible to atomistic calculations. The results show an unexpected qualitative change in the deformation pattern below a critical hole separation
Leveraging a machine learning based predictive framework to study brain-phenotype relationships
An immense collective effort has been put towards the development of methods forquantifying brain activity and structure. In parallel, a similar effort has focused on collecting experimental data, resulting in ever-growing data banks of complex human in vivo neuroimaging data. Machine learning, a broad set of powerful and effective tools for identifying multivariate relationships in high-dimensional problem spaces, has proven to be a promising approach toward better understanding the relationships between the brain and different phenotypes of interest. However, applied machine learning within a predictive framework for the study of neuroimaging data introduces several domain-specific problems and considerations, leaving the overarching question of how to best structure and run experiments ambiguous. In this work, I cover two explicit pieces of this larger question, the relationship between data representation and predictive performance and a case study on issues related to data collected from disparate sites and cohorts. I then present the Brain Predictability toolbox, a soft- ware package to explicitly codify and make more broadly accessible to researchers the recommended steps in performing a predictive experiment, everything from framing a question to reporting results. This unique perspective ultimately offers recommen- dations, explicit analytical strategies, and example applications for using machine learning to study the brain
The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions
The Metaverse offers a second world beyond reality, where boundaries are
non-existent, and possibilities are endless through engagement and immersive
experiences using the virtual reality (VR) technology. Many disciplines can
benefit from the advancement of the Metaverse when accurately developed,
including the fields of technology, gaming, education, art, and culture.
Nevertheless, developing the Metaverse environment to its full potential is an
ambiguous task that needs proper guidance and directions. Existing surveys on
the Metaverse focus only on a specific aspect and discipline of the Metaverse
and lack a holistic view of the entire process. To this end, a more holistic,
multi-disciplinary, in-depth, and academic and industry-oriented review is
required to provide a thorough study of the Metaverse development pipeline. To
address these issues, we present in this survey a novel multi-layered pipeline
ecosystem composed of (1) the Metaverse computing, networking, communications
and hardware infrastructure, (2) environment digitization, and (3) user
interactions. For every layer, we discuss the components that detail the steps
of its development. Also, for each of these components, we examine the impact
of a set of enabling technologies and empowering domains (e.g., Artificial
Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on
its advancement. In addition, we explain the importance of these technologies
to support decentralization, interoperability, user experiences, interactions,
and monetization. Our presented study highlights the existing challenges for
each component, followed by research directions and potential solutions. To the
best of our knowledge, this survey is the most comprehensive and allows users,
scholars, and entrepreneurs to get an in-depth understanding of the Metaverse
ecosystem to find their opportunities and potentials for contribution
A Design Science Research Approach to Smart and Collaborative Urban Supply Networks
Urban supply networks are facing increasing demands and challenges and thus constitute a relevant field for research and practical development. Supply chain management holds enormous potential and relevance for society and everyday life as the flow of goods and information are important economic functions. Being a heterogeneous field, the literature base of supply chain management research is difficult to manage and navigate. Disruptive digital technologies and the implementation of cross-network information analysis and sharing drive the need for new organisational and technological approaches. Practical issues are manifold and include mega trends such as digital transformation, urbanisation, and environmental awareness.
A promising approach to solving these problems is the realisation of smart and collaborative supply networks. The growth of artificial intelligence applications in recent years has led to a wide range of applications in a variety of domains. However, the potential of artificial intelligence utilisation in supply chain management has not yet been fully exploited. Similarly, value creation increasingly takes place in networked value creation cycles that have become continuously more collaborative, complex, and dynamic as interactions in business processes involving information technologies have become more intense.
Following a design science research approach this cumulative thesis comprises the development and discussion of four artefacts for the analysis and advancement of smart and collaborative urban supply networks. This thesis aims to highlight the potential of artificial intelligence-based supply networks, to advance data-driven inter-organisational collaboration, and to improve last mile supply network sustainability. Based on thorough machine learning and systematic literature reviews, reference and system dynamics modelling, simulation, and qualitative empirical research, the artefacts provide a valuable contribution to research and practice
Qluster: An easy-to-implement generic workflow for robust clustering of health data
The exploration of heath data by clustering algorithms allows to better describe the populations of interest by seeking the sub-profiles that compose it. This therefore reinforces medical knowledge, whether it is about a disease or a targeted population in real life. Nevertheless, contrary to the so-called conventional biostatistical methods where numerous guidelines exist, the standardization of data science approaches in clinical research remains a little discussed subject. This results in a significant variability in the execution of data science projects, whether in terms of algorithms used, reliability and credibility of the designed approach. Taking the path of parsimonious and judicious choice of both algorithms and implementations at each stage, this article proposes Qluster, a practical workflow for performing clustering tasks. Indeed, this workflow makes a compromise between (1) genericity of applications (e.g. usable on small or big data, on continuous, categorical or mixed variables, on database of high-dimensionality or not), (2) ease of implementation (need for few packages, few algorithms, few parameters, ...), and (3) robustness (e.g. use of proven algorithms and robust packages, evaluation of the stability of clusters, management of noise and multicollinearity). This workflow can be easily automated and/or routinely applied on a wide range of clustering projects. It can be useful both for data scientists with little experience in the field to make data clustering easier and more robust, and for more experienced data scientists who are looking for a straightforward and reliable solution to routinely perform preliminary data mining. A synthesis of the literature on data clustering as well as the scientific rationale supporting the proposed workflow is also provided. Finally, a detailed application of the workflow on a concrete use case is provided, along with a practical discussion for data scientists. An implementation on the Dataiku platform is available upon request to the authors
iDML: Incentivized Decentralized Machine Learning
With the rising emergence of decentralized and opportunistic approaches to
machine learning, end devices are increasingly tasked with training deep
learning models on-devices using crowd-sourced data that they collect
themselves. These approaches are desirable from a resource consumption
perspective and also from a privacy preservation perspective. When the devices
benefit directly from the trained models, the incentives are implicit -
contributing devices' resources are incentivized by the availability of the
higher-accuracy model that results from collaboration. However, explicit
incentive mechanisms must be provided when end-user devices are asked to
contribute their resources (e.g., computation, communication, and data) to a
task performed primarily for the benefit of others, e.g., training a model for
a task that a neighbor device needs but the device owner is uninterested in. In
this project, we propose a novel blockchain-based incentive mechanism for
completely decentralized and opportunistic learning architectures. We leverage
a smart contract not only for providing explicit incentives to end devices to
participate in decentralized learning but also to create a fully decentralized
mechanism to inspect and reflect on the behavior of the learning architecture
- …