324 research outputs found
Does the internet deserve everybody?
There has been a long standing tradition amongst developed nations of influencing, both directly and indirectly, the activities of developing economies. Behind this is one of a range of aims: building/improving living standards, bettering the social status of recipient communities, etc. In some cases, this has resulted in prosperous relations, yet often this has been seen as the exploitation of a power position or a veneer for other activities (e.g. to tap into new emerging markets). In this paper, we explore whether initiatives to improve Internet connectivity in developing regions are always ethical. We draw a list of issues that would aid in formulating Internet initiatives that are ethical, effective, and sustainable
Using big data to make better decisions in the digital economy
The question this special issue would like to address is how to harvest big data to help decision-makers to deliver better fact-based decisions aimed at improving performance or to create better strategy? This special issue focuses on the big data applications in supporting operations decisions, including advanced research on decision models and tools for the digital economy. Responds to this special issue was great and we have included many high-quality papers. We are pleased to present 13 of the best papers. The techniques presented include data mining, simulation and expert system with applications span across online reviews, food retail chain to e-health
Data Commons
Publicly available data from open sources (e.g., United States Census Bureau
(Census), World Health Organization (WHO), Intergovernmental Panel on Climate
Change (IPCC)) are vital resources for policy makers, students and researchers
across different disciplines. Combining data from different sources requires
the user to reconcile the differences in schemas, formats, assumptions, and
more. This data wrangling is time consuming, tedious and needs to be repeated
by every user of the data. Our goal with Data Commons (DC) is to help make
public data accessible and useful to those who want to understand this data and
use it to solve societal challenges and opportunities. We do the data
processing and make the processed data widely available via standard schemas
and Cloud APIs. Data Commons is a distributed network of sites that publish
data in a common schema and interoperate using the Data Commons APIs. Data from
different Data Commons can be joined easily. The aggregate of these Data
Commons can be viewed as a single Knowledge Graph. This Knowledge Graph can
then be searched over using Natural Language questions utilizing advances in
Large Language Models. This paper describes the architecture of Data Commons,
some of the major deployments and highlights directions for future work
The Impact of Business Intelligence Systems on Profitability and Risks of Firms
202105 bcvcAccepted ManuscriptRGC155009/15BEarly releas
The model and the planning method of volume and variety assessment of innovative products in an industrial enterprise
In the long term, the innovative development strategy efficiency is considered as the most crucial condition for assurance of economic system competitiveness in market conditions. It determines the problem relevance of such justification strategies with regard to specific systems features and conditions of their operation. The problem solution for industrial enterprises can be based on mathematical models of supporting the decision-making on the elements of the innovative manufacturing program. An optimization model and the planning method of innovative products volume and variety are suggested. The feature of the suggested model lies in the nonlinear nature of the objective function. It allows taking into consideration the law of diminishing marginal utility. The suggested method of optimization takes into account the system features and enables the effective implementation of manufacturing capabilities in modern conditions of production organization and sales in terms of market saturation
Will we work in twenty-first century capitalism? A critique of the fourth industrial revolution literature
The fourth industrial revolution has become a prominent concept and imminent technological change a major issue. Facets are everyone’s concern but currently no one’s ultimate responsibility (perhaps a little like financial stability before the global financial crisis). In this paper, we argue that the future is being shaped now by the way the fourth industrial revolution is being positioned. Whilst no one has set out to argue for or defend technological determinism, anxiety combined with passivity and complacency are being produced, and this is in the context of a quasi-determinism. The contingent quantification of the future with regard to the potential for job displacement provides an influential source of authority for this. A background of ‘the future is coming, so you better get used to it’ is being disseminated. This favours a capitalism that may ‘deny work to the many’ perspective rather than a more fundamental rethink that encompasses change that may liberate the many from work. This, in turn, positions workers and responsibility for future employment (reducing the urgency of calls for wider societal preparation). Public understanding and policy are thus affected and along with them the future of work
Extracting Statistically Significant Behaviour from Fish Tracking Data With and Without Large Dataset Cleaning
Extracting a statistically significant result from video of natural phenomenon can be difficult for two reasons: (i) there can be considerable natural variation in the observed behaviour and (ii) computer vision algorithms applied to natural phenomena may not perform correctly on a significant number of samples. This study presents one approach to clean a large noisy visual tracking dataset to allow extracting statistically sound results from the image data. In particular, analyses of 3.6 million underwater trajectories of a fish with the water temperature at the time of acquisition are presented. Although there are many false detections and incorrect trajectory assignments, by a combination of data binning and robust estimation methods, reliable evidence for an increase in fish speed as water temperature increases are demonstrated. Then, a method for data cleaning which removes outliers arising from false detections and incorrect trajectory assignments using a deep learning‐based clustering algorithm is proposed. The corresponding results show a rise in fish speed as temperature goes up. Several statistical tests applied to both cleaned and not‐cleaned data confirm that both results are statistically significant and show an increasing trend. However, the latter approach also generates a cleaner dataset suitable for other analysis
Digital Work Design
Erworben im Rahmen der Schweizer Nationallizenzen (http://www.nationallizenzen.ch)More and more academic studies and practitioner reports claim that human work is increasingly disrupted or even determined by information and communication technology (ICT) (Cascio and Montealegre 2016). This will make a considerable share of jobs currently performed by humans susceptible to automation (e.g., Frey and Osborne 2017; Manyika et al. 2017). These reports often sketch a picture of ‘machines taking over’ traditional domains like manufacturing, while ICT advances and capabilities seem to decide companies’ fate. Consequently, ICT is often put at the core of innovative efforts. While this applies to nearly all areas of workplace design, a recent popular example of increasing technology centricity is ‘Industry 4.0’, which is often delineated as ‘machines talking to computers’
- …