1,421 research outputs found
Special Issue on High-Level Declarative Stream Processing
Stream processing as an information processing paradigm has been investigated by various research communities within computer science and appears in various applications: realtime analytics, online machine learning, continuous computation, ETL operations, and more. The special issue on "High-Level Declarative Stream Processing" investigates the declarative aspects of stream processing, a topic of undergoing intense study. It is published in the Open Journal of Web Technologies (OJWT) (www.ronpub.com/ojwt). This editorial provides an overview over the aims and the scope of the special issue and the accepted papers
Game Theoretic Approaches to Massive Data Processing in Wireless Networks
Wireless communication networks are becoming highly virtualized with
two-layer hierarchies, in which controllers at the upper layer with tasks to
achieve can ask a large number of agents at the lower layer to help realize
computation, storage, and transmission functions. Through offloading data
processing to the agents, the controllers can accomplish otherwise prohibitive
big data processing. Incentive mechanisms are needed for the agents to perform
the controllers' tasks in order to satisfy the corresponding objectives of
controllers and agents. In this article, a hierarchical game framework with
fast convergence and scalability is proposed to meet the demand for real-time
processing for such situations. Possible future research directions in this
emerging area are also discussed
The Visual Social Distancing Problem
One of the main and most effective measures to contain the recent viral
outbreak is the maintenance of the so-called Social Distancing (SD). To comply
with this constraint, workplaces, public institutions, transports and schools
will likely adopt restrictions over the minimum inter-personal distance between
people. Given this actual scenario, it is crucial to massively measure the
compliance to such physical constraint in our life, in order to figure out the
reasons of the possible breaks of such distance limitations, and understand if
this implies a possible threat given the scene context. All of this, complying
with privacy policies and making the measurement acceptable. To this end, we
introduce the Visual Social Distancing (VSD) problem, defined as the automatic
estimation of the inter-personal distance from an image, and the
characterization of the related people aggregations. VSD is pivotal for a
non-invasive analysis to whether people comply with the SD restriction, and to
provide statistics about the level of safety of specific areas whenever this
constraint is violated. We then discuss how VSD relates with previous
literature in Social Signal Processing and indicate which existing Computer
Vision methods can be used to manage such problem. We conclude with future
challenges related to the effectiveness of VSD systems, ethical implications
and future application scenarios.Comment: 9 pages, 5 figures. All the authors equally contributed to this
manuscript and they are listed by alphabetical order. Under submissio
Towards Objectives-Based Process Redesign
Continuously growing and changing multinational companies oftentimes struggle with heterogeneous degrees of standardization.Especially in case of redesigning business processes that have been historically grown over decades, the capability of handlingsemi-structures process is central. Nevertheless, for competitive advantages, it is essential for a company to work on the optimizationof all processes. Existing redesign techniques either focus on completely unstructured or structured processes. TheRedesign Model presented in this paper transforms processes with any level of structuredness into processes with an increaseddegree of standardization. Our technique consists of four main steps: (i) we extract the objectives for an efficient business processredesign from existing literature; (ii) we formulate a list of requirements an innovative redesign model has to fulfill; (iii) wepresent a design science based Business Process Redesign Framework including our Redesign model; (iv) we evaluate our modelshowing its applicability and completeness
Sharing semi-heterogeneous single-user editors for real-time group editing
A new approach is proposed to transparently share familiar single-user editors
without modifying their source code. This approach tweaks a classic diff algorithm
to derive edit scripts between document states. Concurrent edit scripts are merged
to synchronize states of coauthoring sites. Our concept-proving prototype currently
works with familiar, heterogeneous text editors such as GVim and WinEdt that can
be adapted to support two basic interfaces, GetState and SetState. The adaption
is less expensive and more robust than recent approaches such as ICT and CoWord,
which must understand and translate editing operations at the operating system level.
Experimental data show that our approach is able to provide sufficient performance
for near-realtime group editing
Evolutionary emergence of collective intelligence in large groups of students
The emergence of collective intelligence has been studied in much greater detail in small groups than in larger ones. Nevertheless, in groups of several hundreds or thousands of members, it is well-known that the social environment exerts a considerable influence on individual behavior. A few recent papers have dealt with some aspects of large group situations, but have not provided an in-depth analysis of the role of interactions among the members of a group in the creation of ideas, as well as the group’s overall performance. In this study, we report an experiment where a large set of individuals, i.e., 789 high-school students, cooperated online in real time to solve two different examinations on a specifically designed platform (Thinkhub). Our goal of this paper 6 to describe the specific mechanisms of idea creation we were able to observe and to measure the group’s performance as a whole. When we deal with communication networks featuring a large number of interacting entities, it seems natural to model the set as a complex system by resorting to the tools of statistical mechanics. Our experiment shows how an interaction in small groups that increase in size over several phases, leading to a final phase where the students are confronted with the most popular answers of the previous phases, is capable of producing high-quality answers to all examination questions, whereby the last phase plays a crucial role. Our experiment likewise shows that a group’s performance in such a task progresses in a linear manner in parallel with the size of the group. Finally, we show that the controlled interaction and dynamics foreseen in the system can reduce the spread of “fake news” within the group
- …