3,776 research outputs found
Quantifying Information Overload in Social Media and its Impact on Social Contagions
Information overload has become an ubiquitous problem in modern society.
Social media users and microbloggers receive an endless flow of information,
often at a rate far higher than their cognitive abilities to process the
information. In this paper, we conduct a large scale quantitative study of
information overload and evaluate its impact on information dissemination in
the Twitter social media site. We model social media users as information
processing systems that queue incoming information according to some policies,
process information from the queue at some unknown rates and decide to forward
some of the incoming information to other users. We show how timestamped data
about tweets received and forwarded by users can be used to uncover key
properties of their queueing policies and estimate their information processing
rates and limits. Such an understanding of users' information processing
behaviors allows us to infer whether and to what extent users suffer from
information overload.
Our analysis provides empirical evidence of information processing limits for
social media users and the prevalence of information overloading. The most
active and popular social media users are often the ones that are overloaded.
Moreover, we find that the rate at which users receive information impacts
their processing behavior, including how they prioritize information from
different sources, how much information they process, and how quickly they
process information. Finally, the susceptibility of a social media user to
social contagions depends crucially on the rate at which she receives
information. An exposure to a piece of information, be it an idea, a convention
or a product, is much less effective for users that receive information at
higher rates, meaning they need more exposures to adopt a particular contagion.Comment: To appear at ICSWM '1
A Priority-based Fair Queuing (PFQ) Model for Wireless Healthcare System
Healthcare is a very active research area, primarily due to the increase in the elderly population that leads to increasing number of emergency situations that require urgent actions. In recent years some of wireless networked medical devices were equipped with different sensors to measure and report on vital signs of patient remotely. The most important sensors are Heart Beat Rate (ECG), Pressure and Glucose sensors. However, the strict requirements and real-time nature of medical applications dictate the extreme importance and need for appropriate Quality of Service (QoS), fast and accurate delivery of a patientâs measurements in reliable e-Health ecosystem.
As the elderly age and older adult population is increasing (65 years and above) due to the advancement in medicine and medical care in the last two decades; high QoS and reliable e-health ecosystem has become a major challenge in Healthcare especially for patients who require continuous monitoring and attention. Nevertheless, predictions have indicated that elderly population will be approximately 2 billion in developing countries by 2050 where availability of medical staff shall be unable to cope with this growth and emergency cases that need immediate intervention. On the other side, limitations in communication networks capacity, congestions and the humongous increase of devices, applications and IOT using the available communication networks add extra layer of challenges on E-health ecosystem such as time constraints, quality of measurements and signals reaching healthcare centres.
Hence this research has tackled the delay and jitter parameters in E-health M2M wireless communication and succeeded in reducing them in comparison to current available models. The novelty of this research has succeeded in developing a new Priority Queuing model ââPriority Based-Fair Queuingââ (PFQ) where a new priority level and concept of ââPatientâs Health Recordââ (PHR) has been developed and
integrated with the Priority Parameters (PP) values of each sensor to add a second level of priority. The results and data analysis performed on the PFQ model under different scenarios simulating real M2M E-health environment have revealed that the PFQ has outperformed the results obtained from simulating the widely used current models such as First in First Out (FIFO) and Weight Fair Queuing (WFQ).
PFQ model has improved transmission of ECG sensor data by decreasing delay and jitter in emergency cases by 83.32% and 75.88% respectively in comparison to FIFO and 46.65% and 60.13% with respect to WFQ model. Similarly, in pressure sensor the improvements were 82.41% and 71.5% and 68.43% and 73.36% in comparison to FIFO and WFQ respectively. Data transmission were also improved in the Glucose sensor by 80.85% and 64.7% and 92.1% and 83.17% in comparison to FIFO and WFQ respectively. However, non-emergency cases data transmission using PFQ model was negatively impacted and scored higher rates than FIFO and WFQ since PFQ tends to give higher priority to emergency cases.
Thus, a derivative from the PFQ model has been developed to create a new version namely âPriority Based-Fair Queuing-Tolerated Delayâ (PFQ-TD) to balance the data transmission between emergency and non-emergency cases where tolerated delay in emergency cases has been considered. PFQ-TD has succeeded in balancing fairly this issue and reducing the total average delay and jitter of emergency and non-emergency cases in all sensors and keep them within the acceptable allowable standards. PFQ-TD has improved the overall average delay and jitter in emergency and non-emergency cases among all sensors by 41% and 84% respectively in comparison to PFQ model
Recommended from our members
Information Overload: An Overview
For almost as long as there has been recorded information, there has been a perception that humanity has been overloaded by it. Concerns about 'too much to read' have been expressed for many centuries, and made more urgent since the arrival of ubiquitous digital information in the late twentieth century. The historical perspective is a necessary corrective to the often, and wrongly, held view that it is associated solely with the modern digital information environment, and with social media in particular. However, as society fully experiences Floridi's Fourth Revolution, and moves into hyper-history (with society dependent on, and defined by, information and communication technologies) and the infosphere (a information environment distinguished by a seamless blend of online and offline information actvity), individuals and societies are dependent on, and formed by, information in an unprecedented way, information overload needs to be taken more seriously than ever. Overload has been claimed to be both the major issue of our time, and a complete non-issue. It has been cited as an important factor in, inter alia, science, medicine, education, politics, governance, business and marketing, planning for smart cities, access to news, personal data tracking, home life, use of social media, and online shopping, and has even influenced literature The information overload phenomenon has been known by many different names, including: information overabundance, infobesity, infoglut, data smog, information pollution, information fatigue, social media fatigue, social media overload, information anxiety, library anxiety, infostress, infoxication, reading overload, communication overload, cognitive overload, information violence, and information assault. There is no single generally accepted definition, but it can best be understood as that situation which arises when there is so much relevant and potentially useful information available that it becomes a hindrance rather than a help. Its essential nature has not changed with changing technology, though its causes and proposed solutions have changed much. The best ways of avoiding overload, individually and socially, appear to lie in a variety of coping strategies, such as filtering, withdrawing, queuing, and 'satisficing'. Better design of information systems, effective personal information management, and the promotion of digital and media literacies, also have a part to play. Overload may perhaps best be overcome by seeking a mindful balance in consuming information, and in finding understanding
Anytime Cognition: An information agent for emergency response
Planning under pressure in time-constrained environments while relying on uncertain information is a challenging task. This is particularly true for planning the response during an ongoing disaster in a urban area, be that a natural one, or a deliberate attack on the civilian population. As the various activities pertaining to the emergency response need to be coordinated in response to multiple reports from the disaster site, a user finds itself cognitively overloaded. To address this issue, we designed the Anytime Cognition (ANTICO) concept to assist human users working in time-constrained environments by maintaining a manageable level of cognitive workload over time. Based on the ANTICO concept, we develop an agent framework for proactively managing a userâs changing information requirements by integrating information management techniques with probabilistic plan recognition. In this paper, we describe a prototype emergency response application in the context of a subset of the attacks devised by the American Department of Homeland Security
A cognitive architecture for emergency response
Plan recognition, cognitive workload estimation and human assistance have been extensively studied in the AI and human factors communities, resulting in many techniques being applied to domains of various levels of realism. These techniques have seldom been integrated and evaluated as complete systems. In this paper, we report on the development of an assistant agent architecture that integrates plan recognition, current and future user information needs, workload estimation and adaptive information presentation to aid an emergency response manager in making high quality decisions under time stress, while avoiding cognitive overload. We describe the main components of a full implementation of this architecture as well as a simulation developed to evaluate the system. Our evaluation consists of simulating various possible executions of the emergency response plans used in the real world and measuring the expected time taken by an unaided human user, as well as one that receives information assistance from our system. In the experimental condition of agent assistance, we also examine the effects of different error rates in the agent's estimation of user's stat or information needs
Performance impact of web services on Internet servers
While traditional Internet servers mainly served static and
later also dynamic content, the popularity of Web services
is increasing rapidly. Web services incorporate additional
overhead compared to traditional web interaction. This
overhead increases the demand on Internet servers which
is of particular importance when the request rate to the
server is high. We conduct experiments that show that the
imposed overhead of Web services is non-negligible
during server overload. In our experiments the response
time for Web services is more than 30% higher and the
server throughput more than 25% lower compared to
traditional web interaction using dynamically created
HTML pages
Simulation of an Optimized Data Packet Transmission in a Congested Network
Computer network and the Internet nowadays accommodate simultaneous transmission of audio, video, and data traffic among others. Efficient and reliable data transmission is essential for achieving high performance in a networked computing environment. Thus, there is need to optimized data packet transmission in the present day network. This paper simulates and demonstrates the process of optimizing data packet transmission in a congested network. It uses the modified FIFO Queue system to control data packet loss and uses the prototyping software methodology to develop software in Python Programming language for its implementation. From the simulation process, it was observed that causes of packet loss during transmission are largely dependent on protocol, congestion of traffic way, speed of the sender and speed of the receiverâs machine. Thus, the paper takes advantage of the observations from simulation and presents a system that simulates control of data loss during transmission in a congested network. Keywords: Simulation, Auxiliary Queue, Departing Packets, Arrival Packets, Packet Loss
- âŠ