113,440 research outputs found
THE ROLE OF FORENSIC ACCOUNTING IN MITIGATING AGAINST CYBER CRIMES DURING THE COVID-19 PANDEMIC ERA: ISSUES AND PERSPECTIVES
The advent of covid-19 pandemic not only create health challenges but also created difficulties in the conduct of economic activities across the world. Due to restrictions imposed as a result of the pandemic, the physical movements of individuals were restricted. Hence business transactions were conducted online through Internet services. Logistics services were adopted to move goods from the location where they were produced to where they were consumed. Payments or settlements for the products were conducted through the internet and this led to the astronomical increase in Internet fraud which culminated in the loss of valuable assets. This study was therefore conducted to examine how forensic accounting tools were adopted to investigate, track, and recover the lost assets and restore the victims to their original state prior to the loss, as well, as how the perpetrators were made to face appropriate sanctions. The study used a descriptive research design as the data for the study were obtained from secondary sources. The data were obtained through questionnaires that were administered to the personnel of agencies responsible for the management and control of cybercrimes in Nigeria and were analysed using percentages and correlation. The results of the analysis show that there was a significant effect of forensic accounting mechanisms in the management and control of cybercrimes. It also indicated that the application of forensic accounting tools was effective in obtaining admissible evidence in court. Furthermore, the result also showed a strong effect on the recovery of stolen assets. The study concluded that increased deployment of forensic accounting tools had an effect in reducing internet fraud, and provided enough evidence for prosecution of criminals and the recovery of stolen assets. The study recommends strict monitoring of Internet transactions as well as the adoption of forensic accounting tools as critical tools for tracking and recovery of stolen assets. The study apart from contributing to expanding knowledge on forensic accounting also addresses the fears of those who intend to carry on transactions through the Internet. Also, the recommendations will provide the roadmap towards addressing the problems of internet fraud
Reliable and timely event notification for publish/subscribe services over the internet
The publish/subscribe paradigm is gaining attention for the development of several applications in wide area networks (WANs) due to its intrinsic time, space, and synchronization decoupling properties that meet the scalability and asynchrony requirements of those applications. However, while the communication in a WAN may be affected by the unpredictable behavior of the network, with messages that can be dropped or delayed, existing publish/subscribe solutions pay just a little attention to addressing these issues. On the contrary, applications such as business intelligence, critical infrastructures, and financial services require delivery guarantees with strict temporal deadlines. In this paper, we propose a framework that enforces both reliability and timeliness for publish/subscribe services over WAN. Specifically, we combine two different approaches: gossiping, to retrieve missing packets in case of incomplete information, and network coding, to reduce the number of retransmissions and, consequently, the latency. We provide an analytical model that describes the information recovery capabilities of our algorithm and a simulation-based study, taking into account a real workload from the Air Traffic Control domain, which evidences how the proposed solution is able to ensure reliable event notification over a WAN within a reasonable bounded time window. © 2013 IEEE
TCP smart framing: a segmentation algorithm to reduce TCP latency
TCP Smart Framing, or TCP-SF for short, enables the Fast Retransmit/Recovery algorithms even when the congestion window is small. Without modifying the TCP congestion control based on the additive-increase/multiplicative-decrease paradigm, TCP-SF adopts a novel segmentation algorithm: while Classic TCP always tries to send full-sized segments, a TCP-SF source adopts a more flexible segmentation algorithm to try and always have a number of in-flight segments larger than 3 so as to enable Fast Recovery. We motivate this choice by real traffic measurements, which indicate that today's traffic is populated by short-lived flows, whose only means to recover from a packet loss is by triggering a Retransmission Timeout. The key idea of TCP-SF can be implemented on top of any TCP flavor, from Tahoe to SACK, and requires modifications to the server TCP stack only, and can be easily coupled with recent TCP enhancements. The performance of the proposed TCP modification were studied by means of simulations, live measurements and an analytical model. In addition, the analytical model we have devised has a general scope, making it a valid tool for TCP performance evaluation in the small window region. Improvements are remarkable under several buffer management schemes, and maximized by byte-oriented schemes
Software Defined Networks based Smart Grid Communication: A Comprehensive Survey
The current power grid is no longer a feasible solution due to
ever-increasing user demand of electricity, old infrastructure, and reliability
issues and thus require transformation to a better grid a.k.a., smart grid
(SG). The key features that distinguish SG from the conventional electrical
power grid are its capability to perform two-way communication, demand side
management, and real time pricing. Despite all these advantages that SG will
bring, there are certain issues which are specific to SG communication system.
For instance, network management of current SG systems is complex, time
consuming, and done manually. Moreover, SG communication (SGC) system is built
on different vendor specific devices and protocols. Therefore, the current SG
systems are not protocol independent, thus leading to interoperability issue.
Software defined network (SDN) has been proposed to monitor and manage the
communication networks globally. This article serves as a comprehensive survey
on SDN-based SGC. In this article, we first discuss taxonomy of advantages of
SDNbased SGC.We then discuss SDN-based SGC architectures, along with case
studies. Our article provides an in-depth discussion on routing schemes for
SDN-based SGC. We also provide detailed survey of security and privacy schemes
applied to SDN-based SGC. We furthermore present challenges, open issues, and
future research directions related to SDN-based SGC.Comment: Accepte
ElasTraS: An Elastic Transactional Data Store in the Cloud
Over the last couple of years, "Cloud Computing" or "Elastic Computing" has
emerged as a compelling and successful paradigm for internet scale computing.
One of the major contributing factors to this success is the elasticity of
resources. In spite of the elasticity provided by the infrastructure and the
scalable design of the applications, the elephant (or the underlying database),
which drives most of these web-based applications, is not very elastic and
scalable, and hence limits scalability. In this paper, we propose ElasTraS
which addresses this issue of scalability and elasticity of the data store in a
cloud computing environment to leverage from the elastic nature of the
underlying infrastructure, while providing scalable transactional data access.
This paper aims at providing the design of a system in progress, highlighting
the major design choices, analyzing the different guarantees provided by the
system, and identifying several important challenges for the research community
striving for computing in the cloud.Comment: 5 Pages, In Proc. of USENIX HotCloud 200
Heterogeneous Networked Data Recovery from Compressive Measurements Using a Copula Prior
Large-scale data collection by means of wireless sensor network and
internet-of-things technology poses various challenges in view of the
limitations in transmission, computation, and energy resources of the
associated wireless devices. Compressive data gathering based on compressed
sensing has been proven a well-suited solution to the problem. Existing designs
exploit the spatiotemporal correlations among data collected by a specific
sensing modality. However, many applications, such as environmental monitoring,
involve collecting heterogeneous data that are intrinsically correlated. In
this study, we propose to leverage the correlation from multiple heterogeneous
signals when recovering the data from compressive measurements. To this end, we
propose a novel recovery algorithm---built upon belief-propagation
principles---that leverages correlated information from multiple heterogeneous
signals. To efficiently capture the statistical dependencies among diverse
sensor data, the proposed algorithm uses the statistical model of copula
functions. Experiments with heterogeneous air-pollution sensor measurements
show that the proposed design provides significant performance improvements
against state-of-the-art compressive data gathering and recovery schemes that
use classical compressed sensing, compressed sensing with side information, and
distributed compressed sensing.Comment: accepted to IEEE Transactions on Communication
- …