10,763 research outputs found
21st Century Simulation: Exploiting High Performance Computing and Data Analysis
This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded
paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to
overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel
computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in
computing power. This has been characterized as a ten-year lead over the use of single-processor computers.
Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power.
JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The
challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant
populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants,
and to understand non-linear, asymmetric warfare. These requirements stretch both current
computational techniques and data analysis methodologies. In this paper, documented examples and potential
solutions will be advanced. The authors discuss the paths to successful implementation based on their experience.
Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch,
database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses.
The modeling and simulation community has significant potential to provide more opportunities for training and
analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more
realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights,
for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased
understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses.
The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the
beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success
Revealing the Landscape of Privacy-Enhancing Technologies in the Context of Data Markets for the IoT: A Systematic Literature Review
IoT data markets in public and private institutions have become increasingly
relevant in recent years because of their potential to improve data
availability and unlock new business models. However, exchanging data in
markets bears considerable challenges related to disclosing sensitive
information. Despite considerable research focused on different aspects of
privacy-enhancing data markets for the IoT, none of the solutions proposed so
far seems to find a practical adoption. Thus, this study aims to organize the
state-of-the-art solutions, analyze and scope the technologies that have been
suggested in this context, and structure the remaining challenges to determine
areas where future research is required. To accomplish this goal, we conducted
a systematic literature review on privacy enhancement in data markets for the
IoT, covering 50 publications dated up to July 2020, and provided updates with
24 publications dated up to May 2022. Our results indicate that most research
in this area has emerged only recently, and no IoT data market architecture has
established itself as canonical. Existing solutions frequently lack the
required combination of anonymization and secure computation technologies.
Furthermore, there is no consensus on the appropriate use of blockchain
technology for IoT data markets and a low degree of leveraging existing
libraries or reusing generic data market architectures. We also identified
significant challenges remaining, such as the copy problem and the recursive
enforcement problem that-while solutions have been suggested to some extent-are
often not sufficiently addressed in proposed designs. We conclude that
privacy-enhancing technologies need further improvements to positively impact
data markets so that, ultimately, the value of data is preserved through data
scarcity and users' privacy and businesses-critical information are protected.Comment: 49 pages, 17 figures, 11 table
Mining social network data for personalisation and privacy concerns: A case study of Facebook’s Beacon
This is the post-print version of the final published paper that is available from the link below.The popular success of online social networking sites (SNS) such as Facebook is a hugely tempting resource of data mining for businesses engaged in personalised marketing. The use of personal information, willingly shared between online friends' networks intuitively appears to be a natural extension of current advertising strategies such as word-of-mouth and viral marketing. However, the use of SNS data for personalised marketing has provoked outrage amongst SNS users and radically highlighted the issue of privacy concern. This paper inverts the traditional approach to personalisation by conceptualising the limits of data mining in social networks using privacy concern as the guide. A qualitative investigation of 95 blogs containing 568 comments was collected during the failed launch of Beacon, a third party marketing initiative by Facebook. Thematic analysis resulted in the development of taxonomy of privacy concerns which offers a concrete means for online businesses to better understand SNS business landscape - especially with regard to the limits of the use and acceptance of personalised marketing in social networks
- …