982 research outputs found

    Transcriptomic Data Analysis Using Graph-Based Out-of-Core Methods

    Get PDF
    Biological data derived from high-throughput microarrays can be transformed into finite, simple, undirected graphs and analyzed using tools first introduced by the Langston Lab at the University of Tennessee. Transforming raw data can be broken down into three main tasks: data normalization, generation of similarity metrics, and threshold selection. The choice of methods used in each of these steps effect the final outcome of the graph, with respect to size, density, and structure. A number of different algorithms are examined and analyzed to illustrate the magnitude of the effects. Graph-based tools are then used to extract putative gene networks. These tools are loosely based on the concept of clique, which generates clusters optimized for density. Innovative additions to the paraclique algorithm, developed at the Langston Lab, are introduced to generate results that have highest average correlation or highest density. A new suite of algorithms is then presented that exploits the use of a priori gene interactions. Aptly named the anchored analysis toolkit, these algorithms use known interactions as anchor points for generating subgraphs, which are then analyzed for their graph structure. This results in clusters that might have otherwise been lost in noise. A main product of this thesis is a novel collection of algorithms to generate exact solutions to the maximum clique problem for graphs that are too large to fit within core memory. No other algorithms are currently known that produce exact solutions to this problem for extremely large graphs. A combination of in-core and out-of-core techniques is used in conjunction with a distributed-memory programming model. These algorithms take into consideration such pitfalls as external disk I/O and hardware failure and recovery. Finally, a web-based tool is described that provides researchers access the aforementioned algorithms. The Graph Algorithms Pipeline for Pathway Analysis tool, GrAPPA, was previously developed by the Langston Lab and provides the software needed to take raw microarray data as input and preprocess, analyze, and post-process it in a single package. GrAPPA also provides access to high-performance computing resources, via the TeraGrid

    The Cord Weekly (June 26, 2001)

    Get PDF

    Cross-border Investigative Journalism: a critical perspective

    Get PDF
    Focusing on power relationships in the context of Cross-Border Journalistic Investigations (CBIJ) this study takes into account a critical approach of the emerging field. The thesis differs from other accounts on CBIJ, be it from practitioners or academics. Although studies in global media have examined new frameworks and developments, as well as emerging new practices in global investigative journalism in a digitally networked society, this has usually come from a positivist view of strengthening democracy, with an added techno-euphoria. This research presents an analysis of the power relationships in CBIJ as well as its challenges in the global context. Going beyond the usual positive tech-determinist approach the thesis explores how journalistic practices in this field are shaped in two different CBIJ networks when their two major CBIJ projects overlap, through the study of data generated by participatory observation, autoethnography and archival research. The analysis is giving a special attention to both technology communication infrastructures and non-profit funding models and is showing power inequalities and limitations of CBIJ networks as well as implications of contemporary platform investigative journalism and their unintended consequences. As such, this study is providing the insight of an Eastern-European journalist, a long-time practitioner and CBIJ network facilitator, so the analytic focus on the backstages of managing access control in two major cross-border investigations enables another contribution. This thesis finds that CBIJ has been building up based on a (white male) elitist identity for investigative journalists, first in the US in the ’70s and then in Europe and beyond in the context of Post-Cold War globalisation. To add credibility to this identity, scientific techniques have been replicated in what was called 'precision journalism ' which later became data journalism and now has been used in the mega-leaks CBIJ projects. Such data-sets have been building up to such an extent that they create the authoritative source many journalists would like to have access to (i.e. Panama Papers or Football Leaks data). While shifting CBIJ to rely heavy on big data-sets (leaks) and expensive software and computing power (to process data and to share information securely across borders), statistical techniques do not reveal main stories and most of the data work is done by engineers to index and clean data and make it available for the easiest search operations possible (type and click). Because of this dependency, this research shows that today CBIJ networks incur high costs which, in the case of the largest CBIJ organisations, are not paid by media partners of such organisations but are subsidised by media assistance or philanthropy, both governmental and private. This double capture in the technology and non-profit realms gives an unusual strong leverage to the few financial donors and platforms owners, without any accountability, on influencing the CBIJ field at a global level. Contrary to the public claim, this thesis finds that investigative platforms can act as amplifying agents of national commercial (and non-profit) competitive interests at an international network level. Furthermore, journalists accepted as members of a given investigative platform work for free in the platform realm; such network technological infrastructures and the hosted data-sets are not co-owned (in some cases not even co-managed) by all participants in the network. Without decentralised technology design and without governance documents, such platform are totalitarian governance systems (surveillance and control build in) putting access control for collaborations in the hand of a few people. Thus modern CBIJ systems re-create the past pain points of commercial news industry, creating even less gatekeepers than before. I conclude that CBIJ network centralization of socio-tech access control, bankrolled by philanthropy, are building more walls and barriers contrary to current claims and past configurations. As such, the current combination of data journalism, network structures, non-profit and commercial models, and the contemporary 'precariat' indicates that cross-border investigative networks are in the data feudalism realm. Combined with the standardising of the field to be platform ready, CBIJ becomes also ready for its own colonisations. This research makes an original contribution to existing literature, especially in the global media studies, more specifically in journalism studies with a focus on collaborative journalistic practices from a political economy angle. Last but not least, I hope this thesis contributes to the de-Westernizing process of journalism studies

    Social Intelligence Design 2007. Proceedings Sixth Workshop on Social Intelligence Design

    Get PDF

    The Cord Weekly (March 17, 1999)

    Get PDF

    A Survey on Wireless Security: Technical Challenges, Recent Advances and Future Trends

    Full text link
    This paper examines the security vulnerabilities and threats imposed by the inherent open nature of wireless communications and to devise efficient defense mechanisms for improving the wireless network security. We first summarize the security requirements of wireless networks, including their authenticity, confidentiality, integrity and availability issues. Next, a comprehensive overview of security attacks encountered in wireless networks is presented in view of the network protocol architecture, where the potential security threats are discussed at each protocol layer. We also provide a survey of the existing security protocols and algorithms that are adopted in the existing wireless network standards, such as the Bluetooth, Wi-Fi, WiMAX, and the long-term evolution (LTE) systems. Then, we discuss the state-of-the-art in physical-layer security, which is an emerging technique of securing the open communications environment against eavesdropping attacks at the physical layer. We also introduce the family of various jamming attacks and their counter-measures, including the constant jammer, intermittent jammer, reactive jammer, adaptive jammer and intelligent jammer. Additionally, we discuss the integration of physical-layer security into existing authentication and cryptography mechanisms for further securing wireless networks. Finally, some technical challenges which remain unresolved at the time of writing are summarized and the future trends in wireless security are discussed.Comment: 36 pages. Accepted to Appear in Proceedings of the IEEE, 201
    • …
    corecore