6,060 research outputs found

    Harmonious Living: Sustainability, Ecology, and Eco-Islam in Wales

    Get PDF
    This thesis is an in-depth examination of Eco-Islam in Wales. Eco-Islam refers to the conceptual intersection of Islamic principles with environmental and ecological concerns. It is not necessarily a formalised movement with a centralised structure but rather a broader concept that explores the compatibility between Islamic teachings and environmental stewardship. It emphasises the idea that Islamic values and ethics can be applied to address contemporary environmental challenges. This dissertation addresses the question of the normative influence of Islamic environmental principles and their implementation within Welsh Muslim communities and Welsh society. More generally, this thesis is embedded in the academic discourse on the normative role and agency of religions in motivating their members to engage in proenvironmental behaviour. Given the urgency of the environmental crisis facing humanity, which requires a concerted effort from all sectors of society, the research question of this thesis is particularly relevant. Furthermore, despite the growing body of literature on ecology and Islam, there has been little research on the practical implementation of Islamic teachings on nature. Therefore, whilst giving a comprehensive overview of Islamic environmental ethics based on a literature review, the thesis also provides research data on the Eco-Islam movement based on fieldwork conducted in Wales. Particular attention is paid to the social and power structures that contribute to or hinder the development of a Muslim environmental movement. The study provides practical recommendations for better cooperation between faith communities and the (still) predominantly secular environmental movement, with particular attention to the challenges faced by minority communities such as the Muslim communities in Wales

    Improving Cross-Lingual Transfer Learning for Event Detection

    Get PDF
    The widespread adoption of applications powered by Artificial Intelligence (AI) backbones has unquestionably changed the way we interact with the world around us. Applications such as automated personal assistants, automatic question answering, and machine-based translation systems have become mainstays of modern culture thanks to the recent considerable advances in Natural Language Processing (NLP) research. Nonetheless, with over 7000 spoken languages in the world, there still remain a considerable number of marginalized communities that are unable to benefit from these technological advancements largely due to the language they speak. Cross-Lingual Learning (CLL) looks to address this issue by transferring the knowledge acquired from a popular, high-resource source language (e.g., English, Chinese, or Spanish) to a less favored, lower-resourced target language (e.g., Urdu or Swahili). This dissertation leverages the Event Detection (ED) sub-task of Information Extraction (IE) as a testbed and presents three novel approaches that improve cross-lingual transfer learning from distinct perspectives: (1) direct knowledge transfer, (2) hybrid knowledge transfer, and (3) few-shot learning

    Ecological and confined domain ontology construction scheme using concept clustering for knowledge management

    Get PDF
    Knowledge management in a structured system is a complicated task that requires common, standardized methods that are acceptable to all actors in a system. Ontology, in this regard, is a primary element and plays a central role in knowledge management, interoperability between various departments, and better decision making. The ontology construction for structured systems comprises logical and structural complications. Researchers have already proposed a variety of domain ontology construction schemes. However, these schemes do not involve some important phases of ontology construction that make ontologies more collaborative. Furthermore, these schemes do not provide details of the activities and methods involved in the construction of an ontology, which may cause difficulty in implementing the ontology. The major objectives of this research were to provide a comparison between some existing ontology construction schemes and to propose an enhanced ecological and confined domain ontology construction (EC-DOC) scheme for structured knowledge management. The proposed scheme introduces five important phases to construct an ontology, with a major focus on the conceptualizing and clustering of domain concepts. In the conceptualization phase, a glossary of domain-related concepts and their properties is maintained, and a Fuzzy C-Mean soft clustering mechanism is used to form the clusters of these concepts. In addition, the localization of concepts is instantly performed after the conceptualization phase, and a translation file of localized concepts is created. The EC-DOC scheme can provide accurate concepts regarding the terms for a specific domain, and these concepts can be made available in a preferred local language

    A Comprehensive Survey on Applications of Transformers for Deep Learning Tasks

    Full text link
    Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data. Unlike conventional neural networks or updated versions of Recurrent Neural Networks (RNNs) such as Long Short-Term Memory (LSTM), transformer models excel in handling long dependencies between input sequence elements and enable parallel processing. As a result, transformer-based models have attracted substantial interest among researchers in the field of artificial intelligence. This can be attributed to their immense potential and remarkable achievements, not only in Natural Language Processing (NLP) tasks but also in a wide range of domains, including computer vision, audio and speech processing, healthcare, and the Internet of Things (IoT). Although several survey papers have been published highlighting the transformer's contributions in specific fields, architectural differences, or performance evaluations, there is still a significant absence of a comprehensive survey paper encompassing its major applications across various domains. Therefore, we undertook the task of filling this gap by conducting an extensive survey of proposed transformer models from 2017 to 2022. Our survey encompasses the identification of the top five application domains for transformer-based models, namely: NLP, Computer Vision, Multi-Modality, Audio and Speech Processing, and Signal Processing. We analyze the impact of highly influential transformer-based models in these domains and subsequently classify them based on their respective tasks using a proposed taxonomy. Our aim is to shed light on the existing potential and future possibilities of transformers for enthusiastic researchers, thus contributing to the broader understanding of this groundbreaking technology

    Less is More: Restricted Representations for Better Interpretability and Generalizability

    Get PDF
    Deep neural networks are prevalent in supervised learning for large amounts of tasks such as image classification, machine translation and even scientific discovery. Their success is often at the sacrifice of interpretability and generalizability. The increasing complexity of models and involvement of the pre-training process make the inexplicability more imminent. The outstanding performance when labeled data are abundant while prone to overfit when labeled data are limited demonstrates the difficulty of deep neural networks' generalizability to different datasets. This thesis aims to improve interpretability and generalizability by restricting representations. We choose to approach interpretability by focusing on attribution analysis to understand which features contribute to prediction on BERT, and to approach generalizability by focusing on effective methods in a low-data regime. We consider two strategies of restricting representations: (1) adding bottleneck, and (2) introducing compression. Given input x, suppose we want to learn y with the latent representation z (i.e. x→z→y), adding bottleneck means adding function R such that L(R(z)) < L(z) and introducing compression means adding function R so that L(R(y)) < L(y) where L refers to the number of bits. In other words, the restriction is added either in the middle of the pipeline or at the end of it. We first introduce how adding information bottleneck can help attribution analysis and apply it to investigate BERT's behavior on text classification in Chapter 3. We then extend this attribution method to analyze passage reranking in Chapter 4, where we conduct a detailed analysis to understand cross-layer and cross-passage behavior. Adding bottleneck can not only provide insight to understand deep neural networks but can also be used to increase generalizability. In Chapter 5, we demonstrate the equivalence between adding bottleneck and doing neural compression. We then leverage this finding with a framework called Non-Parametric learning by Compression with Latent Variables (NPC-LV), and show how optimizing neural compressors can be used in the non-parametric image classification with few labeled data. To further investigate how compression alone helps non-parametric learning without latent variables (NPC), we carry out experiments with a universal compressor gzip on text classification in Chapter 6. In Chapter 7, we elucidate methods of adopting the perspective of doing compression but without the actual process of compression using T5. Using experimental results in passage reranking, we show that our method is highly effective in a low-data regime when only one thousand query-passage pairs are available. In addition to the weakly supervised scenario, we also extend our method to large language models like GPT under almost no supervision --- in one-shot and zero-shot settings. The experiments show that without extra parameters or in-context learning, GPT can be used for semantic similarity, text classification, and text ranking and outperform strong baselines, which is presented in Chapter 8. The thesis proposes to tackle two big challenges in machine learning --- "interpretability" and "generalizability" through restricting representation. We provide both theoretical derivation and empirical results to show the effectiveness of using information-theoretic approaches. We not only design new algorithms but also provide numerous insights on why and how "compression" is so important in understanding deep neural networks and improving generalizability

    A new global media order? : debates and policies on media and mass communication at UNESCO, 1960 to 1980

    Get PDF
    Defence date: 24 June 2019Examining Board: Professor Federico Romero, European University Institute (Supervisor); Professor Corinna Unger, European University Institute (Second Reader); Professor Iris Schröder, Universität Erfurt (External Advisor); Professor Sandrine Kott, Université de GenèveThe 1970s, a UNESCO report claimed, would be the “communication decade”. UNESCO had started research on new means of mass communication for development purposes in the 1960s. In the 1970s, the issue evolved into a debate on the so-called “New World Information and Communication Order” (NWICO) and the democratisation of global media. It led UNESCO itself into a major crisis in the 1980s. My project traces a dual trajectory that shaped this global debate on transnational media. The first follows communications from being seen as a tool and goal of national development in the 1960s, to communications seen as catalyst for recalibrated international political, cultural and economic relations. The second relates to the recurrent attempts, and eventual failure, of various actors to engage UNESCO as a platform to promote a new global order. I take UNESCO as an observation post to study national ambitions intersecting with internationalist claims to universality, changing understandings of the role of media in development and international affairs, and competing visions of world order. Looking at the modes of this debate, the project also sheds light on the evolving practices of internationalism. Located in the field of a new international history, this study relates to the recent rediscovery of the “new order”-discourses of the 1970s as well as to the increasingly diversified literature on internationalism. With its focus on international communications and attempts at regulating them, it also contributes to an international media history in the late twentieth century. The emphasis on the role of international organisations as well as on voices from the Global South will make contributions to our understanding of the historic macro-processes of decolonisation, globalisation and the Cold War

    Text Summarisation And Translation Across Multiple Languages

    Get PDF
    Multilingual text summarization using Natural Language Processing (NLP) stands as a vital research field, with its primary focus on condensing vital information from documents composed in diverse languages. Multilingual text summarization using Hugging Face's Transformers framework represents a cutting-edge endeavor in Natural Language Processing (NLP), addressing the challenge of distilling crucial information from documents written in various languages. The objective is to generate concise summaries that encapsulate the essential ideas while retaining the original context. This abstract explores the landscape of multilingual text summarization through the lens of Hugging Transformers, delving into methodologies and techniques facilitated by this advanced framework. This research endeavors to push the boundaries of NLP within the realm of multi-language text summarization and translation. By synergizing cutting-edge NLP techniques with the intricacies of language diversity, our work aims to cultivate effortless cross-cultural communication in an increasingly interconnected global landscape

    Evaluating automated and hybrid neural disambiguation for African historical named entities

    Get PDF
    Documents detailing South African history contain ambiguous names. Ambiguous names may be due to people having the same name or the same person being referred to by multiple different names. Thus when searching for or attempting to extract information about a particular person, the name used may affect the results. This problem may be alleviated by using a Named Entity Disambiguation (NED) system to disambiguate names by linking them to a knowledge base. In recent years, transformer-based language models have led to improvements in NED systems. Furthermore, multilingual language models have shown the ability to learn concepts across languages, reducing the amount of training data required in low-resource languages. Thus a multilingual language model-based NED system was developed to disambiguate people's names within a historical South African context using documents written in English and isiZulu from the 500 Year Archive (FHYA). The multilingual language model-based system substantially improved on a probability-based baseline and achieved a micro F1-score of 0.726. At the same time, the entity linking component was able to link 81.9% of the mentions to the correct entity. However, the system's performance on documents written in isiZulu was significantly lower than on the documents written in English. Thus the system was augmented with handcrafted rules to improve its performance. The addition of handcrafted rules resulted in a small but significant improvement in performance when compared to the unaugmented NED system
    • …
    corecore