35 research outputs found

    Cascading Failures in Power Grids - Analysis and Algorithms

    Full text link
    This paper focuses on cascading line failures in the transmission system of the power grid. Recent large-scale power outages demonstrated the limitations of percolation- and epid- emic-based tools in modeling cascades. Hence, we study cascades by using computational tools and a linearized power flow model. We first obtain results regarding the Moore-Penrose pseudo-inverse of the power grid admittance matrix. Based on these results, we study the impact of a single line failure on the flows on other lines. We also illustrate via simulation the impact of the distance and resistance distance on the flow increase following a failure, and discuss the difference from the epidemic models. We then study the cascade properties, considering metrics such as the distance between failures and the fraction of demand (load) satisfied after the cascade (yield). We use the pseudo-inverse of admittance matrix to develop an efficient algorithm to identify the cascading failure evolution, which can be a building block for cascade mitigation. Finally, we show that finding the set of lines whose removal has the most significant impact (under various metrics) is NP-Hard and introduce a simple heuristic for the minimum yield problem. Overall, the results demonstrate that using the resistance distance and the pseudo-inverse of admittance matrix provides important insights and can support the development of efficient algorithms

    LINGUIST: Language Model Instruction Tuning to Generate Annotated Utterances for Intent Classification and Slot Tagging

    Full text link
    We present LINGUIST, a method for generating annotated data for Intent Classification and Slot Tagging (IC+ST), via fine-tuning AlexaTM 5B, a 5-billion-parameter multilingual sequence-to-sequence (seq2seq) model, on a flexible instruction prompt. In a 10-shot novel intent setting for the SNIPS dataset, LINGUIST surpasses state-of-the-art approaches (Back-Translation and Example Extrapolation) by a wide margin, showing absolute improvement for the target intents of +1.9 points on IC Recall and +2.5 points on ST F1 Score. In the zero-shot cross-lingual setting of the mATIS++ dataset, LINGUIST out-performs a strong baseline of Machine Translation with Slot Alignment by +4.14 points absolute on ST F1 Score across 6 languages, while matching performance on IC. Finally, we verify our results on an internal large-scale multilingual dataset for conversational agent IC+ST and show significant improvements over a baseline which uses Back-Translation, Paraphrasing and Slot Catalog Resampling. To our knowledge, we are the first to demonstrate instruction fine-tuning of a large-scale seq2seq model to control the outputs of multilingual intent- and slot-labeled data generation.Comment: Accepted to The 29th International Conference on Computational Linguistics (COLING 2022) October 12-17, 2022, Gyeongju, Republic of Korea https://coling2022.org

    Hacia la promoci贸n del turismo y los servicios hoteleros para el turismo de orfanatos en Egipto

    Get PDF
    Orphanage tourism is a sub-sector of volunteer tourism which refers to traveling abroad to "do well" and support others in other communities. Importance: The importance of providing hospitality services to this style is that orphan tourism stems from the desire to have more meaningful and ethical holiday experiences, and the desire to give back to the host communities visited by the tourist. The study aims at developing the level of promoting hospitality services for the Orphanage Tourism and hospitality pattern in Egypt. Study problem: The problem of the study lies in identifying the strengths and weaknesses and identifying the needs towards promoting hospitality services for the Orphanage Tourism and hospitality pattern in Egypt. Methodology: Using the descriptive approach, selecting guests, managers and supervisors of marketing and public relations departments and those who are in direct contact with customers. The total numbers of questionnaires received were 150 questionnaires were distributed to a group of managers and supervisors in 15 five-star hotels as well as 15 tourism companies in the Red Sea region, and among them 120 questionnaires were valid, correctly and ready to be statistically analyzed. Results: There are deficiencies in the hospitality services provided to the Orphanage Tourism and hospitality pattern in the Red Sea region, and they need more attention and care. Lack of tourism awareness of the strengths and weaknesses and identifying the needs for hospitality tourism orphanages in Egypt and the need to pay attention to them .Recommendations: shedding light and spreading tourist awareness of strengths and weaknesses and identifying needs towards promoting tourism and hospitality services for Orphanage Tourism and hospitality pattern in Egypt; Providing an appropriate environment to provide services that meet the needs of tourism and hospitality services for Orphanage Tourism and hospitality pattern to meet their needs and satisfy their desires.El turismo de orfanatos es una subtipolog铆a del turismo solidario que se refiere a viajar al extranjero para "hacer el bien" y apoyar a otras comunidades locales. La importancia de brindar servicios de hospitalidad de este estilo radica en que el turismo de orfanatos surja del deseo de tener experiencias vacacionales m谩s significativas y 茅ticas y del deseo de retribuir a las comunidades anfitrionas visitadas por el turista. El estudio tiene como objetivo desarrollar mejores niveles de promoci贸n de los servicios de hospitalidad para el turismo de orfanatos y fijar un patr贸n de hospitalidad en Egipto. En este etudio se pretende identificar las fortalezas y debilidades e identificar las necesidades de promoci贸n de los servicios de hospitalidad para el turismo de orfanatos en Egipto. Metodolog铆a: Se utiliza un enfoque descriptivo y cualitativo a trav茅s de la selecci贸n actores, gerentes y supervisores cualificados de los departamentos de marketing y relaciones p煤blicas que est谩n en contacto directo con los clientes. El n煤mero total de cuestionarios recibidos fue de 150, distribuidos entre un grupo de gerentes y supervisores en 15 hoteles de cinco estrellas, as铆 como de 15 empresas tur铆sticas en la regi贸n del Mar Rojo, y entre ellos, 120 cuestionarios fueron v谩lidos, correctos y adecuados para su an谩lisis estad铆stico. Resultados: Existen deficiencias en los servicios de hospitalidad brindados al turismo de orfanatos y el patr贸n de hospitalidad en la regi贸n del Mar Rojo necesita m谩s atenci贸n y cuidado. Adem谩s falta conciencia tur铆stica sobre las fortalezas y debilidades de esta subtipolog铆a tur铆stica y de identificaci贸n de las necesidades de los orfanatos de turismo hotelero en Egipto. Recomendaciones: arrojar luz y difundir la conciencia tur铆stica sobre las fortalezas y debilidades e identificar las necesidades para promover el turismo y los servicios de hospitalidad para el turismo de orfanatos y el patr贸n de hospitalidad en Egipto; proporcionar un ambiente apropiado para brindar servicios que satisfagan las necesidades de turismo y servicios de hospitalidad relacionados con la visita y ayuda a los orfanatos; y establecer un plan de acci贸n para mejorar la calidad de los servicios relacionados con esta subtipolog铆a tur铆stica

    AlexaTM 20B: Few-Shot Learning Using a Large-Scale Multilingual Seq2Seq Model

    Full text link
    In this work, we demonstrate that multilingual large-scale sequence-to-sequence (seq2seq) models, pre-trained on a mixture of denoising and Causal Language Modeling (CLM) tasks, are more efficient few-shot learners than decoder-only models on various tasks. In particular, we train a 20 billion parameter multilingual seq2seq model called Alexa Teacher Model (AlexaTM 20B) and show that it achieves state-of-the-art (SOTA) performance on 1-shot summarization tasks, outperforming a much larger 540B PaLM decoder model. AlexaTM 20B also achieves SOTA in 1-shot machine translation, especially for low-resource languages, across almost all language pairs supported by the model (Arabic, English, French, German, Hindi, Italian, Japanese, Marathi, Portuguese, Spanish, Tamil, and Telugu) on Flores-101 dataset. We also show in zero-shot setting, AlexaTM 20B outperforms GPT3 (175B) on SuperGLUE and SQuADv2 datasets and provides SOTA performance on multilingual tasks such as XNLI, XCOPA, Paws-X, and XWinograd. Overall, our results present a compelling case for seq2seq models as a powerful alternative to decoder-only models for Large-scale Language Model (LLM) training
    corecore