9,252 research outputs found
Examples of works to practice staccato technique in clarinet instrument
Klarnetin staccato tekniğini güçlendirme aşamaları eser çalışmalarıyla uygulanmıştır. Staccato
geçişlerini hızlandıracak ritim ve nüans çalışmalarına yer verilmiştir. Çalışmanın en önemli amacı
sadece staccato çalışması değil parmak-dilin eş zamanlı uyumunun hassasiyeti üzerinde de
durulmasıdır. Staccato çalışmalarını daha verimli hale getirmek için eser çalışmasının içinde etüt
çalışmasına da yer verilmiştir. Çalışmaların üzerinde titizlikle durulması staccato çalışmasının ilham
verici etkisi ile müzikal kimliğe yeni bir boyut kazandırmıştır. Sekiz özgün eser çalışmasının her
aşaması anlatılmıştır. Her aşamanın bir sonraki performans ve tekniği güçlendirmesi esas alınmıştır.
Bu çalışmada staccato tekniğinin hangi alanlarda kullanıldığı, nasıl sonuçlar elde edildiği bilgisine
yer verilmiştir. Notaların parmak ve dil uyumu ile nasıl şekilleneceği ve nasıl bir çalışma disiplini
içinde gerçekleşeceği planlanmıştır. Kamış-nota-diyafram-parmak-dil-nüans ve disiplin
kavramlarının staccato tekniğinde ayrılmaz bir bütün olduğu saptanmıştır. Araştırmada literatür
taraması yapılarak staccato ile ilgili çalışmalar taranmıştır. Tarama sonucunda klarnet tekniğin de
kullanılan staccato eser çalışmasının az olduğu tespit edilmiştir. Metot taramasında da etüt
çalışmasının daha çok olduğu saptanmıştır. Böylelikle klarnetin staccato tekniğini hızlandırma ve
güçlendirme çalışmaları sunulmuştur. Staccato etüt çalışmaları yapılırken, araya eser çalışmasının
girmesi beyni rahatlattığı ve istekliliği daha arttırdığı gözlemlenmiştir. Staccato çalışmasını yaparken
doğru bir kamış seçimi üzerinde de durulmuştur. Staccato tekniğini doğru çalışmak için doğru bir
kamışın dil hızını arttırdığı saptanmıştır. Doğru bir kamış seçimi kamıştan rahat ses çıkmasına
bağlıdır. Kamış, dil atma gücünü vermiyorsa daha doğru bir kamış seçiminin yapılması gerekliliği
vurgulanmıştır. Staccato çalışmalarında baştan sona bir eseri yorumlamak zor olabilir. Bu açıdan
çalışma, verilen müzikal nüanslara uymanın, dil atış performansını rahatlattığını ortaya koymuştur.
Gelecek nesillere edinilen bilgi ve birikimlerin aktarılması ve geliştirici olması teşvik edilmiştir.
Çıkacak eserlerin nasıl çözüleceği, staccato tekniğinin nasıl üstesinden gelinebileceği anlatılmıştır.
Staccato tekniğinin daha kısa sürede çözüme kavuşturulması amaç edinilmiştir. Parmakların
yerlerini öğrettiğimiz kadar belleğimize de çalışmaların kaydedilmesi önemlidir. Gösterilen azmin ve
sabrın sonucu olarak ortaya çıkan yapıt başarıyı daha da yukarı seviyelere çıkaracaktır
Gasificação direta de biomassa para produção de gás combustível
The excessive consumption of fossil fuels to satisfy the world necessities of
energy and commodities led to the emission of large amounts of greenhouse
gases in the last decades, contributing significantly to the greatest
environmental threat of the 21st century: Climate Change. The answer to this
man-made disaster is not simple and can only be made if distinct stakeholders
and governments are brought to cooperate and work together. This is
mandatory if we want to change our economy to one more sustainable and
based in renewable materials, and whose energy is provided by the eternal
nature energies (e.g., wind, solar). In this regard, biomass can have a main role
as an adjustable and renewable feedstock that allows the replacement of fossil
fuels in various applications, and the conversion by gasification allows the
necessary flexibility for that purpose. In fact, fossil fuels are just biomass that
underwent extreme pressures and heat for millions of years. Furthermore,
biomass is a resource that, if not used or managed, increases wildfire risks.
Consequently, we also have the obligation of valorizing and using this
resource.
In this work, it was obtained new scientific knowledge to support the
development of direct (air) gasification of biomass in bubbling fluidized bed
reactors to obtain a fuel gas with suitable properties to replace natural gas in
industrial gas burners. This is the first step for the integration and development
of gasification-based biorefineries, which will produce a diverse number of
value-added products from biomass and compete with current petrochemical
refineries in the future. In this regard, solutions for the improvement of the raw
producer gas quality and process efficiency parameters were defined and
analyzed. First, addition of superheated steam as primary measure allowed the
increase of H2 concentration and H2/CO molar ratio in the producer gas without
compromising the stability of the process. However, the measure mainly
showed potential for the direct (air) gasification of high-density biomass (e.g.,
pellets), due to the necessity of having char accumulation in the reactor bottom
bed for char-steam reforming reactions. Secondly, addition of refused derived
fuel to the biomass feedstock led to enhanced gasification products, revealing
itself as a highly promising strategy in terms of economic viability and
environmental benefits of future gasification-based biorefineries, due to the
high availability and low costs of wastes. Nevertheless, integrated techno economic and life cycle analyses must be performed to fully characterize the
process. Thirdly, application of low-cost catalyst as primary measure revealed
potential by allowing the improvement of the producer gas quality (e.g., H2 and
CO concentration, lower heating value) and process efficiency parameters with
distinct solid materials; particularly, the application of concrete, synthetic
fayalite and wood pellets chars, showed promising results. Finally, the
economic viability of the integration of direct (air) biomass gasification
processes in the pulp and paper industry was also shown, despite still lacking
interest to potential investors. In this context, the role of government policies
and appropriate economic instruments are of major relevance to increase the
implementation of these projects.O consumo excessivo de combustíveis fósseis para garantir as necessidades e
interesses da sociedade conduziu à emissão de elevadas quantidades de
gases com efeito de estufa nas últimas décadas, contribuindo
significativamente para a maior ameaça ambiental do século XXI: Alterações
Climáticas. A solução para este desastre de origem humana é de caráter
complexo e só pode ser atingida através da cooperação de todos os governos
e partes interessadas. Para isto, é obrigatória a criação de uma bioeconomia
como base de um futuro mais sustentável, cujas necessidades energéticas e
materiais sejam garantidas pelas eternas energias da natureza (e.g., vento,
sol). Neste sentido, a biomassa pode ter um papel principal como uma matéria prima ajustável e renovável que permite a substituição de combustíveis fósseis
num variado número de aplicações, e a sua conversão através da gasificação
pode ser a chave para este propósito. Afinal, na prática, os combustíveis
fósseis são apenas biomassa sujeita a elevada temperatura e pressão durante
milhões de anos. Além do mais, a gestão eficaz da biomassa é fundamental
para a redução dos riscos de incêndio florestal e, como tal, temos o dever de
utilizar e valorizar este recurso.
Neste trabalho, foi obtido novo conhecimento científico para suporte do
desenvolvimento das tecnologias de gasificação direta (ar) de biomassa em
leitos fluidizados borbulhantes para produção de gás combustível, com o
objetivo da substituição de gás natural em queimadores industriais. Este é o
primeiro passo para o desenvolvimento de biorrefinarias de gasificação, uma
potencial futura indústria que irá providenciar um variado número de produtos
de valor acrescentado através da biomassa e competir com a atual indústria
petroquímica. Neste sentido, foram analisadas várias medidas para a melhoria
da qualidade do gás produto bruto e dos parâmetros de eficiência do processo.
Em primeiro, a adição de vapor sobreaquecido como medida primária permitiu
o aumento da concentração de H2 e da razão molar H2/CO no gás produto sem
comprometer a estabilidade do processo. No entanto, esta medida somente
revelou potencial para a gasificação direta (ar) de biomassa de alta densidade
(e.g., pellets) devido à necessidade da acumulação de carbonizados no leito
do reator para a ocorrência de reações de reforma com vapor. Em segundo, a
mistura de combustíveis derivados de resíduos e biomassa residual florestal
permitiu a melhoria dos produtos de gasificação, constituindo desta forma uma
estratégia bastante promissora a nível económico e ambiental, devido à
elevada abundância e baixo custo dos resíduos urbanos. Contudo, devem ser
efetuadas análises técnico-económicas e de ciclo de vida para a completa
caraterização do processo. Em terceiro, a aplicação de catalisadores de baixo
custo como medida primária demonstrou elevado potencial para a melhoria do
gás produto (e.g., concentração de H2 e CO, poder calorífico inferior) e para o
incremento dos parâmetros de eficiência do processo; em particular, a
aplicação de betão, faialite sintética e carbonizados de pellets de madeira,
demonstrou resultados promissores. Finalmente, foi demonstrada a viabilidade
económica da integração do processo de gasificação direta (ar) de biomassa
na indústria da pasta e papel, apesar dos parâmetros determinados não serem
atrativos para potenciais investidores. Neste contexto, a intervenção dos
governos e o desenvolvimento de instrumentos de apoio económico é de
grande relevância para a implementação destes projetos.Este trabalho foi financiado pela The Navigator Company e por Fundos Nacionais através da Fundação para a Ciência e a Tecnologia (FCT).Programa Doutoral em Engenharia da Refinação, Petroquímica e Químic
Towards a sociology of conspiracy theories: An investigation into conspiratorial thinking on Dönmes
This thesis investigates the social and political significance of conspiracy theories, which has been an academically neglected topic despite its historical relevance. The academic literature focuses on the methodology, social significance and political impacts of these theories in a secluded manner and lacks empirical analyses. In response, this research provides a comprehensive theoretical framework for conspiracy theories by considering their methodology, political impacts and social significance in the light of empirical data. Theoretically, the thesis uses Adorno's semi-erudition theory along with Girardian approach. It proposes that conspiracy theories are methodologically semi-erudite narratives, i.e. they are biased in favour of a belief and use reason only to prove it. It suggests that conspiracy theories appear in times of power vacuum and provide semi-erudite cognitive maps that relieve alienation and ontological insecurities of people and groups. In so doing, they enforce social control over their audience due to their essentialist, closed-to-interpretation narratives. In order to verify the theory, the study analyses empirically the social and political significance of conspiracy theories about the Dönme community in Turkey. The analysis comprises interviews with conspiracy theorists, conspiracy theory readers and political parties, alongside a frame analysis of the popular conspiracy theory books on Dönmes. These confirm the theoretical framework by showing that the conspiracy theories are fed by the ontological insecurities of Turkish society. Hence, conspiracy theorists, most readers and some political parties respond to their own ontological insecurities and political frustrations through scapegoating Dönmes. Consequently, this work shows that conspiracy theories are important symptoms of society, which, while relieving ontological insecurities, do not provide politically prolific narratives
Photography and Aesthetics: a critical study on visual and textual narratives in the lifework of Sergio Larraín and its impact in 20th century Europe and Latin America
The main focus of this study is a theoretical exploration of critical approaches applicable to the work of the Chilean photographer Sergio Larraín (1931-2012). It presents analytical tools to contextualise and understand the importance and impact of his work in photographic studies and his portrayal of twentieth-century Latin American and European culture. It inspects in depth a large portion of his photo work, which is still only partially published and mostly reduced to his "active" period as a photojournalist, aside from the personal photographic exploration of his early and late career (C. Mena). This extended material creates a broader scope for understanding his photographs and him as a canonical photographer. This study analyses the photographer's trajectory as discourses of recollection of historical memory in time (Mauad) to trace Larraín's collective memory associated with his visual production. Such analysis helps decode his visual imagery and his projection and impact on the European and Latin American culture. This strategy helps solve a two fold problem: firstly, it generates an interpretive consistency to understand the Chilean's photographic practice; secondly, it explores the power of images as an aesthetic experience in the installation of nationalist ideologies and the creation of imaginaries (B. Anderson 163)
Machine learning for managing structured and semi-structured data
As the digitalization of private, commercial, and public sectors advances rapidly, an increasing amount of data is becoming available. In order to gain insights or knowledge from these enormous amounts of raw data, a deep analysis is essential. The immense volume requires highly automated processes with minimal manual interaction. In recent years, machine learning methods have taken on a central role in this task. In addition to the individual data points, their interrelationships often play a decisive role, e.g. whether two patients are related to each other or whether they are treated by the same physician. Hence, relational learning is an important branch of research, which studies how to harness this explicitly available structural information between different data points. Recently, graph neural networks have gained importance. These can be considered an extension of convolutional neural networks from regular grids to general (irregular) graphs.
Knowledge graphs play an essential role in representing facts about entities in a machine-readable way. While great efforts are made to store as many facts as possible in these graphs, they often remain incomplete, i.e., true facts are missing. Manual verification and expansion of the graphs is becoming increasingly difficult due to the large volume of data and must therefore be assisted or substituted by automated procedures which predict missing facts. The field of knowledge graph completion can be roughly divided into two categories: Link Prediction and Entity Alignment. In Link Prediction, machine learning models are trained to predict unknown facts between entities based on the known facts. Entity Alignment aims at identifying shared entities between graphs in order to link several such knowledge graphs based on some provided seed alignment pairs.
In this thesis, we present important advances in the field of knowledge graph completion. For Entity Alignment, we show how to reduce the number of required seed alignments while maintaining performance by novel active learning techniques. We also discuss the power of textual features and show that graph-neural-network-based methods have difficulties with noisy alignment data. For Link Prediction, we demonstrate how to improve the prediction for unknown entities at training time by exploiting additional metadata on individual statements, often available in modern graphs. Supported with results from a large-scale experimental study, we present an analysis of the effect of individual components of machine learning models, e.g., the interaction function or loss criterion, on the task of link prediction. We also introduce a software library that simplifies the implementation and study of such components and makes them accessible to a wide research community, ranging from relational learning researchers to applied fields, such as life sciences. Finally, we propose a novel metric for evaluating ranking results, as used for both completion tasks. It allows for easier interpretation and comparison, especially in cases with different numbers of ranking candidates, as encountered in the de-facto standard evaluation protocols for both tasks.Mit der rasant fortschreitenden Digitalisierung des privaten, kommerziellen und öffentlichen Sektors werden immer größere Datenmengen verfügbar. Um aus diesen enormen Mengen an Rohdaten Erkenntnisse oder Wissen zu gewinnen, ist eine tiefgehende Analyse unerlässlich. Das immense Volumen erfordert hochautomatisierte Prozesse mit minimaler manueller Interaktion. In den letzten Jahren haben Methoden des maschinellen Lernens eine zentrale Rolle bei dieser Aufgabe eingenommen. Neben den einzelnen Datenpunkten spielen oft auch deren Zusammenhänge eine entscheidende Rolle, z.B. ob zwei Patienten miteinander verwandt sind oder ob sie vom selben Arzt behandelt werden. Daher ist das relationale Lernen ein wichtiger Forschungszweig, der untersucht, wie diese explizit verfügbaren strukturellen Informationen zwischen verschiedenen Datenpunkten nutzbar gemacht werden können. In letzter Zeit haben Graph Neural Networks an Bedeutung gewonnen. Diese können als eine Erweiterung von CNNs von regelmäßigen Gittern auf allgemeine (unregelmäßige) Graphen betrachtet werden.
Wissensgraphen spielen eine wesentliche Rolle bei der Darstellung von Fakten über Entitäten in maschinenlesbaren Form. Obwohl große Anstrengungen unternommen werden, so viele Fakten wie möglich in diesen Graphen zu speichern, bleiben sie oft unvollständig, d. h. es fehlen Fakten. Die manuelle Überprüfung und Erweiterung der Graphen wird aufgrund der großen Datenmengen immer schwieriger und muss daher durch automatisierte Verfahren unterstützt oder ersetzt werden, die fehlende Fakten vorhersagen. Das Gebiet der Wissensgraphenvervollständigung lässt sich grob in zwei Kategorien einteilen: Link Prediction und Entity Alignment. Bei der Link Prediction werden maschinelle Lernmodelle trainiert, um unbekannte Fakten zwischen Entitäten auf der Grundlage der bekannten Fakten vorherzusagen. Entity Alignment zielt darauf ab, gemeinsame Entitäten zwischen Graphen zu identifizieren, um mehrere solcher Wissensgraphen auf der Grundlage einiger vorgegebener Paare zu verknüpfen.
In dieser Arbeit stellen wir wichtige Fortschritte auf dem Gebiet der Vervollständigung von Wissensgraphen vor. Für das Entity Alignment zeigen wir, wie die Anzahl der benötigten Paare reduziert werden kann, während die Leistung durch neuartige aktive Lerntechniken erhalten bleibt. Wir erörtern auch die Leistungsfähigkeit von Textmerkmalen und zeigen, dass auf Graph-Neural-Networks basierende Methoden Schwierigkeiten mit verrauschten Paar-Daten haben. Für die Link Prediction demonstrieren wir, wie die Vorhersage für unbekannte Entitäten zur Trainingszeit verbessert werden kann, indem zusätzliche Metadaten zu einzelnen Aussagen genutzt werden, die oft in modernen Graphen verfügbar sind. Gestützt auf Ergebnisse einer groß angelegten experimentellen Studie präsentieren wir eine Analyse der Auswirkungen einzelner Komponenten von Modellen des maschinellen Lernens, z. B. der Interaktionsfunktion oder des Verlustkriteriums, auf die Aufgabe der Link Prediction. Außerdem stellen wir eine Softwarebibliothek vor, die die Implementierung und Untersuchung solcher Komponenten vereinfacht und sie einer breiten Forschungsgemeinschaft zugänglich macht, die von Forschern im Bereich des relationalen Lernens bis hin zu angewandten Bereichen wie den Biowissenschaften reicht. Schließlich schlagen wir eine neuartige Metrik für die Bewertung von Ranking-Ergebnissen vor, wie sie für beide Aufgaben verwendet wird. Sie ermöglicht eine einfachere Interpretation und einen leichteren Vergleich, insbesondere in Fällen mit einer unterschiedlichen Anzahl von Kandidaten, wie sie in den de-facto Standardbewertungsprotokollen für beide Aufgaben vorkommen
Theorising Christian Anarchism A Political Commentary on the Gospel
This thesis argues that there is a tradition in political theology and in political theory that deserves to be called "Christian anarchism." The various thinkers that contribute to this tradition have never before been considered to be part of a theoretical movement or tradition, and the originality of this thesis is to weave these thinkers together and present a generic theory of Christian anarchism. . Taken together, thinkers like Tolstoy, Ellul, Elliott and Andrews put forward a comprehensive exegesis of Jesus' teaching and example as implying a critique of the state and a vision of a stateless society. Based on this understanding of the Gospel, they accuse both the state and the church of contradicting, betraying and corrupting the essence of Christianity. Some Christian anarchists - Eller in particular - even see Romans 13 and the "render unto Caesar" passage as not discrediting but indeed confirming their interpretation, and although more activist Christian anarchists sometimes disagree on the potential role of civil disobedience, they do all stress that what matters above all is obedience to God Moreover, they all call for the "true" church to lead the Christian anarchist revolution by example, despite the very demanding sacrifices which this involves. They point to numerous examples of similar witness ever since the early church, and themselves strive to emulate such examples in their own lives - the Catholic Worker movement being perhaps the most notable example in this regard Thus, Christian anarchist thinkers' critique of the current order and appeal to follow God's radical commandments echoes the voices of the prophets of old, calling society to return to God's covenant. By weaving their scattered voices together - by theorising Christian anarchism - this thesis provides a political commentary on the Gospel which contributes as much to political theory as it does to political theology
AIUCD 2022 - Proceedings
L’undicesima edizione del Convegno Nazionale dell’AIUCD-Associazione di Informatica Umanistica ha per titolo Culture digitali. Intersezioni: filosofia, arti, media. Nel titolo è presente, in maniera esplicita, la richiesta di una riflessione, metodologica e teorica, sull’interrelazione tra tecnologie digitali, scienze dell’informazione, discipline filosofiche, mondo delle arti e cultural studies
COMPUTER SCIENCE STUDENTS AND LIBRARY TECHNOLOGY: EVALUATING STUDENTS’ CAREER GOALS TO CREATE STRATEGIES THAT INCREASE INTEREST IN LIBRARY EMPLOYMENT
Academic libraries in the United States often have difficulty recruiting for technology-focused positions. This mixed-methods study examines what technology skills libraries are seeking in entry-level technology positions and explores ways to increase interest in library employment. Utilizing Lent’s (2013) social cognitive career theory (SCCT) framework, this study seeks to understand why students study computer science, how computer science students seek future employment, and explores how a large university in the southeastern United States can facilitate interest in applying for library technology positions. Quantitative data was determined through an examination of library technology positions to explore trends and what skills employers are seeking. Qualitative data was gathered from recorded interviews with current junior and senior level undergraduate computer science majors. Combined with an in-depth look at the literature and recruitment needs of libraries, possible solutions to the problem of practice are offered in the form of practical internships, interdisciplinary collaboration, and a potential graduate certificate with the goal of connecting computer science students to software development positions in libraries
A Syntactical Reverse Engineering Approach to Fourth Generation Programming Languages Using Formal Methods
Fourth-generation programming languages (4GLs) feature rapid development with minimum configuration required by developers. However, 4GLs can suffer from limitations such as high maintenance cost and legacy software practices.
Reverse engineering an existing large legacy 4GL system into a currently maintainable programming language can be a cheaper and more effective solution than rewriting from scratch. Tools do not exist so far, for reverse engineering proprietary XML-like and model-driven 4GLs where the full language specification is not in the public domain.
This research has developed a novel method of reverse engineering some of the syntax of such 4GLs (with Uniface as an exemplar) derived from a particular system, with a view to providing a reliable method to translate/transpile that system's code and data structures into a modern object-oriented language (such as C\#).
The method was also applied, although only to a limited extent, to some other 4GLs, Informix and Apex, to show that it was in principle more broadly applicable. A novel testing method that the syntax had been successfully translated was provided using 'abstract syntax trees'.
The novel method took manually crafted grammar rules, together with Encapsulated Document Object Model based data from the source language and then used parsers to produce syntactically valid and equivalent code in the target/output language.
This proof of concept research has provided a methodology plus sample code to automate part of the process. The methodology comprised a set of manual or semi-automated steps. Further automation is left for future research.
In principle, the author's method could be extended to allow the reverse engineering recovery of the syntax of systems developed in other proprietary 4GLs. This would reduce time and cost for the ongoing maintenance of such systems by enabling their software engineers to work using modern object-oriented languages, methodologies, tools and techniques
- …