10,371 research outputs found

    Examples of works to practice staccato technique in clarinet instrument

    Get PDF
    Klarnetin staccato tekniğini güçlendirme aşamaları eser çalışmalarıyla uygulanmıştır. Staccato geçişlerini hızlandıracak ritim ve nüans çalışmalarına yer verilmiştir. Çalışmanın en önemli amacı sadece staccato çalışması değil parmak-dilin eş zamanlı uyumunun hassasiyeti üzerinde de durulmasıdır. Staccato çalışmalarını daha verimli hale getirmek için eser çalışmasının içinde etüt çalışmasına da yer verilmiştir. Çalışmaların üzerinde titizlikle durulması staccato çalışmasının ilham verici etkisi ile müzikal kimliğe yeni bir boyut kazandırmıştır. Sekiz özgün eser çalışmasının her aşaması anlatılmıştır. Her aşamanın bir sonraki performans ve tekniği güçlendirmesi esas alınmıştır. Bu çalışmada staccato tekniğinin hangi alanlarda kullanıldığı, nasıl sonuçlar elde edildiği bilgisine yer verilmiştir. Notaların parmak ve dil uyumu ile nasıl şekilleneceği ve nasıl bir çalışma disiplini içinde gerçekleşeceği planlanmıştır. Kamış-nota-diyafram-parmak-dil-nüans ve disiplin kavramlarının staccato tekniğinde ayrılmaz bir bütün olduğu saptanmıştır. Araştırmada literatür taraması yapılarak staccato ile ilgili çalışmalar taranmıştır. Tarama sonucunda klarnet tekniğin de kullanılan staccato eser çalışmasının az olduğu tespit edilmiştir. Metot taramasında da etüt çalışmasının daha çok olduğu saptanmıştır. Böylelikle klarnetin staccato tekniğini hızlandırma ve güçlendirme çalışmaları sunulmuştur. Staccato etüt çalışmaları yapılırken, araya eser çalışmasının girmesi beyni rahatlattığı ve istekliliği daha arttırdığı gözlemlenmiştir. Staccato çalışmasını yaparken doğru bir kamış seçimi üzerinde de durulmuştur. Staccato tekniğini doğru çalışmak için doğru bir kamışın dil hızını arttırdığı saptanmıştır. Doğru bir kamış seçimi kamıştan rahat ses çıkmasına bağlıdır. Kamış, dil atma gücünü vermiyorsa daha doğru bir kamış seçiminin yapılması gerekliliği vurgulanmıştır. Staccato çalışmalarında baştan sona bir eseri yorumlamak zor olabilir. Bu açıdan çalışma, verilen müzikal nüanslara uymanın, dil atış performansını rahatlattığını ortaya koymuştur. Gelecek nesillere edinilen bilgi ve birikimlerin aktarılması ve geliştirici olması teşvik edilmiştir. Çıkacak eserlerin nasıl çözüleceği, staccato tekniğinin nasıl üstesinden gelinebileceği anlatılmıştır. Staccato tekniğinin daha kısa sürede çözüme kavuşturulması amaç edinilmiştir. Parmakların yerlerini öğrettiğimiz kadar belleğimize de çalışmaların kaydedilmesi önemlidir. Gösterilen azmin ve sabrın sonucu olarak ortaya çıkan yapıt başarıyı daha da yukarı seviyelere çıkaracaktır

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    Strategies for Early Learners

    Get PDF
    Welcome to learning about how to effectively plan curriculum for young children. This textbook will address: • Developing curriculum through the planning cycle • Theories that inform what we know about how children learn and the best ways for teachers to support learning • The three components of developmentally appropriate practice • Importance and value of play and intentional teaching • Different models of curriculum • Process of lesson planning (documenting planned experiences for children) • Physical, temporal, and social environments that set the stage for children’s learning • Appropriate guidance techniques to support children’s behaviors as the self-regulation abilities mature. • Planning for preschool-aged children in specific domains including o Physical development o Language and literacy o Math o Science o Creative (the visual and performing arts) o Diversity (social science and history) o Health and safety • Making children’s learning visible through documentation and assessmenthttps://scholar.utc.edu/open-textbooks/1001/thumbnail.jp

    Limit theorems for non-Markovian and fractional processes

    Get PDF
    This thesis examines various non-Markovian and fractional processes---rough volatility models, stochastic Volterra equations, Wiener chaos expansions---through the prism of asymptotic analysis. Stochastic Volterra systems serve as a conducive framework encompassing most rough volatility models used in mathematical finance. In Chapter 2, we provide a unified treatment of pathwise large and moderate deviations principles for a general class of multidimensional stochastic Volterra equations with singular kernels, not necessarily of convolution form. Our methodology is based on the weak convergence approach by Budhiraja, Dupuis and Ellis. This powerful approach also enables us to investigate the pathwise large deviations of families of white noise functionals characterised by their Wiener chaos expansion as~Xε=n=0εnIn(fnε).X^\varepsilon = \sum_{n=0}^{\infty} \varepsilon^n I_n \big(f_n^{\varepsilon} \big). In Chapter 3, we provide sufficient conditions for the large deviations principle to hold in path space, thereby refreshing a problem left open By Pérez-Abreu (1993). Hinging on analysis on Wiener space, the proof involves describing, controlling and identifying the limit of perturbed multiple stochastic integrals. In Chapter 4, we come back to mathematical finance via the route of Malliavin calculus. We present explicit small-time formulae for the at-the-money implied volatility, skew and curvature in a large class of models, including rough volatility models and their multi-factor versions. Our general setup encompasses both European options on a stock and VIX options. In particular, we develop a detailed analysis of the two-factor rough Bergomi model. Finally, in Chapter 5, we consider the large-time behaviour of affine stochastic Volterra equations, an under-developed area in the absence of Markovianity. We leverage on a measure-valued Markovian lift introduced by Cuchiero and Teichmann and the associated notion of generalised Feller property. This setting allows us to prove the existence of an invariant measure for the lift and hence of a stationary distribution for the affine Volterra process, featuring in the rough Heston model.Open Acces

    Data-to-text generation with neural planning

    Get PDF
    In this thesis, we consider the task of data-to-text generation, which takes non-linguistic structures as input and produces textual output. The inputs can take the form of database tables, spreadsheets, charts, and so on. The main application of data-to-text generation is to present information in a textual format which makes it accessible to a layperson who may otherwise find it problematic to understand numerical figures. The task can also automate routine document generation jobs, thus improving human efficiency. We focus on generating long-form text, i.e., documents with multiple paragraphs. Recent approaches to data-to-text generation have adopted the very successful encoder-decoder architecture or its variants. These models generate fluent (but often imprecise) text and perform quite poorly at selecting appropriate content and ordering it coherently. This thesis focuses on overcoming these issues by integrating content planning with neural models. We hypothesize data-to-text generation will benefit from explicit planning, which manifests itself in (a) micro planning, (b) latent entity planning, and (c) macro planning. Throughout this thesis, we assume the input to our generator are tables (with records) in the sports domain. And the output are summaries describing what happened in the game (e.g., who won/lost, ..., scored, etc.). We first describe our work on integrating fine-grained or micro plans with data-to-text generation. As part of this, we generate a micro plan highlighting which records should be mentioned and in which order, and then generate the document while taking the micro plan into account. We then show how data-to-text generation can benefit from higher level latent entity planning. Here, we make use of entity-specific representations which are dynam ically updated. The text is generated conditioned on entity representations and the records corresponding to the entities by using hierarchical attention at each time step. We then combine planning with the high level organization of entities, events, and their interactions. Such coarse-grained macro plans are learnt from data and given as input to the generator. Finally, we present work on making macro plans latent while incrementally generating a document paragraph by paragraph. We infer latent plans sequentially with a structured variational model while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Overall our results show that planning makes data-to-text generation more interpretable, improves the factuality and coherence of the generated documents and re duces redundancy in the output document

    Annals [...].

    Get PDF
    Pedometrics: innovation in tropics; Legacy data: how turn it useful?; Advances in soil sensing; Pedometric guidelines to systematic soil surveys.Evento online. Coordenado por: Waldir de Carvalho Junior, Helena Saraiva Koenow Pinheiro, Ricardo Simão Diniz Dalmolin

    Three Essays on Housing, Credit and Uncertainty

    Get PDF
    This thesis comprises three essays in macroeconomics. The key aim of our work is to quantify the link between housing, credit and uncertainty. In the first chapter we investigate the propagation mechanism of a temporary uncertainty shock for the UK. We adopt a factor augmented VAR model which facilitates an examination of variables which have not been included in previous studies. Our empirical analysis establishes that while the ‘traditional’ channels generally hold; across different sectors there are asymmetric responses. For example, precautionary behaviour implies that agents cut back on consumption and increase saving in order to mitigate the risks associated with uncertainty. However, decomposing consumption, we provide evidence that for food and fuel markets the impact of an increase in uncertainty is close to zero. This follows because we do not capture the expected fall in demand given the consumption decision is a necessity. In terms of housing and credit we propose a new housing uncertainty channel which is closely linked to growth option theory. The idea is that a second moment uncertainty shock extends the tails of the distributions and thus increases the potential payoffs. This in turn leads to an increase in investment. For those who are able to access credit, we capture an increase in housing investment and a corresponding expansion in mortgage credit which contributes to a reduction in the negative impacts of uncertainty shocks. The second paper extends the discussion in Chapter 1, by examining the transmission of uncertainty shocks in the US. Specifically, we quantify the linear and non linear impacts of uncertainty. For the linear analysis, we estimate a proxy SVAR using narrative identification, net of first moment shocks, and provide supporting evidence of the housing uncertainty channel. The interaction between housing and credit is shown to be crucial, reinforcing the findings we present in Chapter 1. The intuition for our non linear analysis builds upon the idea that once uncertainty has reached a certain level, any additional increases in uncertainty are unlikely to have any impact on macroeconomic aggregates. In order to test this conjecture, we propose an instrument to identify uncertainty shocks, which is constructed by isolating the variation in the price of gold around events associated with uncertainty. We argue that the change in the price of gold accurately represents uncertainty, because it is perceived as a safe haven asset. When faced with the additional risk associated with uncertainty agents invest in gold. This reflects a flight to safety. We adopt a threshold VAR model which isolates responses dependant on regimes synoptic with high and low uncertainty. We show that in a low uncertainty regime, uncertainty propagates similarly to the linear case. In contrast, there is a clear distinction in a high uncertainty regime driven by impatient behaviour. In our final chapter, we propose a DSGE model which is consistent with the empirical evidence we provide in the previous chapters. We choose to order the chapters in such a way that we first establish the empirical facts of uncertainty shocks and then use these to inform our theoretical model. The key empirical takeaway following a shock to uncertainty is a co-movement between consumption and investment. We demonstrate that a vanilla housing real business cycle model is not consistent with these empirical facts. In order to match the theoretical model to the empirics, we extend the baseline model by including both banks and financial frictions. We document first a credit channel which limits access to external funds for the credit dependant sector of the economy. Second, we find a housing demand channel, which leads to tighter constraints for households and entrepreneurs and lowers the return on capital. Together, both channels amplify precautionary saving for household borrowers. The credit channel creates a real option channel for entrepreneurs, while the housing demand channel impacts households savers by amplifying their reduction in investment. In the absence of credit constraints, the housing uncertainty channel dominates behaviour. However, this channel is reversed when agents face difficulty in accessing credit consistent with Chapter 1

    Consolidation of Urban Freight Transport – Models and Algorithms

    Get PDF
    Urban freight transport is an indispensable component of economic and social life in cities. Compared to other types of transport, however, it contributes disproportionately to the negative impacts of traffic. As a result, urban freight transport is closely linked to social, environmental, and economic challenges. Managing urban freight transport and addressing these issues poses challenges not only for local city administrations but also for companies, such as logistics service providers (LSPs). Numerous policy measures and company-driven initiatives exist in the area of urban freight transport to overcome these challenges. One central approach is the consolidation of urban freight transport. This dissertation focuses on urban consolidation centers (UCCs) which are a widely studied and applied measure in urban freight transport. The fundamental idea of UCCs is to consolidate freight transport across companies in logistics facilities close to an urban area in order to increase the efficiency of vehicles delivering goods within the urban area. Although the concept has been researched and tested for several decades and it was shown that it can reduce the negative externalities of freight transport in cities, in practice many UCCs struggle with a lack of business participation and financial difficulties. This dissertation is primarily focused on the costs and savings associated with the use of UCCs from the perspective of LSPs. The cost-effectiveness of UCC use, which is also referred to as cost attractiveness, can be seen as a crucial condition for LSPs to be interested in using UCC systems. The overall objective of this dissertation is two-fold. First, it aims to develop models to provide decision support for evaluating the cost-effectiveness of using UCCs. Second, it aims to analyze the impacts of urban freight transport regulations and operational characteristics on the cost attractiveness of using UCCs from the perspective of LSPs. In this context, a distinction is made between UCCs that are jointly operated by a group of LSPs and UCCs that are operated by third parties who offer their urban transport service for a fee. The main body of this dissertation is based on three research papers. The first paper focuses on jointly-operated UCCs that are operated by a group of cooperating LSPs. It presents a simulation model to analyze the financial impacts on LSPs participating in such a scheme. In doing so, a particular focus is placed on urban freight transport regulations. A case study is used to analyze the operation of a jointly-operated UCC for scenarios involving three freight transport regulations. The second and third papers take on a different perspective on UCCs by focusing on third-party operated UCCs. In contrast to the first paper, the second and third papers present an evaluation approach in which the decision to use UCCs is integrated with the vehicle route planning of LSPs. In addition to addressing the basic version of this integrated routing problem, known as the vehicle routing problem with transshipment facilities (VRPTF), the second paper presents problem extensions that incorporate time windows, fleet size and mix decisions, and refined objective functions. To heuristically solve the basic problem and the new problem variants, an adaptive large neighborhood search (ALNS) heuristic with embedded local search heuristic and set partitioning problem (SPP) is presented. Furthermore, various factors influencing the cost attractiveness of UCCs, including time windows and usage fees, are analyzed using a real-world case study. The third paper extends the work of the second paper and incorporates daily and entrance-based city toll schemes and enables multi-trip routing. A mixed-integer linear programming (MILP) formulation of the resulting problem is proposed, as well as an ALNS solution heuristic. Moreover, a real-world case study with three European cities is used to analyze the impact of the two city toll systems in different operational contexts

    Managing global virtual teams in the London FinTech industry

    Get PDF
    Today, the number of organisations that are adopting virtual working arrangements has exploded, and the London FinTech industry is no exception. During recent years, FinTech companies have increasingly developed virtual teams as a means of connecting and engaging geographically dispersed workers, lowering costs, and enabling greater speed and adaptability. As the first study in the United Kingdom regarding global virtual team management in the FinTech industry, this DBA research seeks answers to the question, “What makes for the successful management of a global virtual team in the London FinTech industry?”. Straussian grounded-theory method was chosen as this qualitative approach lets participants have their own voice and offers some flexibility. It also allows the researcher to have preconceived ideas about the research undertaking. The research work makes the case for appreciating the voice of people with lived experiences. Ten London-based FinTech Managers with considerable experience running virtual teams agreed to take part in this study. These Managers had spent time working at large, household-name firms with significant global reach, and one had recently become founder and CEO of his own firm, taking on clients and hiring contract staff from around the world. At least eight of the other participants were senior ‘Heads’ of various technology teams and one was a Managing Director working at a ‘Big Four’ consultancy. They had all (and many still did) spent years running geographically distributed teams with members as far away as Pacific Asia and they were all keen to discuss that breadth of experience and the challenges they faced. Results from these in-depth interviews suggested that there are myriad reasons for a global virtual team, from providing 24 hour, follow-the-sun service to locating the most cost-effective resources with the highest skills. It also confirmed that there are unique challenges to virtual management and new techniques are required to help navigate virtual managers through them. Managing a global virtual team requires much more than the traditional management competencies. Based on discussion with the respondents, a set of practical recommendations for global virtual team management was developed and covered a wide range of issues related to recruitment and selection, team building, developing standard operating procedures, communication, motivation, performance management, and building trust

    Developing automated meta-research approaches in the preclinical Alzheimer's disease literature

    Get PDF
    Alzheimer’s disease is a devastating neurodegenerative disorder for which there is no cure. A crucial part of the drug development pipeline involves testing therapeutic interventions in animal disease models. However, promising findings in preclinical experiments have not translated into clinical trial success. Reproducibility has often been cited as a major issue affecting biomedical research, where experimental results in one laboratory cannot be replicated in another. By using meta-research (research on research) approaches such as systematic reviews, researchers aim to identify and summarise all available evidence relating to a specific research question. By conducting a meta-analysis, researchers can also combine the results from different experiments statistically to understand the overall effect of an intervention and to explore reasons for variations seen across different publications. Systematic reviews of the preclinical Alzheimer’s disease literature could inform decision making, encourage research improvement, and identify gaps in the literature to guide future research. However, due to the vast amount of potentially useful evidence from animal models of Alzheimer’s disease, it remains difficult to make sense of and utilise this data effectively. Systematic reviews are common practice within evidence based medicine, yet their application to preclinical research is often limited by the time and resources required. In this thesis, I develop, build-upon, and implement automated meta-research approaches to collect, curate, and evaluate the preclinical Alzheimer’s literature. I searched several biomedical databases to obtain all research relevant to Alzheimer’s disease. I developed a novel deduplication tool to automatically identify and remove duplicate publications identified across different databases with minimal human effort. I trained a crowd of reviewers to annotate a subset of the publications identified and used this data to train a machine learning algorithm to screen through the remaining publications for relevance. I developed text-mining tools to extract model, intervention, and treatment information from publications and I improved existing automated tools to extract reported measures to reduce the risk of bias. Using these tools, I created a categorised database of research in transgenic Alzheimer’s disease animal models and created a visual summary of this dataset on an interactive, openly accessible online platform. Using the techniques described, I also identified relevant publications within the categorised dataset to perform systematic reviews of two key outcomes of interest in transgenic Alzheimer’s disease models: (1) synaptic plasticity and transmission in hippocampal slices and (2) motor activity in the open field test. Over 400,000 publications were identified across biomedical research databases, with 230,203 unique publications. In a performance evaluation across different preclinical datasets, the automated deduplication tool I developed could identify over 97% of duplicate citations and a had an error rate similar to that of human performance. When evaluated on a test set of publications, the machine learning classifier trained to identify relevant research in transgenic models performed was highly sensitive (captured 96.5% of relevant publications) and excluded 87.8% of irrelevant publications. Tools to identify the model(s) and outcome measure(s) within the full-text of publications may reduce the burden on reviewers and were found to be more sensitive than searching only the title and abstract of citations. Automated tools to assess risk of bias reporting were highly sensitive and could have the potential to monitor research improvement over time. The final dataset of categorised Alzheimer’s disease research contained 22,375 publications which were then visualised in the interactive web application. Within the application, users can see how many publications report measures to reduce the risk of bias and how many have been classified as using each transgenic model, testing each intervention, and measuring each outcome. Users can also filter to obtain curated lists of relevant research, allowing them to perform systematic reviews at an accelerated pace with reduced effort required to search across databases, and a reduced number of publications to screen for relevance. Both systematic reviews and meta-analyses highlighted failures to report key methodological information within publications. Poor transparency of reporting limited the statistical power I had to understand the sources of between-study variation. However, some variables were found to explain a significant proportion of the heterogeneity. Transgenic animal model had a significant impact on results in both reviews. For certain open field test outcomes, wall colour of the open field arena and the reporting of measures to reduce the risk of bias were found to impact results. For in vitro electrophysiology experiments measuring synaptic plasticity, several electrophysiology parameters, including magnesium concentration of the recording solution, were found to explain a significant proportion of the heterogeneity. Automated meta-research approaches and curated web platforms summarising preclinical research could have the potential to accelerate the conduct of systematic reviews and maximise the potential of existing evidence to inform translation
    corecore