6,318 research outputs found

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Ideas Explosion! How Church Can Create More Ideas

    Full text link
    In this project, I am addressing my NPO: A healthy church nurtures a culture of humble exploration and responsive innovation. One of the key findings is that there is not enough material focused on how a healthy church can process and explore ideas. In addition, church leaders often feel ill-equipped for a process that requires creativity. Believing that God made everyone creative, especially the church, this project is a book that details a system to explore ideas, properly discern strengths and weaknesses, and prepare leaders to make informed decisions. I created the Gather-Discern-Decide (GDD) Process from research findings, personal experiences, and feedback to guide ideation for churches. The project targets pastors and church leaders with 400 or fewer members in congregationally led systems. The book currently contains six chapters plus an introduction and a conclusion. Part 1 (chapters 1 and 2) of the book seeks to establish humanity’s partnership with God in the works of creation. It includes four values that a healthy ideation process reinforces in a congregation. Part 2 (chapters 3-6) details the GDD Process and its ability to leverage low-risk creative endeavors into larger, risky church goals. The GDD Process helps build a culture of humble exploration and responsive innovation

    Knowledge-based Modelling of Additive Manufacturing for Sustainability Performance Analysis and Decision Making

    Get PDF
    Additiivista valmistusta on pidetty käyttökelpoisena monimutkaisissa geometrioissa, topologisesti optimoiduissa kappaleissa ja kappaleissa joita on muuten vaikea valmistaa perinteisillä valmistusprosesseilla. Eduista huolimatta, yksi additiivisen valmistuksen vallitsevista haasteista on ollut heikko kyky tuottaa toimivia osia kilpailukykyisillä tuotantomäärillä perinteisen valmistuksen kanssa. Mallintaminen ja simulointi ovat tehokkaita työkaluja, jotka voivat auttaa lyhentämään suunnittelun, rakentamisen ja testauksen sykliä mahdollistamalla erilaisten tuotesuunnitelmien ja prosessiskenaarioiden nopean analyysin. Perinteisten ja edistyneiden valmistusteknologioiden mahdollisuudet ja rajoitukset määrittelevät kuitenkin rajat uusille tuotekehityksille. Siksi on tärkeää, että suunnittelijoilla on käytettävissään menetelmät ja työkalut, joiden avulla he voivat mallintaa ja simuloida tuotteen suorituskykyä ja siihen liittyvän valmistusprosessin suorituskykyä, toimivien korkea arvoisten tuotteiden toteuttamiseksi. Motivaation tämän väitöstutkimuksen tekemiselle on, meneillään oleva kehitystyö uudenlaisen korkean lämpötilan suprajohtavan (high temperature superconducting (HTS)) magneettikokoonpanon kehittämisessä, joka toimii kryogeenisissä lämpötiloissa. Sen monimutkaisuus edellyttää monitieteisen asiantuntemuksen lähentymistä suunnittelun ja prototyyppien valmistuksen aikana. Tutkimus hyödyntää tietopohjaista mallinnusta valmistusprosessin analysoinnin ja päätöksenteon apuna HTS-magneettien mekaanisten komponenttien suunnittelussa. Tämän lisäksi, tutkimus etsii mahdollisuuksia additiivisen valmistuksen toteutettavuuteen HTS-magneettikokoonpanon tuotannossa. Kehitetty lähestymistapa käyttää fysikaalisiin kokeisiin perustuvaa tuote-prosessi-integroitua mallinnusta tuottamaan kvantitatiivista ja laadullista tietoa, joka määrittelee prosessi-rakenne-ominaisuus-suorituskyky-vuorovaikutuksia tietyille materiaali-prosessi-yhdistelmille. Tuloksina saadut vuorovaikutukset integroidaan kaaviopohjaiseen malliin, joka voi auttaa suunnittelutilan tutkimisessa ja täten auttaa varhaisessa suunnittelu- ja valmistuspäätöksenteossa. Tätä varten testikomponentit valmistetaan käyttämällä kahta metallin additiivista valmistus prosessia: lankakaarihitsaus additiivista valmistusta (wire arc additive manufacturing) ja selektiivistä lasersulatusta (selective laser melting). Rakenteellisissa sovelluksissa yleisesti käytetyistä metalliseoksista (ruostumaton teräs, pehmeä teräs, luja niukkaseosteinen teräs, alumiini ja kupariseokset) testataan niiden mekaaniset, lämpö- ja sähköiset ominaisuudet. Lisäksi tehdään metalliseosten mikrorakenteen karakterisointi, jotta voidaan ymmärtää paremmin valmistusprosessin parametrien vaikutusta materiaalin ominaisuuksiin. Integroitu mallinnustapa yhdistää kerätyn kokeellisen tiedon, olemassa olevat analyyttiset ja empiiriset vuorovaikutus suhteet, sekä muut tietopohjaiset mallit (esim. elementtimallit, koneoppimismallit) päätöksenteon tukijärjestelmän muodossa, joka mahdollistaa optimaalisen materiaalin, valmistustekniikan, prosessiparametrien ja muitten ohjausmuuttujien valinnan, lopullisen 3d-tulosteun komponentin halutun rakenteen, ominaisuuksien ja suorituskyvyn saavuttamiseksi. Valmistuspäätöksenteko tapahtuu todennäköisyysmallin, eli Bayesin verkkomallin toteuttamisen kautta, joka on vankka, modulaarinen ja sovellettavissa muihin valmistusjärjestelmiin ja tuotesuunnitelmiin. Väitöstyössä esitetyn mallin kyky parantaa additiivisien valmistusprosessien suorituskykyä ja laatua, täten edistää kestävän tuotannon tavoitteita.Additive manufacturing (AM) has been considered viable for complex geometries, topology optimized parts, and parts that are otherwise difficult to produce using conventional manufacturing processes. Despite the advantages, one of the prevalent challenges in AM has been the poor capability of producing functional parts at production volumes that are competitive with traditional manufacturing. Modelling and simulation are powerful tools that can help shorten the design-build-test cycle by enabling rapid analysis of various product designs and process scenarios. Nevertheless, the capabilities and limitations of traditional and advanced manufacturing technologies do define the bounds for new product development. Thus, it is important that the designers have access to methods and tools that enable them to model and simulate product performance and associated manufacturing process performance to realize functional high value products. The motivation for this dissertation research stems from ongoing development of a novel high temperature superconducting (HTS) magnet assembly, which operates in cryogenic environment. Its complexity requires the convergence of multidisciplinary expertise during design and prototyping. The research applies knowledge-based modelling to aid manufacturing process analysis and decision making in the design of mechanical components of the HTS magnet. Further, it explores the feasibility of using AM in the production of the HTS magnet assembly. The developed approach uses product-process integrated modelling based on physical experiments to generate quantitative and qualitative information that define process-structure-property-performance interactions for given material-process combinations. The resulting interactions are then integrated into a graph-based model that can aid in design space exploration to assist early design and manufacturing decision-making. To do so, test components are fabricated using two metal AM processes: wire and arc additive manufacturing and selective laser melting. Metal alloys (stainless steel, mild steel, high-strength low-alloyed steel, aluminium, and copper alloys) commonly used in structural applications are tested for their mechanical-, thermal-, and electrical properties. In addition, microstructural characterization of the alloys is performed to further understand the impact of manufacturing process parameters on material properties. The integrated modelling approach combines the collected experimental data, existing analytical and empirical relationships, and other data-driven models (e.g., finite element models, machine learning models) in the form of a decision support system that enables optimal selection of material, manufacturing technology, process parameters, and other control variables for attaining desired structure, property, and performance characteristics of the final printed component. The manufacturing decision making is performed through implementation of a probabilistic model i.e., a Bayesian network model, which is robust, modular, and can be adapted for other manufacturing systems and product designs. The ability of the model to improve throughput and quality of additive manufacturing processes will boost sustainable manufacturing goals

    Computertomographie-basierte Bestimmung von Aortenklappenkalk und seine Assoziation mit Komplikationen nach interventioneller Aortenklappenimplantation (TAVI)

    Get PDF
    Background: Severe aortic valve calcification (AVC) has generally been recognized as a key factor in the occurrence of adverse events after transcatheter aortic valve implantation (TAVI). To date, however, a consensus on a standardized calcium detection threshold for aortic valve calcium quantification in contrast-enhanced computed tomography angiography (CTA) is still lacking. The present thesis aimed at comparing two different approaches for quantifying AVC in CTA scans based on their predictive power for adverse events and survival after a TAVI procedure.   Methods: The extensive dataset of this study included 198 characteristics for each of the 965 prospectively included patients who had undergone TAVI between November 2012 and December 2019 at the German Heart Center Berlin (DHZB). AVC quantification in CTA scans was performed at a fixed Hounsfield Unit (HU) threshold of 850 HU (HU 850 approach) and at a patient-specific threshold, where the HU threshold was set by multiplying the mean luminal attenuation of the ascending aorta by 2 (+100 % HUAorta approach). The primary endpoint of this study consisted of a combination of post-TAVI outcomes (paravalvular leak ≥ mild, implant-related conduction disturbances, 30-day mortality, post-procedural stroke, annulus rupture, and device migration). The Akaike information criterion was used to select variables for the multivariable regression model. Multivariable analysis was carried out to determine the predictive power of the investigated approaches.   Results: Multivariable analyses showed that a fixed threshold of 850 HU (calcium volume cut-off 146 mm3) was unable to predict the composite clinical endpoint post-TAVI (OR=1.13, 95 % CI 0.87 to 1.48, p=0.35). In contrast, the +100 % HUAorta approach (calcium volume cut-off 1421 mm3) enabled independent prediction of the composite clinical endpoint post-TAVI (OR=2, 95 % CI 1.52 to 2.64, p=9.2x10-7). No significant difference in the Kaplan-Meier survival analysis was observed for either of the approaches.   Conclusions: The patient-specific calcium detection threshold +100 % HUAorta is more predictive of post-TAVI adverse events included in the combined clinical endpoint than the fixed HU 850 approach. For the +100 % HUAorta approach, a calcium volume cut-off of 1421 mm3 of the aortic valve had the highest predictive value.Hintergrund: Ein wichtiger Auslöser von Komplikationen nach einer Transkatheter-Aortenklappen-Implantation (TAVI) sind ausgeprägte Kalkablagerung an der Aortenklappe. Dennoch erfolgte bisher keine Einigung auf ein standardisiertes Messverfahren zur Quantifizierung der Kalklast der Aortenklappe in einer kontrastverstärkten dynamischen computertomographischen Angiographie (CTA). Die vorliegende Dissertation untersucht, inwieweit die Wahl des Analyseverfahrens zur Quantifizierung von Kalkablagerungen in der Aortenklappe die Prognose von Komplikationen und der Überlebensdauer nach einer TAVI beeinflusst.   Methodik: Der Untersuchung liegt ein umfangreicher Datensatz von 965 Patienten mit 198 Merkmalen pro Patienten zugrunde, welche sich zwischen 2012 und 2019 am Deutschen Herzzentrum Berlin einer TAVI unterzogen haben. Die Quantifizierung der Kalkablagerung an der Aortenklappe mittels CTA wurde einerseits mit einem starren Grenzwert von 850 Hounsfield Einheiten (HU) (HU 850 Verfahren) und andererseits anhand eines individuellen Grenzwertes bemessen. Letzterer ergibt sich aus der HU-Dämpfung in dem Lumen der Aorta ascendens multipliziert mit 2 (+100 % HUAorta Verfahren). Der primäre klinische Endpunkt dieser Dissertation besteht aus einem aus sechs Variablen zusammengesetzten klinischen Endpunkt, welcher ungewünschte Ereignisse nach einer TAVI abbildet (paravalvuläre Leckage ≥mild, Herzrhythmusstörungen nach einer TAVI, Tod innerhalb von 30 Tagen, post-TAVI Schlaganfall, Ruptur des Annulus und Prothesendislokation). Mögliche Störfaktoren, die auf das Eintreten der Komplikationen nach TAVI Einfluss haben, wurden durch den Einsatz des Akaike Informationskriterium ermittelt. Um die Vorhersagekraft von Komplikationen nach einer TAVI durch beide Verfahren zu ermitteln, wurde eine multivariate Regressionsanalyse durchgeführt.   Ergebnisse: Die multivariaten logistischen Regressionen zeigen, dass die Messung der Kalkablagerungen anhand der HU 850 Messung (Kalklast Grenzwert von 146 mm3) die Komplikationen und die Überlebensdauer nicht vorhersagen konnten (OR=1.13, 95 % CI 0.87 bis 1.48, p=0.35). Die nach dem +100 % HUAorta Verfahren (Kalklast Grenzwert von 1421 mm3) individualisierte Kalkmessung erwies sich hingegen als sehr aussagekräftig, da hiermit Komplikationen nach einer TAVI signifikant vorhergesagt werden konnten (OR=2, 95 % CI 1.52 bis 2.64, p=9.2x10-7). In Hinblick auf die postoperative Kaplan-Meier Überlebenszeitanalyse kann auch mit dem +100 % HUAorta Verfahren keine Vorhersage getroffen werden.   Fazit: Aus der Dissertation ergibt sich die Empfehlung, die Messung von Kalkablagerungen nach dem +100 % HUAorta Verfahren vorzunehmen, da Komplikationen wesentlich besser und zuverlässiger als nach der gängigen HU 850 Messmethode vorhergesagt werden können. Für das +100 % HUAorta Verfahren lag der optimale Kalklast Grenzwert bei 1421 mm3

    Examples of works to practice staccato technique in clarinet instrument

    Get PDF
    Klarnetin staccato tekniğini güçlendirme aşamaları eser çalışmalarıyla uygulanmıştır. Staccato geçişlerini hızlandıracak ritim ve nüans çalışmalarına yer verilmiştir. Çalışmanın en önemli amacı sadece staccato çalışması değil parmak-dilin eş zamanlı uyumunun hassasiyeti üzerinde de durulmasıdır. Staccato çalışmalarını daha verimli hale getirmek için eser çalışmasının içinde etüt çalışmasına da yer verilmiştir. Çalışmaların üzerinde titizlikle durulması staccato çalışmasının ilham verici etkisi ile müzikal kimliğe yeni bir boyut kazandırmıştır. Sekiz özgün eser çalışmasının her aşaması anlatılmıştır. Her aşamanın bir sonraki performans ve tekniği güçlendirmesi esas alınmıştır. Bu çalışmada staccato tekniğinin hangi alanlarda kullanıldığı, nasıl sonuçlar elde edildiği bilgisine yer verilmiştir. Notaların parmak ve dil uyumu ile nasıl şekilleneceği ve nasıl bir çalışma disiplini içinde gerçekleşeceği planlanmıştır. Kamış-nota-diyafram-parmak-dil-nüans ve disiplin kavramlarının staccato tekniğinde ayrılmaz bir bütün olduğu saptanmıştır. Araştırmada literatür taraması yapılarak staccato ile ilgili çalışmalar taranmıştır. Tarama sonucunda klarnet tekniğin de kullanılan staccato eser çalışmasının az olduğu tespit edilmiştir. Metot taramasında da etüt çalışmasının daha çok olduğu saptanmıştır. Böylelikle klarnetin staccato tekniğini hızlandırma ve güçlendirme çalışmaları sunulmuştur. Staccato etüt çalışmaları yapılırken, araya eser çalışmasının girmesi beyni rahatlattığı ve istekliliği daha arttırdığı gözlemlenmiştir. Staccato çalışmasını yaparken doğru bir kamış seçimi üzerinde de durulmuştur. Staccato tekniğini doğru çalışmak için doğru bir kamışın dil hızını arttırdığı saptanmıştır. Doğru bir kamış seçimi kamıştan rahat ses çıkmasına bağlıdır. Kamış, dil atma gücünü vermiyorsa daha doğru bir kamış seçiminin yapılması gerekliliği vurgulanmıştır. Staccato çalışmalarında baştan sona bir eseri yorumlamak zor olabilir. Bu açıdan çalışma, verilen müzikal nüanslara uymanın, dil atış performansını rahatlattığını ortaya koymuştur. Gelecek nesillere edinilen bilgi ve birikimlerin aktarılması ve geliştirici olması teşvik edilmiştir. Çıkacak eserlerin nasıl çözüleceği, staccato tekniğinin nasıl üstesinden gelinebileceği anlatılmıştır. Staccato tekniğinin daha kısa sürede çözüme kavuşturulması amaç edinilmiştir. Parmakların yerlerini öğrettiğimiz kadar belleğimize de çalışmaların kaydedilmesi önemlidir. Gösterilen azmin ve sabrın sonucu olarak ortaya çıkan yapıt başarıyı daha da yukarı seviyelere çıkaracaktır

    On the Principles of Evaluation for Natural Language Generation

    Get PDF
    Natural language processing is concerned with the ability of computers to understand natural language texts, which is, arguably, one of the major bottlenecks in the course of chasing the holy grail of general Artificial Intelligence. Given the unprecedented success of deep learning technology, the natural language processing community has been almost entirely in favor of practical applications with state-of-the-art systems emerging and competing for human-parity performance at an ever-increasing pace. For that reason, fair and adequate evaluation and comparison, responsible for ensuring trustworthy, reproducible and unbiased results, have fascinated the scientific community for long, not only in natural language but also in other fields. A popular example is the ISO-9126 evaluation standard for software products, which outlines a wide range of evaluation concerns, such as cost, reliability, scalability, security, and so forth. The European project EAGLES-1996, being the acclaimed extension to ISO-9126, depicted the fundamental principles specifically for evaluating natural language technologies, which underpins succeeding methodologies in the evaluation of natural language. Natural language processing encompasses an enormous range of applications, each with its own evaluation concerns, criteria and measures. This thesis cannot hope to be comprehensive but particularly addresses the evaluation in natural language generation (NLG), which touches on, arguably, one of the most human-like natural language applications. In this context, research on quantifying day-to-day progress with evaluation metrics lays the foundation of the fast-growing NLG community. However, previous works have failed to address high-quality metrics in multiple scenarios such as evaluating long texts and when human references are not available, and, more prominently, these studies are limited in scope, given the lack of a holistic view sketched for principled NLG evaluation. In this thesis, we aim for a holistic view of NLG evaluation from three complementary perspectives, driven by the evaluation principles in EAGLES-1996: (i) high-quality evaluation metrics, (ii) rigorous comparison of NLG systems for properly tracking the progress, and (iii) understanding evaluation metrics. To this end, we identify the current state of challenges derived from the inherent characteristics of these perspectives, and then present novel metrics, rigorous comparison approaches, and explainability techniques for metrics to address the identified issues. We hope that our work on evaluation metrics, system comparison and explainability for metrics inspires more research towards principled NLG evaluation, and contributes to the fair and adequate evaluation and comparison in natural language processing

    Recent Advances in Single-Particle Tracking: Experiment and Analysis

    Get PDF
    This Special Issue of Entropy, titled “Recent Advances in Single-Particle Tracking: Experiment and Analysis”, contains a collection of 13 papers concerning different aspects of single-particle tracking, a popular experimental technique that has deeply penetrated molecular biology and statistical and chemical physics. Presenting original research, yet written in an accessible style, this collection will be useful for both newcomers to the field and more experienced researchers looking for some reference. Several papers are written by authorities in the field, and the topics cover aspects of experimental setups, analytical methods of tracking data analysis, a machine learning approach to data and, finally, some more general issues related to diffusion

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems
    corecore