38 research outputs found
Digital Workflows and Material Sciences in Dental Medicine
The trend of digitalization is an omnipresent phenomenon nowadays – in social life and in the dental community. Advancement in digital technology has fostered research into new dental materials for the use of these workflows, particularly in the field of prosthodontics and oral implantology.CAD/CAM-technology has been the game changer for the production of tooth-borne and implant-supported (monolithic) reconstructions: from optical scanning, to on-screen designing, and rapid prototyping using milling or 3D-printing. In this context, the continuous development and speedy progress in digital workflows and dental materials ensure new opportunities in dentistry.The objective of this Special Issue is to provide an update on the current knowledge with state-of-the-art theory and practical information on digital workflows to determine the uptake of technological innovations in dental materials science. In addition, emphasis is placed on identifying future research needs to manage the continuous increase in digitalization in combination with dental materials and to accomplish their clinical translation.This Special Issue welcomes all types of studies and reviews considering the perspectives of the various stakeholders with regard to digital dentistry and dental materials
Recommended from our members
Active timing margin management to improve microprocessor power efficiency
Improving power/performance efficiency is critical for today’s micro- processors. From edge devices to datacenters, lower power or higher performance always produces better systems, measured by lower cost of ownership or longer battery time. This thesis studies improving microprocessor power/performance efficiency by optimizing the pipeline timing margin. In particular, this thesis focuses on improving the efficacy of Active Timing Margin, a young technology that dynamically adjusts the margin.
Active timing margin trims down the pipeline timing margin with a control loop that adjusts voltage and frequency based on real-time chip environment monitoring. The key insight of this thesis is that in order to maximize active timing margin’s efficiency enhancement benefits, synergistic management from processor architecture design and system software scheduling are needed. To that end, this thesis covers the major consumers of pipeline timing margin, including temperature, voltage, and process variation. For temperature variation, the thesis proposes a table-lookup based active timing margin mechanism, and an associated temperature management scheme to minimize power consumption. For voltage variation, the thesis characterizes the limiting factors of adaptive clocking’s power saving and proposes application scheduling to maximize total system power reduction. For process variation, the thesis proposes core-level adaptive clocking reconfiguration to automatically expose inter-core variation and discusses workload scheduling and throttling management to control critical application performance.
The author believes the optimization presented in this thesis can potentially benefit a variety of processor architectures as the conclusions are based on the solid measurement on state-of-the-art processors, and the research objective, active timing margin, already has wide applicability in the latest microprocessors by the time this thesis is written.Electrical and Computer Engineerin
Modern computing: Vision and challenges
Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress
Persuasive Language Used in Presentations by Information Technology Students
Cílem této práce je vymezit koncept rétoriky a persuasivního jazyka. Práce začíná rétorickou analýzou a definuje rétorickou situaci. Dále poskytuje informace o umění přesvědčování, jeho historii a použití. Těžištěm práce je přehled charakteristických persuasivních prostředků, který se zaměřuje nejen na používání persuasivních jazykových struktur, ale také specifickou slovní zásobu. Analytická část této práce převádí teorii do praxe prostřednictvím podrobné analýzy a srovnání persuasivních prostředků používaných studenty informačních technologií v odborných prezentacích. Zabývá se persuasivními strategiemi, jakými jsou emoční důraz v řeči, používání osobních zájmen, spojovací výrazy, trojice slov nebo lichocení.The aim of this thesis is to frame the concept of rhetoric and persuasive language. The thesis begins with a rhetorical analysis and defines the rhetorical situation. It also provides the information about the art of persuasion, its history and use. A survey of characteristic persuasive devices constitutes a fundamental part of the thesis which deals not only with the use of persuasive language structures but also with specific vocabulary. The analytical part of the thesis converts theory into practice through a detailed analysis and comparison of persuasive devices used in presentations by information technology students. It focuses on persuasive techniques like emotive emphasis in speech, using personal pronouns, linking signals, triples, and flattery.
Blockchain based energy transactions for a prosumer community
PhD thesis in Information technologyIntegration of solar micro-generation capabilities in domestic contexts is on the rise, leading to the creation of prosumer communities who generate part of the energy they consume. Prosumer communities require a decentralized, transparent and immutable transaction system in order to extract value from their surplus energy generation and usage flexibility. The aim of this study is to develop frameworks and methods to create such a prosumer transaction system with self enforcing smart contracts to facilitate trading of energy assets such as electricity units, energy flexibility incentives and storage credits.
Blockchain is a transparent, distributed ledger for consensus based transaction processing maintained by a network of peer nodes. Hyperledger Fabric is a blockchain platform that offers the added benefits of lower operating cost, faster transaction processing, user authentication based access control and support for self enforcing smart contracts.
This thesis investigates the applicability of Hyperledger Fabric to tokenize and transact energy assets in a unified transaction system. Data driven approaches to implement an incentive based energy flexibility system for peak mitigation on the blockchain are also investigated.
To this end, the stakeholders for such a transaction management system were identified and their business relationships and interactions were described. Energy assets were encapsulated into blockchain tokens and algorithms were developed and encoded into self enforcing smart contracts based on the stakeholder relationships. A unified transaction framework was proposed that would bring on board all the stakeholders, their trading relationships and the assets being transacted. Tokens and methods in the transaction system were implemented in fungible and non fungible versions and the versions were critically compared in terms of application area, design, algorithmic complexity, performance, advantages and disadvantages. Further, with a focus on energy flexibility applications, a prosumer research dataset was analysed to gain insights into the production and consumption behaviors. Based on these insights, a data driven approach for peak mitigation was proposed and implemented on the Hyperledger Fabric blockchain.
The thesis thus addresses different aspects of a blockchain based prosumer transaction system, and shows the feasibility of proposed approaches through implementation and performance testing of proofs of concept
Transformative Effects of IoT, Blockchain and Artificial Intelligence on Cloud Computing: Evolution, Vision, Trends and Open Challenges
Cloud computing plays a critical role in modern society and enables a range of applications from infrastructure to social media. Such system must cope with varying load and evolving usage reflecting societies’ interaction and dependency on automated computing systems whilst satisfying Quality of Service (QoS) guarantees. Enabling these systems are a cohort of conceptual technologies, synthesised to meet demand of evolving computing applications. In order to understand current and future challenges of such system, there is a need to identify key technologies enabling future applications. In this study, we aim to explore how three emerging paradigms (Blockchain, IoT and Artificial Intelligence) will influence future cloud computing systems. Further, we identify several technologies driving these paradigms and invite international experts to discuss the current status and future directions of cloud computing. Finally, we proposed a conceptual model for cloud futurology to explore the influence of emerging paradigms and technologies on evolution of cloud computing
Strategies of development and maintenance in supervision, control, synchronization, data acquisition and processing in light sources
Programa Oficial de Doutoramento en Tecnoloxías da Información e as Comunicacións. 5032V01[Resumo]
Os aceleradores de partículas e fontes de luz sincrotrón, evolucionan constantemente para estar
na vangarda da tecnoloxía, levando os límites cada vez mais lonxe para explorar novos
dominios e universos. Os sistemas de control son unha parte crucial desas instalacións
científicas e buscan logra-la flexibilidade de manobra para poder facer experimentos moi
variados, con configuracións diferentes que engloban moitos tipos de detectores,
procedementos, mostras a estudar e contornas.
As propostas de experimento son cada vez máis ambiciosas e van sempre un paso por diante
do establecido. Precísanse detectores cada volta máis rápidos e eficientes, con máis ancho de
banda e con máis resolución. Tamén é importante a operación simultánea de varios detectores
tanto escalares como mono ou bidimensionáis, con mecanismos de sincronización de precisión
que integren as singularidades de cada un.
Este traballo estuda as solucións existentes no campo dos sistemas de control e adquisición de
datos nos aceleradores de partículas e fontes de luz e raios X, ó tempo que explora novos
requisitos e retos no que respecta á sincronización e velocidade de adquisición de datos para
novos experimentos, a optimización do deseño, soporte, xestión de servizos e custos de
operación. Tamén se estudan diferentes solucións adaptadas a cada contorna.[Resumen] Los aceleradores de partículas y fuentes de luz sincrotrón, evolucionan constantemente para
estar en la vanguardia de la tecnología, y poder explorar nuevos dominios. Los sistemas de
control son una parte fundamental de esas instalaciones científicas y buscan lograr la máxima
flexibilidad para poder llevar a cabo experimentos más variados, con configuraciones
diferentes que engloban varios tipos de detectores, procedimientos, muestras a estudiar y
entornos.
Los experimentos se proponen cada vez más ambiciosos y en ocasiones más allá de los límites
establecidos. Se necesitan detectores cada vez más rápidos y eficientes, con más resolución y
ancho de banda, que puedan sincronizarse simultáneamente con otros detectores tanto escalares
como mono y bidimensionales, integrando las singularidades de cada uno y homogeneizando
la adquisición de datos.
Este trabajo estudia los sistemas de control y adquisición de datos de aceleradores de partículas
y fuentes de luz y rayos X, y explora nuevos requisitos y retos en lo que respecta a la
sincronización y velocidad de adquisición de datos, optimización y costo-eficiencia en el
diseño, operación soporte, mantenimiento y gestión de servicios. También se estudian diferentes soluciones adaptadas a cada entorno.[Abstract]
Particle accelerators and photon sources are constantly evolving, attaining the cutting-edge
technologies to push the limits forward and explore new domains. The control systems are a crucial
part of these installations and are required to provide flexible solutions to the new
challenging experiments, with different kinds of detectors, setups, sample environments and
procedures.
Experiment proposals are more and more ambitious at each call and go often a step beyond the
capabilities of the instrumentation. Detectors shall be faster, with higher efficiency,
more resolution, more bandwidth and able to synchronize with other detectors of all kinds; scalars,
one or two-dimensional, taking into account their singularities and homogenizing the
data acquisition.
This work examines the control and data acquisition systems for particle accelerators and X- ray /
light sources and explores new requirements and challenges regarding synchronization and data
acquisition bandwidth, optimization and cost-efficiency in the design / operation / support. It
also studies different solutions depending on the environment