2,774 research outputs found
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
6G secure quantum communication: a success probability prediction model
© 2024 The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/The emergence of 6G networks initiates significant transformations in the communication technology landscape. Yet, the melding of quantum computing (QC) with 6G networks although promising an array of benefits, particularly in secure communication. Adapting QC into 6G requires a rigorous focus on numerous critical variables. This study aims to identify key variables in secure quantum communication (SQC) in 6G and develop a model for predicting the success probability of 6G-SQC projects. We identified key 6G-SQC variables from existing literature to achieve these objectives and collected training data by conducting a questionnaire survey. We then analyzed these variables using an optimization model, i.e., Genetic Algorithm (GA), with two different prediction methods the Naïve Bayes Classifier (NBC) and Logistic Regression (LR). The results of success probability prediction models indicate that as the 6G-SQC matures, project success probability significantly increases, and costs are notably reduced. Furthermore, the best fitness rankings for each 6G-SQC project variable determined using NBC and LR indicated a strong positive correlation (rs = 0.895). The t-test results (t = 0.752, p = 0.502 > 0.05) show no significant differences between the rankings calculated using both prediction models (NBC and LR). The results reveal that the developed success probability prediction model, based on 15 identified 6G-SQC project variables, highlights the areas where practitioners need to focus more to facilitate the cost-effective and successful implementation of 6G-SQC projects.Peer reviewe
Systems approaches to modelling pathways and networks.
Peer reviewedPreprin
Study of Finite Elements-based reliability and maintenance algorithmic methodologies analysis applied to aircraft structures and design optimization
This thesis presents the development of a research methodology oriented to the analysis of an aircraft structure in terms of operational reliability and maintainability requirements regarding its airworthiness. The study has been focused on modern commercial aircraft models, carrying out a market research and model selection according to different criteria. The study then develops a practical implementation consisting of the design approach of the aircraft airframe and main structural components for its subsequent numerical analysis and simulation. The numerical simulations will be computed by application of the Finite Elements Method on the main structural systems of the aircraft and establishment of boundary conditions. These simulations will allow the development of a computational study on linear, non-linear, and transient simulations of static loads, buckling, modal analysis, temperature, fatigue and thermal stress of individual structures and full assembly in different conditions. Finally, these results will be assessed and exported to a Matlab code which will compute an algorithmic methodology in order to approach the operational reliability and safety of the aircraft in the studied conditions. The thesis will conclude with a review of airworthiness regulations a proposal of research paths and further development of the methodology implemented
Potential identification and industrial evaluation of an integrated design automation workflow.
Purpose - The paper aims to raise awareness in the industry of design automation tools, especially in early design phases, by demonstrating along a case study the seamless integration of a prototypically implemented optimization, supporting design space exploration in the early design phase and an in operational use product configurator, supporting the drafting and detailing of the solution predominantly in the later design phase. Design/methodology/approach - Based on the comparison of modeled as-is and to-be processes of ascent assembly designs with and without design automation tools, an automation roadmap is developed. Using qualitative and quantitative assessments, the potentials and benefits, as well as acceptance and usage aspects, are evaluated. Findings - Engineers tend to consider design automation for routine tasks. Yet, prototypical implementations support the communication and identification of the potential for the early stages of the design process to explore solution spaces. In this context, choosing from and interactively working with automatically generated alternative solutions emerged as a particular focus. Translators, enabling automatic downstream propagation of changes and thus ensuring consistency as to change management were also evaluated to be of major value. Research limitations/implications - A systematic validation of design automation in design practice is presented. For generalization, more case studies are needed. Further, the derivation of appropriate metrics needs to be investigated to normalize validation of design automation in future research. Practical implications - Integration of design automation in early design phases has great potential for reducing costs in the market launch. Prototypical implementations are an important ingredient for potential evaluation of actual usage and acceptance before implementing a live system. Originality/value - There is a lack of systematic validation of design automation tools supporting early design phases. In this context, this work contributes a systematically validated industrial case study. Early design-phases-support technology transfer is important because of high leverage potential
Assessing gaps and needs for integrating building performance optimization tools in net zero energy buildings design
This paper summarizes a study undertaken to reveal potential challenges and opportunities for integrating optimization tools in net zero energy buildings (NZEB) design. The paper reviews current trends in simulation-based building performance optimization (BPO) and outlines major criteria for optimization tools selection and evaluation. This is based on analyzing user's needs for tools capabilities and requirement specifications. The review is carried out by means of a literature review of 165 publications and interviews with 28 optimization experts. The findings are based on an inter-group comparison between experts. The aim is to assess the gaps and needs for integrating BPO tools in NZEB design. The findings indicate a breakthrough in using evolutionary algorithms in solving highly constrained envelope, HVAC and renewable optimization problems. Simple genetic algorithm solved many design and operation problems and allowed measuring the improvement in the optimality of a solution against a base case. Evolutionary algorithms are also easily adapted to enable them to solve a particular optimization problem more effectively. However, existing limitations including model uncertainty, computation time, difficulty of use and steep learning curve. Some future directions anticipated or needed for improvement of current tools are presented.Peer reviewe
Automatic online algorithm selection for optimization in cyber-physical production systems
Shrinking product lifecycles, progressing market penetration of innovative product technologies, and increasing demand for product individualization lead to frequent adjustments of production processes and thus to an increasing demand for frequent optimization of production processes. Offline solutions are not always available, and even the optimization problem class itself may have changed in terms of the value landscape of the objective function: Parameters may have been added, the locations of optimal values and the values themselves may have changed. This thesis develops an automatic solution to the algorithm selection problem for continuous optimization. Furthermore, based on the evaluation of three different real-world use cases and a review of well-known architectures from the field of automation and cognitive science, a system architecture suitable for use in large data scenarios was developed. The developed architecture has been implemented and evaluated on two real-world problems: A Versatile Production System (VPS) and Injection Molding Optimization (IM). The developed solution for the VPS was able to automatically tune the feasible algorithms and select the most promising candidate, which significantly outperformed the competitors. This was evaluated by applying statistical tests based on the generated test instances using the process data and by performing benchmark experiments. This solution was extended to the area of multi-objective optimization for the IM use case by specifying an appropriate algorithm portfolio and selecting a suitable performance metric to automatically compare the algorithms. This allows the automatic optimization of three largely uncorrelated objectives: cycle time, average volume shrinkage, and maximum warpage of the parts to be produced. The extension to multi-objective handling for IM optimization showed a huge benefit in terms of manual implementation effort, as most of the work could be done by configuration. The implementation effort was reduced to selecting optimizers and hypervolume computation
GPT Models in Construction Industry: Opportunities, Limitations, and a Use Case Validation
Large Language Models(LLMs) trained on large data sets came into prominence
in 2018 after Google introduced BERT. Subsequently, different LLMs such as GPT
models from OpenAI have been released. These models perform well on diverse
tasks and have been gaining widespread applications in fields such as business
and education. However, little is known about the opportunities and challenges
of using LLMs in the construction industry. Thus, this study aims to assess GPT
models in the construction industry. A critical review, expert discussion and
case study validation are employed to achieve the study objectives. The
findings revealed opportunities for GPT models throughout the project
lifecycle. The challenges of leveraging GPT models are highlighted and a use
case prototype is developed for materials selection and optimization. The
findings of the study would be of benefit to researchers, practitioners and
stakeholders, as it presents research vistas for LLMs in the construction
industry.Comment: 58 pages, 20 figure
- âŠ