5,112 research outputs found
Rapid prototyping for biomedical engineering: current capabilities and Challenges
A new set of manufacturing technologies has emerged in the past decades to address market requirements in a customized way and to provide support for research tasks that require prototypes. These new techniques and technologies are usually referred to as rapid prototyping and manufacturing technologies, and they allow prototypes to be produced in a wide range of materials with remarkable precision in a couple of hours. Although they have been rapidly incorporated into product development methodologies, they are still under development, and their applications in bioengineering are continuously evolving. Rapid prototyping and manufacturing technologies can be of assistance in every stage of the development process of novel biodevices, to address various problems that can arise in the devices' interactions with biological systems and the fact that the design decisions must be tested carefully. This review focuses on the main fields of application for rapid prototyping in biomedical engineering and health sciences, as well as on the most remarkable challenges and research trends
Developments in the tools and methodologies of synthetic biology.
Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a body of knowledge from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community
21st Century Simulation: Exploiting High Performance Computing and Data Analysis
This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded
paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to
overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel
computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in
computing power. This has been characterized as a ten-year lead over the use of single-processor computers.
Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power.
JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The
challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant
populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants,
and to understand non-linear, asymmetric warfare. These requirements stretch both current
computational techniques and data analysis methodologies. In this paper, documented examples and potential
solutions will be advanced. The authors discuss the paths to successful implementation based on their experience.
Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch,
database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses.
The modeling and simulation community has significant potential to provide more opportunities for training and
analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more
realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights,
for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased
understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses.
The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the
beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success
A comparison of processing techniques for producing prototype injection moulding inserts.
This project involves the investigation of processing techniques for producing low-cost moulding inserts used in the particulate injection moulding (PIM) process. Prototype moulds were made from both additive and subtractive processes as well as a combination of the two. The general motivation for this was to reduce the entry cost of users when considering PIM.
PIM cavity inserts were first made by conventional machining from a polymer block using the pocket NC desktop mill. PIM cavity inserts were also made by fused filament deposition modelling using the Tiertime UP plus 3D printer.
The injection moulding trials manifested in surface finish and part removal defects. The feedstock was a titanium metal blend which is brittle in comparison to commodity polymers. That in combination with the mesoscale features, small cross-sections and complex geometries were considered the main problems. For both processing methods, fixes were identified and made to test the theory. These consisted of a blended approach that saw a combination of both the additive and subtractive processes being used.
The parts produced from the three processing methods are investigated and their respective merits and issues are
discussed
Reducing risk in pre-production investigations through undergraduate engineering projects.
This poster is the culmination of final year Bachelor of Engineering Technology (B.Eng.Tech) student projects
in 2017 and 2018. The B.Eng.Tech is a level seven qualification that aligns with the Sydney accord for a three-year engineering degree and hence is internationally benchmarked. The enabling mechanism of these projects is the industry connectivity that creates real-world projects and highlights the benefits of the investigation of process at the technologist level.
The methodologies we use are basic and transparent, with enough depth of technical knowledge to ensure the industry partners gain from the collaboration process. The process we use minimizes the disconnect between the student and the industry supervisor while maintaining the academic freedom of the student and the commercial sensitivities of the supervisor.
The general motivation for this approach is the reduction of the entry cost of the industry to enable consideration of new technologies and thereby reducing risk to core business and shareholder profits.
The poster presents several images and interpretive dialogue to explain the positive and negative aspects of the student process
Virtual and augmented reality as enablers for improving the service on distributed assets
The evolution of Augment and Virtual Reality is enabling new solutions. This paper addresses creation of applications to support service and maintenance of distributed systems. Indeed this approach could be applied to devices provided as service for industrial and individual use and could introduce new capabilities in terms of training for operators, control and remote service support. The paper presents a case study devoted to lead the introduction of these innovative solutions in industrial and health care system
New Horizons: Pioneering Pharmaceutical R&D with Generative AI from lab to the clinic -- an industry perspective
The rapid advance of generative AI is reshaping the strategic vision for R&D
across industries. The unique challenges of pharmaceutical R&D will see
applications of generative AI deliver value along the entire value chain from
early discovery to regulatory approval. This perspective reviews these
challenges and takes a three-horizon approach to explore the generative AI
applications already delivering impact, the disruptive opportunities which are
just around the corner, and the longer-term transformation which will shape the
future of the industry. Selected applications are reviewed for their potential
to drive increase productivity, accelerate timelines, improve the quality of
research, data and decision making, and support a sustainable future for the
industry. Recommendations are given for Pharma R&D leaders developing a
generative AI strategy today which will lay the groundwork for getting real
value from the technology and safeguarding future growth. Generative AI is
today providing new, efficient routes to accessing and combining organisational
data to drive productivity. Next, this impact will reach clinical development,
enhancing the patient experience, driving operational efficiency, and unlocking
digital innovation to better tackle the future burden of disease. Looking to
the furthest horizon, rapid acquisition of rich multi-omics data, which capture
the 'language of life', in combination with next generation AI technologies
will allow organisations to close the loop around phases of the pipeline
through rapid, automated generation and testing of hypotheses from bench to
bedside. This provides a vision for the future of R&D with sustainability at
the core, with reduced timescales and reduced dependency on resources, while
offering new hope to patients to treat the untreatable and ultimately cure
diseases.Comment: 21 pages, 4 figure
- …