198,836 research outputs found
Development of low frequencies, insulating thick diaphragms for power MEMS applications
Major challenges of micro thermal machines are the thermal insulation and
mechanical tolerance in the case of sliding piston. Switching from piston to
membrane in microengines can alleviate the latest and lead to planar
architectures. However, the thermal isolation would call for very thick
structures which are associated to too high resonant frequencies which are
detrimental to the engine performances. A thermal and mechanical compromise is
to be made. On the contrary, based on fluid structure interaction, using an
incompressible fluid contained in a cavity sealed by deformable diaphragm it
would be possible to design a thick, low frequency insulating diaphragm. The
design involves a simple planar geometry that is easy to manufacture with
standard microelectronics methods. An analytical fluid structure model is
proposed and theoretically validated. Experimental structures are realized and
tested. The model is in agreement with the experimental results. A
dimensionless model is proposed to design hybrid fluid structures for
micromachines
Recommended from our members
Going on-line on a shoestring: An experiment in concurrent development of requirements and architecture
A number of on-line applications were built for a small university using a micro-sized development team. Four ideas were tested during the project: the Twin Peaks development model, using fully functional prototypes in the requirements elicitation process, some core practices of Extreme Programming, and the use of open-source software in a production environment. Certain project management techniques and their application to a micro-sized development effort were also explored. These ideas and techniques proved effective in developing many significant Internet and networked applications in a short time and at very low cost
Recommended from our members
Integrating information and knowledge for enterprise innovation
It has widely been accepted that enterprise integration, can be a source of socio-technical and cultural problems within organisations wishing to provide a focussed end-to-end business service. This can cause possible “straitjacketing” of business process architectures, thus suppressing responsive business re-engineering and competitive advantage for some companies. Accordingly, the current typology and emergent forms of Enterprise Resource Planning (ERP) and Enterprise Application Integration (EAI) technologies are set in the context of understanding information and knowledge integration philosophies. As such, key influences and trends in emerging IS integration choices, for end-to-end, cost-effective and flexible knowledge integration, are examined. As touch points across and outside organisations proliferate, via work-flow and relationship management-driven value innovation, aspects of knowledge refinement and knowledge integration pose challenges to maximising the potential of innovation and sustainable success, within enterprises. This is in terms of the increasing propensity for data fragmentation and the lack of effective information management, in the light of information overload. Furthermore, the nature of IS mediation which is inherent within decision making and workflow-based business processes, provides the basis for evaluation of the effects of information and knowledge integration. Hence, the authors propose a conceptual, holistic evaluation framework which encompasses these ideas. It is thus argued that such trends, and their implications regarding enterprise IS integration to engender sustainable competitive advantage, require fundamental re-thinking
DeSyRe: on-Demand System Reliability
The DeSyRe project builds on-demand adaptive and reliable Systems-on-Chips (SoCs). As fabrication technology scales down, chips are becoming less reliable, thereby incurring increased power and performance costs for fault tolerance. To make matters worse, power density is becoming a significant limiting factor in SoC design, in general. In the face of such changes in the technological landscape, current solutions for fault tolerance are expected to introduce excessive overheads in future systems. Moreover, attempting to design and manufacture a totally defect and fault-free system, would impact heavily, even prohibitively, the design, manufacturing, and testing costs, as well as the system performance and power consumption. In this context, DeSyRe delivers a new generation of systems that are reliable by design at well-balanced power, performance, and design costs. In our attempt to reduce the overheads of fault-tolerance, only a small fraction of the chip is built to be fault-free. This fault-free part is then employed to manage the remaining fault-prone resources of the SoC. The DeSyRe framework is applied to two medical systems with high safety requirements (measured using the IEC 61508 functional safety standard) and tight power and performance constraints
Recommended from our members
Benchmarking performance management systems
The Balanced Scorecard and associated performance management approaches, has become a widely practiced and popular management reporting method in recent times. Moreover, enabling technology, which assists in the delivery and personalisation of corporate performance information, is having a deeper and more rapid impact than ever before. This paper presents a brief comparative benchmarking study of leading enterprise performance management systems. Also, the author discusses the merits of bespoke internet technology development and out-of-the-box portal functionalities. An analysis of key business drivers and implementation risks of such approaches is highlighted via a case study example, and concludes the paper
A review of advances in pixel detectors for experiments with high rate and radiation
The Large Hadron Collider (LHC) experiments ATLAS and CMS have established
hybrid pixel detectors as the instrument of choice for particle tracking and
vertexing in high rate and radiation environments, as they operate close to the
LHC interaction points. With the High Luminosity-LHC upgrade now in sight, for
which the tracking detectors will be completely replaced, new generations of
pixel detectors are being devised. They have to address enormous challenges in
terms of data throughput and radiation levels, ionizing and non-ionizing, that
harm the sensing and readout parts of pixel detectors alike. Advances in
microelectronics and microprocessing technologies now enable large scale
detector designs with unprecedented performance in measurement precision (space
and time), radiation hard sensors and readout chips, hybridization techniques,
lightweight supports, and fully monolithic approaches to meet these challenges.
This paper reviews the world-wide effort on these developments.Comment: 84 pages with 46 figures. Review article.For submission to Rep. Prog.
Phy
- …