4 research outputs found
Exploiting Hardware Abstraction for Parallel Programming Framework: Platform and Multitasking
With the help of the parallelism provided by the fine-grained architecture, hardware accelerators on Field Programmable Gate Arrays (FPGAs) can significantly improve the performance of many applications. However, designers are required to have excellent hardware programming skills and unique optimization techniques to explore the potential of FPGA resources fully. Intermediate frameworks above hardware circuits are proposed to improve either performance or productivity by leveraging parallel programming models beyond the multi-core era.
In this work, we propose the PolyPC (Polymorphic Parallel Computing) framework, which targets enhancing productivity without losing performance. It helps designers develop parallelized applications and implement them on FPGAs. The PolyPC framework implements a custom hardware platform, on which programs written in an OpenCL-like programming model can launch. Additionally, the PolyPC framework extends vendor-provided tools to provide a complete development environment including intermediate software framework, and automatic system builders. Designers\u27 programs can be either synthesized as hardware processing elements (PEs) or compiled to executable files running on software PEs. Benefiting from nontrivial features of re-loadable PEs, and independent group-level schedulers, the multitasking is enabled for both software and hardware PEs to improve the efficiency of utilizing hardware resources.
The PolyPC framework is evaluated regarding performance, area efficiency, and multitasking. The results show a maximum 66 times speedup over a dual-core ARM processor and 1043 times speedup over a high-performance MicroBlaze with 125 times of area efficiency. It delivers a significant improvement in response time to high-priority tasks with the priority-aware scheduling. Overheads of multitasking are evaluated to analyze trade-offs. With the help of the design flow, the OpenCL application programs are converted into executables through the front-end source-to-source transformation and back-end synthesis/compilation to run on PEs, and the framework is generated from users\u27 specifications
Design of asynchronous microprocessor for power proportionality
PhD ThesisMicroprocessors continue to get exponentially cheaper for end users following Moore’s
law, while the costs involved in their design keep growing, also at an exponential rate.
The reason is the ever increasing complexity of processors, which modern EDA tools
struggle to keep up with. This makes further scaling for performance subject to a high
risk in the reliability of the system. To keep this risk low, yet improve the performance,
CPU designers try to optimise various parts of the processor. Instruction Set Architecture
(ISA) is a significant part of the whole processor design flow, whose optimal design
for a particular combination of available hardware resources and software requirements
is crucial for building processors with high performance and efficient energy utilisation.
This is a challenging task involving a lot of heuristics and high-level design decisions.
Another issue impacting CPU reliability is continuous scaling for power consumption. For
the last decades CPU designers have been mainly focused on improving performance, but
“keeping energy and power consumption in mind”. The consequence of this was a development
of energy-efficient systems, where energy was considered as a resource whose
consumption should be optimised. As CMOS technology was progressing, with feature
size decreasing and power delivered to circuit components becoming less stable, the
energy resource turned from an optimisation criterion into a constraint, sometimes a critical
one. At this point power proportionality becomes one of the most important aspects
in system design. Developing methods and techniques which will address the problem
of designing a power-proportional microprocessor, capable to adapt to varying operating
conditions (such as low or even unstable voltage levels) and application requirements in
the runtime, is one of today’s grand challenges. In this thesis this challenge is addressed
by proposing a new design flow for the development of an ISA for microprocessors, which
can be altered to suit a particular hardware platform or a specific operating mode. This
flow uses an expressive and powerful formalism for the specification of processor instruction
sets called the Conditional Partial Order Graph (CPOG). The CPOG model captures
large sets of behavioural scenarios for a microarchitectural level in a computationally
efficient form amenable to formal transformations for synthesis, verification and automated
derivation of asynchronous hardware for the CPU microcontrol. The feasibility of
the methodology, novel design flow and a number of optimisation techniques was proven
in a full size asynchronous Intel 8051 microprocessor and its demonstrator silicon. The
chip showed the ability to work in a wide range of operating voltage and environmental
conditions. Depending on application requirements and power budget our ASIC supports
several operating modes: one optimised for energy consumption and the other one for
performance. This was achieved by extending a traditional datapath structure with an
auxiliary control layer for adaptable and fault tolerant operation. These and other optimisations
resulted in a reconfigurable and adaptable implementation, which was proven
by measurements, analysis and evaluation of the chip.EPSR
The Conflicted Mission of the United States Bureau of Biological Survey, 1885-1940: Wildlife, Uncertainty, and Ambivalence
The United States Bureau of Biological Survey, initially founded as the Division of Economic Ornithology and Mammalogy within the Department of Agriculture in 1885, began with a focus on scientific research. Its principle responsibilities were mapping the North American continent's geographical distribution of flora and fauna and determining which animal species were beneficial or injurious to agriculture. Soon, however, the Survey took on new assignments. By the first decade of the twentieth century, the federal bureau was controlling predators and rodents, protecting wildlife on big game reservations and avian refuges, and enforcing wildlife legislation. These added responsibilities resulted in a conflicted mission for the Survey: Since the bureau had to both kill (through predator and rodent control) and protect wildlife, it could not build unequivocal, long-lasting alliances with groups of constituents that would support the Survey. Stockmen supported predator and rodent control yet were critical of wildlife protection. Sport hunters welcomed the avian refuges but often opposed the enforcement of hunting regulations. Scientists and conservationists endorsed wildlife protection but disapproved of predator and rodent control. Furthermore, states, other federal agencies, and residents living near the refuges and reservations often had their own ideas about wildlife and the acceptable use of land designated for wildlife protection, sometimes welcoming the Survey, sometimes opposing it, and sometimes demonstrating a combination of support and resistance. Thus, the Survey's relationships with states, other bureaus, local citizens, and different groups of constituents were ambivalent and uncertain. The uncertainty was further exacerbated by the lack of basic knowledge of wildlife, a reflection of the incipient fields of wildlife science and game management. Working within the restraints of a conflicted mission, divided authority between state and federal government over the management of wildlife, a wavering base of support, and limited scientific understanding of wildlife, the Survey faced its responsibilities with a high degree of uncertainty and was pulled in multiple directions