10 research outputs found
Learning from Prototypes
Structured methods for the analysis and design of information systems have largely focused on representations and control mechanisms for the outcomes of the design process. Prototyping methods are more sensitive to critiques during the designprocess itself but do not preserve knowledge about it explicitly. In this paper, a systems arc iitecture called REMAP is presented that accumulates design process knowledge to manage systems evolution. To accomplish this, REMAP acquires and maintains dependencies among the design decisions made during a prototyping process. It includes a model for learning general design rules from such dependencies which can be applied to prototype refinement, systems maintenance, and design re-use
Occidental Versus Oriental I.S. Professionals\u27 Perceptions on Key Factors for Motivation
A comparison of perceptions of analysts and programmers in Singapore versus the United States identifies many more similarities than dissimilarities. The Singapore sample consisted of 1,179 persons (31% of the entire I. S. population). The U. S. data base is comprised of more than 8,000 persons. Similarities were statistically significant on 9 of 12 factors compared for system analysts and on five of 13 factors compared for programmers. On six of the eight factors where programmers are significantly different, changes underway have a strong likelihood of eliminating those differences. On the most important factor that distinguishes I.S. professionals in the U.S from other U.S. professionals, individual growth need strength (GNS), Singaporean I.S. professionals are not significantly different from their U.S. counterparts. This is the first of six studies comparing American I.S. professionals to I.S. professionals in Singapore, that they perceive motivational issues much like their American counterparts
Millimetre wave imaging for concealed target detection
PhDConcealed weapon detection (CWD) has been a hot topic as the concern about pub-
lic safety increases. A variety of approaches for the detection of concealed objects
on the human body based on earth magnetic ¯eld distortion, inductive magnetic
¯eld, acoustic and ultrasonic, electromagnetic resonance, MMW (millimetre wave),
THz, Infrared, x-ray technologies have been suggested and developed. Among all
of them, MMW holographic imaging is considered as a promising approach due
to the relatively high penetration and high resolution that it can o®er. Typical
concealed target detection methods are classi¯ed into 2 categories, the ¯rst one is a
resonance based target identi¯cation technique, and the second one is an imaging
based system. For the former, the complex natural resonance (CNR) frequencies
associated with a certain target are extracted and used for identi¯cation, but this
technique has an issue of high false alarm rate. The microwave/millimetre wave
imaging systems can be categorized into two types: passive systems and active sys-
tems. For the active microwave/millimetre wave imaging systems, the microwave
holographic imaging approach was adopted in this thesis. Such a system can oper-
ate at either a single frequency or multiple frequencies (wide band). An active,
coherent, single frequency operation millimetre wave imaging system based on the
theory of microwave holography was developed. Based on literature surveys and
¯rst hand experimental results, this thesis aims to provide system level parame-
ter determination to aid the development of a target detection imager. The goal
is approached step by step in 7 chapters, with topics and issues addressed rang-
ing from reviewing the past work, ¯nding out the best candidate technology, i.e.
the MMW holographic imaging combined with the resonance based target recog-
i
nition technique, the construction of the 94 GHz MMW holographic prototype
imager, experimental trade-o® investigation of system parameters, imager per-
formance evaluation, low pro¯le components and image enhancement techniques,
feasibility investigation of resonance based technique, to system implementation
based on the parameters and results achieved. The task set forth in the beginning
is completed by coming up with an entire system design in the end.
SSTAC/ARTS Review of the Draft Integrated Technology Plan (ITP). Volume 2: Propulsion Systems
The topics addressed are: (1) space propulsion technology program overview; (2) space propulsion technology program fact sheet; (3) low thrust propulsion; (4) advanced propulsion concepts; (5) high-thrust chemical propulsion; (6) cryogenic fluid management; (7) NASA CSTI earth-to-orbit propulsion; (8) advanced main combustion chamber program; (9) earth-to-orbit propulsion turbomachinery; (10) transportation technology; (11) space chemical engines technology; (12) nuclear propulsion; (13) spacecraft on-board propulsion; and (14) low-cost commercial transport
A computer-aided design for digital filter implementation
Imperial Users onl
GraphflowDB: Scalable Query Processing on Graph-Structured Relations
Finding patterns over graph-structured datasets is ubiquitous and integral to a wide range of analytical applications, e.g., recommendation and fraud detection. When expressed in the high-level query languages of database management systems (DBMSs), these patterns correspond to many-to-many join computations, which generate very large intermediate relations during query processing and degrade the performance of existing systems.
This thesis argues that modern query processors need to adopt two novel techniques to be efficient on growing many-to-many joins: (i) worst-case optimal join algorithms; and (ii) factorized representations. Traditional query processors generate join plans that use binary joins, which in iteration take two relations, base or intermediate, to join and produce a new relation. The theory of worst-case optimal joins have shown that this style of join processing can be provably suboptimal and hence generate unnecessarily large intermediate results. This can be avoided on cyclic join queries if the join is performed in a multi-way fashion a join-attribute-at-a-time. As its first contribution, this thesis proposes the design and implementation of a query processor and optimizer that can generate plans that mix worst-case optimal joins, i.e., attribute-at-a-time joins and binary joins, i.e., table-at-a-time joins. In contrast to prior approaches with novel join optimizers that require solving hard computational problems, such as computing low-width hypertree decompositions of queries, our join optimizer is cost-based and uses a traditional dynamic programming approach with a new cost metric.
On acyclic queries, or acyclic parts of queries, sometimes the generation of large intermediate results cannot be avoided. Yet, the theory of factorization has shown that often such intermediate results can be highly compressible if they contain multi-valued dependencies between join attributes. Factorization proposes two relation representation schemes, called f- and d-representations, to represent the large intermediate results generated under many-to-many joins in a compressed format. Existing proposals to adopt factorized representations require designing processing on fully materialized general tries and novel operators that operate on entire tries, which are not easy to adopt in existing systems. As a second contribution, we describe the implementation of a novel query processing approach we call factorized vector execution that adopts f-representations. Factorized vector execution extends the traditional vectorized query processors to use multiple blocks of vectors instead of a single block allowing us to factorize intermediate results and delay or even avoid Cartesian products. Importantly, our design ensures that every core operator in the system still performs computations on vectors. As a third contribution, we further describe how to extend our factorized vector execution model with novel operators to adopt d-representations, which extend f-representations with cached and reused sub-relations. Our design here is based on using nested hash tables that can point to sub-relations instead of copying them and on directed acyclic graph-based query plans.
All of our techniques are implemented in the GraphflowDB system, which was developed throughout the years to facilitate the research in this thesis. We demonstrate that GraphflowDB’s query processor can outperform existing approaches and systems by orders of magnitude on both micro-benchmarks and end-to-end benchmarks. The designs proposed in this thesis adopt common-wisdom query processing techniques of pipelining, vector-based execution, and morsel-driven parallelism to ensure easy adoption in existing systems. We believe the design can serve as a blueprint for how to adopt these techniques in existing DBMSs to make them more efficient on workloads with many-to-many joins
MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications
Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described
Generalized averaged Gaussian quadrature and applications
A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal
Recommended from our members
Intensely distributed nanoscience: co-ordinating scientific work in a large multi-sited cross-disciplinary nanomedical project
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThis thesis is concerned with the study of biomedical scientific research work that is intensely
distributed, i.e. socially distributed across multiple institutions, sites, and disciplines.
Specifically, this PhD probes the ways in which scientists co-operating on multi-sited crossdisciplinary
projects, design, use and maintain information-based resources to conduct and coordinate
their experimental activities. The research focuses on the roles of information
artefacts, i.e. the tools, media and devices used to store, track, display, and retrieve
information in paper or electronic format, in helping the scientists integrate their activities to
achieve concerted action.
To examine how scientists in globally distributed settings organise and co-ordinate their
scientific work using information artefacts, a multi-method multi-sited study informed by
different ethnographic perspectives was conducted focused on a large European crossdisciplinary
translational research project in nanodiagnostics. Situated interviews with project
scientists, participant observations and participatory learning exercises were designed and
deployed. From the data analysis, several abstractions were developed to represent how the
joined utilisations of key information artefacts support the co-ordination of experimental
activities. Subsequently, a framework was developed to highlight key interactional strategies
that need to be managed by experimenters when using artefacts to organise their work cooperatively.
This framework was then used as a guiding device to identify innovative ways to
design future digital interactive systems to support the co-ordination of intensely distributed
scientific work.
From this study, several key findings came to light. We identify the role of the experimental
protocol acts as a co-ordinative map that is co-designed dynamically to disseminate various
instantiations of experimental executions across sites. We have also shed light on the ways the
protocol, the lab book and the material log are used jointly to support the articulation of
scientific work. The protocol and the lab book are used both locally and across co-operating
sites to support four repeatability and reproducibility levels that are key to experimental
validation. The use of the local protocol / lab book dyads at each site is further integrated with that of a centralised material log artefact to enable a system of exchange of scientific content
(e.g. experimental processes, intermediate results and observations) and experimental
materials (both physical materials and key information). We have found that this integration
into a co-ordinative cluster supports awareness and the articulation of experimental activities
both locally and across remote labs. From this understanding, we have derived several
sensitising tensions to frame the strategies that scientific practitioners need to manage when
designing their multi-sited experimental work and technologists should consider when
designing systems to support them: (1) formalisation / flexibility; (2) articulability / local
appropriateness; (3) scrutiny / tinkering; (4) accountability / applicability; (5) traceability /
improvisation and (6) lastingness / immediacy. Lastly, based on these tensions, we have
suggested a number of implications for the design of interactive information artefacts that can
help manage both local and multi-sited co-ordination in intensely distributed scientific
projects