466,952 research outputs found
Spatial agency practice in Tai O Village:Colonial legacies and spatial-architectural approaches to collaborative urban futures.
This paper discusses spatial agency practice within a living lab in Hong Kong. Lab members work in Tai O Village, a historic fishing settlement receiving increased attention due to remnant vernacular housing there. The article presents historical and policy context for ongoing casework conducted with stakeholders in Tai O. It presents Tai O’s history in brief, recent policy developments, and inherent conflicts arising from the interaction of the two. The third section of the article describes informal settlement land tenure conflicts as historical phenomena in Hong Kong. The paper follows this case-specific discussion with global literature review of selected regularisation and settlement upgrading efforts from around the world. These reviews present the article’s thesis that third sector and design-led efforts are critically applicable methods to address informal settlement conflicts that remain due to colonial legacy policies and political inertia. The final section of the article presents ongoing living lab research and initiatives, including collaborative monitoring projects and strategic development proposals. Each living lab initiative presented elaborates the article’s thesis on the interaction between architecture, research, and governance to negotiate complex development transitions. The article contributes to architectural scholarship by summarising unique interactions between history, policy, economics, and demography that engendered the development situation in Tai O. Further, it reflects upon response development methods through architectural science and spatial agency practice, including the role of architectural representation products and discursive distinctions at boundaries between architectural practice and spatial agency practice.</p
Educational Technology as Seen Through the Eyes of the Readers
In this paper, I present the evaluation of a novel knowledge domain
visualization of educational technology. The interactive visualization is based
on readership patterns in the online reference management system Mendeley. It
comprises of 13 topic areas, spanning psychological, pedagogical, and
methodological foundations, learning methods and technologies, and social and
technological developments. The visualization was evaluated with (1) a
qualitative comparison to knowledge domain visualizations based on citations,
and (2) expert interviews. The results show that the co-readership
visualization is a recent representation of pedagogical and psychological
research in educational technology. Furthermore, the co-readership analysis
covers more areas than comparable visualizations based on co-citation patterns.
Areas related to computer science, however, are missing from the co-readership
visualization and more research is needed to explore the interpretations of
size and placement of research areas on the map.Comment: Forthcoming article in the International Journal of Technology
Enhanced Learnin
Fast, Sparse Matrix Factorization and Matrix Algebra via Random Sampling for Integral Equation Formulations in Electromagnetics
Many systems designed by electrical & computer engineers rely on electromagnetic (EM) signals to transmit, receive, and extract either information or energy. In many cases, these systems are large and complex. Their accurate, cost-effective design requires high-fidelity computer modeling of the underlying EM field/material interaction problem in order to find a design with acceptable system performance. This modeling is accomplished by projecting the governing Maxwell equations onto finite dimensional subspaces, which results in a large matrix equation representation (Zx = b) of the EM problem. In the case of integral equation-based formulations of EM problems, the M-by-N system matrix, Z, is generally dense. For this reason, when treating large problems, it is necessary to use compression methods to store and manipulate Z. One such sparse representation is provided by so-called H^2 matrices. At low-to-moderate frequencies, H^2 matrices provide a controllably accurate data-sparse representation of Z.
The scale at which problems in EM are considered ``large\u27\u27 is continuously being redefined to be larger. This growth of problem scale is not only happening in EM, but respectively across all other sub-fields of computational science as well. The pursuit of increasingly large problems is unwavering in all these sub-fields, and this drive has long outpaced the rate of advancements in processing and storage capabilities in computing. This has caused computational science communities to now face the computational limitations of standard linear algebraic methods that have been relied upon for decades to run quickly and efficiently on modern computing hardware. This common set of algorithms can only produce reliable results quickly and efficiently for small to mid-sized matrices that fit into the memory of the host computer. Therefore, the drive to pursue larger problems has even began to outpace the reasonable capabilities of these common numerical algorithms; the deterministic numerical linear algebra algorithms that have gotten matrix computation this far have proven to be inadequate for many problems of current interest. This has computational science communities focusing on improvements in their mathematical and software approaches in order to push further advancement. Randomized numerical linear algebra (RandNLA) is an emerging area that both academia and industry believe to be strong candidates to assist in overcoming the limitations faced when solving massive and computationally expensive problems.
This thesis presents results of recent work that uses a random sampling method (RSM) to implement algebraic operations involving multiple H^2 matrices. Significantly, this work is done in a manner that is non-invasive to an existing H^2 code base for filling and factoring H^2 matrices. The work presented thus expands the existing code\u27s capabilities with minimal impact on existing (and well-tested) applications. In addition to this work with randomized H^2 algebra, improvements in sparse factorization methods for the compressed H^2 data structure will also be presented. The reported developments in filling and factoring H^2 data structures assist in, and allow for, the further pursuit of large and complex problems in computational EM (CEM) within simulation code bases that utilize the H^2 data structure
Visualisation techniques, human perception and the built environment
Historically, architecture has a wealth of visualisation techniques that have evolved throughout the period of structural design, with Virtual Reality (VR) being a relatively recent addition to the toolbox. To date the effectiveness of VR has been demonstrated from conceptualisation through to final stages and maintenance, however, its full potential has yet to be realised (Bouchlaghem et al, 2005). According to Dewey (1934), perceptual integration was predicted to be transformational; as the observer would be able to ‘engage’ with the virtual environment. However, environmental representations are predominately focused on the area of vision, regardless of evidence stating that the experience is multi sensory. In addition, there is a marked lack of research exploring the complex interaction of environmental design and the user, such as the role of attention or conceptual interpretation. This paper identifies the potential of VR models to aid communication for the Built Environment with specific reference to human perception issues
Enhanced LFR-toolbox for MATLAB and LFT-based gain scheduling
We describe recent developments and enhancements of the LFR-Toolbox for MATLAB for building LFT-based uncertainty models and for LFT-based gain scheduling. A major development is the new LFT-object definition supporting a large class of uncertainty descriptions: continuous- and discrete-time uncertain models, regular and singular parametric expressions, more general uncertainty blocks (nonlinear, time-varying, etc.). By associating names to uncertainty blocks the reusability of generated LFT-models and the user friendliness of manipulation of LFR-descriptions have been highly increased. Significant enhancements of the computational efficiency and of numerical accuracy have been achieved by employing efficient and numerically robust Fortran implementations of order reduction tools via mex-function interfaces. The new enhancements in conjunction with improved symbolical preprocessing lead generally to a faster generation of LFT-models with significantly lower orders. Scheduled gains can be viewed as LFT-objects. Two techniques for designing such gains are presented. Analysis tools are also considered
Supporting collaboration within the eScience community
Collaboration is a core activity at the heart of large-scale co-
operative scientific experimentation. In order to support the
emergence of Grid-based scientific collaboration, new models of
e-Science working methods are needed.
Scientific collaboration involves production and manipulation of
various artefacts. Based on work done in the software
engineering field, this paper proposes models and tools which
will support the representation and production of such artefacts.
It is necessary to provide facilities to classify, organise, acquire,
process, share, and reuse artefacts generated during collaborative
working. The concept of a "design space" will be used to
organise scientific design and the composition of experiments,
and methods such as self-organising maps will be used to support
the reuse of existing artefacts.
It is proposed that this work can be carried out and evaluated in
the UK e-Science community, using an "industry as laboratory"
approach to the research, building on the knowledge, expertise,
and experience of those directly involved in e-Science
Computer mediated colour fidelity and communication
Developments in technology have meant that computercontrolled
imaging devices are becoming more powerful and more
affordable. Despite their increasing prevalence, computer-aided
design and desktop publishing software has failed to keep pace, leading
to disappointing colour reproduction across different devices.
Although there has been a recent drive to incorporate colour management
functionality into modern computer systems, in general this
is limited in scope and fails to properly consider the way in which
colours are perceived. Furthermore, differences in viewing conditions
or representation severely impede the communication of colour
between groups of users.
The approach proposed here is to provide WYSIWYG colour
across a range of imaging devices through a combination of existing
device characterisation and colour appearance modeling techniques.
In addition, to further facilitate colour communication, various common
colour notation systems are defined by a series of mathematical
mappings. This enables both the implementation of computer-based
colour atlases (which have a number of practical advantages over
physical specifiers) and also the interrelation of colour represented in
hitherto incompatible notations.
Together with the proposed solution, details are given of a computer
system which has been implemented. The system was used by
textile designers for a real task. Prior to undertaking this work,
designers were interviewed in order to ascertain where colour played
an important role in their work and where it was found to be a problem.
A summary of the findings of these interviews together with a
survey of existing approaches to the problems of colour fidelity and
communication in colour computer systems are also given. As background
to this work, the topics of colour science and colour imaging
are introduced
- …