2,822 research outputs found
Recommended from our members
On Birthing Dancing Stars: The Need for Bounded Chaos in Information Interaction
While computers causing chaos is acommon social trope, nearly the entirety of the history of computing is dedicated to generating order. Typical interactive information retrieval tasks ask computers to support the traversal and exploration of large, complex information spaces. The implicit assumption is that they are to support users in simplifying the complexity (i.e. in creating order from chaos). But for some types of task, particularly those that involve the creative application or synthesis of knowledge or the creation of new knowledge, this assumption may be incorrect. It is increasingly evident that perfect order—and the systems we create with it—support highly-structured information tasks well, but provide poor support for less-structured tasks.We need digital information environments that help create a little more chaos from order to spark creative thinking and knowledge creation. This paper argues for the need for information systems that offerwhat we term ‘bounded chaos’, and offers research directions that may support the creation of such interface
From 3D Models to 3D Prints: an Overview of the Processing Pipeline
Due to the wide diffusion of 3D printing technologies, geometric algorithms
for Additive Manufacturing are being invented at an impressive speed. Each
single step, in particular along the Process Planning pipeline, can now count
on dozens of methods that prepare the 3D model for fabrication, while analysing
and optimizing geometry and machine instructions for various objectives. This
report provides a classification of this huge state of the art, and elicits the
relation between each single algorithm and a list of desirable objectives
during Process Planning. The objectives themselves are listed and discussed,
along with possible needs for tradeoffs. Additive Manufacturing technologies
are broadly categorized to explicitly relate classes of devices and supported
features. Finally, this report offers an analysis of the state of the art while
discussing open and challenging problems from both an academic and an
industrial perspective.Comment: European Union (EU); Horizon 2020; H2020-FoF-2015; RIA - Research and
Innovation action; Grant agreement N. 68044
GeantV: Results from the prototype of concurrent vector particle transport simulation in HEP
Full detector simulation was among the largest CPU consumer in all CERN
experiment software stacks for the first two runs of the Large Hadron Collider
(LHC). In the early 2010's, the projections were that simulation demands would
scale linearly with luminosity increase, compensated only partially by an
increase of computing resources. The extension of fast simulation approaches to
more use cases, covering a larger fraction of the simulation budget, is only
part of the solution due to intrinsic precision limitations. The remainder
corresponds to speeding-up the simulation software by several factors, which is
out of reach using simple optimizations on the current code base. In this
context, the GeantV R&D project was launched, aiming to redesign the legacy
particle transport codes in order to make them benefit from fine-grained
parallelism features such as vectorization, but also from increased code and
data locality. This paper presents extensively the results and achievements of
this R&D, as well as the conclusions and lessons learnt from the beta
prototype.Comment: 34 pages, 26 figures, 24 table
Cognitive test-bed for wireless sensor networks
Cognitive Wireless Sensor Networks are an emerging technology with a vast potential to avoid traditional wireless problems such as reliability, interferences and spectrum scarcity in Wireless Sensor Networks. Cognitive Wireless Sensor Networks test-beds are an important tool for future developments, protocol strategy testing and algorithm optimization in real scenarios. A new cognitive test-bed for Cognitive Wireless Sensor Networks is presented in this paper. This work in progress includes both the design of a cognitive simulator for networks with a high number of nodes and the implementation of a new platform with three wireless interfaces and a cognitive software for extracting real data. Finally, as a future work, a remote programmable system and the planning for the physical deployment of the nodes at the university building is presented
Advanced engineering tools for next generation substation automation systems: the added value of IEC 61850 and the inpact project
Automation systems according to IEC 61850 are a powerful solution for station automation. Engineering of such distributed systems is however a non-trivial task which requires different approaches and enhanced tool support. In this paper the authors (i) present how IEC 61850 is viewed and is being adopted by a utility and vendor, (ii) discuss its engineering potential and current issues, (iii) point-out global requirements for next generation tools, (iv) present the InPACTproject which is tackling some of these concerns and (v) propose key elements of visual languages as one contributing enhancement
A particle finite element method for fluid-related problems in civil engineering
The work presented in this Thesis is a set of developments focused on the Particle Finite Element Method (PFEM) and its applicability in several fields in Civil Engineering. The PFEM had already been proven to be a powerful tool for the free surface flows with large deformation and domain separation, but the application to actual engineering problems requires many more advances. The interaction between the fluid and many solids contacting with each other, the erosion of soils and the transport of small particles are some of these advances, which are main topics addressed in this document. Apart from them, other developments related with the fluid solution are included, which are intended to get deeper than ever before in the practical use of PFEM.El treball que es presenta en aquesta Tesi és un conjunt de desenvolupaments centrats en el Particle Finite Element Method (PFEM) i en la seva aplicació a diversos camps de l'enginyeria civil. El PFEM ja havia demostrat ser una eina potent pels fluxes amb superfície lliure amb grans deformacions i separació de dominis, però l'aplicació a problemes d'enginyeria reals requereix molts més avenços. La interacció entre el fluid i molts sòlids que contacten els uns amb els altres, l'erosió de sòls i el transport de partícules petites són alguns d'aquests avenços, que són els principals temes tractats en aquesta Tesi. Apart d'aquests, s'inclouen altres desenvolupaments relacionats amb la solució del fluid, que miren d'arribar més profunditat que mai abans en l'ús pràctic del PFEM i la seva implementació. Primer es presenta el PFEM, es descriuen els desenvolupaments de l'autor millorant la solució de la dinàmica de fluids i altres capacitats simples que s'hi han afegit. Després, tres capítols principals es centren en a) l'algoritme d'interacció fluid-estructura amb contacte b) l'erosió de sòls c) el transport de partícules. A continuació altres aplicacions del mètode s'expliquen, així com una llista dels projectes d'investigació amb els quals aquesta tesi ha tingut vincle.Postprint (published version
Measuring Interaction Design before Building the System: a Model-Based Approach
Early prototyping of user interfaces is an established good practice in interactive system development. However, prototypes cover only some usage scenarios, and questions dealing with number of required steps, possible interaction paths or impact of possible user errors can be answered only for the specific scenarios and only after tedious manual inspection. We present a tool (MIGTool) that transforms models of the behavior of a user interface into a graph, upon which usage scenarios can be easily specified, and used by MIGTool to compute possible interaction paths. Metrics based on possible paths, with or without user navigation errors, can then be computed. For example, when analyzing four mail applications, we show that Gmail has 3 times more shortest routes, has twice more routes that include a single user error, has routes with 13\ufewer steps, but has also optimal routes with the smallest probability to be chosen. Without MIGTool, this kind of analysis could only be done after building some prototype of the system, and then only for specific scenarios by manually tracing user actions and relative changes to the screens. With MIGTool the exploration of suitability of a design with respect to different scenarios, or comparison of different design alternatives against a single scenario, can be done with just a partial specification of the user interface behavior. This is made possible by the ability to associate scenarios steps to required user actions as defined in the model, by an efficient strategy to identify complete execution traces that users can follow, and by computing a range of diverse metrics on these results
Inferring Concise Specifications of APIs
Modern software relies on libraries and uses them via application programming
interfaces (APIs). Correct API usage as well as many software engineering tasks
are enabled when APIs have formal specifications. In this work, we analyze the
implementation of each method in an API to infer a formal postcondition.
Conventional wisdom is that, if one has preconditions, then one can use the
strongest postcondition predicate transformer (SP) to infer postconditions.
However, SP yields postconditions that are exponentially large, which makes
them difficult to use, either by humans or by tools. Our key idea is an
algorithm that converts such exponentially large specifications into a form
that is more concise and thus more usable. This is done by leveraging the
structure of the specifications that result from the use of SP. We applied our
technique to infer postconditions for over 2,300 methods in seven popular Java
libraries. Our technique was able to infer specifications for 75.7% of these
methods, each of which was verified using an Extended Static Checker. We also
found that 84.6% of resulting specifications were less than 1/4 page (20 lines)
in length. Our technique was able to reduce the length of SMT proofs needed for
verifying implementations by 76.7% and reduced prover execution time by 26.7%
Kontextsensitivität für den Operationssaal der Zukunft
The operating room of the future is a topic of high interest. In this thesis, which is among the first in the recently defined field of Surgical Data Science, three major topics for automated context awareness in the OR of the future will be examined: improved surgical workflow analysis, the newly developed event impact factors, and as application combining these and other concepts the unified surgical display.Der Operationssaal der Zukunft ist ein Forschungsfeld von großer Bedeutung. In dieser Dissertation, die eine der ersten im kürzlich definierten Bereich „Surgical Data Science“ ist, werden drei Themen für die automatisierte Kontextsensitivität im OP der Zukunft untersucht: verbesserte chirurgische Worflowanalyse, die neuentwickelten „Event Impact Factors“ und als Anwendungsfall, der diese Konzepte mit anderen kombiniert, das vereinheitlichte chirurgische Display
- …