471,407 research outputs found
Document semantics: Two approaches
SGML introduced DTD idea to formally describe document syntax and structure.
One of its main characteristics is the fact of being purely declarative
and fully independent of the future document's processing (typesetting,
formatting, translation/transformation).
In this context, SGML has become the international standard to be
followed.
Sooner or later, a document has to be processed. In order to do that we
need to associate semantics to the document's structure.
In a compiler context, normally we separate semantics in two, static and
dynamic.
Establishing a parallelism with document processing, we can think of the
document's decorated tree (as recognized by a SGML analyzer) as being the
static semantics and document's tree transformation and/or reaction
as dynamic semantics.
Pursuing this idea, we will present and discuss a study of the
relationship between SGML, DAST (Decorated Abstract
Syntax Tree), and Algebraic Specification tools, in order to better
understand how to formally process documents in general and how to
specify and build generic document processing tools
Rutger's CAM2000 chip architecture
This report describes the architecture and instruction set of the Rutgers CAM2000 memory chip. The CAM2000 combines features of Associative Processing (AP), Content Addressable Memory (CAM), and Dynamic Random Access Memory (DRAM) in a single chip package that is not only DRAM compatible but capable of applying simple massively parallel operations to memory. This document reflects the current status of the CAM2000 architecture and is continually updated to reflect the current state of the architecture and instruction set
Modelling, Visualising and Summarising Documents with a Single Convolutional Neural Network
Capturing the compositional process which maps the meaning of words to that
of documents is a central challenge for researchers in Natural Language
Processing and Information Retrieval. We introduce a model that is able to
represent the meaning of documents by embedding them in a low dimensional
vector space, while preserving distinctions of word and sentence order crucial
for capturing nuanced semantics. Our model is based on an extended Dynamic
Convolution Neural Network, which learns convolution filters at both the
sentence and document level, hierarchically learning to capture and compose low
level lexical features into high level semantic concepts. We demonstrate the
effectiveness of this model on a range of document modelling tasks, achieving
strong results with no feature engineering and with a more compact model.
Inspired by recent advances in visualising deep convolution networks for
computer vision, we present a novel visualisation technique for our document
networks which not only provides insight into their learning process, but also
can be interpreted to produce a compelling automatic summarisation system for
texts
A model-driven approach to broaden the detection of software performance antipatterns at runtime
Performance antipatterns document bad design patterns that have negative
influence on system performance. In our previous work we formalized such
antipatterns as logical predicates that predicate on four views: (i) the static
view that captures the software elements (e.g. classes, components) and the
static relationships among them; (ii) the dynamic view that represents the
interaction (e.g. messages) that occurs between the software entities elements
to provide the system functionalities; (iii) the deployment view that describes
the hardware elements (e.g. processing nodes) and the mapping of the software
entities onto the hardware platform; (iv) the performance view that collects
specific performance indices. In this paper we present a lightweight
infrastructure that is able to detect performance antipatterns at runtime
through monitoring. The proposed approach precalculates such predicates and
identifies antipatterns whose static, dynamic and deployment sub-predicates are
validated by the current system configuration and brings at runtime the
verification of performance sub-predicates. The proposed infrastructure
leverages model-driven techniques to generate probes for monitoring the
performance sub-predicates and detecting antipatterns at runtime.Comment: In Proceedings FESCA 2014, arXiv:1404.043
Revisiting the Provision of Nanoscale Precision of Cutting on the Basis of Dynamic Characteristics Modeling of Processing Equipment
The article deals with the issues related to the development of the processing equipment providing na-noscale precision of cutting by means of turning and milling. Building of a machine dynamic model is car-ried out to solve of this task. This allows taking into account the dynamic characteristics of the existing or
designed equipment and the errors of dynamic setting of the machine and this also allows providing pro-cessing precision in nanometer range.
When you are citing the document, use the following link http://essuir.sumdu.edu.ua/handle/123456789/3634
- …