6,711 research outputs found
Solving the TTC 2011 Compiler Optimization Task with metatools
The authors' "metatools" are a collection of tools for generic programming.
This includes generating Java sources from mathematically well-founded
specifications, as well as the creation of strictly typed document object
models for XML encoded texts. In this context, almost every computer-internal
structure is treated as a "model", and every computation is a kind of model
transformation.
This concept differs significantly from "classical model transformation"
executed by specialized tools and languages. Therefore it seemed promising to
the organizers of the TTC 2011, as well as to the authors, to apply metatools
to one of the challenges, namely to the "compiler optimization task". This is a
report on the resulting experiences.Comment: In Proceedings TTC 2011, arXiv:1111.440
Spoon: Program Analysis and Transformation in Java
In this research report, we present Spoon, a framework for program transformation and static analysis in Java. More precisely, Spoon is an open and extensible Java compiler, written in pure Java by using Compile-time reflection techniques. We take advantage of the new features added by Java 5, and particularly of annotations and generics. Using annotations within the Spoon framework allows the programmer to extend the Java language without defining new syntactic elements, and in such a way that it is naturally supported by IDEs for Java 5 and greater. Generics, as a priceless complement, allow for the well-typing of Spoon programs that implement the programmers' language extensions. Enforcing typing naturally provides better IDE support (such as static checks, completion, documentation, and navigation), and also allows us to define a pure Java template mechanism, which we use as a tool to define well-typed and straightforward program transformations. In addition to its basic transformation capabilities, Spoon comes with a partial evaluation engine that is used to calculate the control flow of the program and to simplify the results of template-based transformations for correctness, optimization, and readability. In order to demonstrate the usability and usefulness of our framework, we present three applications, which have been chosen to cover most of Spoon's features: a translator from Java 1.4 programs into well-typed Java 5 programs, an efficient template-based AOP extension, and an automatic implementation and validation of the visitor pattern
Open Programming Language Interpreters
Context: This paper presents the concept of open programming language
interpreters and the implementation of a framework-level metaobject protocol
(MOP) to support them. Inquiry: We address the problem of dynamic interpreter
adaptation to tailor the interpreter's behavior on the task to be solved and to
introduce new features to fulfill unforeseen requirements. Many languages
provide a MOP that to some degree supports reflection. However, MOPs are
typically language-specific, their reflective functionality is often
restricted, and the adaptation and application logic are often mixed which
hardens the understanding and maintenance of the source code. Our system
overcomes these limitations. Approach: We designed and implemented a system to
support open programming language interpreters. The prototype implementation is
integrated in the Neverlang framework. The system exposes the structure,
behavior and the runtime state of any Neverlang-based interpreter with the
ability to modify it. Knowledge: Our system provides a complete control over
interpreter's structure, behavior and its runtime state. The approach is
applicable to every Neverlang-based interpreter. Adaptation code can
potentially be reused across different language implementations. Grounding:
Having a prototype implementation we focused on feasibility evaluation. The
paper shows that our approach well addresses problems commonly found in the
research literature. We have a demonstrative video and examples that illustrate
our approach on dynamic software adaptation, aspect-oriented programming,
debugging and context-aware interpreters. Importance: To our knowledge, our
paper presents the first reflective approach targeting a general framework for
language development. Our system provides full reflective support for free to
any Neverlang-based interpreter. We are not aware of any prior application of
open implementations to programming language interpreters in the sense defined
in this paper. Rather than substituting other approaches, we believe our system
can be used as a complementary technique in situations where other approaches
present serious limitations
Bringing ultra-large-scale software repository mining to the masses with Boa
Mining software repositories provides developers and researchers a
chance to learn from previous development activities and apply that
knowledge to the future. Ultra-large-scale open source repositories
(e.g., SourceForge with 350,000+ projects, GitHub with 250,000+
projects, and Google Code with 250,000+ projects) provide an extremely
large corpus to perform such mining tasks on. This large corpus allows
researchers the opportunity to test new mining techniques and
empirically validate new approaches on real-world data. However, the
barrier to entry is often extremely high. Researchers interested in
mining must know a large number of techniques, languages, tools, etc,
each of which is often complex. Additionally, performing mining at
the scale proposed above adds additional complexity and often is
difficult to achieve.
The Boa language and infrastructure was developed to solve these
problems. We provide users a domain-specific language tailored for
software repository mining and allow them to submit queries via our
web-based interface. These queries are then automatically
parallelized and executed on a cluster, analyzing a dataset containing
almost 700,000 projects, history information from millions of
revisions, millions of Java source files, and billions of AST nodes.
The language also provides an easy to comprehend visitor syntax to
ease writing source code mining queries. The underlying
infrastructure contains several optimizations, including query
optimizations to make single queries faster as well as a fusion
optimization to group queries from multiple users into a single query.
The latter optimization is important as Boa is intended to be a
shared, community resource. Finally, we show the potential benefit of
Boa to the community by reproducing a previously published case
study and performing a new case study on the adoption of Java language
features
A User-Centered Approach to Landing Page Optimization in a Software-as-a-Service Business
There are two essential steps in the digital marketing process: acquisition and conversion. Acquisition describes the efforts of getting a potential buyer to visit a business's website. Conversion is concerned with convincing that prospect, who has arrived on a website, to take a desired action, thus to convert. The process of improving conversions is called conversion rate optimization (CRO). While marketers increasingly understand the importance of optimizing their website for conversion, often CRO is only done in a quantitative way, relying on web metrics and visitor behavior. This limited approach does not consider the reasons behind visitors' behavior, their underlying needs and way of thinking when evaluating products and services online. Yet, those reasons are crucial to understand when optimizing for conversion.
The objective of this study is to investigate how methods from user-centered design can aid in uncovering the needs and thought process of website visitors evaluating a Software-as-a-Service solution online. Additionally, the visitor's overall buying process is studied. The study is conducted as semi-structured interviews and retrospective testing with six recent website visitors interested in the SaaS service. Thematic analysis and customer journey mapping are used to analyze the interview data.
The results indicate that visitor needs are mostly connected to inquiring service-related information, such as performance or features, as well as the pricing range. Additionally, aspects such as ease of getting started, service flexibility and quality support had a strong influence. It was found that most of these aspects are typical for successful SaaS solutions. The overall decision making process of choosing a SaaS solution proved to be fairly unstructured. However, being present in the minds of potential customers before they feel the need to search for solutions actively seems to be crucial in order to be considered. In addition to that, the first impression of a business's online presence also largely impacts visitor trust and consideration. Regarding the final decision making, it is to be noted that technical visitors are strong influencers but the final provider selection is a collaborative effort. Concerning the page itself, visitor conversion is generally favored when presenting relevant content to visitors in relevant order, while leaving out irrelevant content
Evaluation of the quality of Alexa’s metrics
Alexa is a tool that can easily be confused by name with the voice device that Amazon proposes, but in reality, it is a web traffic tool. Very little is known about how it functions and where it gets data from. With so little information available, how is it possible to know whether the tool is of good value or not. The ability to compare Alexa with other tools such as Google Analytics gives insight into the quality of metrics and makes it possible to judge its transparency, reliability, trustworthiness and flexibility. To achieve this a state of the art on the subject was held, portraying elements relative to the metrics, the tools and the methods, this gave a direction in which to take the study. This lead the way to a much more practical side of the project, actually dealing with and assessing data. With a call being sent out to multiple networks, a sample of 10 websites was created, they all varied greatly but they also held important information that would help answer the research questions. A strict work methodology was undertaken to ensure the data would not be tainted and that it remained usable in order to facilitate the analysis of the data, it also ensured no backtracking would be necessary. The findings were not as striking as expected, as some results were more similar than originally predicted, although the correlation between the numbers was very low. Hardly any websites from the sample presented results that were constantly similar, albeit one, there was also one metric that would have data that bore no resemblance between the different tools. In addition to the results emitted by the data and charts numerous limitations attached to the tools were identified and it was obvious that they added challenges into giving conclusive results. Even though Alexa presents itself to be a useful tool to the everyday individual it does have quite a few limitations that a more consequent tool does not possess. There are evidently also improvements to be made when it comes to the standardization of such tools in order to make their use easier for all. Not all the results found in this study were conclusive but the door is open for a more in-depth project that would answer the additional questions that came up
Recommended from our members
Towards an aspect weaving BPEL engine
This position paper proposes the use of dynamic aspects and
the visitor design pattern to obtain a highly configurable and
extensible BPEL engine. Using these two techniques, the
core of this infrastructural software can be customised to
meet new requirements and add features such as debugging,
execution monitoring, or changing to another Web Service
selection policy. Additionally, it can easily be extended to
cope with customer-specific BPEL extensions. We propose
the use of dynamic aspects not only on the engine itself
but also on the workflow in order to tackle the problems of
Web Service hot deployment and hot fixes to long running
processes. In this way, composing aWeb Service "on-the-fly"
means weaving its choreography interface into the workflow
A Survey of Positioning Systems Using Visible LED Lights
© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.As Global Positioning System (GPS) cannot provide satisfying performance in indoor environments, indoor positioning technology, which utilizes indoor wireless signals instead of GPS signals, has grown rapidly in recent years. Meanwhile, visible light communication (VLC) using light devices such as light emitting diodes (LEDs) has been deemed to be a promising candidate in the heterogeneous wireless networks that may collaborate with radio frequencies (RF) wireless networks. In particular, light-fidelity has a great potential for deployment in future indoor environments because of its high throughput and security advantages. This paper provides a comprehensive study of a novel positioning technology based on visible white LED lights, which has attracted much attention from both academia and industry. The essential characteristics and principles of this system are deeply discussed, and relevant positioning algorithms and designs are classified and elaborated. This paper undertakes a thorough investigation into current LED-based indoor positioning systems and compares their performance through many aspects, such as test environment, accuracy, and cost. It presents indoor hybrid positioning systems among VLC and other systems (e.g., inertial sensors and RF systems). We also review and classify outdoor VLC positioning applications for the first time. Finally, this paper surveys major advances as well as open issues, challenges, and future research directions in VLC positioning systems.Peer reviewe
- …