2,038 research outputs found
Migrating C/C++ Software to Mobile Platforms in the ADM Context
Software technology is constantly evolving and therefore the development of applications requires adapting software components and applications in order to be aligned to new paradigms such as Pervasive Computing, Cloud Computing and Internet of Things. In particular, many desktop software components need to be migrated to mobile technologies. This migration faces many challenges due to the proliferation of different mobile platforms. Developers usually make applications tailored for each type of device expending time and effort. As a result, new programming languages are emerging to integrate the native behaviors of the different platforms targeted in development projects. In this direction, the Haxe language allows writing mobile applications that target all major mobile platforms. Novel technical frameworks for information integration and tool interoperability such as Architecture-Driven Modernization (ADM) proposed by the Object Management Group (OMG) can help to manage a huge diversity of mobile technologies. The Architecture-Driven Modernization Task Force (ADMTF) was formed to create specifications and promote industry consensus on the modernization of existing applications. In this work, we propose a migration process from C/C++ software to different mobile platforms that integrates ADM standards with Haxe. We exemplify the different steps of the process with a simple case study, the migration of âthe Set of Mandelbrotâ C++ application. The proposal was validated in Eclipse Modeling Framework considering that some of its tools and run-time environments are aligned with ADM standards
Migrating software to mobile technology: a model driven engineering approach
Nowadays, organizations are facing the problematic of having to modernize or replace their legacy
software. This software has involved the investment of money, time and other resources through the
ages and there is a high risk in replacing it. The purpose of reengineering is to adapt software in
a disciplined way in order to improve its quality in aspects such as operability, functionality or
performance. The focus of reengineering is on improving an existing system with a higher return on
investment than would be achieved by developing a new system.
In the context of reengineering, the term legacy was associated with software that survived several
generations of developers, administrators and users. The entry into the market of new technologies
or paradigms is increasingly occurring and, motivates the growing demand for the adaptation of
systems developed more recently. Mobile Computing is crucial to harvesting the potential of these
new paradigms. Smartphones are the most used computing platform worldwide. They come
with a variety of sensors (GPS, accelerometer, digital compass, microphone and camera)
enabling a wide range of applications in Pervasive Computing, Cloud
Computing and Internet of Things (IoT)
Recommended from our members
ENIGMA and global neuroscience: A decade of large-scale studies of the brain in health and disease across more than 40 countries.
This review summarizes the last decade of work by the ENIGMA (Enhancing NeuroImaging Genetics through Meta Analysis) Consortium, a global alliance of over 1400 scientists across 43 countries, studying the human brain in health and disease. Building on large-scale genetic studies that discovered the first robustly replicated genetic loci associated with brain metrics, ENIGMA has diversified into over 50 working groups (WGs), pooling worldwide data and expertise to answer fundamental questions in neuroscience, psychiatry, neurology, and genetics. Most ENIGMA WGs focus on specific psychiatric and neurological conditions, other WGs study normal variation due to sex and gender differences, or development and aging; still other WGs develop methodological pipelines and tools to facilitate harmonized analyses of "big data" (i.e., genetic and epigenetic data, multimodal MRI, and electroencephalography data). These international efforts have yielded the largest neuroimaging studies to date in schizophrenia, bipolar disorder, major depressive disorder, post-traumatic stress disorder, substance use disorders, obsessive-compulsive disorder, attention-deficit/hyperactivity disorder, autism spectrum disorders, epilepsy, and 22q11.2 deletion syndrome. More recent ENIGMA WGs have formed to study anxiety disorders, suicidal thoughts and behavior, sleep and insomnia, eating disorders, irritability, brain injury, antisocial personality and conduct disorder, and dissociative identity disorder. Here, we summarize the first decade of ENIGMA's activities and ongoing projects, and describe the successes and challenges encountered along the way. We highlight the advantages of collaborative large-scale coordinated data analyses for testing reproducibility and robustness of findings, offering the opportunity to identify brain systems involved in clinical syndromes across diverse samples and associated genetic, environmental, demographic, cognitive, and psychosocial factors
Conceptualizing a framework for cyber-physical systems of systems development and deployment
ABSTRACT
Cyber-physical systems (CPS) refer to the next generation of
embedded ICT systems that are interconnected, collaborative and that provide users and businesses with a wide range of smart applications and services. Software in CPS applications ranges from small systems to large systems, aka. Systems of Systems (SoS), such as smart grids and cities. CPSoS require managing massive amounts of data, being aware of their emerging behavior, and scaling out to progressively evolve and add new systems. Cloud
computing supports processing and storing massive amounts of
data, hosting and delivering services, and configuring selfprovisioned resources. Therefore, cloud computing is the natural candidate to solve CPSoS needs. However, the diversity of platforms and the low-level cloud programming models make difficult to find a common solution for the development and deployment of CPSoS. This paper presents the architectural foundations of a cloud-centric framework for automating the development and deployment of CPSoS service applications to converge towards a common open service platform for CPSoS applications. This framework relies on the well-known qualities of the microservices architecture style, the autonomic computing paradigm, and the model-driven software development approach. Its implementation and validation is on-going at two European and national projects
Formal support for model driven development with graph transformation techniques
Also published online by CEUR Workshop Proceedings (CEUR-WS.org, ISSN 1613-0073)Â In this paper we give an overview of our
approach for Model Driven Development
(MDD), based on graph transformation techniques.
In MDD, models are the primary assets
in the development process. They are not
only used for documentation, but also for analysis,
simulation, code and test cases generation.
Thus, model transformation becomes a
central activity. As models can be formally
described as attributed, typed graphs, we can
use formal graph transformation techniques
for their manipulation. In this paper, we give
an overview of the different kinds of model
transformation and suitable graph transformation
techniques. Moreover, graph transformation
can be combined with meta-modelling for
further expressivity. Some of these techniques
have been recently implemented in the Metamodelling
tool AToM3. We use the tool to
introduce an example in the component-based
modelling and simulation area
Grain-size distribution in the mantle wedge of subduction zones
Author Posting. © American Geophysical Union, 2011. This article is posted here by permission of American Geophysical Union for personal use, not for redistribution. The definitive version was published in Journal of Geophysical Research 116 (2011): B10203, doi:10.1029/2011JB008294.Mineral grain size plays an important role in controlling many processes in the mantle wedge of subduction zones, including mantle flow and fluid migration. To investigate the grain-size distribution in the mantle wedge, we coupled a two-dimensional (2-D) steady state finite element thermal and mantle-flow model with a laboratory-derived grain-size evolution model. In our coupled model, the mantle wedge has a composite olivine rheology that incorporates grain-size-dependent diffusion creep and grain-size-independent dislocation creep. Our results show that all subduction settings lead to a characteristic grain-size distribution, in which grain size increases from 10 to 100 ÎŒm at the most trenchward part of the creeping region to a few centimeters in the subarc mantle. Despite the large variation in grain size, its effect on the mantle rheology and flow is very small, as >90% of the deformation in the flowing part of the creeping region is accommodated by grain-size-independent dislocation creep. The predicted grain-size distribution leads to a downdip increase in permeability by âŒ5 orders of magnitude. This increase is likely to promote greater upward migration of aqueous fluids and melts where the slab reaches âŒ100 km depth compared with shallower depths, potentially providing an explanation for the relatively uniform subarc slab depth. Seismic attenuation derived from the predicted grain-size distribution and thermal field is consistent with the observed seismic structure in the mantle wedge at many subduction zones, without requiring a significant contribution by the presence of melt.Funding for this research was provided
by the National Science Foundation through a MARGINS Postdoctoral
Fellowship (NSF OCEâ0840800) and NSF grant EARâ0854673
- âŠ