3,972 research outputs found
Forum Session at the First International Conference on Service Oriented Computing (ICSOC03)
The First International Conference on Service Oriented Computing (ICSOC) was held in Trento, December 15-18, 2003. The focus of the conference ---Service Oriented Computing (SOC)--- is the new emerging paradigm for distributed computing and e-business processing that has evolved from object-oriented and component computing to enable building agile networks of collaborating business applications distributed within and across organizational boundaries. Of the 181 papers submitted to the ICSOC conference, 10 were selected for the forum session which took place on December the 16th, 2003. The papers were chosen based on their technical quality, originality, relevance to SOC and for their nature of being best suited for a poster presentation or a demonstration. This technical report contains the 10 papers presented during the forum session at the ICSOC conference. In particular, the last two papers in the report ere submitted as industrial papers
Abmash: Mashing Up Legacy Web Applications by Automated Imitation of Human Actions
Many business web-based applications do not offer applications programming
interfaces (APIs) to enable other applications to access their data and
functions in a programmatic manner. This makes their composition difficult (for
instance to synchronize data between two applications). To address this
challenge, this paper presents Abmash, an approach to facilitate the
integration of such legacy web applications by automatically imitating human
interactions with them. By automatically interacting with the graphical user
interface (GUI) of web applications, the system supports all forms of
integrations including bi-directional interactions and is able to interact with
AJAX-based applications. Furthermore, the integration programs are easy to
write since they deal with end-user, visual user-interface elements. The
integration code is simple enough to be called a "mashup".Comment: Software: Practice and Experience (2013)
Recommended from our members
ACCOUNTING AND FINANCIAL STATEMENTS AUTO ANALYSIS SYSTEM
This project was motivated by the need to revolutionize the generation of financial statements and financial analysis process thus speeding up business decision making. The research questions were: 1) How can machine learning increase the speed of financial statement preparation and automate financial statements analysis? 2) How can businesses balance the benefits of automating financial analysis with potential concerns around privacy, data security, and bias? 3) Can the Java J2EE framework provide a reliable running environment for machine learning?
The findings were: 1) Machine learning can significantly increase the accuracy and speed of financial analysis. Using machine learning algorithms, financial data can be processed and analyzed in real-time, allowing for quicker and more precise financial analysis. Machine learning models can identify patterns and trends in financial data that may not be easily detectable by humans, leading to more accurate financial statements and analysis. Additionally, machine learning can automate repetitive tasks in the financial analysis process, saving time and resources for businesses. 2) Businesses need to carefully balance the benefits of automating financial analysis with potential concerns around privacy, data security, and bias. While machine learning can offer significant advantages in terms of accuracy and speed, it also requires handling sensitive financial data. Therefore, it is crucial for businesses to implement robust data security measures to protect against potential data breaches and ensure compliance with privacy regulations. Additionally, businesses need to be mindful of potential biases in machine learning algorithms, as biased algorithms can result in biased financial analysis. Regular audits and monitoring of machine learning models should be conducted to address and mitigate any potential biases. 3) The Java J2EE framework can provide a reliable running environment for machine learning. Java J2EE (Java 2 Platform, Enterprise Edition) is a widely used and mature framework for developing enterprise applications, including machine learning applications. It offers scalability, reliability, and security features that are essential for running machine learning algorithms in a production environment. Java J2EE provides robust support for distributed computing, allowing for efficient processing of large financial datasets. Furthermore, it offers a wide range of libraries and tools for implementing machine learning algorithms, making it a viable choice for running machine learning applications in the financial industry.
The conclusions were: 1) Machine learning has the potential to significantly increase the accuracy and speed of financial analysis, thereby revolutionizing the generation of financial statements and the financial analysis process. Various machine learning algorithms, such as decision trees, random forests, and deep learning algorithms, can be utilized to identify patterns, trends, and hidden risks in financial data, leading to more informed and efficient business decision making. 2) Businesses need to carefully balance the benefits of automating financial analysis with potential concerns around privacy, data security, and bias. While machine learning can offer significant advantages in terms of accuracy and speed, there are ethical considerations that need to be addressed, such as ensuring data privacy, implementing effective data security measures, and mitigating biases in machine learning algorithms used in financial analysis. Businesses should adopt a responsible approach to machine learning implementation, considering the potential risks and benefits. 3) The Java J2EE framework can provide a reliable running environment for machine learning applications, but further research is needed to evaluate the performance and scalability of machine learning models in this framework. Identifying potential optimizations for running machine learning applications at scale in the Java J2EE framework can lead to more efficient and effective implementation of machine learning in financial analysis and decision-making processes. Further research in this area can contribute to the development of robust and scalable machine learning applications for financial analysis in the business domain.
Areas for further study include: 1) Exploring different machine learning algorithms and techniques to further improve the accuracy and speed of financial analysis. 2) Conducting research on the impact of machine learning on financial decision making and business performance. 3) Investigating methods for addressing and mitigating biases in machine learning algorithms used in financial analysis. 4) Evaluating the effectiveness of different data security measures in protecting sensitive financial data in machine learning applications. 5) Studying the performance and scalability of machine learning models in the Java J2EE framework and identifying potential optimizations for running machine learning applications at scale
A Programming Model for Hybrid Workflows: combining Task-based Workflows and Dataflows all-in-one
This paper tries to reduce the effort of learning, deploying, and integrating
several frameworks for the development of e-Science applications that combine
simulations with High-Performance Data Analytics (HPDA). We propose a way to
extend task-based management systems to support continuous input and output
data to enable the combination of task-based workflows and dataflows (Hybrid
Workflows from now on) using a single programming model. Hence, developers can
build complex Data Science workflows with different approaches depending on the
requirements. To illustrate the capabilities of Hybrid Workflows, we have built
a Distributed Stream Library and a fully functional prototype extending COMPSs,
a mature, general-purpose, task-based, parallel programming model. The library
can be easily integrated with existing task-based frameworks to provide support
for dataflows. Also, it provides a homogeneous, generic, and simple
representation of object and file streams in both Java and Python; enabling
complex workflows to handle any data type without dealing directly with the
streaming back-end.Comment: Accepted in Future Generation Computer Systems (FGCS). Licensed under
CC-BY-NC-N
Badger: Complexity Analysis with Fuzzing and Symbolic Execution
Hybrid testing approaches that involve fuzz testing and symbolic execution
have shown promising results in achieving high code coverage, uncovering subtle
errors and vulnerabilities in a variety of software applications. In this paper
we describe Badger - a new hybrid approach for complexity analysis, with the
goal of discovering vulnerabilities which occur when the worst-case time or
space complexity of an application is significantly higher than the average
case. Badger uses fuzz testing to generate a diverse set of inputs that aim to
increase not only coverage but also a resource-related cost associated with
each path. Since fuzzing may fail to execute deep program paths due to its
limited knowledge about the conditions that influence these paths, we
complement the analysis with a symbolic execution, which is also customized to
search for paths that increase the resource-related cost. Symbolic execution is
particularly good at generating inputs that satisfy various program conditions
but by itself suffers from path explosion. Therefore, Badger uses fuzzing and
symbolic execution in tandem, to leverage their benefits and overcome their
weaknesses. We implemented our approach for the analysis of Java programs,
based on Kelinci and Symbolic PathFinder. We evaluated Badger on Java
applications, showing that our approach is significantly faster in generating
worst-case executions compared to fuzzing or symbolic execution on their own
A Survey on IT-Techniques for a Dynamic Emergency Management in Large Infrastructures
This deliverable is a survey on the IT techniques that are relevant to the three use cases of the project EMILI. It describes the state-of-the-art in four complementary IT areas: Data cleansing, supervisory control and data acquisition, wireless sensor networks and complex event processing. Even though the deliverable’s authors have tried to avoid a too technical language and have tried to explain every concept referred to, the deliverable might seem rather technical to readers so far little familiar with the techniques it describes
- …