33 research outputs found
Using RPA for data generation using OCR platforms in Mediterranean University of Albania
The increase in the amount of data today has led to the use of computer applications in order to manage processes precisely. Robotic process automation (RPA), also known as software robotics, uses automation technologies to mimic back-office tasks of human workers, such as extracting data, filling in forms, moving files, et cetera. Optical character recognition (OCR) is sometimes referred to as text recognition. An OCR program extracts and repurposes data from scanned documents, camera images and image-only pdfs. OCR systems use a combination of hardware and software to convert physical, printed documents into machine-readable text. Hardware such as an optical scanner or specialized circuit board copies or reads text then, software typically handles the advanced processing. Process Automation in Azure Automation allows you to automate frequent, time-consuming, and error-prone management tasks. This service helps you focus on work that adds business value. In this paper, I will use the above-mentioned technologies to realize the automatic data generation process for the construction of an online library. In addition, the level of data accuracy will be studied in the automation of data generation from pdf files to mySql. The application will be built in front end html and back end php programming language and mySql database. These tests will be done by inserting more than 17000 books in pdf format
Active and passive reduction of high order modes in the gravitational wave detector GEO 600
[no abstract
Benchmarking VisualStudio.NET for the development and implementation of a manufacturing execution system
The focus of this thesis is to show the utility of Microsoft\u27s\u27 .NET framework in developing and implementing a MES system. The manufacturing environment today, more than ever, is working towards achieving better yields, productivity, quality, and customer satisfaction. Companies such as DELL are rapidly outgrowing their competition due to better management of their product lifecycles. The time between receiving a new order to the time the final product is shipped is getting shorter. Historically, business management applications such as Enterprise Resource Planning (ERP) systems and Customer Relationship Management (CRM) systems have been implemented without too much importance given to the operational and shop floor needs. The fact is that these business systems can be successful only when they are properly integrated with real-time data from the shop floor, which is the core of any manufacturing set-up. A Manufacturing Execution System or a MES is this link between the shop floor and the top floor. MESA international defines MES as Systems that deliver information enabling the optimization of production activities from order launch to finished goods Thus, a MES provides the right information to the right people at the right time in a right format, to help them make well-informed decisions. Thus, a necessity for an efficient MES is high capability of integration with the existing systems on the operational level. This is where Microsoft\u27s\u27 VS.NET fits in. Microsoft defines .NET as A set of software technologies for connecting information, people, systems and devices . The vision of .NET is to enable the end user to connect to information from any place at anytime, using any device and in a manner that is independent of the platform on which the service is based. The building block of the .NET framework is the Common Language Runtime or CLR, which is capable of converting data from its original format into a format understandable to .NET and then use that format to interface with its client. This feature that .NET provides holds the key in the context of a MES development and implementation. The aim of this applied research is to design a MES using VS.NET to control the working of a Flexible Manufacturing System (FMS) namely CAMCELL. The architecture used for the MES will then be gauged against an MES implementation done previously using a Siemens\u27 PC-based automation technology and Visual FoxPro. This study will integrate the Siemens\u27 technology with the .NET framework to enhance the resulting MES efficiency. The shop floor details or the real-time data collection will be done using the databases from WinCC and data aggregation and manipulation will be done within the .NET framework. The software architecture used for this study will achieve vertical integration between the CAMCELL ERP layer, the MES layer and the Control layer. The study will demonstrate how the data stored in a high level ERP database can be converted into useful information for the control layer for process control and also how real-time information gathered from the control layer can be filtered into useful information up to the ERP layer to facilitate the decision making process. VS.NET user interface screens will be proposed to support these activities. The performance of the proposed architecture will be compared to that from previous studies, thus benchmarking VS.NET for the implementation of the MES
Recommended from our members
Improvement and deployment of the web-based database management system for computer science graduate program
In 2005 Mr. Dung Tien Vu completed a master\u27s project for the California State University, San Bernardino Computer Science Department in which he designed a web-based database of department graduate student information. This project was designed to alter the system so that it conforms with current campus regulations, to make some improvements to the system, and to deploy it
A Windows-based roadway maintenance management system
A Maintenance Management System (MMS), as the name suggests, is a system used to manage maintenance activities on any infrastructure system or facility such as a road network. Maintenance of a road network involves repair, construction and improvement of pavements and the right of way elements that deteriorate due to their usage, exposure to the environment and various other causes such as impact during crashes, vandalism etc. Management of the maintenance activities is critical to maintain the level of service of the road network, public safety and welfare and for effective and efficient fiscal planning. Since this involves large expenses, a system that would improve the efficiency is not only useful but also necessary; The development of a prototype roadway MMS is presented in this thesis. The basic information such as the names of the personnel, street, equipment and material, the wage rates, the lengths of the street segments, the maintenance activities etc. need to be entered into the system before details about projects can be stored. The application has the appropriate functions to allow the user to input such data. The user can obtain a summary report showing the total cost, the breakdown of the cost by personnel, equipment and material, and the amount that is left to be paid for each project. The user also can query the database and also view the different tables. (Abstract shortened by UMI.)
A computational approach to the identification of lineage-specific bacterial genes and a determination of their biological significance
EThOS - Electronic Theses Online ServiceGBUnited Kingdo
Development of an open-source and open-data energy system optimization model for the analysis of the European energy mix
L'abstract è presente nell'allegato / the abstract is in the attachmen
CLASSIFYING AND RESPONDING TO NETWORK INTRUSIONS
Intrusion detection systems (IDS) have been widely adopted within the IT community, as
passive monitoring tools that report security related problems to system administrators.
However, the increasing number and evolving complexity of attacks, along with the
growth and complexity of networking infrastructures, has led to overwhelming numbers of
IDS alerts, which allow significantly smaller timeframe for a human to respond. The need
for automated response is therefore very much evident. However, the adoption of such
approaches has been constrained by practical limitations and administrators' consequent
mistrust of systems' abilities to issue appropriate responses.
The thesis presents a thorough analysis of the problem of intrusions, and identifies false
alarms as the main obstacle to the adoption of automated response. A critical examination
of existing automated response systems is provided, along with a discussion of why a new
solution is needed. The thesis determines that, while the detection capabilities remain
imperfect, the problem of false alarms cannot be eliminated. Automated response
technology must take this into account, and instead focus upon avoiding the disruption of
legitimate users and services in such scenarios. The overall aim of the research has
therefore been to enhance the automated response process, by considering the context of an
attack, and investigate and evaluate a means of making intelligent response decisions.
The realisation of this objective has included the formulation of a response-oriented
taxonomy of intrusions, which is used as a basis to systematically study intrusions and
understand the threats detected by an IDS. From this foundation, a novel Flexible
Automated and Intelligent Responder (FAIR) architecture has been designed, as the basis
from which flexible and escalating levels of response are offered, according to the context
of an attack. The thesis describes the design and operation of the architecture, focusing
upon the contextual factors influencing the response process, and the way they are
measured and assessed to formulate response decisions. The architecture is underpinned by
the use of response policies which provide a means to reflect the changing needs and
characteristics of organisations.
The main concepts of the new architecture were validated via a proof-of-concept prototype
system. A series of test scenarios were used to demonstrate how the context of an attack
can influence the response decisions, and how the response policies can be customised and
used to enable intelligent decisions. This helped to prove that the concept of flexible
automated response is indeed viable, and that the research has provided a suitable
contribution to knowledge in this important domain
Assessing the impact of computer use on landscape architecture professional practice: efficiency, effectiveness, and design creativity
Landscape architects claim that computers are efficient and effective presentation tools.
However, to date, no one has evaluated the impact of computer use on the nature and
quality of design in a practice setting. To further explore this issue, a trial was
conducted with landscape architecture students in which they worked in conventional,
mixed and digital media. Results indicated that although computer use was efficient in
some tasks, the nature of the design process did not yield itself effectively yet to
complete computerization. In addition, to assess the impact of computer use more
broadly on office practice today, a survey was conducted of over 100 Chapter Executive
Members of the American Society of Landscape Architects in the United States of
America.Survey results indicated that computer use has permeated all areas of landscape
architecture practice, and that it has genuinely improved drawing quality and capability.
However, it has not significantly impacted the artistic or creative aspects. Few
respondents believed the computer can improve these facets of the profession or that
traditional practice methods will be totally replaced by the computer.The results suggest that academic and professional sectors of landscape architecture
must help educate existing professionals to fully grasp the benefits of current and
emerging computer technologies and to prepare the future professionals for an
increasingly digital practice
MSL Framework: (Minimum Service Level Framework) for cloud providers and users
Cloud Computing ensures parallel computing and emerged as an efficient technology to meet
the challenges of rapid growth of data that we experienced in this Internet age. Cloud
computing is an emerging technology that offers subscription based services, and provide
different models such as IaaS, PaaS and SaaS among other models to cater the needs of
different user groups. The technology has enormous benefits but there are serious concerns
and challenges related to lack of uniform standards or nonexistence of minimum benchmark
for level of services offered across the industry to provide an effective, uniform and reliable
service to the cloud users. As the cloud computing is gaining popularity, organizations and
users are having problems to adopt the service ue to lack of minimum service level
framework which can act as a benchmark in the selection of the cloud provider and provide
quality of service according to the user’s expectations. The situation becomes more critical
due to distributed nature of the service provider which can be offering service from any part
of the world. Due to lack of minimum service level framework that will act as a benchmark
to provide a uniform service across the industry there are serious concerns raised recently interms
of security and data privacy breaches, authentication and authorization issues, lack of
third party audit and identity management problems, integrity, confidentiality and variable
data availability standards, no uniform incident response and monitoring standards,
interoperability and lack of portability standards, identity management issues, lack of
infrastructure protection services standards and weak governance and compliance standards
are major cause of concerns for cloud users. Due to confusion and absence of universal
agreed SLAs for a service model, different quality of services is being provided across the
cloud industry. Currently there is no uniform performance model agreed by all stakeholders;
which can provide performance criteria to measure, evaluate, and benchmark the level of
services offered by various cloud providers in the industry. With the implementation of
General Data Protection Regulation (GDPR) and demand from cloud users to have Green
SLAs that provides better resource allocations mechanism, there will be serious implications
for the cloud providers and its consumers due to lack of uniformity in SLAs and variable
standards of service offered by various cloud providers. This research examines weaknesses in service level agreements offered by various cloud
providers and impact due to absence of uniform agreed minimum service level framework on
the adoption and usage of cloud service. The research is focused around higher education
case study and proposes a conceptual model based on uniform minimum service model that
acts as benchmark for the industry to ensure quality of service to the cloud users in the higher
education institution and remove the barriers to the adoption of cloud technology. The
proposed Minimum Service Level (MSL) framework, provides a set of minimum and
uniform standards in the key concern areas raised by the participants of HE institution which
are essential to the cloud users and provide a minimum quality benchmark that becomes a
uniform standard across the industry. The proposed model produces a cloud computing
implementation evaluation criteria which is an attempt to reduce the adoption barrier of the
cloud technology and set minimum uniform standards followed by all the cloud providers
regardless of their hosting location so that their performance can be measured, evaluated and
compared across the industry to improve the overall QoS (Quality of Service) received by the
cloud users, remove the adoption barriers and concerns of the cloud users and increase the
competition across the cloud industry.A computação em nuvem proporciona a computação paralela e emergiu como uma tecnologia
eficiente para enfrentar os desafios do crescimento rápido de dados que vivemos na era da
Internet. A computação em nuvem é uma tecnologia emergente que oferece serviços
baseados em assinatura e oferece diferentes modelos como IaaS, PaaS e SaaS, entre outros
modelos para atender as necessidades de diferentes grupos de utilizadores. A tecnologia tem
enormes benefícios, mas subsistem sérias preocupações e desafios relacionados com a falta
de normas uniformes ou inexistência de um referencial mínimo para o nível de serviços
oferecidos, na indústria, para proporcionar uma oferta eficaz, uniforme e confiável para os
utilizadores da nuvem. Como a computação em nuvem está a ganhar popularidade, tanto
organizações como utilizadores estão enfrentando problemas para adotar o serviço devido à
falta de enquadramento de nível de serviço mínimo que possa agir como um ponto de
referência na seleção de provedor da nuvem e fornecer a qualidade dos serviços de acordo
com as expectativas do utilizador. A situação torna-se mais crítica, devido à natureza
distribuída do prestador de serviço, que pode ser oriundo de qualquer parte do mundo.
Devido à falta de enquadramento de nível de serviço mínimo que irá agir como um
benchmark para fornecer um serviço uniforme em toda a indústria, existem sérias
preocupações levantadas recentemente em termos de violações de segurança e privacidade de
dados, autenticação e autorização, falta de questões de auditoria de terceiros e problemas de
gestão de identidade, integridade, confidencialidade e disponibilidade de dados, falta de
uniformidade de normas, a não resposta a incidentes e o monitoramento de padrões, a
interoperabilidade e a falta de padrões de portabilidade, questões relacionadas com a gestão
de identidade, falta de padrões de serviços de proteção das infraestruturas e fraca governança
e conformidade de padrões constituem outras importantes causas de preocupação para os
utilizadores. Devido à confusão e ausência de SLAs acordados de modo universal para um
modelo de serviço, diferente qualidade de serviços está a ser fornecida através da nuvem, pela
indústria da computação em nuvem. Atualmente, não há desempenho uniforme nem um
modelo acordado por todas as partes interessadas; que pode fornecer critérios de desempenho
para medir, avaliar e comparar o nível de serviços oferecidos por diversos fornecedores de
computação em nuvem na indústria. Com a implementação do Regulamento Geral de Protecção de Dados (RGPD) e a procura da
nuvem com base no impacto ambiental (Green SLAs), são acrescentadas precupações
adicionais e existem sérias implicações para os forncedores de computação em nuvem e para
os seus consumidores, também devido à falta de uniformidade na multiplicidade de SLAs e
padrões de serviço oferecidos. A presente pesquisa examina as fraquezas em acordos de nível
de serviço oferecidos por fornecedores de computação em nuvem e estuda o impacto da
ausência de um quadro de nível de serviço mínimo acordado sobre a adoção e o uso no
contexto da computação em nuvem. A pesquisa está orientada para a adoção destes serviços
para o caso do ensino superior e as instituições de ensino superior e propõe um modelo
conceptualt com base em um modelo de serviço mínimo uniforme que funciona como
referência para a indústria, para garantir a qualidade do serviço para os utilizadores da nuvem
numa instituição de ensino superior de forma a eliminar as barreiras para a adoção da
tecnologia de computação em nuvem. O nível de serviço mínimo proposto (MSL), fornece
um conjunto mínimo de normas uniformes e na áreas das principais preocupações levantadas
por responsáveis de instituições de ensino superior e que são essenciais, de modo a fornecer
um referencial mínimo de qualidade, que se possa tornar um padrão uniforme em toda a
indústria. O modelo proposto é uma tentativa de reduzir a barreira de adoção da tecnologia de
computação em nuvem e definir normas mínimas seguidas por todos os fornecedores de
computação em nuvem, independentemente do seu local de hospedagem para que os seus
desempenhos possam ser medidos, avaliados e comparados em toda a indústria, para
melhorar a qualidade de serviço (QoS) recebida pelos utilizadores e remova as barreiras de
adoção e as preocupações dos utilizadores, bem como fomentar o aumento da concorrência
em toda a indústria da computação em nuvem