599 research outputs found
SCADA System Testbed for Cybersecurity Research Using Machine Learning Approach
This paper presents the development of a Supervisory Control and Data
Acquisition (SCADA) system testbed used for cybersecurity research. The testbed
consists of a water storage tank's control system, which is a stage in the
process of water treatment and distribution. Sophisticated cyber-attacks were
conducted against the testbed. During the attacks, the network traffic was
captured, and features were extracted from the traffic to build a dataset for
training and testing different machine learning algorithms. Five traditional
machine learning algorithms were trained to detect the attacks: Random Forest,
Decision Tree, Logistic Regression, Naive Bayes and KNN. Then, the trained
machine learning models were built and deployed in the network, where new tests
were made using online network traffic. The performance obtained during the
training and testing of the machine learning models was compared to the
performance obtained during the online deployment of these models in the
network. The results show the efficiency of the machine learning models in
detecting the attacks in real time. The testbed provides a good understanding
of the effects and consequences of attacks on real SCADA environmentsComment: E-Preprin
DeepCEL0 for 2D Single Molecule Localization in Fluorescence Microscopy
In fluorescence microscopy, Single Molecule Localization Microscopy (SMLM)
techniques aim at localizing with high precision high density fluorescent
molecules by stochastically activating and imaging small subsets of blinking
emitters. Super Resolution (SR) plays an important role in this field since it
allows to go beyond the intrinsic light diffraction limit. In this work, we
propose a deep learning-based algorithm for precise molecule localization of
high density frames acquired by SMLM techniques whose -based loss
function is regularized by positivity and -based constraints. The
is relaxed through its Continuous Exact (CEL0)
counterpart. The arising approach, named DeepCEL0, is parameter-free, more
flexible, faster and provides more precise molecule localization maps if
compared to the other state-of-the-art methods. We validate our approach on
both simulated and real fluorescence microscopy data
10181 Abstracts Collection -- Program Development for Extreme-Scale Computing
From May 2nd to May 7th, 2010, the Dagstuhl Seminar 10181
``Program Development for Extreme-Scale Computing \u27\u27
was held in Schloss Dagstuhl~--~Leibniz Center for Informatics.
During the seminar, several participants presented their current
research, and ongoing work and open problems were discussed. Abstracts of
the presentations given during the seminar as well as abstracts of
seminar results and ideas are put together in this paper.
Links to extended abstracts or full papers are provided, if available
Local spatiotemporal modeling of house prices: a mixed model approach
The real estate market has long provided an active application area for spatial–temporal modeling and analysis and it is well known that house prices tend to be not only spatially but also temporally correlated. In the spatial dimension, nearby properties tend to have similar values because they share similar characteristics, but house prices tend to vary over space due to differences in these characteristics. In the temporal dimension, current house prices tend to be based on property values from previous years and in the spatial–temporal dimension, the properties on which current prices are based tend to be in close spatial proximity. To date, however, most research on house prices has adopted either a spatial perspective or a temporal one; relatively little effort has been devoted to situations where both spatial and temporal effects coexist. Using ten years of house price data in Fife, Scotland (2003–2012), this research applies a mixed model approach, semiparametric geographically weighted regression (GWR), to explore, model, and analyze the spatiotemporal variations in the relationships between house prices and associated determinants. The study demonstrates that the mixed modeling technique provides better results than standard approaches to predicting house prices by accounting for spatiotemporal relationships at both global and local scales
HOW TO INCREASE EMPLOYEE’S DISCIPLINARY IN FACULTY OF MEDICINE OF DIPONEGORO UNIVERSITY
Human Capital plays an important role in organization. It is the heart of the organization strategy. Many factors embeded in it. Public Service employees as government human capital instead of the contract employees. The quality of the Public Service Employee has recently become a major issue. It is widely known that Public Service employee is lacking of disciplin. The issue discuss most in Public service employee’s disciplinary is absentheeism. This study presents factors that influence the Public Service employee’s disciplinary in Faculty of Medicine of Diponegoro University.
This research purposes will support the decision – making process to increase the Faculty of Medicine of Diponegoro University employee’s disciplinary, this study proposing some models analyzed by SEM. The study population is administration staffs in Faculty of Medicine of Dipnegoro University for both Public Service employees and contract employees. The respondents are 120 employees whose given some questionaires related to the study.
The result of the data analysis shows that the human capital is influenced by knowledge sharing, empowerment and workplace environment. Whilst human capital influences employee’s disciplinary positively.
Keywords : public service empoyees, knowledge sharing, empowerment, workplace environment, human capital, disciplinary
You Are What You Eat: A Preference-Aware Inverse Optimization Approach
A key challenge in the emerging field of precision nutrition entails
providing diet recommendations that reflect both the (often unknown) dietary
preferences of different patient groups and known dietary constraints specified
by human experts. Motivated by this challenge, we develop a preference-aware
constrained-inference approach in which the objective function of an
optimization problem is not pre-specified and can differ across various
segments. Among existing methods, clustering models from machine learning are
not naturally suited for recovering the constrained optimization problems,
whereas constrained inference models such as inverse optimization do not
explicitly address non-homogeneity in given datasets. By harnessing the
strengths of both clustering and inverse optimization techniques, we develop a
novel approach that recovers the utility functions of a constrained
optimization process across clusters while providing optimal diet
recommendations as cluster representatives. Using a dataset of patients' daily
food intakes, we show how our approach generalizes stand-alone clustering and
inverse optimization approaches in terms of adherence to dietary guidelines and
partitioning observations, respectively. The approach makes diet
recommendations by incorporating both patient preferences and expert
recommendations for healthier diets, leading to structural improvements in both
patient partitioning and nutritional recommendations for each cluster. An
appealing feature of our method is its ability to consider infeasible but
informative observations for a given set of dietary constraints. The resulting
recommendations correspond to a broader range of dietary options, even when
they limit unhealthy choices
Replication as a strategy in capital intensive industries
Researchers describing replication strategies have proposed theoretical constructs that are positively associated with successful replication. In a rigorous quantitative exploration of replication in capital intensive industries, this study is the first of its kind and seeks to prove the applicability of the theoretical frameworks. Responses to questionnaires sent to petrochemical refining sites, coupled with an independent performance metric (the Solomon Associates Comparative Performance Assessment Index) were used to model the impact of replication practices on site performance. This model is used to show that firms attempting to centrally define an Arrow Core suffer a performance penalty. Furthermore, the model shows that a clear differentiation between the phases of exploration and exploitation is not a requirement for successful replication in capital intensive industries. The model helps to explain why barriers exist preventing the conceptualisation of the core capabilities within capital intensive industries; why companies seeking to locally control deleterious practices are negatively impacted compared to those implementing centralised mechanisms; and why the effective use of a template yields a performance advantage even in the absence of a well defined Arrow Core. The analysis also suggests appropriate practices for managers seeking to expand in capital intensive sectors.Dissertation (MBA)--University of Pretoria, 2010.Gordon Institute of Business Science (GIBS)unrestricte
A grouping hyper-heuristic framework: application on graph colouring
Grouping problems are hard to solve combinatorial optimisation problems which require partitioning of objects into a minimum number of subsets while a given objective is simultaneously optimised. Selection hyper-heuristics are high level general purpose search methodologies that operate on a space formed by a set of low level heuristics rather than solutions. Most of the recently proposed selection hyper-heuristics are iterative and make use of two key methods which are employed successively; heuristic selection and move acceptance. In this study, we present a novel generic selection hyper-heuristic framework containing a fixed set of reusable grouping low level heuristics and an unconventional move acceptance mechanism for solving grouping problems. This framework deals with one solution at a time at any given decision point during the search process. Also, a set of high quality solutions, capturing the trade-off between the number of groups and the additional objective for the given grouping problem, is maintained. The move acceptance mechanism embeds a local search approach which is capable of progressing improvements on those trade-off solutions. The performance of different selection hyper-heuristics with various components under the proposed framework is investigated on graph colouring as a representative grouping problem. Then, the top performing hyper-heuristics are applied to a benchmark of examination timetabling instances. The empirical results indicate the effectiveness and generality of the proposed framework enabling grouping hyper-heuristics to achieve high quality solutions in both domains. ©2015 Elsevier Ltd. All rights reserved
Enhanced Community-Based Routing for Low-Capacity Pocket Switched Networks
Sensor devices and the emergent networks that they enable are capable of transmitting information
between data sources and a permanent data sink. Since these devices have low-power and intermittent
connectivity, latency of the data may be tolerated in an effort to save energy for certain classes of data.
The BUBBLE routing algorithm developed by Hui et al. in 2008 provides consistent routing by employing a
model which computes individual nodes popularity from sets of nodes and then uses these popularity values
for forwarding decisions. This thesis considers enhancements to BUBBLE based on the hypothesis that nodes
do form groups and certain centrality values of nodes within these groups can be used to improve routing
decisions further.
Built on this insight, there are two algorithms proposed in this thesis. First is the Community-Based-
Forwarding (CBF), which uses pairwise group interactions and pairwise node-to-group interactions as a
measure of popularity for routing messages. By having a different measure of popularity than BUBBLE,
as an additional factor in determining message forwarding, CBF is a more conservative routing scheme
than BUBBLE. Thus, it provides consistently superior message transmission and delivery performance at an
acceptable delay cost in resource constrained environments.
To overcome this drawback, the concept of unique interaction pattern within groups of nodes is introduced
in CBF and it is further renewed into an enhanced algorithm known as Hybrid-Community-Based-
Forwarding (HCBF). Utilizing this factor will channel messages along the entire path with consideration
for higher probability of contact with the destination group and the destination node.
Overall, the major contribution of this thesis is to design and evaluate an enhanced social based routing
algorithm for resource-constrained Pocket Switched Networks (PSNs), which will optimize energy consumption
related to data transfer. It will do so by explicitly considering features of communities in order to reduce
packet loss while maintaining high delivery ratio and reduced delay
- …