45 research outputs found
SLA-Oriented Resource Provisioning for Cloud Computing: Challenges, Architecture, and Solutions
Cloud computing systems promise to offer subscription-oriented,
enterprise-quality computing services to users worldwide. With the increased
demand for delivering services to a large number of users, they need to offer
differentiated services to users and meet their quality expectations. Existing
resource management systems in data centers are yet to support Service Level
Agreement (SLA)-oriented resource allocation, and thus need to be enhanced to
realize cloud computing and utility computing. In addition, no work has been
done to collectively incorporate customer-driven service management,
computational risk management, and autonomic resource management into a
market-based resource management system to target the rapidly changing
enterprise requirements of Cloud computing. This paper presents vision,
challenges, and architectural elements of SLA-oriented resource management. The
proposed architecture supports integration of marketbased provisioning policies
and virtualisation technologies for flexible allocation of resources to
applications. The performance results obtained from our working prototype
system shows the feasibility and effectiveness of SLA-based resource
provisioning in Clouds.Comment: 10 pages, 7 figures, Conference Keynote Paper: 2011 IEEE
International Conference on Cloud and Service Computing (CSC 2011, IEEE
Press, USA), Hong Kong, China, December 12-14, 201
Cloudbus Toolkit for Market-Oriented Cloud Computing
This keynote paper: (1) presents the 21st century vision of computing and
identifies various IT paradigms promising to deliver computing as a utility;
(2) defines the architecture for creating market-oriented Clouds and computing
atmosphere by leveraging technologies such as virtual machines; (3) provides
thoughts on market-based resource management strategies that encompass both
customer-driven service management and computational risk management to sustain
SLA-oriented resource allocation; (4) presents the work carried out as part of
our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a
Service software system containing SDK (Software Development Kit) for
construction of Cloud applications and deployment on private or public Clouds,
in addition to supporting market-oriented resource management; (ii)
internetworking of Clouds for dynamic creation of federated computing
environments for scaling of elastic applications; (iii) creation of 3rd party
Cloud brokering services for building content delivery networks and e-Science
applications and their deployment on capabilities of IaaS providers such as
Amazon along with Grid mashups; (iv) CloudSim supporting modelling and
simulation of Clouds for performance studies; (v) Energy Efficient Resource
Allocation Mechanisms and Techniques for creation and management of Green
Clouds; and (vi) pathways for future research.Comment: 21 pages, 6 figures, 2 tables, Conference pape
A Review on Progress and Problems of Quantum Computing as a Service (QCaaS) in the Perspective of Cloud Computing
Cloud computing is a global established system Quantum computing is hypothetical model which is still in tentative analysis Cloud system has some weakness in security processing backup and vicinity Somehow quantum computing illustrates some revolutionary solution to overcome cloud weakness Most researchers are optimistic in quantum computing that it will improve cloud system It is not easy to combine these two different systems along We will show two quantum approaches quantum cryptography and blind quantum computing to secure cloud computing Quantum cryptography will secure the user data transmission and communication through cloud form hackers And blind computing will secure the instant eavesdropping or accessing of data processing in cloud from any vicious cloud provider or third party This paper s major target is to show advantages and disadvantages of quantum computing in the viewpoint to integrate it with cloud system Also review some current improvement of quantum computing and compute
Investigation of Factors Affecting Immunotherapy Treatment Results by Binary Logistic Regression and Classification Analysis
There are many factors that affect the success of immunotherapy treatment. In addition to the clinical examination, the investigation of these factors by different methods contributes to the researchers on prior knowledge and time. In this study, it was aimed to evaluate the application of logistic regression and data mining methods to evaluate the success of post-immunotherapy treatment. Bilateral logistic regression analysis with WTA and classification analysis with Weka were used to evaluate whether immunotherapy treatment was successful for warts. Decision tree structure is also discussed to determine the variables that affect classification success. According to the logistic regression result, the model is important because the probe. 0.022 <0.05. The classification result for the logistic regression model was calculated as 85.56%. This result shows that the model is successful. Data mining experiments were carried out with different classification algorithms. The best result was found in decision trees (with J48 algorithm) with 85.56% accuracy rate. According to the J48 algorithm decision tree structure, the variables that affect the outcome of the treatment were recorded as time, number of warts and age, respectively. Study results show that both methods yield parallel results. Decision tree algorithm is used as an alternative to classical statistical models. In particular, in cases where clinical research is limited, it will benefit researchers on topics such as transition to analysis, preliminary information gathering, time and effort
Artificial Intelligence in Engineering Risk Analytics
Risks exist in every aspect of our lives, and can mean different things to different people. While negative in general they always cause a great deal of potential damage and inconvenience for stakeholders. Recent engineering risks include the Fukushima nuclear plant disaster from the 2011 tsunami, a year that also saw earthquakes in New Zealand, tornados in the US, and floods in both Australia and Thailand. Earthquakes, tornados (not to mention hurricanes) and floods are repetitive natural phenomenon. But the October 2011 floods in Thailand were the worst in 50 years, impacting supply chains including those of Honda, Toyota, Lenovo, Fujitsu, Nippon Steel, Tesco, and Canon. Human-induced tragedies included a clothing factory fire in Bangladesh in 2012 that left over 100 dead. Wal-Mart and Sears supply chains were downstream customers. The events of Bhopal in 1984, Chernobyl in 1986, Exxon Valdez in 1989, and the Gulf oil spill of 2010 were tragic accidents. There are also malicious events such as the Tokyo Sarin attach in 1995, The World Trade Center and Pentagon attacks in 2001, and terrorist attacks on subways in Madrid (2004), London (2005), and Moscow (2010). The news brings us reports of such events all too often. The next step up in intensity is war, which seems to always be with us in some form somewhere in the world. Complex human systems also cause problems. The financial crisis resulted in recession in all aspects of the economy. Risk and analytics has become an important topic in today’s more complex, interrelated global environment, replete with threats from natural, engineering, economic, and technical sources (Olson and Wu, 2015)