25 research outputs found

    Cloud Computing, Clickwrap Agreements, and Limitation on Liability Clauses: A Perfect Storm?

    Get PDF
    “To the cloud!” trumpets a commercial by Microsoft, whose aim is to herd customers, and their checkbooks, into the cloud computing fold. But Microsoft, and other cloud providers like Amazon and Google, might inadvertently be doing just the opposite. It is not for lack of security or even early adopter apprehension that potential customers might turn away. Nor is it a lack of fantastic, cost-saving applications of cloud technology. Rather, the problem is buried deep within these tech giants’ clickwrap agreements—the ones that customers rarely read and to which they invariably click “I Agree.” Hidden in these agreements are limitation on liability clauses, veritable safe harbors for cloud providers and submerged icebergs for the unwary cloud customer. Often, these clauses wholly abrogate a customer’s right to recover damages for his provider’s wrongful acts. In other words, a provider could purposefully delete its customers’ data or shut down its users’ websites, leaving the aggrieved customers with no cause of action and no right to recover. While limitation on liability clauses are not new to the contract law vernacular, their inclusion in cloud computing agreements is particularly troublesome. The amount of potential liability that customers may waive through a half-cocked click is as enormous as it is troubling. While courts have recently held that these clauses are enforceable in other Internet-related areas, courts should be wary of blindly applying precedent and enforcing these clauses in the cloud computing context

    Diversifying the genomic data science research community

    Get PDF
    Over the past 20 years, the explosion of genomic data collection and the cloud computing revolution have made computational and data science research accessible to anyone with a web browser and an internet connection. However, students at institutions with limited resources have received relatively little exposure to curricula or professional development opportunities that lead to careers in genomic data science. To broaden participation in genomics research, the scientific community needs to support these programs in local education and research at underserved institutions (UIs). These include community colleges, historically Black colleges and universities, Hispanic-serving institutions, and tribal colleges and universities that support ethnically, racially, and socioeconomically underrepresented students in the United States. We have formed the Genomic Data Science Community Network to support students, faculty, and their networks to identify opportunities and broaden access to genomic data science. These opportunities include expanding access to infrastructure and data, providing UI faculty development opportunities, strengthening collaborations among faculty, recognizing UI teaching and research excellence, fostering student awareness, developing modular and open-source resources, expanding course-based undergraduate research experiences (CUREs), building curriculum, supporting student professional development and research, and removing financial barriers through funding programs and collaborator support

    CLOUD COMPUTING

    Get PDF
    Cloud Computing is a new technology that helps us to use the Cloud for compliance our computation needs. Cloud refers to a scalable network of computers that work together like Internet. An important element in Cloud Computing is that we shift processing, managing, storing and implementing our data from, locality into the Cloud; So it helps us to improve the efficiency. Because of it is new technology, it has both advantages and disadvantages that are scrutinized in this article. Then some vanguards of this technology are studied. Afterwards we find out that Cloud Computing will have important roles in our tomorrow life

    Challenges and Approaches in Green Data Center

    Get PDF
    Cloud computing is a fast evolving area of information and communication technologies (ICTs)that hascreated new environmental issues. Cloud computing technologies have a widerange ofapplications due to theirscalability, dependability, and trustworthiness, as well as their abilityto deliver high performance at a low cost.The cloud computing revolution is altering modern networking, offering both economic and technologicalbenefits as well as potential environmental benefits. These innovations have the potential to improve energyefficiency while simultaneously reducing carbon emissions and e-waste. These traits have thepotential tomakecloud computing more environmentally friendly. Green cloud computing is the science and practise of properlydesigning, manufacturing, using, and disposing of computers, servers,and associated subsystems like displays,printers, storage devices, and networking and communication systems while minimising or eliminatingenvironmental impact. The most significant reason for a data centre review is to understand capacity,dependability, durability,algorithmic efficiency, resource allocation, virtualization, power management, andother elements. The green cloud design aims to reduce data centre power consumption. The main advantageof green cloud computing architecture is that it ensures real-time performance whilereducing IDC’s energyconsumption (internet data center).This paper analyzed the difficultiesfaced by data centers such as capacityplanning and management, up-time and performance maintenance, energy efficiency and cost cutting, realtime monitoring and reporting. The solution for the identified problems with DCIM system is also presentedin this paper. Finally, it discusses the market report’s coverage of green data centres, green computingprinciples, andfuture research challenges. This comprehensive green cloud analysis study will assist nativegreen research fellows in learning about green cloud concerns and understanding future research challengesin the field

    Cyberinfrastructure Deployments on Public Research Clouds Enable Accessible Environmental Data Science Education

    Get PDF
    Modern science depends on computers, but not all scientists have access to the scale of computation they need. A digital divide separates scientists who accelerate their science using large cyberinfrastructure from those who do not, or who do not have access to the compute resources or learning opportunities to develop the skills needed. The exclusionary nature of the digital divide threatens equity and the future of innovation by leaving people out of the scientific process while over-amplifying the voices of a small group who have resources. However, there are potential solutions: recent advancements in public research cyberinfrastructure and resources developed during the open science revolution are providing tools that can help bridge this divide. These tools can enable access to fast and powerful computation with modest internet connections and personal computers. Here we contribute another resource for narrowing the digital divide: scalable virtual machines running on public cloud infrastructure. We describe the tools, infrastructure, and methods that enabled successful deployment of a reproducible and scalable cyberinfrastructure architecture for a collaborative data synthesis working group in February 2023. This platform enabled 45 scientists with varying data and compute skills to leverage 40,000 hours of compute time over a 4-day workshop. Our approach provides an open framework that can be replicated for educational and collaborative data synthesis experiences in any data- and compute-intensive discipline

    Big data y NoSQL: su rol en la revolución del cloud computing y sus retos hacia la estandarización

    Get PDF
    El auge en la utilización de las tecnologías de información (TI) orientadas a cloud computing ha generado grandes cantidades y variedades de datos, o Big Data, en los centros de datos de las organizaciones proveedoras de estos servicios, generando retos a los desarrolladores y administradores de las TI para el manejo eficiente de esos recursos. Frente a este fenómeno, las bases de datos relacionales son consideradas por los profesionales de las TI como una obstrucción para ofrecer las características esenciales del cloud computing. Esto ha impulsado una serie de nuevos enfoques y tecnologías para la gestión de datos, conocidos como bases de datos no relacionales o NoSQL, diseñadas para proveer rapidez, escalabilidad, alta disponibilidad y elasticidad, facilitando la administración de datos en las soluciones tecnológicas en cloud computing. En este artículo se exponen los conceptos de Big Data, sus principales herramientas tecnológicas para la gestión de datos, el movimiento NoSQL, el rol de estos en la revolución del cloud computing y los retos de estas tecnologías hacia la estandarización

    Barriers to the adoption of cloud services by SMEs in South Africa.

    Get PDF
    Master’s Degree. University of KwaZulu-Natal, Durban.Abstract This research looks at the barriers faced by Small and medium enterprises (SMEs) , in South Africa in accessing cloud computing services. This research is drawn from a survey carried out from a sample drawn from businesses mainly located in Durban. The upswing in entrepreneurship activities experienced in South Africa implies that SMEs have the potential to play an even greater role in driving the economic growth on the country. These gains can only be consolidated if the country can reduce the high failure rate in SME growth. It was asserted that 75% of new SME do not become established firms. Information and communication technology (ICT) is acknowledged as an enabler in the growth of enterprises and some studies have focussed on the role that ICT in general plays in SME growth in South Africa. Few have however delved deeper and looked at ICT adoption in the cloud computing context and the barriers encountered. In this forum, some studies show that strides made in the cloud computing revolution have theoretically levelled the playing field in terms of providing SMEs with access to affordable ICT services, in a way that gives them a competitive advantage similar or beyond that which bigger firms have. The focus of this study was on ascertaining the current uptake of cloud computing services by SMEs in South Africa, with a view of identifying barriers that hinder the full adoption of the same. 210 SMEs were chosen for this study based on their profile. Out of 210 questionnaires sent only 43 responded, representing a response rate of 20 %. The data was captured in a QuestionPro database. From the analysis using the unified theory of acceptance and use of technology (UTAUT) framework, it was found that, Facilitating Conditions (FC) significantly influenced SME Use Behaviour for cloud computing technology while Performance Expectancy (PE), Effort Expectancy (EE) and Social Influence (SI) were statistically insignificant, on influencing Behavioural Intention (BI) to adopt cloud services. However, Behavioural Intention significantly influenced Use Behaviour (UB) for cloud services. The sample used in this study was fairly small, due to resource constraints, it is the researcher’s recommendation that a follow up study with a larger sample be conducted if resources allow. The most significant limitation was on unreliable contact listing for SMEs resulting in many SMEs not being reachable.Abstract available in the PDF

    The Internet after Aereo: How to Save Innovation from the Public Performance Right

    Get PDF
    The Supreme Court\u27s decision in American Broadcasting Companies, Inc. v. Aereo, Inc. overturned the Second Circuit\u27s rule that separate copies create separate performances without clarifying the scope of a performance. The decision creates significant ambiguity surrounding the public performance right and potentially massive liability for cloud-computing companies. Since cloud computing allows customers to run programs remotely from a company\u27s servers, two independent customers watching different copies of the same movie from the same cloud results in the cloud conducting a public performance. This Note examines this problem, concludes that the current public performance regime has become obsolete, and proposes a new bright-line safe harbor for cloud-computing companies based on the fair use doctrine, dubbed the Fair Performance Doctrine
    corecore