448 research outputs found

    A New Way of Speaking: Jonathan Safran Foer’s Everything Is Illuminated and Effective Forms of Holocaust Literature

    Get PDF
    This paper examines the use of broken English, magic realism, and nonlinearity in the novel Everything Is Illuminated by Jonathan Safran Foer, arguing that such nontraditional narrative techniques are appropriate for depicting the Holocaust, which cannot be represented by conventional narrative form. This paper also examines the ethicality of creating a fictional work about the Holocaust: I argue that Foer distinguishes well between that which is meant to be taken literally and that which is meant to be understood symbolically, and that he in no way compromises the essential core of truth of what occurred during the Nazi genocide

    The English sonnet in the nineteenth century

    Full text link
    Thesis (M.A.)--Boston Universit

    Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

    Get PDF
    Network consensus optimization has received increasing attention in recent years and has found important applications in many scientific and engineering fields. To solve network consensus optimization problems, one of the most well-known approaches is the distributed gradient descent method (DGD). However, in networks with slow communication rates, DGD's performance is unsatisfactory for solving high-dimensional network consensus problems due to the communication bottleneck. This motivates us to design a communication-efficient DGD-type algorithm based on compressed information exchanges. Our contributions in this paper are three-fold: i) We develop a communication-efficient algorithm called amplified-differential compression DGD (ADC-DGD) and show that it converges under {\em any} unbiased compression operator; ii) We rigorously prove the convergence performances of ADC-DGD and show that they match with those of DGD without compression; iii) We reveal an interesting phase transition phenomenon in the convergence speed of ADC-DGD. Collectively, our findings advance the state-of-the-art of network consensus optimization theory.Comment: 11 pages, 11 figures, IEEE INFOCOM 201

    Shaking Up Traditional Training With Lynda.com

    Get PDF
    Supporting the diverse technology training needs on campus while resources continue to dwindle is a challenge many of us continue to tackle. Institutions from small liberal arts campuses to large research universities are providing individualized training and application support 24/7 by subscribing to the lynda.com Online Training Library(r) and marketing the service to various combinations of faculty, staff and students. As a supplemental service on most of our campuses, lynda.com has allowed us to extend support to those unable to attend live lab-based training, those who want advanced level training, those who want training on specialized applications, and those who want to learn applications that are not in high demand. The service also provides cost effective professional development opportunities for everyone on campus, from our own trainers and technology staff who are developing new workshops, learning new software versions or picking up new areas of expertise from project management to programming, to administrative and support staff who are trying to improve their skills in an ever-tighter economic environment. On this panel discussion, you will hear about different licensing approaches, ways of raising awareness about lynda.com on our campuses, lessons learned through implementation, reporting capabilities, and advice we would give for other campuses looking to offer this service

    Communication-Efficient Network-Distributed Optimization with Differential-Coded Compressors

    Get PDF
    Network-distributed optimization has attracted significant attention in recent years due to its ever-increasing applications. However, the classic decentralized gradient descent (DGD) algorithm is communication-inefficient for large-scale and high-dimensional network-distributed optimization problems. To address this challenge, many compressed DGD-based algorithms have been proposed. However, most of the existing works have high complexity and assume compressors with bounded noise power. To overcome these limitations, in this paper, we propose a new differential-coded compressed DGD (DC-DGD) algorithm. The key features of DC-DGD include: i) DC-DGD works with general SNR-constrained compressors, relaxing the bounded noise power assumption; ii) The differential-coded design entails the same convergence rate as the original DGD algorithm; and iii) DC-DGD has the same low-complexity structure as the original DGD due to a {\em self-noise-reduction effect}. Moreover, the above features inspire us to develop a hybrid compression scheme that offers a systematic mechanism to minimize the communication cost. Finally, we conduct extensive experiments to verify the efficacy of the proposed DC-DGD and hybrid compressor
    • …
    corecore