1,653 research outputs found
Forking Belief in Cryptocurrency: A Tax Non-Realization Event
When the community of believers in a cryptocurrency splits into two, the currency may experience a “hard fork” and split into two indepen¬dent currencies. Indeed, like fiat currencies, cryptocurrencies only have value if people believe they have value and are willing to use them in transactions. Hard forks do not create new value unless they inspire new belief; otherwise, they merely split the value of the original cur¬rency between the two new currencies. The Internal Revenue Service is wrong to conclude in Revenue Ruling 2019–24 that the value of the new currency resulting from a hard fork constitutes gross income in the hands of coin owners. A hard fork is properly understood as a divi¬sion of each coin of the original currency into two resulting coins and is no more a taxable event than when a property owner subdivides a larger parcel of land into two smaller lots. The appropriate question is not whether there is income, but how the owner’s basis in the original property should be split between the two resulting parts. After delving into the nature of hard forks and exploring the governing law, the author suggests an approach for basis allocation
Using the High Productivity Language Chapel to Target GPGPU Architectures
It has been widely shown that GPGPU architectures offer large performance gains compared to their traditional CPU counterparts for many applications. The downside to these architectures is that the current programming models present numerous challenges to the programmer: lower-level languages, explicit data movement, loss of portability, and challenges in performance optimization. In this paper, we present novel methods and compiler transformations that increase productivity by enabling users to easily program GPGPU architectures using the high productivity programming language Chapel. Rather than resorting to different parallel libraries or annotations for a given parallel platform, we leverage a language that has been designed from first principles to address the challenge of programming for parallelism and locality. This also has the advantage of being portable across distinct classes of parallel architectures, including desktop multicores, distributed memory clusters, large-scale shared memory, and now CPU-GPU hybrids. We present experimental results from the Parboil benchmark suite which demonstrate that codes written in Chapel achieve performance comparable to the original versions implemented in CUDA.NSF CCF 0702260Cray Inc. Cray-SRA-2010-016962010-2011 Nvidia Research Fellowshipunpublishednot peer reviewe
A Semester Long Classroom Course Mimicking a Software Company and a New Hire Experience for Computer Science Students Preparing to Enter the Software Industry
Students in a Computer Science degree programs must learn to code before they can be taught Software Engineering skills. This core skill set is how to program and consists of the constructs of various languages, how to create short programs or applications, independent assignments, and arrive at solutions that utilize the skills being covered in the language for that course (Chatley & Field, 2017). As an upperclassman, students will often be allowed to apply these skills in newer ways and have the opportunity to work on longer, more involved assignments although frequently still independent or in small groups of two to three students. Once these students graduate and enter the software industry they will find that most companies follow specific development methodologies from one of the many forms of Agile through Waterfall. All while working in large groups or teams where each developer is responsible for specific pieces of the functionality, participating in design meetings and code reviews, as well as using code versioning systems, such as git, a program management system, such as Jira, all in a very collaborative environment. This study will develop a course that will allow students to apply these skills in a more realistic setting while remaining on-campus and monitoring the students’ beliefs on their preparedness for the world outside of the computer science building
Communication problems in the parish ministry: an action research study of fifty Protestant ministers in a New England city
Thesis (Ph.D.)--Boston UniversityDesigned within the framework of communication theory and utilizing psychological methods, a study of the Protestant parish ministry has been made yielding basic vocational information, and permitting the formulation of five major problems confronting the ministry and the Church today.
Vital information about each communication which took place on one weekday and one Sunday in the life of each minister was systematically collected by a self-recording technique. All topics of conversation were gathered for content analysis. The methodology of this study differs from others made in this field in deriving primary data of communication from records made close to the moment of action, employing extensive followup interviews, and representing a virtually complete sample of the professional Protestant ministry in a greater urban area of 150,000 people.
Primary data were transferred to IBM punch cards and processed electronically; statistical results are presented in thirty-two tables and illustrations. Analysis is made of the contents, means, motives, and personal network of communication, providing answers to the questions: What does the minister talk about? By what means does he communicate with others? To whom does he communicate, and why?
On the basis of primary and interview data five crucial problems of communication are formulated: (1) the problem of specialization, arising from the desire for vocational fulfilment, but largely frustrated by an overwhelming need for non-professional services; (2) the problem of supply and demand, which calls attention to the Church's failure to provide facilities and manpower to meet rising demands, and the inadequacy of a communication network corresponding roughly to a single wheel with all lines converging at the center, the minister; (3) the problem of selectivity or bias, indicating the degree to which the minister's conscious and unconscious preferences interfere with the establishment of truly cosmopolitan and inclusive Christian communities; (4) the problem of superficiality, indicated by the heavy predominance of incidental contents, brief contacts, and impersonal means of communication; and (5) the problem of sensitivity, of remaining a sensitive receiver of communication in spite of the serious barriers created by status, schedule, and preoccupation with parish detail.
Judging from his communications, the major role of the minister today is that of pastor, with the role of administrator a close competitor. In actual conversation as well as by stated preference, ministers move away from administrative functions toward pastoral, while parishioners and others call upon them more often for administrative than for pastoral services. Heavy involvement in committees and groups, averaging one fourth of all working time, gives new prominence to the organizational role, a role in which the minister is not well adjusted, and lacks strategy.
The more professional ministry involving direct attention to religion in either pastoral, priestly, preaching, or teaching situations is mainly a Sunday phenomenon, all of these put together accounting for only 11% of weekday conversation. A large proportion of ministers in this study are dissatisfied with their present vocational roles. Yet, they seem unable to cut out spheres of major activity and competence for themselves, and to interpret this specialization to their congregations.
In outlining a strategy of communication to meet the situation, the author begins with a consideration of the pastor's own motives and vocational goals, laying particular stress upon the hazards of the Messiah complex. In asserting the need for multiple foci of communication, he analyzes the effects of staff on the actual communications of the pastor, and critically evaluates the staff-solution to the minister's dilemma. Greatest hope is seen in a renewal of a true ministry of all believers under the direction of a spiritual overseer, the pattern of communication which prevailed in the New Testament Church
In Pursuit of Eye Tracking for Visual Landscape Assessments
Visual quality and impact assessments have historically relied on experts to formally evaluate the visual properties of a landscape. In contrast, environmental psychologists have studied subjective landscape preferences using ratings and surveys. These two approaches represent, respectively, the “objectivist” and “subjectivist” paradigms within visual landscape research. A gap, however, exists between these approaches: actual observation behaviors. In this paper, we argue for the inclusion of eye-tracking research in visual landscape assessments as a critical bridge between objective landscape qualities and subjective visual experiences. We describe the basics of eye-tracking methods and data types to introduce the role of eye movements in landscape preference formation. Three-dimensional immersive virtual environments are particularly useful for collecting these types of data, as they allow for quantification of the viewed environment’s spatial and scene metrics in addition to providing eye-tracking capabilities at sufficient resolutions. These environmental and behavioral data can then be consolidated and analyzed within existing GIS platforms to draw conclusions about environmental influences on observation behaviors. While eye tracking may eventually contribute directly to the practice of visual quality or impact assessments, the near-term benefits of this work will most likely center around contributing to the objectivity and defensibility of assessments through validation and methodological recommendations
The gift of the algorithm: beyond autonomy and control
This piece brings together, participation, algorithmic composition and augmentation (as a mechanism by which people can work together to augment and support a composer’s workflow). The performance is about understanding the ways in which composition and performance can be understood, socially, aesthetically and scientifically. This performance becomes a piece of research and design in its own right, a more experimental manifestation of HCI, but it also demonstrates and disrupts conventional production and performance by making the multiple layers of practice and provenance obvious. *See Program notes for a fuller description of the piece for public consumption. We also aim to discuss this further and demo at the Performance workshop that we have submitted.
This is part the on going research of the FAST project and aims to engage the wider interdisciplinary Audio Mostly community.
• Program notes
This piece expands upon Chamberlain’s work into compositional practices that explore autonomy and control, and builds upon the Numbers into Notes system as developed by De Roure. The piece (which is an evolving work) uses the symbolism of the gift to frame parts of the interactions that have occurred in the development of the piece. Individuals are given the chance to create an algorithm. This is made into a physical entity (containing a sequence), which is then gifted to the composer; these together are combined and used to compose a piece. The piece is then performed and given back to the audience (live), of which some members have created the original algorithms. The performance creates a gift, a souvenir, a memento of the experience which some of the audience members can take away. The performance also acts as a way in which we can also understand the interplay between algorithms, art, performance, provenance and participation
Adaptively Managing Lake Powell Releases to Respond to Reservoir Inflow and Evaporation
Across the Western U.S., reservoir levels are declining towards protection elevations because of aridity amplified by long-standing operations that release more water than inflow. This paper has the purpose to sketch the effects of reservoir operations that release less water than inflow and evaporation. We use the case of Glen Canyon Dam/Lake Powell on the Colorado River, U.S.A. We programed a new rule in the trusted basin simulation model maintained by the U.S. Bureau of Reclamation. The rule releases 95% of inflow in each time step. The 5% reduction is typically greater than reservoir evaporation. Lake Powell levels stabilize and recover in a few years across a range of assumptions for hydrology and upstream diversions. There remain challenges technically and politically to implement into existing operations the idea to release less water than inflow and evaporation
- …