485,557 research outputs found
Communications for Next Generation single chip computers
It is the thesis of this report that much of what is presently thought to require specialized VLSI functions might instead be achieved by combinations of fast general purpose single chip computers with upgraded communication facilities. To this end, the characteristics of applications
of this nature are first surveyed briefly and some working principles established. In the light of these, three different chip philosophies are explored in some detail. This study shows that some upgrading of typical
single chip I/O will definitely be necessary, but that this upgrading does not have to be complex and that true multiprocessor-multibus operation could be achieved without excessive cost
Universal Quantum Computation
We study quantum computers and their impact on computability. First, we summarize the history of computer science. Only a few articles have determined the direction of computer science and industry despite the fact that many works have been dedicated to the present success. We choose articles by A. M. Turing and D. Deutsch, because A. M. Turing proposed the basic architecture of modern computers while D. Deutsch proposed an architecture for the next generation of computers called quantum computers. Second, we study the architecture of modern computers using Turing machines. The Turing machine has the basic design of modern computers despite its simple structure. Then we study quantum computers. Quantum computers are believed to be the next generation, and expected to have a breakthrough in processing speed. We study what makes quantum computers have such a high processing speed. Third, we study how quantum computers gain such a processing speed with an example of Shor’s algorithm. This algorithm allows quantum computers to factor natural numbers into primes. Finally, we discuss a possible impact of quantum computers on the notion of computability
Big Data, Digitization, and Social Change (Ubiquity Symposium)
The term “big data” is something of a misnomer. Every generation of computers since the 1950s has been confronted with problems where data was way too large for the memory and processing power available. This seemed like an inconvenience of the technology that would someday be resolved when the next generation of computers came along. So what is different about big data today? The revolution is happening at the convergence of two trends: the expansion of the internet into billions of computing devices, and the digitization of almost everything. The internet gives us access to vast amounts of data. Digitization creates digital representations for many things once thought to be beyond the reach of computing technology. The result is an explosion of innovation of network-based big data applications and the automation of cognitive tasks. This revolution is introducing what Brynjolfsson and McAfee call the “Second Machine Age.” This symposium will examine this revolution from a number of angles
Reference and Information Services for the Next Generation
In their article, “Born with the Chip,” Abram and Luther discuss the next generation of library users. At 81 million, NextGens are next in size to boomers. Born between 1982 and 2002, this next generation represents an underserved user group that may not be well understood by current libraries. This generation who grew up using computers does not think of them as technology but as part of their everyday
culture. Abram and Luther (2004) reveal the
following key points that explain the significant impact this user group will have on the services that libraries provide
Machine Understanding of Human Behavior
A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior
UDRI Researchers Develop Glasses-Mounted Display, Next Generation of Wearable Computers
News release announces that a wearable computer version called a glasses-mounted display or GMD may soon be as familiar as a Game Boy
Quantum computers for optimization the performance
Computers decrease human work and concentrate on enhancing the performance to advance the technology. Various methods have been developed to enhance the performance of computers. Performance of computer is based on computer architecture, while computer architecture differs in various devices, such as microcomputers, minicomputers, mainframes, laptops, tablets, and mobile phones. While each device has its own architecture, the majority of these systems are built on Boolean algebra. In this study, a few basic concepts used in quantum computing are discussed. It is known that quantum computers do not possess any transistor and chip while being roughly 100 times faster than a common classic silicon computer. Scientists believe that quantum computers are the next generation of the classic computers
- …