17 research outputs found
Supporting users' influence in gamification settings and game live-streams
Playing games has long been important to mankind. One reason for this is the associated autonomy, as players can decide on many aspects on their own and can shape the experience. Game-related sub-fields have appeared in Human-Computer Interaction where this autonomy is questionable: in this thesis, we consider gamification and game live-streams and here, we support the users' influence at runtime. We hypothesize that this should affect the perception of autonomy and should lead to positive effects overall. Our contribution is three-fold: first, we investigate crowd-based, self-sustaining systems in which the user's influence directly impacts the outcome of the system's service. We show that users are willing to expend effort in such systems even without additional motivation, but that gamification is still beneficial here. Second, we introduce "bottom-up" gamification, i.e., the idea of self-tailored gamification. Here, users have full control over the gamification used in a system, i.e., they can set it up as they see fit at the system's runtime. Through user studies, we show that this has positive behavioral effects and thus adds to the ongoing efforts to move away from "one-size-fits-all" solutions. Third, we investigate how to make gaming live-streams more interactive, and how viewers perceive this. We also consider shared game control settings in live-streams, in which viewers have full control, and we contribute options to support viewers' self-administration here.Seit jeher nehmen Spiele im Leben der Menschen eine wichtige Rolle ein. Ein Grund hierfür ist die damit einhergehende Autonomie, mit der Spielende Aspekte des Spielerlebnisses gestalten können. Spiele-bezogene Teilbereiche werden innerhalb der Mensch-Maschine-Interaktion untersucht, bei denen dieser Aspekt jedoch diskutabel ist: In dieser Arbeit betrachten wir Gamification und Spiele Live-Streams und geben Anwendern mehr Einfluss. Wir stellen die Hypothese auf, dass sich dies auf die Autonomie auswirkt und zu positiven Effekten führt. Der Beitrag dieser Dissertation ist dreistufig: Wir untersuchen crowdbasierte, selbsterhaltende Systeme, bei denen die Einflussnahme des Einzelnen sich auf das Systemergebnis auswirkt. Wir zeigen, dass Nutzer aus eigenem Antrieb bereit sind, sich hier einzubringen, der Einfluss von Gamification sich aber förderlich auswirkt. Im zweiten Schritt führen wir "bottom-up" Gamification ein. Hier hat der Nutzer die volle Kontrolle über die Gamification und kann sie nach eigenem Ermessen zur Laufzeit einstellen. An Hand von Nutzerstudien belegen wir daraus resultierende positive Verhaltenseffekte, was die anhaltenden Bemühungen bestärkt, individuelle Gamification-Konzepte anzubieten. Im dritten Schritt untersuchen wir, wie typische Spiele Live-Streams für Zuschauer interaktiver gestaltet werden können. Zudem betrachten wir Fälle, in denen Zuschauer die gemeinsame Kontrolle über ein Spiel ausüben und wie dies technologisch unterstützt werden kann
Building information modeling – A game changer for interoperability and a chance for digital preservation of architectural data?
Digital data associated with the architectural design-andconstruction
process is an essential resource alongside -and even
past- the lifecycle of the construction object it describes. Despite
this, digital architectural data remains to be largely neglected in
digital preservation research – and vice versa, digital preservation
is so far neglected in the design-and-construction process. In the
last 5 years, Building Information Modeling (BIM) has seen a
growing adoption in the architecture and construction domains,
marking a large step towards much needed interoperability. The
open standard IFC (Industry Foundation Classes) is one way in
which data is exchanged in BIM processes. This paper presents a
first digital preservation based look at BIM processes,
highlighting the history and adoption of the methods as well as
the open file format standard IFC (Industry Foundation Classes)
as one way to store and preserve BIM data
Sincronização em sistemas integrados a alta velocidade
Doutoramento em Engenharia ElectrotécnicaA distribui ção de um sinal relógio, com elevada precisão espacial (baixo
skew) e temporal (baixo jitter ), em sistemas sí ncronos de alta velocidade tem-se revelado uma tarefa cada vez mais demorada e complexa devido ao escalonamento da tecnologia. Com a diminuição das dimensões dos dispositivos
e a integração crescente de mais funcionalidades nos Circuitos Integrados (CIs), a precisão associada as transições do sinal de relógio tem sido cada vez mais afectada por varia ções de processo, tensão e temperatura.
Esta tese aborda o problema da incerteza de rel ogio em CIs de alta velocidade, com o objetivo de determinar os limites do paradigma de desenho sí ncrono.
Na prossecu ção deste objectivo principal, esta tese propõe quatro novos modelos de incerteza com âmbitos de aplicação diferentes. O primeiro modelo permite estimar a incerteza introduzida por um inversor est atico CMOS, com base em parâmetros simples e su cientemente gen éricos para que possa ser usado na previsão das limitações temporais de circuitos mais complexos, mesmo na fase inicial do projeto. O segundo modelo, permite
estimar a incerteza em repetidores com liga ções RC e assim otimizar o dimensionamento da rede de distribui ção de relógio, com baixo esfor ço computacional. O terceiro modelo permite estimar a acumula ção de incerteza em cascatas de repetidores. Uma vez que este modelo tem em considera ção a correla ção entre fontes de ruí do, e especialmente util para promover t ecnicas de distribui ção de rel ogio e de alimentação que possam minimizar a acumulação de incerteza. O quarto modelo permite estimar a incerteza temporal em sistemas com m ultiplos dom ínios de sincronismo.
Este modelo pode ser facilmente incorporado numa ferramenta autom atica
para determinar a melhor topologia para uma determinada aplicação ou para avaliar a tolerância do sistema ao ru ído de alimentação.
Finalmente, usando os modelos propostos, são discutidas as tendências da precisão de rel ogio. Conclui-se que os limites da precisão do rel ogio são, em ultima an alise, impostos por fontes de varia ção dinâmica que se preveem crescentes na actual l ogica de escalonamento dos dispositivos. Assim sendo,
esta tese defende a procura de solu ções em outros ní veis de abstração, que não apenas o ní vel f sico, que possam contribuir para o aumento de desempenho dos CIs e que tenham um menor impacto nos pressupostos do paradigma de desenho sí ncrono.Distributing a the clock simultaneously everywhere (low skew) and periodically
everywhere (low jitter) in high-performance Integrated Circuits (ICs)
has become an increasingly di cult and time-consuming task, due to technology
scaling. As transistor dimensions shrink and more functionality is
packed into an IC, clock precision becomes increasingly a ected by Process,
Voltage and Temperature (PVT) variations. This thesis addresses the
problem of clock uncertainty in high-performance ICs, in order to determine
the limits of the synchronous design paradigm.
In pursuit of this main goal, this thesis proposes four new uncertainty models,
with di erent underlying principles and scopes. The rst model targets
uncertainty in static CMOS inverters. The main advantage of this model
is that it depends only on parameters that can easily be obtained. Thus,
it can provide information on upcoming constraints very early in the design
stage. The second model addresses uncertainty in repeaters with RC interconnects,
allowing the designer to optimise the repeater's size and spacing,
for a given uncertainty budget, with low computational e ort. The third
model, can be used to predict jitter accumulation in cascaded repeaters, like
clock trees or delay lines. Because it takes into consideration correlations
among variability sources, it can also be useful to promote
oorplan-based
power and clock distribution design in order to minimise jitter accumulation.
A fourth model is proposed to analyse uncertainty in systems with multiple
synchronous domains. It can be easily incorporated in an automatic tool
to determine the best topology for a given application or to evaluate the
system's tolerance to power-supply noise.
Finally, using the proposed models, this thesis discusses clock precision
trends. Results show that limits in clock precision are ultimately imposed
by dynamic uncertainty, which is expected to continue increasing with technology
scaling. Therefore, it advocates the search for solutions at other
abstraction levels, and not only at the physical level, that may increase
system performance with a smaller impact on the assumptions behind the
synchronous design paradigm
Characterization and mitigation of process variation in digital circuits and systems
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 155-166).Process variation threatens to negate a whole generation of scaling in advanced process technologies due to performance and power spreads of greater than 30-50%. Mitigating this impact requires a thorough understanding of the variation sources, magnitudes and spatial components at the device, circuit and architectural levels. This thesis explores the impacts of variation at each of these levels and evaluates techniques to alleviate them in the context of digital circuits and systems. At the device level, we propose isolation and measurement of variation in the intrinsic threshold voltage of a MOSFET using sub-threshold leakage currents. Analysis of the measured data, from a test-chip implemented on a 0. 18[mu]m CMOS process, indicates that variation in MOSFET threshold voltage is a truly random process dependent only on device dimensions. Further decomposition of the observed variation reveals no systematic within-die variation components nor any spatial correlation. A second test-chip capable of characterizing spatial variation in digital circuits is developed and implemented in a 90nm triple-well CMOS process. Measured variation results show that the within-die component of variation is small at high voltages but is an increasing fraction of the total variation as power-supply voltage decreases. Once again, the data shows no evidence of within-die spatial correlation and only weak systematic components. Evaluation of adaptive body-biasing and voltage scaling as variation mitigation techniques proves voltage scaling is more effective in performance modification with reduced impact to idle power compared to body-biasing.(cont.) Finally, the addition of power-supply voltages in a massively parallel multicore processor is explored to reduce the energy required to cope with process variation. An analytic optimization framework is developed and analyzed; using a custom simulation methodology, total energy of a hypothetical 1K-core processor based on the RAW core is reduced by 6-16% with the addition of only a single voltage. Analysis of yield versus required energy demonstrates that a combination of disabling poor-performing cores and additional power-supply voltages results in an optimal trade-off between performance and energy.by Nigel Anthony Drego.Ph.D
A wide dynamic range high-q high-frequency bandpass filter with an automatic quality factor tuning scheme
An 80 MHz bandpass filter with a tunable quality factor of 16∼44 using an improved transconductor circuit is presented. A noise optimized biquad structure for high-Q, high- frequency bandpass filter is proposed. The quality factor of the filter is tuned using a new quality factor locked loop algorithm. It was shown that a second-order quality factor locked loop is necessary and sufficient to tune the quality factor of a bandpass filter with zero steady state error. The accuracy, mismatch, and sensitivty analysis of the new tuning scheme was performed and analyzed. Based on the proposed noise optimized filter structure and new quality factor tuning scheme, a biquad filter was designed and fabricated in 0.25 μm BiCMOS process. The measured results show that the biquad filter achieves a SNR of 45 dB at IMD of 40 dB. The P-1dB compression point and IIP3 of the filter are -10 dBm and -2.68 dBm, respectively. The proposed biquad filter and quality factor tuning scheme consumes 58mW and 13 mW of power at 3.3 V supply.Ph.D.Committee Chair: Allen Phillip; Committee Member: Hasler Paul; Committee Member: Keezer David; Committee Member: Kenny James; Committee Member: Pan Ronghu
An Analog Multiphase Self-Calibrating DLL to Minimize the Effects of Process, Supply Voltage, and Temperature Variations
Delay locked loops have been found to be useful tools in such applications as computing, TDCs, and communications. These system can be found in space exploration vehicles and satellites, which operate in extreme environments. Unfortunately, in these environments supply voltage and temperature will not be constant, therefore they must be under consideration when designing a DLL. Furthermore, solar radiation in conjunction with the varying environmental aspects, could cause the delay locked loop to lose it locked state.
Delay locked loops are inherently good at tracking these environmental aspects, but in order to do so, the voltage controlled delay line must exhibit a very large gain, which translates to a large capture range. Assuming charged particles hit a key node in the DLL (e.g. the control voltage), the DLL would lose lock and would have to recapture it. Depending on the severity of the uctuation, this relocking process could easily take on the order of many microseconds assuming the bandwidth was kept low to minimize jitter. To date, no delay locked loops have been published for extreme environment applications.
In many other extreme environment circuits, calibration techniques have been applied to minimize the environmental effects. Whereas there have been multiple calibration methods published related to delay locked loops, none of them were intended for extreme environments. Furthermore, none of these methods are directly suitable for an analog multiphase delay locked loop.
The self-calibrating DLL in this work includes an all digital calibration circuit, as well as a system transient monitor. The coarse calibration helps minimize global process, voltage, and temperature errors for an analog multiphase DLL. The system monitor is used to detect any transients that might cause the DLL to unlock, which could be used to allow the DLL to be recalibrated to the new environmental conditions. The presented measurement results will demonstrate that the DLL can be used in extreme environments such as space, or other extreme environment applications
Long distance free-space quantum key distribution
In the age of information and globalisation, secure communication as well as the protection of sensitive data against unauthorised access are of utmost importance. Quantum cryptography currently provides the only way to exchange a cryptographic key between two parties in an unconditionally secure fashion. Owing to losses and noise of today's optical fibre and detector technology, at present quantum cryptography is limited to distances below a few 100 km. In principle, larger distances could be subdivided into shorter segments, but the required quantum repeaters are still beyond current technology. An alternative approach for bridging larger distances is a satellite-based system, that would enable secret key exchange between two arbitrary points on the globe using free-space optical communication.
The aim of the presented experiment was to investigate the feasibility of satellite-based global quantum key distribution. In this context, a free-space quantum key distribution experiment over a real distance of 144 km was performed. The transmitter and the receiver were situated in 2500 m altitude on the Canary Islands of La Palma and Tenerife, respectively. The small and compact transmitter unit generated attenuated laser pulses, that were sent to the receiver via a 15-cm optical telescope. The receiver unit for polarisation analysis and detection of the sent pulses was integrated into an existing mirror telescope designed for classical optical satellite communications. To ensure the required stability and efficiency of the optical link in the presence of atmospheric turbulence, the two telescopes were equipped with a bi-directional automatic tracking system.
Still, due to stray light and high optical attenuation, secure key exchange would not be possible using attenuated pulses in connection with the standard BB84 protocol. The photon number statistics of attenuated pulses follows a Poissonian distribution. Hence, by removing a photon from all pulses containing two or more photons, an eavesdropper could measure its polarisation without disturbing the polarisation state of the remaining pulse. In this way, he can gain information about the key without introducing detectable errors. To protect against such attacks, the presented experiment employed the recently developed method of using additional "decoy" states, i.e., the the intensity of the pulses created by the transmitter were varied in a random manner. By analysing the detection probabilities of the different pulses individually, a photon-number-splitting attack can be detected. Thanks to the decoy-state analysis, the secrecy of the resulting quantum key could be ensured despite the Poissonian nature of the emitted pulses. For a channel attenuation as high as 35 dB, a secret key rate of up to 250 bit/s was achieved.
Our outdoor experiment was carried out under real atmospheric conditions and with a channel attenuation comparable to an optical link from ground to a satellite in low earth orbit. Hence, it definitely shows the feasibility of satellite-based quantum key distribution using a technologically comparatively simple system
CIRCUS 2001 Conference Proceedings: New Synergies in Digital Creativity. Conference for Content Integrated Research in Creative User Systems
CIRCUS (Content Integrated Research For Creative User Systems) was an ESPRIT Working Group, originally set up in 1988 as one of the very last additional actions in Framework 4, under DG III. Its purpose was to develop models for collaborative work between artists (the term here used in its widest sense) and technologists (ditto) and to promote these models by whatever means available. While some have criticised this aim as implicitly promoting a 1950s agenda of building bridges across C.P. Snow’s ‘two cultures’, there is no such intention here, rather that technology, particularly computer and communications technology (ICT) , is irresistibly intruding into what is normally thought of as creative work (and so practised by artists) and that, like any new technique, this has to be understood by its potential practitioners in terms of its true strengths and limitations. The specific problem that computer technology poses is that it is in principle malleable to such an extent that the limitations on its form and functionality are still barely understood, yet the people charged with the task of making the technology available have little or no understanding of the needs of creative users. What the artist usually sees is a tool which is in principle capable of being harnessed to creative ends but in practice resists being so applied. Quite often the tool is shaped more by blind economic forces than by a clear response to a specific, here creative, need.
CIRCUS came into existence as a forum in which both artists and technologists could work out how best to play to the strengths of ICT and how to apply both creative and technological solutions (possibly both together) to its limitations. In particular the then new Framework V programme invited projects in such areas as new media but required them to be addressed in essentially the same old way, by technologists working towards commercialisation. The only obvious exception to this was in the area of cultural heritage which, incidentally, CIRCUS was also capable of reviewing. The scope for effective participation by artists was thus limited by an essentially technological agenda although everybody at the time, the participants of CIRCUS and programme managers in DG III, believed that we could do far better than this, and to develop new models of working which could inform the nature of Framework VI or even the later stages of F V. It is fair to say that everyone involved was excited by the idea of doing something quite new (and iconoclastic), not least the expanding of the expertise base on which future Frameworks could draw.
It is also fair to say that, while not ultimately wholly original, the CIRCUS agenda was an ambitious one and the WG has had a chequered history peppered with misunderstandings perpetrated by the very people who might have thought would give the WG their strongest support. The CIRCUS idea has been aired before, specifically at the University of Illinois at Urbana- Champaign, the MIT Media Lab (and its imitators), and a recent IEEE forum. However a near total change in participation, fuelled by natural migration and a switch to DG XIII, has resulted in the CIRCUS agenda being restarted on at least one occasion and a fairly regular questioning of the principles on whose elucidation we are engaged. While this is no bad thing in principle, in practice we haven’t learned anything new from these periodic bouts of self-examination other than a reinforcement of the values our goals. On the other hand it is evident that we have made progress and have moved on a long way from where we started. A recent experience of a workshop whose agenda appeared to be to form another version of CIRCUS, this time with an overwhelmingly technological (DG III) membership, demonstrates they have a CIRCUS-worth of work to do before they will have reached where we are now. (Foreword of CIRCUS for Beginners
Remote Sensing of Earth Resources: A literature survey with indexes (1970 - 1973 supplement). Section 1: Abstracts
Abstracts of reports, articles, and other documents introduced into the NASA scientific and technical information system between March 1970 and December 1973 are presented in the following areas: agriculture and forestry, environmental changes and cultural resources, geodesy and cartography, geology and mineral resources, oceanography and marine resources, hydrology and water management, data processing and distribution systems, instrumentation and sensors, and economic analysis