149 research outputs found
Tracking and Retexturing Cloth for RealTime Virtual Clothing Applications
Abstract. In this paper, we describe a dynamic texture overlay method from monocular images for real-time visualization of garments in a virtual mirror environment. Similar to looking into a mirror when trying on clothes, we create the same impression but for virtually textured garments. The mirror is replaced by a large display that shows the mirrored image of a camera capturing e.g. the upper body part of a person. By estimating the elastic deformations of the cloth from a single camera in the 2D image plane and recovering the illumination of the textured surface of a shirt in real time, an arbitrary virtual texture can be realistically augmented onto the moving garment such that the person seems to wear the virtual clothing. The result is a combination of the real video and the new augmented model yielding a realistic impression of the virtual piece of cloth
Revisión de literatura de jerarquÃa volúmenes acotantes enfocados en detección de colisiones
(Eng) A bounding volume is a common method to simplify object representation by using the composition of geometrical shapes that enclose the object; it encapsulates complex objects by means of simple volumes and it is widely useful in collision detection applications and ray tracing for rendering algorithms. They are popular in computer graphics and computational geometry. Most popular bounding volumes are spheres, Oriented-Bounding Boxe s (OBB’ s), Axis-Align ed Bound ing Boxes (AABB’ s); moreover , the literature review includes ellipsoids, cylinders, sphere packing, sphere shells , k-DOP’ s, convex hulls, cloud of points, and minimal bounding boxe s, among others. A Bounding Volume Hierarchy is ussualy a tree in which the complete object is represented thigter fitting every level of the hierarchy. Additionally, each bounding volume has a cost associated to construction, update, and interference te ts. For instance, spheres are invariant to rotation and translations, then they do not require being updated ; their constructions and interference tests are more straightforward then OBB’ s; however, their tightness is lower than other bounding volumes. Finally , three comparisons between two polyhedra; seven different algorithms were used, of which five are public libraries for collision detection.(Spa) Un volumen acotante es un método común para simplificar la representación de los objetos por medio de composición
de formas geométricas que encierran el objeto; estos encapsulan objetos complejos por medio de volúmenes simples y
son ampliamente usados en aplicaciones de detección de colisiones y trazador de rayos para algoritmos de renderización.
Los volúmenes acotantes son populares en computación gráfica y en geometrÃa computacional; los más populares son las
esferas, las cajas acotantes orientadas (OBB’s) y las cajas acotantes alineadas a los ejes (AABB’s); no obstante, la literatura
incluye elipses, cilindros empaquetamiento de esferas, conchas de esferas, k-DOP’s, convex hulls, nubes de puntos y cajas
acotantes mÃnimas, entre otras. Una jerarquÃa de volúmenes acotantes es usualmente un árbol, en el cual la representación
de los objetos es más ajustada en cada uno de los niveles de la jerarquÃa. Adicionalmente, cada volumen acotante tiene
asociado costos de construcción, actualización, pruebas de interferencia. Por ejemplo, las esferas so invariantes a rotación
y translación, por lo tanto no requieren ser actualizadas en comparación con los AABB no son invariantes a la rotación.
Por otro lado la construcción y las pruebas de solapamiento de las esferas son más simples que los OBB’s; sin embargo, el
ajuste de las esferas es menor que otros volúmenes acotantes. Finalmente, se comparan dos poliedros con siete algoritmos
diferentes de los cuales cinco son librerÃas públicas para detección de colisiones
Stakeholder network as a determinant to the degree of synchronization between a firm’s values and its stakeholder management strategies. A comparison between public and private companies using mission statements and corporate charitable donations.
Stakeholder theory and stakeholder management theories have gained popularity among practitioners and scholars in recent decades for both its normative and positive power. Intuitively, it is easy to assume that firms who manage for stakeholders utilize various stakeholder management strategies to realize their corporate values. Thus, this study intends to examine the degree of synchronization, or the lack thereof, between a firm’s publicly endorsed values and the values embedded in its CSR stakeholder management activity, specifically, charitable donations. More importantly, due to the different sizes and nature of the stakeholder networks faced by private and public firms, we expect the levels of synchronization to differ between the two, with the distinction that such values stray further from each other for public firms. We found that public and private firms differ in the levels of synchronization between their endorsed values and their charitable recipient organizations’ values on many semantic and psychological domains (17 categories). Interesting, contrary to our initial hypothesis, the level of discrepancy is greater among private firms than that of public firms on most domains (16 categories), which entices further research into determinants of firms’ behavior affected by institutionalized rituals
Quickest Sequence Phase Detection
A phase detection sequence is a length- cyclic sequence, such that the
location of any length- contiguous subsequence can be determined from a
noisy observation of that subsequence. In this paper, we derive bounds on the
minimal possible in the limit of , and describe some sequence
constructions. We further consider multiple phase detection sequences, where
the location of any length- contiguous subsequence of each sequence can be
determined simultaneously from a noisy mixture of those subsequences. We study
the optimal trade-offs between the lengths of the sequences, and describe some
sequence constructions. We compare these phase detection problems to their
natural channel coding counterparts, and show a strict separation between the
fundamental limits in the multiple sequence case. Both adversarial and
probabilistic noise models are addressed.Comment: To appear in the IEEE Transactions on Information Theor
Propuesta de un marco de trabajo para el diseño de procesos de desarrollo bioinspirados basados en estructuras tensegritales
Los algoritmos evolutivos se inspiran en la evolución biológica como metáfora de su modus
operandi: se consideran poblaciones (conjuntos) de individuos (soluciones a problemas),
donde cada individuo se caracteriza por su genotipo (conjunto de parámetros que componen
la solución), y se le asigna un fitness que mide cuán adaptado es (cómo de buena es la
solución). La cuestión es que las distintas clases de algoritmos evolutivos aparecieron en la
segunda mitad del siglo XX, en una época en la que la complejidad de los seres vivos se
interpretaba como la complejidad de sus correspondientes genotipos [35]. AsÃ, en los
algoritmos evolutivos se suele poner el acento sobre el diseño de un buen genotipo, y la
transformación de genotipo en fenotipo suele ser trivial, siguiendo el paradigma de esta
interpretación biológica.
Actualmente, se está descifrando el enigma del desarrollo de los seres vivos poco a poco, y el
genotipo va perdiendo paulatinamente su papel estelar. La complejidad de los seres vivos se
asigna cada vez más a su proceso de desarrollo, que el genotipo modula y coordina antes que
dirige [35]. Esto ha motivado la aplicación de este paradigma a diversas disciplinas, como
redes neuronales [1, 17], agentes autónomos [16], o diseño de hardware [15]. En este trabajo,
nos proponemos explorar este paradigma desde el punto de vista del diseño computacional de
estructuras.
Concretando, en este trabajo pretende esbozar un marco de trabajo con el que estudiar
procesos de desarrollo bioinspirados de estructuras tensegritales. El carácter de este trabajo se
puede entender como una exploración del espacio de posibilidades en los estadios iniciales de
la tesis
Recommended from our members
Design and Analysis of Decoy Systems for Computer Security
This dissertation is aimed at defending against a range of internal threats, including eaves-dropping on network taps, placement of malware to capture sensitive information, and general insider threats to exfiltrate sensitive information. Although the threats and adversaries may vary, in each context where a system is threatened, decoys can be used to deny critical information to adversaries making it harder for them to achieve their target goal. The approach leverages deception and the use of decoy technologies to deceive adversaries and trap nefarious acts. This dissertation proposes a novel set of properties for decoys to serve as design goals in the development of decoy-based infrastructures. To demonstrate their applicability, we designed and prototyped network and host-based decoy systems. These systems are used to evaluate the hypothesis that network and host decoys can be used to detect inside attackers and malware. We introduce a novel, large-scale automated creation and management system for deploying decoys. Decoys may be created in various forms including bogus documents with embedded beacons, credentials for various web and email accounts, and bogus financial in- formation that is monitored for misuse. The decoy management system supplies decoys for the network and host-based decoy systems. We conjecture that the utility of the decoys depends on the believability of the bogus information; we demonstrate the believability through experimentation with human judges. For the network decoys, we developed a novel trap-based architecture for enterprise networks that detects "silent" attackers who are eavesdropping network traffic. The primary contributions of this system is the ease of injecting, automatically, large amounts of believable bait, and the integration of various detection mechanisms in the back-end. We demonstrate our methodology in a prototype platform that uses our decoy injection API to dynamically create and dispense network traps on a subset of our campus wireless network. We present results of a user study that demonstrates the believability of our automatically generated decoy traffic. We present results from a statistical and information theoretic analysis to show the believability of the traffic when automated tools are used. For host-based decoys, we introduce BotSwindler, a novel host-based bait injection sys- tem designed to delude and detect crimeware by forcing it to reveal itself during the ex- ploitation of monitored information. Our implementation of BotSwindler relies upon an out-of-host software agent to drive user-like interactions in a virtual machine, seeking to convince malware residing within the guest OS that it has captured legitimate credentials. To aid in the accuracy and realism of the simulations, we introduce a novel, low overhead approach, called virtual machine verification, for verifying whether the guest OS is in one of a predefined set of states. We provide empirical evidence to show that BotSwindler can be used to induce malware into performing observable actions and demonstrate how this approach is superior to that used in other tools. We present results from a user to study to illustrate the believability of the simulations and show that financial bait infor- mation can be used to effectively detect compromises through experimentation with real credential-collecting malware. We present results from a statistical and information theo- retic analysis to show the believability of simulated keystrokes when automated tools are used to distinguish them. Finally, we introduce and demonstrate an expanded role for decoys in educating users and measuring organizational security through experiments with approximately 4000 university students and staff
Outdata-ed museums: creating ethical and transparent data collection processes in museums
UK museums are contradictory sites of education and community outreach, and emblems of colonial legacy and elitism. Physical and socioeconomic barriers prevent meaningful engagement for audiences, but particularly marginalised peoples. To identify and overcome these barriers, museums and cultural institutions are seeking technological solutions that capture and analyse personal data. However, current legislation and attitudes towards personal data also risk perpetuating exclusionary barriers. Many governments and organisations use personal data to suppress, undermine, and violently target minoritised or marginalised communities whilst upholding the status quo that marginalised them in the first place. This inequality is further entrenched by the powerlessness most people feel in the face of how data is collected and used on a day-to-day basis.
Drawing on Human Computer Interaction, Human Geography and New Museology, this PhD thesis seeks a solution to these concerns that empowers museums to safely collect the data they need whilst enabling audiences to become active in their own data curation. Using co-creative principles, input is sought from museums and audiences to answer three questions:
• How are discourses and practices surrounding personal data negotiated, defined, perpetuated, and resisted in museums?
• What is the value of personal data to museums and audiences?
• Can mutually beneficial and transparent data exchange foster meaningful, long-term relationships between museums and audiences?
To address these questions, a novel theoretical framework that explores museums as place, technology as mediator, and relational personal data through a lens of power is generated. Four sequential studies are then conducted utilising a post-structural feminist epistemology. The first study presents a content analysis of privacy policies to explore what data museums typically collect and how that information is conceptualised and shared with audiences, showing that museums collect a broad range of quantitative data but inadequately express to audiences what, how, or why. The second study presents a workshop with museum staff to determine what data would benefit the museum and what prevents it from being captured. It shows that museums seek qualitative, behavioural data but are limited by resource constraints. The third study uses workshop style activities to ask audiences to conceptualise the value of their desirable data and speculate different ways for their data to be used in the museum. The study highlights barriers to data engagement including fatigue and lack of understanding, and shows trust and transparency to be key motivators in data sharing. The fourth study uses a novel methodology to speculate a data-enabled museum visit, from which a technology probe called ‘MuNa’ is developed and tested in a virtual museum visit with real audiences. Evaluation shows how transparency and trust can be synchronously developed through meaningful engagement with data. This is shown to increase the engagement of audiences with both museum and data, fostering long-term, meaningful relationships between venue and visitor and the creation of data subjects able to advocate for their own data rights.
The implications of this research reach across each of its disciplines and into the everyday practices of cultural organisations and audiences. Contributing novel paradigms of understanding surrounding the museum visit experience including different stakeholder perspectives addressing museums, technology, and personal data, the thesis presents evidence of an equitable and sustainable, data-enabled future
- …