738 research outputs found

    Asian migrant writers in Australia and the negotiation of the third space

    Get PDF
    This thesis is a comparative study of three selected texts by Australian novelistsYasmine Gooneratne, A Change of Skies,(1991) Adib Khan, SeasonalAdjustments (1994) and Brian Castro, Birds of Passage ((1983). All three writersexplore the experiences and perceptions of their protagonists in relating to thelandscape, people and cultural traditions within the Australian context into whichthey have migrated from different Asian countries. Brian Castro’s centralcharacters, Lo Yun Shan and Seamus O’Young, are drawn from two contexts, theformer from the 19th century China while the latter is a contemporary Australianborn Chinese. Gooneratne’s and Khan’s protagonists hail from South-East Asiancontexts, which are again interestingly different, Gooneratne’s character beingfrom Sri Lanka and Khan’s from Bangladesh. From the multiplicity of culturesfrom which these texts emerge with their inevitable movements of theprotagonists between the originary and adoptive homes, there seems to be areaching towards a necessary ‘inter’ space, what Homi Bhabha calls the ‘ThirdSpace.’ In terms of perception of identity and belonging this borderline positionwould appear to be crucial to the diasporic condition. (1994, p. 53) While thisstudy explores the problematics, accommodations, resolutions and synergiesinvolved in the experience of negotiating this liminal space and living whatRushdie calls a ‘translated’ existence, (1991, p. 17) the focus is on particularprocesses crucial to that translation. My study will suggest that the arrival at the ‘Third Space’ is represented neitheras a benign experience of adaptation to a different sense of home nor a sense ofbeing relegated to a state of permanent loss and alienation. Rather it will beapparent that the migrant experience is more mosaic than formulaic resisting neatdefinitions of movement from an initial sense of estrangement from the hostnation to accommodation and assimilation within the new society. It seems thateach individual character is poised on different and differing configurations ofcultural allegiances and identities within the’ Third Space’. The representationand perception of the’ Third Space’ ‘in relation to the performance of identity as iteration and the recreation of self…[particularly in terms of] the desire forrecognition’ (Bhabha, 2004, p.12) appears more diverse than originally envisagedby Bhabha. There appears to be a plurality of articulations within thisformulation, suggesting it is not a single, homogenous in-between space but aconstellation of ‘Third Spaces’, fluid and changing, overriding the possibility of a‘happy hybridity’ which, in any case. most theorists in the field find an untenableconcept. The tracing of this highly complex . inter-related and entangled plethoraof experiences which constitute the fate of the migrant will be explored in depthand detail in this thesis. Finally, no arrival at certain certainties is promised at itsconclusion; only, possibly, a heightening of awareness, an expansion ofunderstanding.. This provides an opportunity to revisit, indeed to rethink thecomplexities of migrant experience as not only transcending dichotomies ofinsider/outsider, belonging/alterity which are encoded in the narrative of a nation,while simultaneously affirming the processes of hybridity as crucial to theformation of a ‘double selved’ identity

    Intelligent Circuits and Systems

    Get PDF
    ICICS-2020 is the third conference initiated by the School of Electronics and Electrical Engineering at Lovely Professional University that explored recent innovations of researchers working for the development of smart and green technologies in the fields of Energy, Electronics, Communications, Computers, and Control. ICICS provides innovators to identify new opportunities for the social and economic benefits of society.  This conference bridges the gap between academics and R&D institutions, social visionaries, and experts from all strata of society to present their ongoing research activities and foster research relations between them. It provides opportunities for the exchange of new ideas, applications, and experiences in the field of smart technologies and finding global partners for future collaboration. The ICICS-2020 was conducted in two broad categories, Intelligent Circuits & Intelligent Systems and Emerging Technologies in Electrical Engineering

    Hardware/software architectures for iris biometrics

    Get PDF
    Nowadays, the necessity of identifying users of facilities and services has become quite important not only to determine who accesses a system and/or service, but also to determine which privileges should be provided to each user. For achieving such identification, Biometrics is emerging as a technology that provides a high level of security, as well as being convenient and comfortable for the citizen. Most biometric systems are based on computer solutions, where the identification process is performed by servers or workstations, whose cost and processing time make them not feasible for some situations. However, Microelectronics can provide a suitable solution without the need of complex and expensive computer systems. Microelectronics is a subfield of Electronics and as the name suggests, is related to the study, development and/or manufacturing of electronic components, i.e. integrated circuits (ICs). We have focused our research in a concrete field of Microelectronics: hardware/software co-design. This technique is widely used for developing specific and high computational cost devices. Its basis relies on using both hardware and software solutions in an effective way, thus, obtaining a device faster than just a software solution, or smaller devices that use dedicated hardware developed for all the processes. The questions on how we can obtain an effective solution for Biometrics will be solved considering all the different aspects of these systems. In this Thesis, we have made two important contributions: the first one for a verification system based on ID token and secondly, a search engine used for massive recognition systems, both of them related to Iris Biometrics. The first relevant contribution is a biometric system architecture proposal based on ID tokens in a distributed system. In this contribution, we have specified some considerations to be done in the system and describe the different functionalities of the elements which form it, such as the central servers and/or the terminals. The main functionality of the terminal is just left to acquiring the initial biometric raw data, which will be transmitted under security cryptographic methods to the token, where all the biometric process will be performed. The ID token architecture is based on Hardware/software co-design. The architecture proposed, independent of the modality, divides the biometric process into hardware and software in order to achieve further performance functions, more than in the existing tokens. This partition considers not only the decrease of computational time hardware can provide, but also the reduction of area and power consumption, the increase in security levels and the effects on performance in all the design. To prove the proposal made, we have implemented an ID token based on Iris Biometrics following our premises. We have developed different modules for an iris algorithm both in hardware and software platforms to obtain results necessary for an effective combination of same. We have also studied different alternatives for solving the partition problem in the Hardware/software co-design issue, leading to results which point out tabu search as the fastest algorithm for this purpose. Finally, with all the data obtained, we have been able to obtain different architectures according to different constraints. We have presented architectures where the time is a major requirement, and we have obtained 30% less processing time than in all software solutions. Likewise, another solution has been proposed which provides less area and power consumption. When considering the performance as the most important constraint, two architectures have been presented, one which also tries to minimize the processing time and another which reduces hardware area and power consumption. In regard the security we have also shown two architectures considering time and hardware area as secondary requirements. Finally, we have presented an ultimate architecture where all these factors were considered. These architectures have allowed us to study how hardware improves the security against authentication attacks, how the performance is influenced by the lack of floating point operations in hardware modules, how hardware reduces time with software reducing the hardware area and the power consumption. The other singular contribution made is the development of a search engine for massive identification schemes, where time is a major constraint as the comparison should be performed over millions of users. We have initially proposed two implementations: following a centralized architecture, where memories are connected to the microprocessor, although the comparison is performed by a dedicated hardware co-processor, and a second approach, where we have connected the memory driver directly in the hardware coprocessor. This last architecture has showed us the importance of a correct connection between the elements used when time is a major requirement. A graphical representation of the different aspects covered in this Thesis is presented in Fig.1, where the relation between the different topics studied can be seen. The main topics, Biometrics and Hardware/Software Co-design have been studied, where several aspects of them have been described, such as the different Biometric modalities, where we have focussed on Iris Biometrics and the security related to these systems. Hardware/Software Co-design has been studied by presenting different design alternatives and by identifying the most suitable configuration for ID Tokens. All the data obtained from this analysis has allowed us to offer two main proposals: The first focuses on the development of a fast search engine device, and the second combines all the factors related to both sciences with regards ID tokens, where different aspects have been combined in its Hardware/Software Design. Both approaches have been implemented to show the feasibility of our proposal. Finally, as a result of the investigation performed and presented in this thesis, further work and conclusions can be presented as a consequence of the work developed.-----------------------------------------------------------------------------------------Actualmente la identificación usuarios para el acceso a recintos o servicios está cobrando importancia no sólo para poder permitir el acceso, sino además para asignar los correspondientes privilegios según el usuario del que se trate. La Biometría es una tecnología emergente que además de realizar estas funciones de identificación, aporta mayores niveles de seguridad que otros métodos empleados, además de resultar más cómodo para el usuario. La mayoría de los sistemas biométricos están basados en ordenadores personales o servidores, sin embargo, la Microelectrónica puede aportar soluciones adecuadas para estos sistemas, con un menor coste y complejidad. La Microelectrónica es un campo de la Electrónica, que como su nombre sugiere, se basa en el estudio, desarrollo y/o fabricación de componentes electrónicos, también denominados circuitos integrados. Hemos centrado nuestra investigación en un campo específico de la Microelectrónica llamado co-diseño hardware/software. Esta técnica se emplea en el desarrollo de dispositivos específicos que requieren un alto gasto computacional. Se basa en la división de tareas a realizar entre hardware y software, consiguiendo dispositivos más rápidos que aquellos únicamente basados en una de las dos plataformas, y más pequeños que aquellos que se basan únicamente en hardware. Las cuestiones sobre como podemos crear soluciones aplicables a la Biometría son las que intentan ser cubiertas en esta tesis. En esta tesis, hemos propuesto dos importantes contribuciones: una para aquellos sistemas de verificación que se apoyan en dispositivos de identificación y una segunda que propone el desarrollo de un sistema de búsqueda masiva. La primera aportación es la metodología para el desarrollo de un sistema distribuido basado en dispositivos de identificación. En nuestra propuesta, el sistema de identificación está formado por un proveedor central de servicios, terminales y dichos dispositivos. Los terminales propuestos únicamente tienen la función de adquirir la muestra necesaria para la identificación, ya que son los propios dispositivos quienes realizan este proceso. Los dispositivos se apoyan en una arquitectura basada en codiseño hardware/software, donde los procesos biométricos se realizan en una de las dos plataformas, independientemente de la modalidad biométrica que se trate. El reparto de tareas se realiza de tal manera que el diseñador pueda elegir que parámetros le interesa más enfatizar, y por tanto se puedan obtener distintas arquitecturas según se quiera optimizar el tiempo de procesado, el área o consumo, minimizar los errores de identificación o incluso aumentar la seguridad del sistema por medio de la implementación en hardware de aquellos módulos que sean más susceptibles a ser atacados por intrusos. Para demostrar esta propuesta, hemos implementado uno de estos dispositivos basándonos en un algoritmo de reconocimiento por iris. Hemos desarrollado todos los módulos de dicho algoritmo tanto en hardware como en software, para posteriormente realizar combinaciones de ellos, en busca de arquitecturas que cumplan ciertos requisitos. Hemos estudiado igualmente distintas alternativas para la solucionar el problema propuesto, basándonos en algoritmos genéticos, enfriamiento simulado y búsqueda tabú. Con los datos obtenidos del estudio previo y los procedentes de los módulos implementados, hemos obtenido una arquitectura que minimiza el tiempo de ejecución en un 30%, otra que reduce el área y el consumo del dispositivo, dos arquitecturas distintas que evitan la pérdida de precisión y por tanto minimizan los errores en la identificación: una que busca reducir el área al máximo posible y otra que pretende que el tiempo de procesado sea mínimo; dos arquitecturas que buscan aumentar la seguridad, minimizando ya sea el tiempo o el área y por último, una arquitectura donde todos los factores antes nombrados son considerados por igual. La segunda contribución de la tesis se refiere al desarrollo de un motor de búsqueda para identificación masiva. La premisa seguida en esta propuesta es la de minimizar el tiempo lo más posible para que los usuarios no deban esperar mucho tiempo para ser identificados. Para ello hemos propuesto dos alternativas: una arquitectura clásica donde las memorias están conectadas a un microprocesador central, el cual a su vez se comunica con un coprocesador que realiza las funciones de comparación. Una segunda alternativa, donde las memorias se conectan directamente a dicho co-procesador, evitándose el uso del microprocesador en el proceso de comparación. Ambas propuestas son comparadas y analizadas, mostrando la importancia de una correcta y apropiada conexión de los distintos elementos que forman un sistema. La Fig. 2 muestra los distintos temas tratados en esta tesis, señalando la relación existente entre ellos. Los principales temas estudiados son la Biometría y el co-diseño hardware/software, describiendo distintos aspectos de ellos, como las diferentes modalidades biométricas, centrándonos en la Biometría por iris o la seguridad relativa a estos sistemas. En el caso del co-diseño hardware/software se presenta un estado de la técnica donde se comentan diversas alternativas para el desarrollo de sistemas empotrados, el trabajo propuesto por otros autores en el ¶ambito del co-diseño y por último qué características deben cumplir los dispositivos de identificación como sistemas empotrados. Con toda esta información pasamos al desarrollo de las propuestas antes descritas y los desarrollos realizados. Finalmente, conclusiones y trabajo futuro son propuestos a raíz de la investigación realizada

    The origami of desire: unfolding and refolding the desiring self (F)

    Get PDF
    This thesis celebrates the emancipatory potential of writing, which can be a tool for creating alternative worlds and, as Françoise Lionnet says, for ‘reappropriating the past so as to transform our understanding of ourselves’. I use fictional and auto/biographical texts, read through Deleuzian theories of desire and subjectivity, to argue that we can use our powers of thought and expression to change our understanding of self and others and to live more creatively and joyfully.Traditionally, women have lived secondary lives, shaped and repressed by hierarchical and patriarchal codes of behaviour and thought; many still live like this. Desire has been defined (at least in the dominant Platonic tradition of Western philosophy) in terms of lack and loss; in this binary paradigm, desire is a secondary function of language and culture, and the subject is opposed to, and constituted by, the other. Gilles Deleuze and Félix Guattari propose instead that desire is a primary, impersonal connective force that flows through all life in a Spinozan universe that is composed of one substance, distributed on the intersecting yet distinct planes of the virtual or invisible and the material or visible. On these planes, a multiplicity of incorporeal events and organised forms and subjectivities proliferate and connect in a dance of difference and repetition — always folding, unfolding, refolding, becoming.The thesis is structured as a métissage or assemblage, a braiding of different narrative strands: theory, the literature — fictional and non-fictional — of medieval Heian Japanese court women, and my own auto/biographical writing. My central proposition is that desire is immanent creative energy that produces folds of time, memory, material forms and subjectivity that, ‘like origami, can be unfolded and refolded into different shapes.’ I use the figure of origami as a recurring motif to describe and explore the ever-changing process of the construction of selfhood, which is both active and reactive, self- and other-folded. This process is illustrated in the literature of Heian women, whose lives were controlled to an extreme degree, but who, in their closeted interiors, created an extraordinary body of confessional and fictional literature, much of which is still extant, translated, studied and enjoyed. Desire, in this labyrinthine world, is active and masculine, yet the literature is a décoverture of men’s penetrative and exploitative use of women for their gratification, and a celebration of the women’s hidden desires (for emotional satisfaction and security, for personal freedom, for spiritual fulfilment) and rich imaginative lives. This year, on November 1, Japan celebrates the thousandth anniversary of the creation of The tale of Genji, the world’s first novel, considered by many scholars and readers to be a masterpiece. Such is the power of the imagination, that a subjugated woman could produce a work that is transformative in its creative power throughout and beyond one thousand years of world history and culture.Within the braided narrative of the origami of desire, stories of my life are framed by reflections that theorise the themes of failed subjectivity — a construction of femininity within the bourgeois paradigm of woman as mother to her children and wife (and mother) to her husband, without ‘a life, sex and desires of her own’. The exclusion and censorship of female desire from this subjectification led, in my case, to a pursuit of love that resulted in the loss of my children. The causes and effects of that loss in my life and theirs are narrated in several memoirs, and the interpretive narrative seeks to unfold the old dysfunctional and hegemonic forms of desire and repression that produced this failure and perform an autopoiesis or re-creation of self in different, freer, more fluid forms.This thesis is a mise-en-abîme of stories of self and others folded within the main narrative of desire as origami. Works of fiction and memoirs present narrated worlds that reflect the ‘real’ world we inhabit, creating stories within stories where, like Alice through the looking-glass, we can see much that is the same, yet much that is different. We return to the everyday world changed in subtle ways, and we can use our perceptions and affective responses to refold ourselves and the way we react to circumstances. The narration of self and other in memoir and fiction is a way in which we can reinterpret the world and thereby change it, becoming worthy of what happens, becoming the offspring of our own events, so that, as Deleuze puts it, we can have one more birth

    Digital signal conditioning on multiprocessor systems

    Get PDF
    An important application area of modem computer systems is that of digital signal processing. This discipline is concerned with the analysis or modification of digitally represented signals, through the use of simple mathematical operations. A primary need of such systems is that of high data throughput. Although optimised programmable processors are available, system designers are now looking towards parallel processing to gain further performance increases. Such parallel systems may be easily constructed using the transputer family of processors. However, although these devices are comparatively easy to program, they possess a general von Neumann core and so are relatively inefficient at implementing digital signal processing algorithms. The power of the transputer lies in its ability to communicate effectively, not in its computational capability. The converse is true of specialised digital signal processors. These devices have been designed specifically to implement the type of small data intensive operations required by digital signal processing algorithms, but have not been designed to operate efficiently in a multiprocessor environment. This thesis examines the performance of both types of processors with reference to a common signal processing application, multichannel filtering. The transputer is examined in both uniprocessor and multiprocessor configurations, and its performance analysed. A theoretical model of program behaviour is developed, in order to assess the performance benefits of particular code structures and the effects of such parameters as data block size. The transputer implementation is contrasted with that of the Motorola DSP56001 digital signal processor. This device is found to be much more efficient at implementing such algorithms on a single device, but provides limited multiprocessor support. Using the conclusions of this assessment, a hybrid multiprocessor has been designed. This consists of a transputer controlling a number of signal processors, communicating through shared memory, separating tiie tasks of computation and communication. Forcing the transputer to communicate through shared memory causes problems, and these have been addressed. A theoretical performance model of the system has been produced. A small system has been constructed, and is currently running performance test software

    Understanding Lists: Umberto Eco\u27s Rhetoric of Communication and Signification

    Get PDF
    This project, Understanding Lists: Umberto Eco’s Rhetoric of Communication and Signification, begins and ends with an observation and warning suggested throughout Eco’s work: lists are the origin of culture and the Internet as the Mother of All Lists threatens to end culture. To understand this warning, I turn to Eco’s work on lists, contextualized within a 2009 exhibition at the Musée du Louvre and in an illustrated collection, The Infinity of Lists. This project offers an analysis of Eco’s understanding of lists concurrent to his commentary on the social and cultural implications of the algorithmic-obsessed Internet age. To understand his argument, this project collects hints of insight through his corpus. In Eco’s cultural aesthetics, he celebrates the notion of openness that invites and encourages audience participation in the interpretation of texts with multiple possibilities. With his interpretive semiotics, Eco offers a theory of culture grounded in signification and communication. Signification consists of the codes of culture that make meaning and interpretive response possible. Communication is the labor of sign production and interpretation. Throughout his literary praxis, Eco implements these theoretical notions into story-form, and with his fifth novel, The Mysterious Flame of Queen Loana, affirms the mutual necessity of communication and signification. Ultimately, Eco urges us to list as a response to the threats of algorithmic processing of big data that displaces and replaces the human interpreter. For Eco, listing a form of communication that requires the labor to wade through information, activate codes of signification, and interpret cultural meaning

    Critical Programming: Toward a Philosophy of Computing

    Get PDF
    Beliefs about the relationship between human beings and computing machines and their destinies have alternated from heroic counterparts to conspirators of automated genocide, from apocalyptic extinction events to evolutionary cyborg convergences. Many fear that people are losing key intellectual and social abilities as tasks are offloaded to the everywhere of the built environment, which is developing a mind of its own. If digital technologies have contributed to forming a dumbest generation and ushering in a robotic moment, we all have a stake in addressing this collective intelligence problem. While digital humanities continue to flourish and introduce new uses for computer technologies, the basic modes of philosophical inquiry remain in the grip of print media, and default philosophies of computing prevail, or experimental ones propagate false hopes. I cast this as-is situation as the post-postmodern network dividual cyborg, recognizing that the rational enlightenment of modernism and regressive subjectivity of postmodernism now operate in an empire of extended mind cybernetics combined with techno-capitalist networks forming societies of control. Recent critical theorists identify a justificatory scheme foregrounding participation in projects, valorizing social network linkages over heroic individualism, and commending flexibility and adaptability through life long learning over stable career paths. It seems to reify one possible, contingent configuration of global capitalism as if it was the reflection of a deterministic evolution of commingled technogenesis and synaptogenesis. To counter this trend I offer a theoretical framework to focus on the phenomenology of software and code, joining social critiques with textuality and media studies, the former proposing that theory be done through practice, and the latter seeking to understand their schematism of perceptibility by taking into account engineering techniques like time axis manipulation. The social construction of technology makes additional theoretical contributions dispelling closed world, deterministic historical narratives and requiring voices be given to the engineers and technologists that best know their subject area. This theoretical slate has been recently deployed to produce rich histories of computing, networking, and software, inform the nascent disciplines of software studies and code studies, as well as guide ethnographers of software development communities. I call my syncretism of these approaches the procedural rhetoric of diachrony in synchrony, recognizing that multiple explanatory layers operating in their individual temporal and physical orders of magnitude simultaneously undergird post-postmodern network phenomena. Its touchstone is that the human-machine situation is best contemplated by doing, which as a methodology for digital humanities research I call critical programming. Philosophers of computing explore working code places by designing, coding, and executing complex software projects as an integral part of their intellectual activity, reflecting on how developing theoretical understanding necessitates iterative development of code as it does other texts, and how resolving coding dilemmas may clarify or modify provisional theories as our minds struggle to intuit the alien temporalities of machine processes
    corecore