285 research outputs found

    Rigorous direct and inverse design of photonic-plasmonic nanostructures

    Get PDF
    Designing photonic-plasmonic nanostructures with desirable electromagnetic properties is a central problem in modern photonics engineering. As limited by available materials, engineering geometry of optical materials at both element and array levels becomes the key to solve this problem. In this thesis, I present my work on the development of novel methods and design strategies for photonic-plasmonic structures and metamaterials, including novel Green’s matrix-based spectral methods for predicting the optical properties of large-scale nanostructures of arbitrary geometry. From engineering elements to arrays, I begin my thesis addressing toroidal electrodynamics as an emerging approach to enhance light absorption in designed nanodisks by geometrically creating anapole configurations using high-index dielectric materials. This work demonstrates enhanced absorption rates driven by multipolar decomposition of current distributions involving toroidal multipole moments for the first time. I also present my work on designing helical nano-antennas using the rigorous Surface Integral Equations method. The helical nano-antennas feature unprecedented beam-forming and polarization tunability controlled by their geometrical parameters, and can be understood from the array perspective. In these projects, optimization of optical performances are translated into systematic study of identifiable geometric parameters. However, while array-geometry engineering presents multiple advantages, including physical intuition, versatility in design, and ease of fabrication, there is currently no rigorous and efficient solution for designing complex resonances in large-scale systems from an available set of geometrical parameters. In order to achieve this important goal, I developed an efficient numerical code based on the Green’s matrix method for modeling scattering by arbitrary arrays of coupled electric and magnetic dipoles, and show its relevance to the design of light localization and scattering resonances in deterministic aperiodic geometries. I will show how universal properties driven by the aperiodic geometries of the scattering arrays can be obtained by studying the spectral statistics of the corresponding Green’s matrices and how this approach leads to novel metamaterials for the visible and near-infrared spectral ranges. Within the thesis, I also present my collaborative works as examples of direct and inverse designs of nanostructures for photonics applications, including plasmonic sensing, optical antennas, and radiation shaping

    100spaces: a fervent exercise in scenic flexibility, sustainability, and pedagogy

    Get PDF
    Using 100spaces as the base for a case study, this thesis investigates the importance of including scenic designers within the dance and theatre pedagogy program. This methodical examination into parallel themes of immersivity, causality, rhythm and movement, sensory awareness and application, ecology and sustainability, and performativity of audiences and performers alike, demonstrates a strong correlation between core objectives of scenic design and pedagogy. The 100spaces project raises serious questions about the essentiality of accounting for time within collaborations at universities, and the benefits of learning from one another. Although university-level pedagogy studies in dance and theatre are currently taught with academic, theoretical, and philosophical concepts in mind, these studies would be notably enhanced by the addition of design abilities, dramaturgical creativity, and ecological awareness acquired and practiced by scenic designers and scenographers. Thus, a conclusion can be made that, given the proper amount of concerted collaborative time, the curriculum of both programs would benefit from the inclusion of the other

    The Evolution of Language Universals: Optimal Design and Adaptation

    Get PDF
    Inquiry into the evolution of syntactic universals is hampered by severe limitations on the available evidence. Theories of selective function nevertheless lead to predictions of local optimaliiy that can be tested scientifically. This thesis refines a diagnostic, originally proposed by Parker and Maynard Smith (1990), for identifying selective functions on this basis and applies it to the evolution of two syntactic universals: (I) the distinction between open and closed lexical classes, and (2) nested constituent structure. In the case of the former, it is argued that the selective role of the closed class items is primarily to minimise the amount of redundancy in the lexicon. In the case of the latter, the emergence of nested phrase structure is argued to have been a by-product of selection for the ability to perform insertion operations on sequences - a function that plausibly pre-dated the emergence of modem language competence. The evidence for these claims is not just that these properties perform plausibly fitness-related functions, but that they appear to perform them in a way that is improbably optimal. A number of interesting findings follow when examining the selective role of the closed classes. In particular, case, agreement and the requirement that sentences have subjects are expected consequences of an optimised lexicon, the theory thereby relating these properties to natural selection for the first time. It also motivates the view that language variation is confined to parameters associated with closed class items, in turn explaining why parameter confiicts fail to arise in bilingualism. The simplest representation of sequences that is optimised for efficient insertions can represent both nested constituent structure and long-distance dependencies in a unified way, thus suggesting that movement is intrinsic to the representation of constituency rather than an 'imperfection'. The basic structure of phrases also follows from this representation and helps to explain the interaction between case and theta assignment. These findings bring together a surprising array of phenomena, reinforcing its correctness as the representational basis of syntactic structures. The diagnostic overcomes shortcomings in the approach of Pinker and Bloom (1990), who argued that the appearance of 'adaptive complexity' in the design of a trait could be used as evidence of its selective function, but there is no reason to expect the refinements of natural selection to increase complexity in any given case. Optimality considerations are also applied in this thesis to filter theories of the nature of unobserved linguistic representations as well as theories of their functions. In this context, it is argued that, despite Chomsky's (1995) resistance to the idea, it is possible to motivate the guiding principles of the Minimalist Program in terms of evolutionary optimisation, especially if we allow the possibility that properties of language were selected for non-communicative functions and that redundancy is sometimes costly rather than beneficial

    New Foundation in the Sciences: Physics without sweeping infinities under the rug

    Get PDF
    It is widely known among the Frontiers of physics, that “sweeping under the rug” practice has been quite the norm rather than exception. In other words, the leading paradigms have strong tendency to be hailed as the only game in town. For example, renormalization group theory was hailed as cure in order to solve infinity problem in QED theory. For instance, a quote from Richard Feynman goes as follows: “What the three Nobel Prize winners did, in the words of Feynman, was to get rid of the infinities in the calculations. The infinities are still there, but now they can be skirted around . . . We have designed a method for sweeping them under the rug. [1] And Paul Dirac himself also wrote with similar tune: “Hence most physicists are very satisfied with the situation. They say: Quantum electrodynamics is a good theory, and we do not have to worry about it any more. I must say that I am very dissatisfied with the situation, because this so-called good theory does involve neglecting infinities which appear in its equations, neglecting them in an arbitrary way. This is just not sensible mathematics. Sensible mathematics involves neglecting a quantity when it turns out to be small—not neglecting it just because it is infinitely great and you do not want it!”[2] Similarly, dark matter and dark energy were elevated as plausible way to solve the crisis in prevalent Big Bang cosmology. That is why we choose a theme here: New Foundations in the Sciences, in order to emphasize the necessity to introduce a new set of approaches in the Sciences, be it Physics, Cosmology, Consciousness etc
    • …
    corecore