43,611 research outputs found

    Optimization of micromachined relex klystrons for operation at terahertz frequencies

    Get PDF
    New micromachining techniques now provide us with the technology to fabricate reflex klystron oscillators with dimensions suitable for operation in the terahertz region of the electromagnetic spectrum. For the success of these devices, accurate designs are required since the optimization of certain parameters is critical to obtaining useful amounts of ac power. Classical models for device design have long been in existence, but these are no longer valid at terahertz frequencies. For this reason, we have developed a simulation tool, specifically aimed at the design of terahertz frequency reflex klystrons. The tool, based on the Monte Carlo algorithm, includes loss mechanisms and takes into account the main peculiarities expected for device operation at terahertz frequencies. In this study, the tool is used to study the influence of the electron beam aperture angle and cavity dimensions (particularly the grid spacing) on ac power generation. The results demonstrate that aperture angles of less than 10 are necessary for the optimization of output power. It is also found that the power output is highly sensitive to the distance between the grids

    The Threat of Exclusion and Relational Contracting

    Get PDF
    Relational contracts have been shown to mitigate moral hazard in labor and credit markets. A central assumption in most theoretical and experimental studies is that, upon misbehaving, agents can be excluded from their current source of income and have to resort to less attractive outside options. This threat of exclusion is unrealistic in many environments, and especially in credit and investment contexts. We examine experimentally the emergence and time structure of relational contracts when the threat of exclusion is weakened. We focus on bilateral credit relationships in which strategic default is possible. We compare a weak exclusion treatment in which defaulting borrowers can reinvest borrowed funds, to a strong exclusion treatment in which defaulting borrowers must liquidate borrowed funds. We find that under weak exclusion more relationships break down in early periods and credit relationships are more likely to “start small”

    Phenomenological research on professsional knowledge and educational relationship building

    Get PDF
    Following Dewey’s (1997) and Schwab’s (2013) ideas, Clandinin & Connelly (1992) developed their notion of teacher as curriculum maker, it means, the “teacher not so much as a maker of curriculum but as a part of it and to imagine a place for contexts, culture (Dewey´s notion of interaction), and temporality (both past and future contained in Dewey´s notion of continuity)” (p.365). In this way, teachers are not seen as implementers of curricular plans but as part of the curriculum making process. In other words, they understand that students create their curriculum in their experience at school when they interact with teachers and the environment. Therefore, the educational relationship creates the framework where learning can take place and students can build knowledge (Atkinson, 2015); it means, relationships generate meeting places that allow the making and reshaping of curriculum. If teaching takes place in the relationship, it means recognition (and acceptance) of the other person, of the otherness. It supposes trying to come into relation with the other, and it implies also acceptance of the uncertainty that otherness has. Therefore, education Is not about the implementation of an education programme in order to achieve (pre)determined results. It is not about intervention on students, but it is an experience of relationship where each one constructs their own story (Molina, Blanco & Arbiol, 2016). In short, curriculum is made through experiences that are lived in relation and, therefore, we could say that education is an act of relationship (Piussi, 2006). In this way, education does not require that teachers have the most appropriate knowledge and programme for every situation; the educational experience is unpredictable and ineffable, we cannot anticipate or face it completely (Van Manen, 2015). Thus, teaching requires becoming aware of how we build relationships and how we see the other person (Contreras, 2002).Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Statistical properties of a filtered Poisson process with additive random noise: Distributions, correlations and moment estimation

    Get PDF
    Filtered Poisson processes are often used as reference models for intermittent fluc- tuations in physical systems. Such a process is here extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The lowest order moments, probability density function, auto-correlation function and power spectral density are derived and used to identify and compare the effects of the two different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of model pa- rameter estimation and to identify methods for distinguishing the noise types. It is shown that the probability density function and the three lowest order moments provide accurate estimations of the parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of identifying the noise type. The number of times the signal crosses a prescribed threshold level in the positive direction also promises to be able to differentiate the noise type.Comment: 34 pages, 25 figure

    Holographic non-computers

    Full text link
    We introduce the notion of holographic non-computer as a system which exhibits parametrically large delays in the growth of complexity, as calculated within the Complexity-Action proposal. Some known examples of this behavior include extremal black holes and near-extremal hyperbolic black holes. Generic black holes in higher-dimensional gravity also show non-computing features. Within the 1/d1/d expansion of General Relativity, we show that large-dd scalings which capture the qualitative features of complexity, such as a linear growth regime and a plateau at exponentially long times, also exhibit an initial computational delay proportional to dd. While consistent for large AdS black holes, the required `non-computing' scalings are incompatible with thermodynamic stability for Schwarzschild black holes, unless they are tightly caged.Comment: 23 pages, 7 figures. V3: References added. Figures updated. New discussion of small black holes in the canonical ensembl
    • …
    corecore