59 research outputs found

    Book announcements

    Get PDF

    First passage sets of the 2D continuum Gaussian free field

    Get PDF
    We introduce the first passage set (FPS) of constant level a-a of the two-dimensional continuum Gaussian free field (GFF) on finitely connected domains. Informally, it is the set of points in the domain that can be connected to the boundary by a path on which the GFF does not go below a-a. It is, thus, the two-dimensional analogue of the first hitting time of a-a by a one-dimensional Brownian motion. We provide an axiomatic characterization of the FPS, a continuum construction using level lines, and study its properties: it is a fractal set of zero Lebesgue measure and Minkowski dimension 2 that is coupled with the GFF Φ\Phi as a local set AA so that Φ+a\Phi+a restricted to AA is a positive measure. One of the highlights of this paper is identifying this measure as a Minkowski content measure in the non-integer gauge rlog(r)1/2r2r \mapsto \vert\log(r)\vert^{1/2}r^{2}, by using Gaussian multiplicative chaos theory.Comment: The first version also contained arXiv:1805.09204, which is now a paper on its own; the third version is an all-around improved version ; 42 pages; 8 figures

    Simulating daily field crop canopy photosynthesis: an integrated software package

    Get PDF
    Photosynthetic manipulation is seen as a promising avenue for advancing field crop productivity. However, progress is constrained by the lack of connection between leaf-level photosynthetic manipulation and crop performance. Here we report on the development of a model of diurnal canopy photosynthesis for well watered conditions by using biochemical models of C-3 and C-4 photosynthesis upscaled to the canopy level using the simple and robust sun-shade leaves representation of the canopy. The canopy model was integrated over the time course of the day for diurnal canopy photosynthesis simulation. Rationality analysis of the model showed that it simulated the expected responses in diurnal canopy photosynthesis and daily biomass accumulation to key environmental factors (i.e. radiation, temperature and CO2), canopy attributes (e.g. leaf area index and leaf angle) and canopy nitrogen status (i.e. specific leaf nitrogen and its profile through the canopy). This Diurnal Canopy Photosynthesis Simulator (DCaPS) was developed into a web-based application to enhance usability of the model. Applications of the DCaPS package for assessing likely canopy-level consequences of changes in photosynthetic properties and its implications for connecting photosynthesis with crop growth and development modelling are discussed

    A Mathematical Framework of Human Thought Process: Rectifying Software Construction Inefficiency and Identifying Characteristic Efficiencies of Networked Systems Via Problem-solution Cycle

    Get PDF
    Problem The lack of a theory to explain human thought process latently affects the general perception of problem solving activities. This present study was to theorize human thought process (HTP) to ascertain in general the effect of problem solving inadequacy on efficiency. Method To theorize human thought process (HTP), basic human problem solving activities were investigated through the vein of problem-solution cycle (PSC). The scope of PSC investigation was focused on the inefficiency problem in software construction and latent characteristic efficiencies of a similar networked system. In order to analyze said PSC activities, three mathematical quotients and a messaging wavefunction model similar to Schrodinger’s electronic wavefunction model are respectively derived for four intrinsic brain traits namely intelligence, imagination, creativity and language. These were substantiated using appropriate empirical verifications. Firstly, statistical analysis of intelligence, imagination and creativity quotients was done using empirical data with global statistical views from: 1. 1994–2004 CHAOS report Standish Group International’s software development projects success and failure survey. 2. 2000–2009 Global Creativity Index (GCI) data based on 3Ts of economic development (technology, talent and tolerance indices) from 82 nations. 3. Other varied localized success and failure surveys from 1994–2009/1998–2010 respectively. These statistical analyses were done using spliced decision Sperner system (SDSS) to show that the averages of all empirical scientific data on successes and failures of software production within specified periods are in excellent agreement with theoretically derived values. Further, the catalytic effect of creativity (thought catalysis) in human thought process is outlined and shown to be in agreement with newly discovered branch-like nerve cells in brain of mice (similar to human brain). Secondly, the networked communication activities of the language trait during PSC was scrutinized statistical using journal-journal citation data from 13 randomly selected 1984 major chemistry journals. With the aid of aforementioned messaging wave formulation, computer simulation of message-phase “thermogram” and “chromatogram” were generated to provide messaging line spectra relative to the behavioral messaging activities of the messaging network under study. Results Theoretical computations stipulated 66.67% efficiency due to intelligence, imagination and creativity traits interactions (multi-computational skills) was 33.33% due to networked linkages of language trait (aggregated language skills). The worldwide software production and economic data used were normally distributed with significance level α of 0.005. Thus, there existed a permissible error of 1% attributed to the significance level of said normally distributed data. Of the brain traits quotient statistics, the imagination quotient (IMGQ) score was 52.53% from 1994-2004 CHAOS data analysis and that from 2010 GCI data was 54.55%. Their average reasonably approximated 50th percentile of the cumulative distribution of problem-solving skills. On the other hand, the creativity quotient score from 1994-2004 CHAOS data was 0.99% and that from 2010 GCI data was 1.17%. These averaged to a near 1%. The chances of creativity and intelligence working together as joint problem-solving skills was consistently found to average at 11.32%(1994-2004 CHAOS: 10.95%, 2010 GCI: 11.68%). Also, the empirical data analysis showed that the language inefficiency of thought flow ηʹ(τ) from 1994-2004 CHAOS data was 35.0977% and that for 2010 GCI data was 34.9482%. These averaged around 35%. On the success and failure of software production, statistical analysis of empirical data showed 63.2% average efficiency for successful software production (1994 - 2012) and 33.94% average inefficiency for failed software production (1998 - 2010). On the whole, software production projects had a bound efficiency approach level (BEAL) of 94.8%. In the messaging wave analysis of 13 journal-to-journal citations, the messaging phase space graph(s) indicated a fundamental frequency (probable minimum message state) of 11. Conclusions By comparison, using cutoff level of printed editions of Journal Citation Reports to substitute for missing data values is inappropriate. However, values from optimizing method(s) harmonized with the fundamental frequency inferred from message wave analysis using informatics wave equation analysis (IWEA). Due to its evenly spaced chronological data snapshot, the application of SDSS technique inherently does diminish the difficulty associated with handling large data volume (big data) for analysis. From CHAOS and GCI data analysis, the averaged CRTQ scores indicate that only 1 percent (on the average) of the entire human race can be considered exceptionally creative. However in the art of software production, the siphoning effect of existing latent language inefficiency suffocates its processes of solution creation to an efficiency bound level of 66.67%. With a BEAL value of 94.8% and basic human error of 5.2%, it can be reasonable said that software production projects have delivered efficiently within existing latent inefficiency. Consequently, by inference from the average language inefficiency of thought flow, an average language efficiency of 65% exists in the process of software production worldwide. Reasonably, this correlates very strongly with existing average software production efficiency of 63.2% around which software crisis has averagely stagnated since the inception of software creation. The persistent dismal performance of software production is attributable to existing central focus on the usage of multiplicity of programming languages. Acting as an “efficiency buffer”, the latter minimizes changes to efficiency in software production thereby limiting software production efficiency theoretically to 66.67%. From both theoretical and empirical perspective, this latently shrouds software production in a deficit maximum attainable efficiency (DMAE). Software crisis can only be improved drastically through policy-driven adaptation of a universal standard supporting very minimal number of programming languages. On the average, the proposed universal standardization could save the world an estimated 6 trillion US dollars per year which is lost through existing inefficient software industry

    Structural vibration energy harvesting via bistable nonlinear attachments

    Get PDF
    A vibration-based bistable electromagnetic energy harvester coupled to a directly excited host structure is theoretically and experimentally examined. The primary goal of the study is to investigate the potential benet of the bistable element for harvesting broadband and low-amplitude vibration energy. The considered system consists of a grounded, weakly damped, linear oscillator (LO) coupled to a lightweight, damped oscillator by means of an element which provides for both cubic nonlinear and negative linear stiness components and electromechanical coupling elements. Single and repeated impulses with varying amplitude applied to the LO are the vibration energy sources considered. A thorough sensitivity analysis of the system's key parameters provides design insights for a bistable nonlinear energy harvesting (BNEH) device able to attain robust harvesting efficiency. Energy localization into the bistable attachment is achieved through the exploitation of three BNEH main dynamical regimes; namely, periodic cross-well, aperiodic (chaotic) cross-well, and in-well oscillations. For the experimental investigation on the performance of the bistable device, nonlinear and negative linear terms in the mechanical coupling are physically realized by exploiting the transverse displacement of a buckled slender steel beam; the electromechanical coupling is accomplished by an electromagnetic transducer

    Buffer Engineering for M|G|infinity Input Processes

    Get PDF
    We suggest the MGinftyM|G|infty input process as a viable model forrepresenting the heavy correlations observed in network traffic.Originally introduced by Cox, this model represents the busy-serverprocess of an MGinftyM|G|infty queue with Poisson inputs and generalservice times distributed according to GG, and provides a large andversatile class of traffic models. We examine various properties ofthe MGinftyM|G|infty process, focusing particularly on its richcorrelation structure. The process is shown to effectively portrayshort or long-range dependence simply by controlling the tail of thedistribution GG.In an effort to understand the dynamics of a system supportingMGinftyM|G|infty traffic, we study the large buffer asymptotics of amultiplexer driven by an MGinftyM|G|infty input process. Using the largedeviations framework developed by Duffield and O'Connell, weinvestigate the tail probabilities for the steady-state buffercontent. The key step in this approach is the identification of theappropriate large deviations scaling. This scaling is shown to beclosely related to the forward recurrence time of the service timedistribution, and a closed form expression is derived for thecorresponding limiting log-moment generating functionassociated with the input process. Three different regimes areidentified.The results are then applied to obtain the large bufferasymptotics under a variety of service time distributions. In eachcase, the derived asymptotics are compared with simulation results. While the general functional form of buffer asymptotics may be derivedvia large deviations techniques, direct arguments often provide a moreprecise description when the input traffic is heavily correlated.Even so, several significant inferences may be drawn from thefunctional dependencies of the tail buffer probabilities. Theasymptotics already indicate a sub-exponential behavior in the caseof heavily-correlated traffic, in sharp contrast to the geometricdecay usually observed for Markovian input streams. This difference,along with a shift in the explicit dependence of the asymptotics onthe input and output rates rinr_{in} and cc, from ho=rin/c ho=r_{in}/c whenGG is exponential, to Delta=crinDelta = c - r_{in} when GG issub--exponential, clearly delineates the heavy and light tailed cases.Finally, comparison with similar asymptotics for a different class ofinput processes indicates that buffer sizing cannot be adequatelydetermined by appealing solely to the short versus long-rangedependence characterization of the input model used

    The computational magic of the ventral stream: sketch of a theory (and why some deep architectures work).

    Get PDF
    This paper explores the theoretical consequences of a simple assumption: the computational goal of the feedforward path in the ventral stream -- from V1, V2, V4 and to IT -- is to discount image transformations, after learning them during development
    corecore