502 research outputs found
Mediating Cognitive Transformation with VR 3D Sketching during Conceptual Architectural Design Process
Communications for information synchronization during the conceptual design phase require designers to employ more intuitive digital design tools. This paper presents findings of a feasibility study for using VR 3D sketching interface in order to replace current non-intuitive CAD tools. We used a sequential mixed method research methodology including a qualitative case study and a cognitive-based quantitative protocol analysis experiment. Foremost, the case study research was conducted in order to understand how novice designers make intuitive decisions. The case study documented the failure of conventional sketching methods in articulating complicated design ideas and shortcomings of current CAD tools in intuitive ideation. The case study’s findings then became the theoretical foundations for testing the feasibility of using VR 3D sketching interface during design. The latter phase of study evaluated the designers’ spatial cognition and collaboration at six different levels: “physical-actions”, “perceptualac ons”, “functional-actions”, “conceptual-actions”, “cognitive synchronizations”, and “gestures”. The results and confirmed hypotheses showed that the utilized tangible 3D sketching interface improved novice designers’ cognitive and collaborative design activities. In summary this paper presents the influences of current external representation tools on designers’ cognition and collaboration as well as providing the necessary theoretical foundations for implementing VR 3D sketching interface. It contributes towards transforming conceptual architectural design phase from analogue to digital by proposing a new VR design interface. The paper proposes this transformation to fill in the existing gap between analogue conceptual architectural design process and remaining digital engineering parts of building design process hence expediting digital design process
Quantifying the role of stochasticity in the development of autoimmune disease
In this paper, we propose and analyse a mathematical model for the onset and development of autoimmune disease, with particular attention to stochastic effects in the dynamics. Stability analysis yields parameter regions associated with normal cell homeostasis, or sustained periodic oscillations. Variance of these oscillations and the effects of stochastic amplification are also explored. Theoretical results are complemented by experiments, in which experimental autoimmune uveoretinitis (EAU) was induced in B10.RIII and C57BL/6 mice. For both cases, we discuss peculiarities of disease development, the levels of variation in T cell populations in a population of genetically identical organisms, as well as a comparison with model outputs
Time-delayed model of autoimmune dynamics
Among various environmental factors associated with triggering or exacerbating autoimmune response, an important role is played by infections. A breakdown of immune tolerance as a byproduct of immune response against these infections is one of the major causes of autoimmune disease. In this paper we analyse the dynamics of immune response with particular emphasis on the role of time delays characterising the infection and the immune response, as well as on interactions between different types of T cells and cytokines that mediate their behaviour. Stability analysis of the model provides insights into how different model parameters affect the dynamics. Numerical stability analysis and simulations are performed to identify basins of attraction of different dynamical states, and to illustrate the behaviour of the model in different regime
Boosting the HP Filter for Trending Time Series with Long Range Dependence
This paper extends recent asymptotic theory developed for the Hodrick Prescott (HP) filter and boosted HP (bHP) filter to long range dependent time series that have fractional Brownian motion (fBM) limit processes after suitable standardization. Under general conditions it is shown that the asymptotic form of the HP filter is a smooth curve, analogous to the finding in Phillips and Jin (2021) for integrated time series and series with deterministic drifts. Boosting the filter using the iterative procedure suggested in Phillips and Shi (2021) leads under well defined rate conditions to a consistent estimate of the fBM limit process or the fBM limit process with an accompanying deterministic drift when that is present. A stopping criterion is used to automate the boosting algorithm, giving a data-determined method for practical implementation. The theory is illustrated in simulations and two real data examples that highlight the differences between simple HP filtering and the use of boosting. The analysis is assisted by employing a uniformly and almost surely convergent trigonometric series representation of fBM
Asymptotic theory for near integrated processes driven by tempered linear processes
In an early article on near-unit root autoregression, Ahtola and Tiao (1984) studied the behavior of the score function in a stationary first order autoregression driven by independent Gaussian innovations as the autoregressive coefficient approached unity from below. The present paper develops asymptotic theory for near-integrated random processes and associated regressions including the score function in more general settings where the errors are tempered linear processes. Tempered processes are stationary time series that have a semi long memory property in the sense that the autocovariogram of the process resembles that of a long memory model for moderate lags but eventually diminishes exponentially fast accordingto the presence of a decay factor governed by a tempering parameter. When the tempering parameter is sample size dependent, the resulting class of processes admits a wide range of behavior that includes both long memory, semi-long memory, and short memory processes.The paper develops asymptotic theory for such processes and associated regression statistics thereby extending earlier findings that fall within certain subclasses of processes involving near-integrated time series. The limit results relate to tempered fractional processes that include tempered fractional Brownian motion and tempered fractional diffusions of the secondkind. The theory is extended to provide the limiting distribution for autoregressions with such tempered near-integrated time series, thereby enabling analysis of the limit properties of statistics of particular interest in econometrics, such as unit root tests, under more general conditions than existing theory. Some extensions of the theory to the multivariate case arereported.<br/
Norms of anthropometric, body composition measures and prevalence of overweight and obesity in urban populations of Iran
زمینه و هدف: تدوین هنجارهای ملی برای اندازههای پیکری و ترکیب بدنی و نیز تعیین شیوع اضافه وزن و چاقی به دلیل ارتباط آن با بیماریهای مزمن از ضروریات جوامع امروزی است. این پژوهش با هدف تهیهی این هنجارها در جمعیتهای شهری ایران طراحی و اجرا شد. روش بررسی: در این پژوهش مقطعی که از نوع توصیفی – تحلیلی بود، 991 نفر مرد و 1188 نفر زن با دامنه سنی 15 تا 64 سال به شیوه در دسترس از شهرهای اردبیل، اصفهان، اهواز، تهران، رشت، کرمان و مشهد فراخوان شدند. شاخص تودهی بدن (BMI)، دور کمر (WC)، نسبت دور کمر به لگن (WHR)، نسبت دور کمر به قد (WHtR) و درصد چربی بدن آزمودنیها اندازهگیری شد. داده ها با استفاده از آزمون های t مستقل، ضریب همبستگی جزیی تعدیل شده و تحلیل واریانس یک طرفه در نرم افزار SPSS تجزیه و تحلیل شدند. یافتهها: با توجه به اندازههای BMI، 49 مردان و 53 زنان دارای اضافه وزن یا چاقی بودند که 2/10 مردان و 6/18 زنان چاق بودند. در هر گروه سنی، مردان درصد چربی کمتری نسبت به زنان داشتند (001/0>P). در هر دوی مردان و زنان شیوع اضافه وزن در میان ردهی سنی 49-40 سال و شیوع چاقی در ردهی سنی بالای 50 سال بیشتر از سنین دیگر بود. نتیجهگیری: یافتههای پژوهش حاضر ضمن ارایهی هنجارهای ملی، شیوع بالای اضافه وزن و چاقی عمومی و شکمی را در هر دو جنسیت در جمعیتهای شهری ایران نشان داد که بیانگر لزوم ارزیابیهای مستمر و ارایهی برنامههای مداخلهای در جهت کنترل و پیشگیری از اختلالهای مرتبط با چاقی مانند دیابت میباش
Single-shot quantum memory advantage in the simulation of stochastic processes
Stochastic processes underlie a vast range of natural and social phenomena.
Some processes such as atomic decay feature intrinsic randomness, whereas other
complex processes, e.g. traffic congestion, are effectively probabilistic
because we cannot track all relevant variables. To simulate a stochastic
system's future behaviour, information about its past must be stored and thus
memory is a key resource. Quantum information processing promises a memory
advantage for stochastic simulation that has been validated in recent
proof-of-concept experiments. Yet, in all past works, the memory saving would
only become accessible in the limit of a large number of parallel simulations,
because the memory registers of individual quantum simulators had the same
dimensionality as their classical counterparts. Here, we report the first
experimental demonstration that a quantum stochastic simulator can encode the
relevant information in fewer dimensions than any classical simulator, thereby
achieving a quantum memory advantage even for an individual simulator. Our
photonic experiment thus establishes the potential of a new, practical resource
saving in the simulation of complex systems
- …