37,953 research outputs found
The Hubble Hypothesis and the Developmentalist's Dilemma
Developmental psychopathology stands poised at the close of the 20th century on the horns of a major scientific dilemma. The essence of this dilemma lies in the contrast between its heuristically rich open system concepts on the
one hand, and the closed system paradigm it adopted from mainstream psychology for investigating those models on
the other. Many of the research methods, assessment strategies, and data analytic models of psychologys paradigm are predicated on closed system assumptions and explanatory models. Thus, they are fundamentally inadequate forstudying humans, who are unparalleled among open systems in their wide ranging capacities for equifinal and
multifinal functioning. Developmental psychopathology faces two challenges in successfully negotiating the developmentalists dilemma. The first lies in recognizing how the current paradigm encourages research practices
that are antithetical to developmental principles, yet continue to flourish. I argue that the developmentalists
dilemma is sustained by long standing, mutually enabling weaknesses in the paradigms discovery methods and
scientific standards. These interdependent weaknesses function like a distorted lens on the research process by
variously sustaining the illusion of theoretical progress, obscuring the need for fundamental reforms, and both
constraining and misguiding reform efforts. An understanding of how these influences arise and take their toll provides a foundation and rationale for engaging the second challenge. The essence of this challenge will be finding ways to resolve the developmentalists dilemma outside the constraints of the existing paradigm by developing indigenous research strategies, methods, and standards with fidelity to the complexity of developmental phenomena
Meta-evaluation of the impacts and legacy of the London 2012 Olympic Games and Paralympic Games : Developing methods paper
This report brings together the interim findings from the Developing Meta-Evaluation Methods study, which is being undertaken in conjunction with the Meta-Evaluation of the Impacts and Legacy of the London 2012 Olympic Games and Paralympic Games.
The work on methods is funded by the Economic and Social Research Council (ESRC). The aim of this paper is to review the existing evidence on conducting meta-evaluation, and provide guidance appropriate to the Meta Evaluation of the Games as well as other meta-evaluation studies
Recommended from our members
Verdict functions in testing with a fault domain or test hypotheses
In state based testing it is common to include verdicts within test cases, the result of the test case being the verdict reached by the test run. In addition, approaches that reason about test effectiveness or produce tests that are guaranteed to find certain classes of faults are often based on either a fault domain or a set of test hypotheses. This paper considers how the presence of a fault domain or test hypotheses affects our notion of a test verdict. The analysis reveals the need for new verdicts that provide more information than the current verdicts and for verdict functions that return a verdict based on a set of test runs rather than a single test run. The concepts are illustrated in the contexts of testing from a non-deterministic finite state machine and the testing of a datatype specified using an algebraic specification language but are potentially relevant whenever fault domains or test hypotheses are used
Fast and Lean Immutable Multi-Maps on the JVM based on Heterogeneous Hash-Array Mapped Tries
An immutable multi-map is a many-to-many thread-friendly map data structure
with expected fast insert and lookup operations. This data structure is used
for applications processing graphs or many-to-many relations as applied in
static analysis of object-oriented systems. When processing such big data sets
the memory overhead of the data structure encoding itself is a memory usage
bottleneck. Motivated by reuse and type-safety, libraries for Java, Scala and
Clojure typically implement immutable multi-maps by nesting sets as the values
with the keys of a trie map. Like this, based on our measurements the expected
byte overhead for a sparse multi-map per stored entry adds up to around 65B,
which renders it unfeasible to compute with effectively on the JVM.
In this paper we propose a general framework for Hash-Array Mapped Tries on
the JVM which can store type-heterogeneous keys and values: a Heterogeneous
Hash-Array Mapped Trie (HHAMT). Among other applications, this allows for a
highly efficient multi-map encoding by (a) not reserving space for empty value
sets and (b) inlining the values of singleton sets while maintaining a (c)
type-safe API.
We detail the necessary encoding and optimizations to mitigate the overhead
of storing and retrieving heterogeneous data in a hash-trie. Furthermore, we
evaluate HHAMT specifically for the application to multi-maps, comparing them
to state-of-the-art encodings of multi-maps in Java, Scala and Clojure. We
isolate key differences using microbenchmarks and validate the resulting
conclusions on a real world case in static analysis. The new encoding brings
the per key-value storage overhead down to 30B: a 2x improvement. With
additional inlining of primitive values it reaches a 4x improvement
Computational model for evaluating the state of geomechanical systems during computing experiments
Purpose. To create a model allowing integration of the diverse features identified for the rock massif behavior by differentiation of various theories and real phenomena into a single information-analytical flow.
Methods. System analysis of computational experiments’ results was based on the use of recursive calculation methodology for assessing accuracy of the obtained results with different methods of geometric and physical description applied to individual elements of simulation in the computational domain.
Findings. Sample tables were obtained containing the acceptable values of weight characteristics for the various simulated elements in the generalized computational domain. A recursive algorithm for the analysis of the studied objects description’ efficiency in the solution of geomechanics problems by grid numerical methods was formulated and implemented as a computational module. The authors created a system for the assessment of the results obtained via computational experiment at the time of full-scale investigation, which provides a comprehensive analysis of changes in the rock massif state during the operation of the selected support system. The conditions of combining the design characteristics of the simulated support elements functioning in a single load-carrying system under dynamic redistribution of forces were obtained.
Originality. The resulting generalized model of mine working and elements affecting its condition allows to determine most accurately the nature of changes in the stress-strain state of geotechnological system regardless of the originally a priori specified limitations.
Practical implications. The unified approach can be used in the search for the optimal parameters of implementing combined working supports in the area of mining operations and beyond.Мета. Створити модель, що дозволяє інтегрувати все різноманіття виявлених особливостей поведінки гірського масиву шляхом диференціації різних теорій і реальних явищ у єдиний інформаційно-аналітичний потік.
Методика. Системний аналіз результатів обчислювальних експериментів, що побудований на застосуванні рекурсивної методики оцінки розрахункової точності одержуваних результатів при різних методах геометричного й фізичного опису окремих елементів моделювання у розрахунковій області.
Результати. Отримано типові таблиці, що містять припустимі величини вагових характеристик для різних елементів, які моделюються в узагальненій розрахунковій області. Сформульовано й реалізовано у вигляді розрахункового модуля рекурсивний алгоритм аналізу ефективності опису досліджуваних об’єктів при рішенні завдань геомеханіки сітковими чисельними методами. Створено систему оцінки одержуваних результатів обчислювального експерименту при проведенні натурних досліджень, яка забезпечує комплексний аналіз зміни стану породного масиву в ході експлуатації обраної системи кріплення. Отримано умови сполучення розрахункових характеристик елементів кріплення, які моделюються й функціонують у єдиній вантажонесучій системі при динамічному перерозподілі зусиль.
Наукова новизна. Одержана узагальнена модель гірничої виробки й елементів, що впливають на її стан, гранично точно дозволяє визначати картину зміни напружено-деформованого стану геотехнологічної системи незалежно від початкових апріорі заданих обмежень.
Практична значимість. Забезпечується можливість застосування уніфікованого підходу в пошуку оптимальних показників застосування комбінованих кріплень гірничих виробок, які знаходяться поза зоною й у зоні дії очисних робіт.Цель. Создать модель, позволяющую интегрировать все многообразие выявленных особенностей поведения горного массива путем дифференциации различных теорий и реальных явлений в единый информационно-аналитический поток.
Методика. Системный анализ результатов вычислительных экспериментов, построенный на применении рекурсивной методики оценки расчетной точности получаемых результатов при различных методах геометрического и физического описания отдельных элементов моделирования в расчетной области.
Результаты. Получены типовые таблицы, содержащие допустимые величины весовых характеристик для различных моделируемых элементов в обобщенной расчетной области. Сформулирован и реализован в виде расчетного модуля рекурсивный алгоритм анализа эффективности описания исследуемых объектов при решении задач геомеханики сеточными численными методами. Создана система оценки получаемых результатов вычислительного эксперимента при проведении натурных исследований обеспечивающая комплексный анализ изменения состояния породного массива в ходе эксплуатации выбранной системы крепления. Получены условия совмещения расчетных характеристик моделируемых элементов крепи функционирующих в единой грузонесущей системе при динамическом перераспределении усилий.
Научная новизна. Полученная обобщенная модель горной выработки и элементов, влияющих на ее состояние, предельно точно позволяет определять картину изменения напряженно-деформированного состояния геотехнологической системы независимо от первоначально априори заданных ограничений.
Практическая значимость. Обеспечивается возможность применения унифицированного подхода в поиске оптимальных показателей применения комбинированных крепей горных выработок находящихся вне зоны и в зоне действия очистных работ.This work would not have been possible without the support of “DTEK Energo” company. The studies were performed within the framework of the state-maintained scientific research topics. The authors express special gratitude to professors of the National Mining University – Volodymyr Bondarenko and Iryna Kovalevska – for the support in conducting the research
Investigating biocomplexity through the agent-based paradigm.
Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines--or agents--to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex
- …