20,201 research outputs found
Trueing
Even in areas of philosophy of science that don’t involve formal treatments of truth, one’s background view of truth still centrally shapes views on other issues. I offer an informal way to think about truth as trueing, like trueing a bicycle wheel. This holist approach to truth provides a way to discuss knowledge products like models in terms of how well-trued they are to their target. Trueing emphasizes: the process by which models are brought into true; how the idealizations in models are not false but rather like spokes in appropriate tension to achieve a better-trued fit to target; and that this process is not accomplished once and done forever, but instead requires upkeep and ongoing fine-tuning. I conclude by emphasizing the social importance of being a pragmatist about truth in order to accurately answer questions about science such as, “but do we really know that…
Rank-based linkage I: triplet comparisons and oriented simplicial complexes
Rank-based linkage is a new tool for summarizing a collection of objects
according to their relationships. These objects are not mapped to vectors, and
``similarity'' between objects need be neither numerical nor symmetrical. All
an object needs to do is rank nearby objects by similarity to itself, using a
Comparator which is transitive, but need not be consistent with any metric on
the whole set. Call this a ranking system on . Rank-based linkage is applied
to the -nearest neighbor digraph derived from a ranking system. Computations
occur on a 2-dimensional abstract oriented simplicial complex whose faces are
among the points, edges, and triangles of the line graph of the undirected
-nearest neighbor graph on . In steps it builds an
edge-weighted linkage graph where
is called the in-sway between objects and . Take to be
the links whose in-sway is at least , and partition into components of
the graph , for varying . Rank-based linkage is a
functor from a category of out-ordered digraphs to a category of partitioned
sets, with the practical consequence that augmenting the set of objects in a
rank-respectful way gives a fresh clustering which does not ``rip apart`` the
previous one. The same holds for single linkage clustering in the metric space
context, but not for typical optimization-based methods. Open combinatorial
problems are presented in the last section.Comment: 37 pages, 12 figure
Dependently Typing R Vectors, Arrays, and Matrices
The R programming language is widely used in large-scale data analyses. It
contains especially rich built-in support for dealing with vectors, arrays, and
matrices. These operations feature prominently in the applications that form
R's raison d'\^etre, making their behavior worth understanding. Furthermore,
ostensibly for programmer convenience, their behavior in R is a notable
extension over the corresponding operations in mathematics, thereby offering
some challenges for specification and static verification.
We report on progress towards statically typing this aspect of the R
language. The interesting aspects of typing, in this case, warn programmers
about violating bounds, so the types must necessarily be dependent. We explain
the ways in which R extends standard mathematical behavior. We then show how
R's behavior can be specified in LiquidHaskell, a dependently-typed extension
to Haskell. In the general case, actually verifying library and client code is
currently beyond LiquidHaskell's reach; therefore, this work provides
challenges and opportunities both for typing R and for progress in
dependently-typed programming languages.Comment: 10 page
Testing the nomological network for the Personal Engagement Model
The study of employee engagement has been a key focus of management for over three decades. The academic literature on engagement has generated multiple definitions but there are two primary models of engagement: the Personal Engagement Model of Kahn (1990), and the Work Engagement Model (WEM) of Schaufeli et al., (2002). While the former is cited by most authors as the seminal work on engagement, research has tended to focus on elements of the model and most theoretical work on engagement has predominantly used the WEM to consider the topic.
The purpose of this study was to test all the elements of the nomological network of the PEM to determine whether the complete model of personal engagement is viable. This was done using data from a large, complex public sector workforce. Survey questions were designed to test each element of the PEM and administered to a sample of the workforce (n = 3,103). The scales were tested and refined using confirmatory factor analysis and then the model was tested determine the structure of the nomological network. This was validated and the generalisability of the final model was tested across different work and organisational types.
The results showed that the PEM is viable but there were differences from what was originally proposed by Kahn (1990). Specifically, of the three psychological conditions deemed necessary for engagement to occur, meaningfulness, safety, and availability, only meaningfulness was found to contribute to employee engagement. The model demonstrated that employees experience meaningfulness through both the nature of the work that they do and the organisation within which they do their work. Finally, the findings were replicated across employees in different work types and different organisational types.
This thesis makes five contributions to the engagement paradigm. It advances engagement theory by testing the PEM and showing that it is an adequate representation of engagement. A model for testing the causal mechanism for engagement has been articulated, demonstrating that meaningfulness in work is a primary mechanism for engagement. The research has shown the key aspects of the workplace in which employees experience meaningfulness, the nature of the work that they do and the organisation within which they do it. It has demonstrated that this is consistent across organisations and the type of work. Finally, it has developed a reliable measure of the different elements of the PEM which will support future research in this area
Self-Supervised Learning to Prove Equivalence Between Straight-Line Programs via Rewrite Rules
We target the problem of automatically synthesizing proofs of semantic
equivalence between two programs made of sequences of statements. We represent
programs using abstract syntax trees (AST), where a given set of
semantics-preserving rewrite rules can be applied on a specific AST pattern to
generate a transformed and semantically equivalent program. In our system, two
programs are equivalent if there exists a sequence of application of these
rewrite rules that leads to rewriting one program into the other. We propose a
neural network architecture based on a transformer model to generate proofs of
equivalence between program pairs. The system outputs a sequence of rewrites,
and the validity of the sequence is simply checked by verifying it can be
applied. If no valid sequence is produced by the neural network, the system
reports the programs as non-equivalent, ensuring by design no programs may be
incorrectly reported as equivalent. Our system is fully implemented for a given
grammar which can represent straight-line programs with function calls and
multiple types. To efficiently train the system to generate such sequences, we
develop an original incremental training technique, named self-supervised
sample selection. We extensively study the effectiveness of this novel training
approach on proofs of increasing complexity and length. Our system, S4Eq,
achieves 97% proof success on a curated dataset of 10,000 pairs of equivalent
programsComment: 30 pages including appendi
A Design Science Research Approach to Smart and Collaborative Urban Supply Networks
Urban supply networks are facing increasing demands and challenges and thus constitute a relevant field for research and practical development. Supply chain management holds enormous potential and relevance for society and everyday life as the flow of goods and information are important economic functions. Being a heterogeneous field, the literature base of supply chain management research is difficult to manage and navigate. Disruptive digital technologies and the implementation of cross-network information analysis and sharing drive the need for new organisational and technological approaches. Practical issues are manifold and include mega trends such as digital transformation, urbanisation, and environmental awareness.
A promising approach to solving these problems is the realisation of smart and collaborative supply networks. The growth of artificial intelligence applications in recent years has led to a wide range of applications in a variety of domains. However, the potential of artificial intelligence utilisation in supply chain management has not yet been fully exploited. Similarly, value creation increasingly takes place in networked value creation cycles that have become continuously more collaborative, complex, and dynamic as interactions in business processes involving information technologies have become more intense.
Following a design science research approach this cumulative thesis comprises the development and discussion of four artefacts for the analysis and advancement of smart and collaborative urban supply networks. This thesis aims to highlight the potential of artificial intelligence-based supply networks, to advance data-driven inter-organisational collaboration, and to improve last mile supply network sustainability. Based on thorough machine learning and systematic literature reviews, reference and system dynamics modelling, simulation, and qualitative empirical research, the artefacts provide a valuable contribution to research and practice
Rational-approximation-based model order reduction of Helmholtz frequency response problems with adaptive finite element snapshots
We introduce several spatially adaptive model order reduction approaches tailored to non-coercive elliptic boundary value problems, specifically, parametric-in-frequency Helmholtz problems. The offline information is computed by means of adaptive finite elements, so that each snapshot lives in a different discrete space that resolves the local singularities of the analytical solution and is adjusted to the considered frequency value. A rational surrogate is then assembled adopting either a least squares or an interpolatory approach, yielding a function-valued version of the standard rational interpolation method (V-SRI) and the minimal rational interpolation method (MRI). In the context of building an approximation for linear or quadratic functionals of the Helmholtz solution, we perform several numerical experiments to compare the proposed methodologies. Our simulations show that, for interior resonant problems (whose singularities are encoded by poles on the V-SRI and MRI work comparably well. Instead, when dealing with exterior scattering problems, whose frequency response is mostly smooth, the V-SRI method seems to be the best performing one
Stability of space-time isogeometric methods for wave propagation problems
This thesis aims at investigating the first steps toward an unconditionally
stable space-time isogeometric method, based on splines of maximal regularity,
for the linear acoustic wave equation. The unconditional stability of
space-time discretizations for wave propagation problems is a topic of
significant interest, by virtue of the advantages of space-time methods
compared with more standard time-stepping techniques. In the case of continuous
finite element methods, several stabilizations have been proposed. Inspired by
one of these works, we address the stability issue by studying the isogeometric
method for an ordinary differential equation closely related to the wave
equation. As a result, we provide a stabilized isogeometric method whose
effectiveness is supported by numerical tests. Motivated by these results, we
conclude by suggesting an extension of this stabilization tool to the
space-time isogeometric formulation of the acoustic wave equation.Comment: Masters thesi
The place where curses are manufactured : four poets of the Vietnam War
The Vietnam War was unique among American wars. To pinpoint its uniqueness, it was necessary to look for a non-American voice that would enable me to articulate its distinctiveness and explore the American character as observed by an Asian. Takeshi Kaiko proved to be most helpful. From his novel, Into a Black Sun, I was able to establish a working pair of 'bookends' from which to approach the poetry of Walter McDonald, Bruce Weigl, Basil T. Paquet and Steve Mason. Chapter One is devoted to those seemingly mismatched 'bookends,' Walt Whitman and General William C. Westmoreland, and their respective anthropocentric and technocentric visions of progress and the peculiarly American concept of the "open road" as they manifest themselves in Vietnam. In Chapter, Two, I analyze the war poems of Walter McDonald. As a pilot, writing primarily about flying, his poetry manifests General Westmoreland's technocentric vision of the 'road' as determined by and manifest through technology. Chapter Three focuses on the poems of Bruce Weigl. The poems analyzed portray the literal and metaphorical descent from the technocentric, 'numbed' distance of aerial warfare to the world of ground warfare, and the initiation of a 'fucking new guy,' who discovers the contours of the self's interior through a set of experiences that lead from from aerial insertion into the jungle to the degradation of burning human
feces. Chapter Four, devoted to the thirteen poems of Basil T. Paquet, focuses on the continuation of the descent begun in Chapter Two. In his capacity as a medic, Paquet's entire body of poems details his quotidian tasks which entail tending the maimed, the mortally wounded and the dead. The final chapter deals with Steve Mason's JohnnY's Song, and his depiction of the plight of Vietnam veterans back in "The World" who are still trapped inside the interior landscape of their individual "ghettoes" of the soul created by their war-time experiences
Implementing Health Impact Assessment as a Required Component of Government Policymaking: A Multi-Level Exploration of the Determinants of Healthy Public Policy
It is widely understood that the public policies of ‘non-health’ government sectors have greater impacts on population health than those of the traditional healthcare realm. Health Impact Assessment (HIA) is a decision support tool that identifies and promotes the health benefits of policies while also mitigating their unintended negative consequences. Despite numerous calls to do so, the Ontario government has yet to implement HIA as a required component of policy development. This dissertation therefore sought to identify the contexts and factors that may both enable and impede HIA use at the sub-national (i.e., provincial, territorial, or state) government level.
The three integrated articles of this dissertation provide insights into specific aspects of the policy process as they relate to HIA. Chapter one details a case study of purposive information-seeking among public servants within Ontario’s Ministry of Education (MOE). Situated within Ontario’s Ministry of Health (MOH), chapter two presents a case study of policy collaboration between health and ‘non-health’ ministries. Finally, chapter three details a framework analysis of the political factors supporting health impact tool use in two sub-national jurisdictions – namely, Québec and South Australia.
MOE respondents (N=9) identified four components of policymaking ‘due diligence’, including evidence retrieval, consultation and collaboration, referencing, and risk analysis. As prospective HIA users, they also confirmed that information is not routinely sought to mitigate the potential negative health impacts of education-based policies. MOH respondents (N=8) identified the bureaucratic hierarchy as the brokering mechanism for inter-ministerial policy development. As prospective HIA stewards, they also confirmed that the ministry does not proactively flag the potential negative health impacts of non-health sector policies. Finally, ‘lessons learned’ from case articles specific to Québec (n=12) and South Australia (n=17) identified the political factors supporting tool use at different stages of the policy cycle, including agenda setting (‘policy elites’ and ‘political culture’), implementation (‘jurisdiction’), and sustained implementation (‘institutional power’).
This work provides important insights into ‘real life’ policymaking. By highlighting existing facilitators of and barriers to HIA use, the findings offer a useful starting point from which proponents may tailor context-specific strategies to sustainably implement HIA at the sub-national government level
- …