9 research outputs found

    Engineering Stress Culture in Project-Based Engineering Programs

    Get PDF
    Background: This research paper examines engineering stress culture in the context of project-based learning engineering programs at the university level. Multiple authors have reported that the culture of engineering and engineering education can be stressful and exclusive. A study conducted by Jensen and Cross [9] found that measures of inclusion such as Department Caring and Department Pride were negatively correlated with stress, anxiety, and depression. We used the approach developed by Jensen and Cross to examine stress culture in the context of three project-based learning engineering programs. Purpose: Our goal was to establish a baseline of measures of mental health (stress, anxiety, and depression), professional identity, and inclusion among students in entirely project-based engineering and computer science programs. Design/Method: Our study used the instruments developed by Jensen and Cross to gather data from the perspective of students pursuing integrated engineering and computer science degrees in entirely project-based learning environments. Data collection and analysis for this study were informed by the methodology used by Jensen and Cross, allowing us to establish baseline measures for stress culture within the context of project-based learning environments in engineering and computer science. Results: We present results from statistical analyses reporting measures of mental health (stress, anxiety, depression), professional identity, and perceptions of inclusion among students pursuing engineering and computer science degrees in entirely project-based learning environments. Students in the project-based programs reported less stress and depression and a stronger vision of an engineering career than students in the Jensen and Cross study. The anxiety and professional identity results were comparable with the original Jensen and Cross results. Conclusions: Although the sample size for this study is smaller than that of the original Jensen and Cross study, the results show the strong potential impact of project-based engineering programs. Future work will examine performance changes as a function of time and population size, as well as triangulating and supporting quantitative results with qualitative data

    From Biological to Synthetic Neurorobotics Approaches to Understanding the Structure Essential to Consciousness (Part 2)

    Get PDF
    We have been left with a big challenge, to articulate consciousness and also to prove it in an artificial agent against a biological standard. After introducing Boltuc’s h-consciousness in the last paper, we briefly reviewed some salient neurology in order to sketch less of a standard than a series of targets for artificial consciousness, “most-consciousness” and “myth-consciousness.” With these targets on the horizon, we began reviewing the research program pursued by Jun Tani and colleagues in the isolation of the formal dynamics essential to either. In this paper, we describe in detail Tani’s research program, in order to make the clearest case for artificial consciousness in these systems. In the next paper, the third in the series, we will return to Boltuc’s naturalistic non-reductionism in light of the neurorobotics models introduced (alongside some others), and evaluate them more completely

    Design Science Research Methodology: An Artefact-Centric Creation and Evaluation Approach

    Get PDF
    Adaptation of the Design Science Research methodology has never been easy. There have always been concerns regarding the validity of design science and the evaluation of the artefacts generated therewith and the subsequent claims of the researchers. To address these problems we propose an artefact-centric creation and evaluation methodology for design science research. This methodology begins with observation which is followed by theory building which in turn is followed by an interwoven artefact creation and artefact evaluation process. The artefact creation process focuses on the creation of key artefacts that include conceptual models, processes, conceptual frameworks, system frameworks, architectures, and implementations. The artefact evaluation process is tightly interwoven with the artefact creation process and evaluates the artefacts independently as well as against prior artefacts that influenced their creation. In this paper we discuss in brief the application of this methodology to the ‘Sustainable Business Transformation’ design science research project

    Values-Based Transformative Games:From the Physical to the Digital

    Get PDF
    In the context of game-based learning, learning is often limited to basic literacies such as math and reading, even though several educational institutions acknowledge the importance of Values education. In this paper we discuss how to bring values into a game. We discuss the design and implementation of a customisable version of the popular board game, Snakes and Ladders to teach values to the young (ages 0-8). Values refer to “a centrally held, enduring belief which guides actions and judgements across specific situations…”. This implies that there is an inherent element of choice or decision-making in demonstrating one's values. We discuss the process of adapting the Snakes and Ladders board game to a physical artefact by applying a Values-based Transformative Games Design Model, and further digitizing the artefact to make it more accessible. A prototype of the digital artefact is presented to demonstrate the concept. The Insider Action Game Design Research methodology is applied to create a physical artefact given the researcher's involvement in volunteer work on values-based education for the young. The findings of this research are of immediate benefit to those wishing to introduce a digitized version of a simple and popular board game to teach values to young children. The values-based questions used in the game are easy to adapt so the game has the potential to be extended to various other basic literacies, as well as different types of values such as sustainability and cultural values. The Values-based Transformative Games design model can also be adapted and improved with further research.</p

    Computing science and the demarcation problem in philosophy of science : an epistemological investigation

    Get PDF
    Orientadores: Silvio Seno Chibeni, Marcelo Esteban ConiglioDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciências HumanasResumo: Nesta dissertação examina-se a tese de que a ciência da computação apresenta um hibridismo epistêmico peculiar, com propriedades ora similares à matemática, ora similares às ciências naturais, dependendo do recorte disciplinar em enfoque. Tal tese é abordada pela análise da natureza epistêmica subjacente à ciência da computação e de seu estatuto atual no âmbito científico e acadêmico, sob a perspectiva da filosofia da ciência, em particular, do critério de demarcação científica de Thomas Kuhn. Com esse objetivo em mente, investigam-se duas questões fundamentais. A primeira consiste em um exame genealógico da ciência da computação em relação a outros campos de conhecimento, no contexto da filosofia da ciência. A segunda questão é examinar o próprio problema da demarcação por meio de alguns critérios da filosofia da ciência contemporânea. Compreendendo a ciência da computação como campo de conhecimento com identidade própria, autônoma, porém com intersecções simultâneas com a matemática, a física e a engenharia, debruça-se sobre essas intersecções entre domínios, que oferecem elementos para reflexões de caráter fundamentalmente filosófico acerca de importantes temas do conhecimento computacional. Finalmente, examina-se a hipótese de que o processo constitutivo da ciência da computação foi, de certa forma, um subproduto de uma crise epistêmica, de características essencialmente kuhnianas, que ocorreu no conhecimento lógico-matemático durante o período chamado de empirismo lógico. A fim de iluminar as questões acima, também apresenta-se uma análise breve e sistemática do problema da demarcação, considerando alguns dos critérios e argumentos principais já construídos na literatura contemporânea. Intersecções epistêmicas entre ciência da computação, matemática e ciências naturais são exploradas para os principais grupos de subdisciplinas da ciência da computação em função de suas peculiaridades epistêmicas, a saber: ciência da computação teórica, que abrange algoritmos, complexidade computacional e classes de problemas; engenharia de software, que abrange aspectos arquiteturais de sistemas computacionais; campos de computação heurística, como a inteligência artificial e, finalmente, alguns aspectos distintivos entre a computação quântica e a computação clássicaAbstract: This dissertation addresses the thesis that computing science has a peculiar epistemic hybridism, with properties sometimes similar to mathematics, sometimes similar to the natural sciences, depending on the selected disciplinary focus. This thesis is approached by the analysis of the epistemic nature underlying computing science and its current status in scientific and academic environments, from the perspective of the philosophy of science, particularly the scientific demarcation criterion of Thomas Kuhn. With this goal in mind, two key issues are investigated. The first one is a genealogical examination of computing science in relation to other fields of knowledge in the context of philosophy of science. The second issue is to examine the problem of demarcation itself by means of a few criteria from the contemporary philosophy of science. We understand computing science as a field of knowledge with its own autonomous identity, however with simultaneous intersections with mathematics, physics and engineering. As such, some of these intersections between domains are addressed, as they bear elements for reflections of a fundamentally philosophical character about important topics of computational knowledge. Finally, we analyse the hypothesis that the constitutive process of computing science was, in a way, a by-product of an epistemic crisis, bearing Kuhnian characteristics, that occurred in the logical-mathematical knowledge during the period known as logical empiricism. In order to illuminate the above questions, a brief and systematic analysis of the demarcation problem is provided, taking into account some of the main criteria and arguments already put forward in the contemporary literature. Epistemic intersections between computing science, mathematics and natural sciences are explored for the main groups of computing science subdisciplines due to their epistemic singularities, namely: theoretical computing science, which encompasses algorithms, computational complexity and classes of problems; software engineering, which covers architectural aspects of computational systems; heuristic computing fields such as artificial intelligence and, finally, a few distinctive aspects between quantum computing and classical computingMestradoFilosofiaMestre em Filosofi

    Una teoría para el desarrollo software construida mediante técnicas y modelos de gestión del conocimiento

    Get PDF
    [Resumen] En esta tesis se muestra y demuestra, cómo la ecuación fundamental del conocimiento, aplicada al propio conocimiento, mejora éste, Naturalmente, ésta es la forma de trabajar de los científicos en sus investigaciones, pero aquí se trata de probar que, usando técnicas de gestión del conocimiento, cualquier especialista en una materia puede sacar provecho de dicha ecuación. En este trabajo, se eligió el dominio del desarrollo del software como marco donde investigar la tesis propuesta. Así, en primer lugar, se detectó que el conocido "gap" entre el desarrollo del hardware y el del software es básicamente consecuencia de que el primero tiene una verdadera ingeniería que lo soporta y por lo tanto, una ciencia que lo avala y fundamente; respectivamente, la electrónica y la física, en tanto la segunda, es aún más un arte que una ingeniería sin teoría científica que la avale. Por ello, la propuesta que se hace es presentar una teoría que convierta el desarrollo software en una verdadera ingeniería. Con esto "in mente", se han establecido las condiciones formales y materiales de adecuación de cualquier teoría. A continuación, utilizado el teorema de Löwehim-Skolem y la generación de los números ordinales a partir del vacío, por von Neumann, se demuestra la factibilidad de dicha teoría. Posteriormente, y tomando como dominio la programación funcional, y más en concreto la "curryficación", se comprueba la viabilidad de la teoría. Para, finalmente, proponer una teoría que, cumpliendo los requisitos exigibles a cualquier teoría, fundamenta el desarrollo software. Más aún, pues la teoría propuesta es tan amplia y robusta que pude aplicarse a cualquier sistema de información incluido el ADN y el Cerebro. Para contrastarla, se proponen, en todos estos dominios, distintos experimentos cruciales que, supuestamente, son capaces de falsarla. Como resultados concretos se han obtenido los siguientes: A) Establecimiento de los límites computacionales en 1050 operaciones por segundo y 1031 bits de memoria, para un "mentefacto" de 1kg. B) Que la conjunción del teorema de Löwehim-Skolem y la propuesta de generación de ordinales de von Neumann son suficientes para establecer una teoría para el desarrollo del software. De paso, y como resultado añadido, se ve que la expresión de Kronecker sobre la creación de los números enteros hay que modificarla en el siguiente sentido: Dios creó el vacío, el hombre hizo el resto. C) La idea de la Programación funcional de Frege y Schöfinkel que desarrolló Curry, establece la efectividad de una teoría axiomática para el desarrollo software. D) Se propone una teoría con dos constructos y tres postulados, no sólo para el desarrollo software, sino también para cualquier sistema de información. E) Finalmente, y como efecto colateral, se muestra como Leibniz plagió al español Caramuel en la creación del sistema binario de numeración

    Operational analysis of sequence diagram specifications

    Get PDF
    This thesis is concerned with operational analysis of UML 2.x sequence diagram specifications. By operational analysis we mean analysis based on a characterization of the executions of sequence diagrams, or in other words an operational semantics for sequence diagrams. We define two methods for analysis of sequence diagram specifications – refinement verification and refinement testing – and both are implemented in an analysis tool we have named ‘Escalator’. Further, we make the first steps in the direction of extending our approach with support for availability analysis. In order to facilitate operational analysis, we define an operational semantics for UML 2.x sequence diagrams. The operational semantics is loyal to the intended semantics of UML, and is proven to be sound and complete with respect to the denotational semantics for sequence diagrams defined in STAIRS – a framework for stepwise development based on refinement of sequence diagram specifications. The operational semantics has a formalized meta-level, on which we define execution strategies. This meta-level allows us to make distinctions between positive and negative behavior, between potential and universal behavior, and between potential and mandatory choice, all of which are inherently difficult in an operational semantics. Based on the operational semantics and its formalized meta-level, we define trace generation, test generation and test execution. Further, based on a formalization of refinement in STAIRS, the trace generation is used to devise a method for refinement verification, and the test generation and the test execution are used to define a method for refinement testing. Both are methods for investigating whether or not a sequence diagram specification is a correct refinement of another sequence diagram specification. The operational semantics, the refinement verification and the refinement testing are implemented with the term rewriting language Maude, and these implementations are integrated in the Escalator tool. In addition, Escalator provides a graphical user interface for working with sequence diagram specifications and for running the analyses. In order to facilitate availability analysis, we define a conceptual model for service availability where the basic properties of availability are identified. Further, we extend the operational semantics with support for one class of these basic properties, namely real-time properties, and outline how the operation semantics extended with time can be applied to make methods for timed analysis of sequence diagram specifications

    An evaluation methodology and framework for semantic web services technology

    Get PDF
    Software engineering has been driven over decades by the trend towards component based development and loose coupling. Service oriented architectures and Web Services in particular are the latest product of this long-reaching development. Semantic Web Services (SWS) apply the paradigms of the Semantic Web to Web Services to allow more flexible and dynamic service usages. Numerous frameworks to realize SWS have been put forward in recent years but their relative advantages and general maturity are not easy to assess. This dissertation presents a solution to this issue. It defines a general methodology and framework for SWS technology evaluation as well as concrete benchmarks to assess the functional scope and performance of various approaches. The presented benchmarks have been executed within international evaluation campaign. The thesis thus comprehensively covers theoretical, methodological as well as practical results regarding the evaluation and assessment of SWS technologies
    corecore