7,507 research outputs found

    Volumetric Techniques for Product Routing and Loading Optimisation in Industry 4.0: A Review

    Get PDF
    Industry 4.0 has become a crucial part in the majority of processes, components, and related modelling, as well as predictive tools that allow a more efficient, automated and sustainable approach to industry. The availability of large quantities of data, and the advances in IoT, AI, and data-driven frameworks, have led to an enhanced data gathering, assessment, and extraction of actionable information, resulting in a better decision-making process. Product picking and its subsequent packing is an important area, and has drawn increasing attention for the research community. However, depending of the context, some of the related approaches tend to be either highly mathematical, or applied to a specific context. This article aims to provide a survey on the main methods, techniques, and frameworks relevant to product packing and to highlight the main properties and features that should be further investigated to ensure a more efficient and optimised approach

    Contract-Based Design of Dataflow Programs

    Get PDF
    Quality and correctness are becoming increasingly important aspects of software development, as our reliance on software systems in everyday life continues to increase. Highly complex software systems are today found in critical appliances such as medical equipment, cars, and telecommunication infrastructure. Failures in these kinds of systems may have disastrous consequences. At the same time, modern computer platforms are increasingly concurrent, as the computational capacity of modern CPUs is improved mainly by increasing the number of processor cores. Computer platforms are also becoming increasingly parallel, distributed and heterogeneous, often involving special processing units, such as graphics processing units (GPU) or digital signal processors (DSP) for performing specific tasks more efficiently than possible on general-purpose CPUs. These modern platforms allow implementing increasingly complex functionality in software. Cost efficient development of software that efficiently exploits the power of this type of platforms and at the same time ensures correctness is, however, a challenging task. Dataflow programming has become popular in development of safetycritical software in many domains in the embedded community. For instance, in the automotive domain, the dataflow language Simulink has become widely used in model-based design of control software. However, for more complex functionality, this model of computation may not be expressive enough. In the signal processing domain, more expressive, dynamic models of computation have attracted much attention. These models of computation have, however, not gained as significant uptake in safety-critical domains due to a great extent to that it is challenging to provide guarantees regarding e.g. timing or determinism under these more expressive models of computation. Contract-based design has become widespread to specify and verify correctness properties of software components. A contract consists of assumptions (preconditions) regarding the input data and guarantees (postconditions) regarding the output data. By verifying a component with respect to its contract, it is ensured that the output fulfils the guarantees, assuming that the input fulfils the assumptions. While contract-based verification of traditional object-oriented programs has been researched extensively, verification of asynchronous dataflow programs has not been researched to the same extent. In this thesis, a contract-based design framework tailored specifically to dataflow programs is proposed. The proposed framework supports both an extensive subset of the discrete-time Simulink synchronous language, as well as a more general, asynchronous and dynamic, dataflow language. The proposed contract-based verification techniques are automatic, only guided by user-provided invariants, and based on encoding dataflow programs in existing, mature verification tools for sequential programs, such as the Boogie guarded command language and its associated verifier. It is shown how dataflow programs, with components implemented in an expressive programming language with support for matrix computations, can be efficiently encoded in such a verifier. Furthermore, it is also shown that contract-based design can be used to improve runtime performance of dataflow programs by allowing more scheduling decisions to be made at compile-time. All the proposed techniques have been implemented in prototype tools and evaluated on a large number of different programs. Based on the evaluation, the methods were proven to work in practice and to scale to real-world programs.Kvalitet och korrekthet blir idag allt viktigare aspekter inom mjukvaruutveckling, dÄ vi i allt högre grad förlitar oss pÄ mjukvarusystem i vÄra vardagliga sysslor. Mycket komplicerade mjukvarusystem finns idag i kritiska tillÀmpningar sÄ som medicinsk utrustning, bilar och infrastruktur för telekommunikation. Fel som uppstÄr i de hÀr typerna av system kan ha katastrofala följder. Samtidigt utvecklas kapaciteten hos moderna datorplattformar idag frÀmst genom att öka antalet processorkÀrnor. DÀrtill blir datorplattformar allt mer parallella, distribuerade och heterogena, och innefattar ofta specialla processorer sÄ som grafikprocessorer (GPU) eller signalprocessorer (DSP) för att utföra specifika berÀkningar snabbare Àn vad som Àr möjligt pÄ vanliga processorer. Den hÀr typen av plattformar möjligör implementering av allt mer komplicerade berÀkningar i mjukvara. Kostnadseffektiv utveckling av mjukvara som effektivt utnyttjar kapaciteten i den hÀr typen av plattformar och samtidigt sÀkerstÀller korrekthet Àr emellertid en mycket utmanande uppgift. Dataflödesprogrammering har blivit ett populÀrt sÀtt att utveckla mjukvara inom flera omrÄden som innefattar sÀkerhetskritiska inbyggda datorsystem. Till exempel inom fordonsindustrin har dataflödessprÄket Simulink kommit att anvÀndas i bred utstrÀckning för modellbaserad design av kontrollsystem. För mer komplicerad funktionalitet kan dock den hÀr modellen för berÀkning vara för begrÀnsad betrÀffande vad som kan beksrivas. Inom signalbehandling har mera expressiva och dynamiska modeller för berÀkning attraherat stort intresse. De hÀr modellerna för berÀkning har ÀndÄ inte tagits i bruk i samma utstrÀckning inom sÀkerhetskritiska tillÀmpningar. Det hÀr beror till en stor del pÄ att det Àr betydligt svÄrare att garantera egenskaper gÀllande till exempel timing och determinism under sÄdana hÀr modeller för berÀkning. Kontraktbaserad design har blivit ett vanligt sÀtt att specifiera och verifiera korrekthetsegenskaper hos mjukvarukomponeneter. Ett kontrakt bestÄr av antaganden (förvillkor) gÀllande indata och garantier (eftervillkor) gÀllande utdata. Genom att verifiera en komponent gentemot sitt konktrakt kan man bevisa att utdatan uppfyller garantierna, givet att indatan uppfyller antagandena. Trots att kontraktbaserad verifiering i sig Àr ett mycket beforskat omrÄde, sÄ har inte verifiering av asynkrona dataflödesprogram beforskats i samma utstrÀckning. I den hÀr avhandlingen presenteras ett ramverk för kontraktbaserad design skrÀddarsytt för dataflödesprogram. Det föreslagna ramverket stödjer sÄ vÀl en stor del av det synkrona sprÄket. Simulink med diskret tid som ett mera generellt asynkront och dynamiskt dataflödessprÄk. De föreslagna kontraktbaserade verifieringsteknikerna Àr automatiska. Utöver kontraktets för- och eftervillkor ger anvÀndaren endast de invarianter som krÀvs för att möjliggöra verifieringen. Verifieringsteknikerna grundar sig pÄ att omkoda dataflödesprogram till input för existerande och beprövade verifieringsverktyg för sekventiella program sÄ som Boogie. Avhandlingen visar hur dataflödesprogram implementerade i ett expressivt programmeringssprÄk med inbyggt stöd för matrisoperationer effektivt kan omkodas till input för ett verifieringsverktyg som Boogie. Utöver detta visar avhandlingen ocksÄ att kontraktbaserad design ocksÄ kan förbÀttra prestandan hos dataflödesprogram i körningsskedet genom att möjliggöra flera schemalÀggningsbeslut redan i kompileringsskedet. Alla tekniker som presenteras i avhandlingen har implementerats i prototypverktyg och utvÀrderats pÄ en stor mÀngd olika program. UtvÀrderingen bevisar att teknikerna fungerar i praktiken och Àr tillrÀckligt skalbara för att ocksÄ fungera pÄ program av realistisk storlek

    Accelerating finite state machine-based testing using reinforcement learning

    Get PDF
    Testing is a crucial phase in the development of complex systems, and this has led to interest in automated test generation techniques based on state-based models. Many approaches use models that are types of finite state machine (FSM). Corresponding test generation algorithms typically require that certain test components, such as reset sequences (RSs) and preset distinguishing sequences (PDSs), have been produced for the FSM specification. Unfortunately, the generation of RSs and PDSs is computationally expensive, and this affects the scalability of such FSM-based test generation algorithms. This paper addresses this scalability problem by introducing a reinforcement learning framework: the Q -Graph framework for MBT. We show how this framework can be used in the generation of RSs and PDSs and consider both (potentially partial) timed and untimed models. The proposed approach was evaluated using three types of FSMs: randomly generated FSMs, FSMs from a benchmark, and an FSM of an Engine Status Manager for a printer. In experiments, the proposed approach was much faster and used much less memory than the state-of-the-art methods in computing PDSs and RSs

    Formally verified animation for RoboChart using interaction trees

    Get PDF
    RoboChart is a core notation in the RoboStar framework. It is a timed and probabilistic domain-specific and state machine-based language for robotics. RoboChart supports shared variables and communication across entities in its component model. It has formal denotational semantics given in CSP. The semantic technique of Interaction Trees (ITrees) represents behaviours of reactive and concurrent programs interacting with their environments. Recent mechanisation of ITrees, ITree-based CSP semantics and a Z mathematical toolkit in Isabelle/HOL bring new applications of verification and animation for state-rich process languages, such as RoboChart. In this paper, we use ITrees to give RoboChart novel operational semantics, implement it in Isabelle, and use Isabelle’s code generator to generate verified and executable animations. We illustrate our approach using an autonomous chemical detector and patrol robot models, exhibiting nondeterminism and using shared variables. With animation, we show two concrete scenarios for the chemical detector when the robot encounters diïŹ€erent environmental inputs and three for the patrol robot when its calibrated position is in other corridor sections. We also verify that the animated scenarios are trace refinements of the CSP denotational semantics of the RoboChart models using FDR, a refinement model checker for CSP. This ensures that our approach to resolve nondeterminism using CSP operators with priority is sound and correct

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Space-Efficient Parameterized Algorithms on Graphs of Low Shrubdepth

    Full text link
    Dynamic programming on various graph decompositions is one of the most fundamental techniques used in parameterized complexity. Unfortunately, even if we consider concepts as simple as path or tree decompositions, such dynamic programming uses space that is exponential in the decomposition's width, and there are good reasons to believe that this is necessary. However, it has been shown that in graphs of low treedepth it is possible to design algorithms which achieve polynomial space complexity without requiring worse time complexity than their counterparts working on tree decompositions of bounded width. Here, treedepth is a graph parameter that, intuitively speaking, takes into account both the depth and the width of a tree decomposition of the graph, rather than the width alone. Motivated by the above, we consider graphs that admit clique expressions with bounded depth and label count, or equivalently, graphs of low shrubdepth (sd). Here, sd is a bounded-depth analogue of cliquewidth, in the same way as td is a bounded-depth analogue of treewidth. We show that also in this setting, bounding the depth of the decomposition is a deciding factor for improving the space complexity. Precisely, we prove that on nn-vertex graphs equipped with a tree-model (a decomposition notion underlying sd) of depth dd and using kk labels, we can solve - Independent Set in time 2O(dk)⋅nO(1)2^{O(dk)}\cdot n^{O(1)} using O(dk2log⁡n)O(dk^2\log n) space; - Max Cut in time nO(dk)n^{O(dk)} using O(dklog⁡n)O(dk\log n) space; and - Dominating Set in time 2O(dk)⋅nO(1)2^{O(dk)}\cdot n^{O(1)} using nO(1)n^{O(1)} space via a randomized algorithm. We also establish a lower bound, conditional on a certain assumption about the complexity of Longest Common Subsequence, which shows that at least in the case of IS the exponent of the parametric factor in the time complexity has to grow with dd if one wishes to keep the space complexity polynomial.Comment: Conference version to appear at the European Symposium on Algorithms (ESA 2023

    Current and Future Challenges in Knowledge Representation and Reasoning

    Full text link
    Knowledge Representation and Reasoning is a central, longstanding, and active area of Artificial Intelligence. Over the years it has evolved significantly; more recently it has been challenged and complemented by research in areas such as machine learning and reasoning under uncertainty. In July 2022 a Dagstuhl Perspectives workshop was held on Knowledge Representation and Reasoning. The goal of the workshop was to describe the state of the art in the field, including its relation with other areas, its shortcomings and strengths, together with recommendations for future progress. We developed this manifesto based on the presentations, panels, working groups, and discussions that took place at the Dagstuhl Workshop. It is a declaration of our views on Knowledge Representation: its origins, goals, milestones, and current foci; its relation to other disciplines, especially to Artificial Intelligence; and on its challenges, along with key priorities for the next decade

    Enhancing the forensic comparison process of common trace materials through the development of practical and systematic methods

    Get PDF
    An ongoing advancement in forensic trace evidence has driven the development of new and objective methods for comparing various materials. While many standard guides have been published for use in trace laboratories, different areas require a more comprehensive understanding of error rates and an urgent need for harmonizing methods of examination and interpretation. Two critical areas are the forensic examination of physical fits and the comparison of spectral data, which depend highly on the examiner’s judgment. The long-term goal of this study is to advance and modernize the comparative process of physical fit examinations and spectral interpretation. This goal is fulfilled through several avenues: 1) improvement of quantitative-based methods for various trace materials, 2) scrutiny of the methods through interlaboratory exercises, and 3) addressing fundamental aspects of the discipline using large experimental datasets, computational algorithms, and statistical analysis. A substantial new body of knowledge has been established by analyzing population sets of nearly 4,000 items representative of casework evidence. First, this research identifies material-specific relevant features for duct tapes and automotive polymers. Then, this study develops reporting templates to facilitate thorough and systematic documentation of an analyst’s decision-making process and minimize risks of bias. It also establishes criteria for utilizing a quantitative edge similarity score (ESS) for tapes and automotive polymers that yield relatively high accuracy (85% to 100%) and, notably, no false positives. Finally, the practicality and performance of the ESS method for duct tape physical fits are evaluated by forensic practitioners through two interlaboratory exercises. Across these studies, accuracy using the ESS method ranges between 95-99%, and again no false positives are reported. The practitioners’ feedback demonstrates the method’s potential to assist in training and improve peer verifications. This research also develops and trains computational algorithms to support analysts making decisions on sample comparisons. The automated algorithms in this research show the potential to provide objective and probabilistic support for determining a physical fit and demonstrate comparative accuracy to the analyst. Furthermore, additional models are developed to extract feature edge information from the systematic comparison templates of tapes and textiles to provide insight into the relative importance of each comparison feature. A decision tree model is developed to assist physical fit examinations of duct tapes and textiles and demonstrates comparative performance to the trained analysts. The computational tools also evaluate the suitability of partial sample comparisons that simulate situations where portions of the item are lost or damaged. Finally, an objective approach to interpreting complex spectral data is presented. A comparison metric consisting of spectral angle contrast ratios (SCAR) is used as a model to assess more than 94 different-source and 20 same-source electrical tape backings. The SCAR metric results in a discrimination power of 96% and demonstrates the capacity to capture information on the variability between different-source samples and the variability within same-source samples. Application of the random-forest model allows for the automatic detection of primary differences between samples. The developed threshold could assist analysts with making decisions on the spectral comparison of chemically similar samples. This research provides the forensic science community with novel approaches to comparing materials commonly seen in forensic laboratories. The outcomes of this study are anticipated to offer forensic practitioners new and accessible tools for incorporation into current workflows to facilitate systematic and objective analysis and interpretation of forensic materials and support analysts’ opinions

    Investigating the learning potential of the Second Quantum Revolution: development of an approach for secondary school students

    Get PDF
    In recent years we have witnessed important changes: the Second Quantum Revolution is in the spotlight of many countries, and it is creating a new generation of technologies. To unlock the potential of the Second Quantum Revolution, several countries have launched strategic plans and research programs that finance and set the pace of research and development of these new technologies (like the Quantum Flagship, the National Quantum Initiative Act and so on). The increasing pace of technological changes is also challenging science education and institutional systems, requiring them to help to prepare new generations of experts. This work is placed within physics education research and contributes to the challenge by developing an approach and a course about the Second Quantum Revolution. The aims are to promote quantum literacy and, in particular, to value from a cultural and educational perspective the Second Revolution. The dissertation is articulated in two parts. In the first, we unpack the Second Quantum Revolution from a cultural perspective and shed light on the main revolutionary aspects that are elevated to the rank of principles implemented in the design of a course for secondary school students, prospective and in-service teachers. The design process and the educational reconstruction of the activities are presented as well as the results of a pilot study conducted to investigate the impact of the approach on students' understanding and to gather feedback to refine and improve the instructional materials. The second part consists of the exploration of the Second Quantum Revolution as a context to introduce some basic concepts of quantum physics. We present the results of an implementation with secondary school students to investigate if and to what extent external representations could play any role to promote students’ understanding and acceptance of quantum physics as a personal reliable description of the world
    • 

    corecore