16 research outputs found

    Tensile Specimen Punch

    Get PDF
    There is a significant demand of tensile specimen for the students in MET 426 Lab. The solution would be to facilitate specimen production in the lab. This project designed a punch and die to take raw sheet material and produce some tensile specimens. The project was conceived, analyzed, and designed at the Central Washington University (CWU) Mechanical Engineering Technology department. Working within the constraints of our university resource, all the parts were made in the CWU machine shop. This project includes the base, that holds the specimen in place, and the punch, that will remove the material. Two dies and two springs will be placed right in the middle of the base to help push the punch back to the first position. Threaded fasteners were used to hold the punch housing in place and to avoid oscillating. The punch housing will support the punch, keep it stable, and aligned with the die. This project will be complete when the punch and die can be mounted to the arbor, the specimen can be supported on the base, and the punch can remove the desired material. Testing produced tensile specimens meeting the requirements of the project. The result is a 45% increase in part production as there is twice the capacity in the new design

    電場による炭素粒子ポリマーハイブリッドの設計と構造制御に関する研究

    Get PDF
    国立大学法人長岡技術科学大

    Towards an Improved Understanding of Software Vulnerability Assessment Using Data-Driven Approaches

    Get PDF
    Software Vulnerabilities (SVs) can expose software systems to cyber-attacks, potentially causing enormous financial and reputational damage for organizations. There have been significant research efforts to detect these SVs so that developers can promptly fix them. However, fixing SVs is complex and time-consuming in practice, and thus developers usually do not have sufficient time and resources to fix all SVs at once. As a result, developers often need SV information, such as exploitability, impact, and overall severity, to prioritize fixing more critical SVs. Such information required for fixing planning and prioritization is typically provided in the SV assessment step of the SV lifecycle. Recently, data-driven methods have been increasingly proposed to automate SV assessment tasks. However, there are still numerous shortcomings with the existing studies on data-driven SV assessment that would hinder their application in practice. This PhD thesis aims to contribute to the growing literature in data-driven SV assessment by investigating and addressing the constant changes in SV data as well as the lacking considerations of source code and developers’ needs for SV assessment that impede the practical applicability of the field. Particularly, we have made the following five contributions in this thesis. (1) We systematize the knowledge of data-driven SV assessment to reveal the best practices of the field and the main challenges affecting its application in practice. Subsequently, we propose various solutions to tackle these challenges to better support the real-world applications of data-driven SV assessment. (2) We first demonstrate the existence of the concept drift (changing data) issue in descriptions of SV reports that current studies have mostly used for predicting the Common Vulnerability Scoring System (CVSS) metrics. We augment report-level SV assessment models with subwords of terms extracted from SV descriptions to help the models more effectively capture the semantics of ever-increasing SVs. (3) We also identify that SV reports are usually released after SV fixing. Thus, we propose using vulnerable code to enable earlier SV assessment without waiting for SV reports. We are the first to use Machine Learning techniques to predict CVSS metrics on the function level leveraging vulnerable statements directly causing SVs and their context in code functions. The performance of our function-level SV assessment models is promising, opening up research opportunities in this new direction. (4) To facilitate continuous integration of software code nowadays, we present a novel deep multi-task learning model, DeepCVA, to simultaneously and efficiently predict multiple CVSS assessment metrics on the commit level, specifically using vulnerability-contributing commits. DeepCVA is the first work that enables practitioners to perform SV assessment as soon as vulnerable changes are added to a codebase, supporting just-in-time prioritization of SV fixing. (5) Besides code artifacts produced from a software project of interest, SV assessment tasks can also benefit from SV crowdsourcing information on developer Question and Answer (Q&A) websites. We automatically retrieve large-scale security/SVrelated posts from these Q&A websites. We then apply a topic modeling technique on these posts to distill developers’ real-world SV concerns that can be used for data-driven SV assessment. Overall, we believe that this thesis has provided evidence-based knowledge and useful guidelines for researchers and practitioners to automate SV assessment using data-driven approaches.Thesis (Ph.D.) -- University of Adelaide, School of Computer Science, 202

    Densely Packed Linear Assembles of Carbon Nanotube Bundles in Polysiloxane-Based Nanocomposite Films

    Get PDF
    Linear assemblies of carbon nanotubes (LACNTs) were fabricated and controlled in polysiloxane-based nanocomposite films and the effects of the LACNTs on the thermal and electrical properties of the films were investigated. CNTs were dispersed by mechanical stirring and sonication in a prepolymer of polysiloxane. Homogeneous suspensions were cast on polyamide spacers and oriented by linear-assembly by applying DC and switching DC electric fields before the mixture became cross-linked. Densely packed LACNTs that fixed the composite film surfaces were fabricated with various structures and thicknesses that depended on the DC and switching DC conditions. Polymer nanocomposites with different LACNT densities exhibited enhanced thermal and electrical conductivities and high optical transmittances. They are considered promising structural materials for electronic sectors in automotive and aerospace applications

    TextANIMAR: Text-based 3D Animal Fine-Grained Retrieval

    Full text link
    3D object retrieval is an important yet challenging task, which has drawn more and more attention in recent years. While existing approaches have made strides in addressing this issue, they are often limited to restricted settings such as image and sketch queries, which are often unfriendly interactions for common users. In order to overcome these limitations, this paper presents a novel SHREC challenge track focusing on text-based fine-grained retrieval of 3D animal models. Unlike previous SHREC challenge tracks, the proposed task is considerably more challenging, requiring participants to develop innovative approaches to tackle the problem of text-based retrieval. Despite the increased difficulty, we believe that this task has the potential to drive useful applications in practice and facilitate more intuitive interactions with 3D objects. Five groups participated in our competition, submitting a total of 114 runs. While the results obtained in our competition are satisfactory, we note that the challenges presented by this task are far from being fully solved. As such, we provide insights into potential areas for future research and improvements. We believe that we can help push the boundaries of 3D object retrieval and facilitate more user-friendly interactions via vision-language technologies.Comment: arXiv admin note: text overlap with arXiv:2304.0573

    Software for the frontiers of quantum chemistry:An overview of developments in the Q-Chem 5 package

    Get PDF
    This article summarizes technical advances contained in the fifth major release of the Q-Chem quantum chemistry program package, covering developments since 2015. A comprehensive library of exchange–correlation functionals, along with a suite of correlated many-body methods, continues to be a hallmark of the Q-Chem software. The many-body methods include novel variants of both coupled-cluster and configuration-interaction approaches along with methods based on the algebraic diagrammatic construction and variational reduced density-matrix methods. Methods highlighted in Q-Chem 5 include a suite of tools for modeling core-level spectroscopy, methods for describing metastable resonances, methods for computing vibronic spectra, the nuclear–electronic orbital method, and several different energy decomposition analysis techniques. High-performance capabilities including multithreaded parallelism and support for calculations on graphics processing units are described. Q-Chem boasts a community of well over 100 active academic developers, and the continuing evolution of the software is supported by an “open teamware” model and an increasingly modular design

    Ekvationslösning och dess svårigheter

    No full text
    Syftet med examensarbetet är att hitta elevers svårigheter med ekvationslösning i kursen Matematik B på gymnasieskolan samt att granska elevers lösningsmetod i syfte att se om eleverna använder den effektiva lösningsmetoden vid ekvationslöningen. För att belysa problem omkring ekvationslösning och elevernaslösningsmetod utförs ett test med uppgifter av gymnasieelever. Undersökningen har genomförts i fyra gymnasieskolor. Resultatet av testundersökning visar att elever har ett betydligt antal svårigheter vid ekvationslösningen, bland annat tillämpning av räkneregler och räknelagar. Svårigheterna beror på att elevers uppfattning och tolkning av algebraiska uttryck är begränsad eller bristande. Nästan alla deltagande elever har inte någon tanke eller några strategier att lösa ekvationer på ett effektivt sätt. I högre grad väljer elever att lösa ekvationer med en formell lösningsmetod (Den icke effektiva lösningsmetoden)
    corecore