1,389 research outputs found

    Reconstructing the Dedicatory Experience: Flexibility and Limitation in the Ancient Greek Dedicatory Process

    Get PDF
    Identifying the factors that affected dedicatory practices has long been an area of consideration in the study of ancient Greek religion. However, this discussion is largely dominated by two concepts, those of divine specialization and appropriateness. Whereas the former assumes that divine beings had responsibilities specific to them and that this specialization limited the range of offerings a deity could receive, the latter assumes that worshippers not only selected gifts in accordance with those divine specializations but also based on preconceived notions of gender roles of worshippers and deities alike. In addition, there is a tendency to deprive worshippers of their agency and, thus, their ability to shape their own dedicatory experience. This study reconsiders the role that worshippers play in the dedicatory process by reconceptualizing it as a series of choices. Thus, it considers the flexibility and limitation of ancient Greek dedicatory practices by identifying the factors that affected a worshipper\u27s experiences when offering gifts to divine beings. It also examines a wider range of sources, considering a fresh and broader selection of literary sources coupled with archaeological and epigraphical evidence. By bringing together material from the Geometric to the Hellenistic period from all across the Greek world, this dissertation creates a more nuanced reconstruction of the dedicatory process and thus demonstrates that each worshipper had a unique dedicatory experience when offering a gift to a divine being. Factors that did restrict worshippers in their choices included regulations limiting access to sanctuaries and areas within them, personal aspects of worshippers, such as social status, membership in certain groups, and gender, as well as the inheritance of a vow. A careful review of the evidence suggests that notions of specialization and appropriateness were less limiting than previously thought. Worshippers could dedicate an offering of their choice to a deity or hero because they were flexible beings and capable of aiding worshippers in a variety of activities. Similarly, the gender of the worshipper and the deity did not necessarily dictate the choice of gift

    Development and validation of an instrument to measure perceived service quality of an academic library in Costa Rica

    Get PDF
    Service management involves the responsibility of ensuring the effectiveness of business operations in terms of meeting customer requirements. A good service is judged not only by meeting customer requirements but also by the way the customers perceive and interpret the received service. To know how effective the service is, the quality of the service can be measured. For this aim it is necessary to target actual service elements to improve and to weigh the evaluation of service elements relative to the importance that customers place on them. The literature shows that service quality outcome and measurement are dependent on the type of service setting, situation, needs and other factors. General instruments to measure perceived service were developed in the context of main dimensions proposed by general service quality models. However, it is important to develop new instruments which are directly targeted to the context reality. Based upon conceptual models the goal of this study is to target actual service elements that customers from an academic library in Costa Rica deem important. Using the identified elements the dimensions of service quality are developed and validated to measure user perceived service. It was discussed how appropriable knowledge on quality service can spurred the innovative capacity to improve library services

    Profile monitoring via sensor fusion: The use of PCA methods for multi-channel data

    Get PDF
    Continuous advances of sensor technology and real-time computational capability are leading to data-rich environments to improve industrial automation and machine intelligence. When multiple signals are acquired from different sources (i.e. multi-channel signal data), two main issues must be faced: (i) the reduction of data dimensionality to make the overall signal analysis system efficient and actually applicable in industrial environments, and (ii) the fusion of all the sensor outputs to achieve a better comprehension of the process. In this frame, multi-way principal component analysis (PCA) represents a multivariate technique to perform both the tasks. The paper investigates two main multi-way extensions of the traditional PCA to deal with multi-channel signals, one based on unfolding the original data-set, and one based on multi-linear analysis of data in their tensorial form. The approaches proposed for data modelling are combined with appropriate control charting to achieve multi-channel profile data monitoring. The developed methodologies are demonstrated with both simulated and real data. The real data come from an industrial sensor fusion application in waterjet cutting, where different signals are monitored to detect faults affecting the most critical machine components

    Modeling spatial point processes in video-imaging via Ripley's K-function: an application to spatter analysis in additive manufacturing

    Get PDF
    For an increasing number of applications, the quality and the stability of manufacturing processes can be determined via image and video-image data analysis and new techniques are required to extract and synthesize the relevant information content enclosed in big sensor data to draw conclusions about the process and the final part quality. This paper focuses on video image data where the phenomena under study is captured by a point process whose spatial signature is of interest. A novel approach is proposed which combines spatial data modeling via Ripley's K-function with Functional Analysis of Variance (FANOVA), i.e., Analysis of Variance on Functional data. The K-function allows to synthesize the spatial pattern information in a function while preserving the capability to capture changes in the process behavior. The method is applicable to quantities and phenomena that can be represented as clusters, or clouds, of spatial points evolving over time. In our case, the motivating case study regards the analysis of spatter ejections caused by the laser-material interaction in Additive Manufacturing via Laser Powder Bed Fusion (L-PBF). The spatial spread of spatters, captured in the form of point particles through in-situ high speed machine vision, can be used as a proxy to select the best conditions to avoid defects (pores) in the manufactured part. The proposed approach is shown to be not only an efficient way to translate the high-dimensional video image data into a lower dimensional format (the K-function curves), but also more effective than benchmark methods in detecting departures from a stable and in-control state

    Hierarchical metamodeling: Cross validation and predictive uncertainty

    Get PDF
    At Esaform 2013 a hierarchical metamodeling approach had been presented, able to com- bine the results of numerical simulations and physical experiments into a unique response surface, which is a "fusion'' of both data sets. The method had been presented with respect to the structural optimization of a steel tube, filled with an aluminium foam, intended as an anti-intrusion bar. The prediction yielded by a conventional way of metamodeling the results of FEM simulations can be considered trustworthy only if the accuracy of numerical models have been thoroughly tested and the simulation parameters have been sufficiently calibrated. On the contrary, the main advantage of a hierarchical metamodel is to yield a reliable prediction of a response variable to be optimized, even in the presence of non-completely calibrated or accurate FEM models. In order to demonstrate these statements, in this paper the authors wish to compare the prediction ability of a "fusion'' metamodel based on under-calibrated simulations, with a conventional approach based on calibratedFEMresults. Both metamodels will be cross validated with a "leave-one-out'' technique, i.e. by excluding one ex- perimental observation at a time and assessing the predictive ability of the model. Furthermore, the paper will demonstrate how the hierarchical metamodel is able to provide not only an average esti- mated value for each excluded experimental observation, but also an estimation of uncertainty of the prediction of the average value

    Spatio-temporal Analysis of Thermal Profiles in Extrusion-based Additive Manufacturing

    Get PDF
    Extrusion-based Additive Manufacturing (AM) processes have recently gained increasing attention in the scientific and industrial communities because of the wide range of processible materials (from thermoplastics to composite and biomaterials), printable volumes, and industrial applications. As for many other AM processes, the actual problems with process stability and repeatability are still limiting the industrial process adoption, as these problems can significantly impact on the final part quality. In this framework, a latest research trend aims at developing in-situ monitoring solutions for inline defect detection, in a zero-waste production perspective. Among the existing in-situ sensing techniques, many studies showed that in-situ thermography represents a viable solution to describe the temperature dynamic and validate the thermal models but very few approaches have been proposed to quantitively study the temperature evolution to quickly detect process instabilities. This paper presents a new approach to quickly analyse the temporal dynamic of temperature in the printed layer while providing a spatial mapping of the temperature homogeneities. Compared with previous methods, the current one has the main novelty feature of combining both the spatial and temporal signature in a synthetic mapping that allows to detect unstable or unusual problems. In order to show the effectiveness of the proposed solution, a real case study of Big Area Additive Manufacturing (BAAM) for composite materials is considered. The study shows that the provided method can clearly enhance defect detection and represents a new solution for detecting anomalous areas where thermal profiles behave differently with respect to the surrounding areas. The same methodology underlined the thermal evolution complexity in the BAAM case study and enabled the detection of local flaws, i.e., hot and cold spots

    Robust in-line qualification of lattice structures manufactured via laser powder bed fusion

    Get PDF
    The shape complexity enabled by AM would impose new part inspection systems (e.g., x-ray computed tomography), which translate into qualification time and costs that may be not affordable. However, the layerwise nature of the process potentially allows anticipating qualification tasks in-line and in-process, leading to a quick detection of defects since their onset stage. This opportunity is particularly attractive in the presence of lattice structures, whose industrial adoption has considerably increased thanks to AM. This paper presents a novel methodology to model the quality of lattice structures at unit cell level while the part is being built, using high resolutions images of the powder bed for in-line geometry reconstruction and identification of deviations from the nominal shape. The methodology is designed to translate complex 3D shapes into 1D deviation profiles that capture the “geometrical signature” of the cell together with the reconstruction uncertainty

    Improved Signal Characterization via Empirical Mode Decomposition to Enhance in-line Quality Monitoring

    Get PDF
    The machine tool industry is facing the need to increase the sensorization of production systems to meet evolving market demands. This leads to the increasing interest for in-process monitoring tools that allow a fast detection of faults and unnatural process behaviours during the process itself. Nevertheless, the analysis of sensor signals implies several challenges. One major challenge consists of the complexity of signal patterns, which often exhibit a multiscale content, i.e., a superimposition of both stationary and non-stationary fluctuations on different time-frequency levels. Among time-frequency techniques, Empirical Mode Decomposition (EMD) is a powerful method to decompose any signal into its embedded oscillatory modes in a fully data-driven way, without any ex-ante basis selection. Because of this, it might be used effectively for automated monitoring and diagnosis of manufacturing processes. Unfortunately, it usually yields an over-decomposition, with single oscillation modes that can be split into more than one scale (this effect is also known as “mode mixing”). The literature lacks effective strategies to automatically synthetize the decomposition into a minimal number of physically relevant and interpretable components. This paper proposes a novel approach to achieve a synthetic decomposition of complex signals through the EMD procedure. A new criterion is proposed to group together multiple components associated to a common time-frequency pattern, aimed at summarizing the information content into a minimal number of modes, which may be easier to interpret. A real case study in waterjet cutting is presented, to demonstrate the benefits and the critical issues of the proposed approach

    Evaluating Legacy System Migration Technologies through Empirical Studies

    Get PDF
    We present two controlled experiments conducted with master students and practitioners and a case study conducted with practitioners to evaluate the use of MELIS (Migration Environment for Legacy Information Systems) for the migration of legacy COBOL programs to the web. MELIS has been developed as an Eclipse plug-in within a technology transfer project conducted with a small software company [16]. The partner company has developed and marketed in the last 30 years several COBOL systems that need to be migrated to the web, due to the increasing requests of the customers. The goal of the technology transfer project was to define a systematic migration strategy and the supporting tools to migrate these COBOL systems to the web and make the partner company an owner of the developed technology. The goal of the controlled experiments and case study was to evaluate the effectiveness of introducing MELIS in the partner company and compare it with traditional software development environments. The results of the overall experimentation show that the use of MELIS increases the productivity and reduces the gap between novice and expert software engineers

    A techno-economic approach for decision-making in metal additive manufacturing: metal extrusion versus single and multiple laser powder bed fusion

    Get PDF
    This work presents a decision-making methodology that allows the merging of quantitative and qualitative decision variables for selecting the optimal metal Additive Manufacturing (AM) technology. The approach is applied on two competing technologies in the field of metal AM industry, i.e., the metal extrusion AM process (metal FFF) and the Laser Powder Bed Fusion process (LPBF) with single and multiple lasers, which represent the benchmark solution currently on the market. A comprehensive techno-economical comparison is presented where the two processes are analysed in terms of process capabilities (quality, easiness of use, setup time, range of possible materials, etc.) and costs, considering two different production scenarios and different parts’ geometries. In the first scenario, the AM system is assumed to be dedicated to one single part production while in this second scenario, the AM system is assumed to be saturated, as devoted to producing a wide mix of part types. For each scenario, two different part types made of 17–4 PH stainless steel are considered as a reference to investigate the effect of shape complexity, part size and production times to select the best technology when metal FFF and LPBF must be considered. The first part type refers to an extrusion die, to represent typical shapes of interest in the tooling industry, while the second part type is an impeller which can be used in many different industrial sectors, ranging from oil and gas to aerospace. In order to include quantitative and qualitative criteria, a decision-making model based on Analytic Hierarchy Process (AHP) is proposed as the enabler tool for decision making. The proposed approach allows to determine the most effective solution depending on the different production configurations and part types and can be used as a guideline and extended to include other technologies in the field of metal AM. On the other side, the critical discussion of the criteria selected, and the results achieved allow to highlight the pros and cons of the competing technologies, thus defining the existing limits to define directions for future research
    • …
    corecore