240 research outputs found

    An n log n algorithm for determining the congruity of polyhedra

    Get PDF
    AbstractThis paper describes an algorithm for determining whether two polyhedra are congruent. The asymptotic time complexity of the algorithm is bounded by a constant times n log n where n is the number of edges of the polyhedra, It is also shown that under some conditions the problem of partial congruity can be solved in O(n2) time

    A reverse engineering approach to measure the deformations of a sailing yacht

    Get PDF
    In this work, a multidisciplinary experience, aimed to study the permanent deformations of the hull of a regatta sailing yacht is described. In particular, a procedure to compare two different surfaces of the hull of a small sailing yacht, designed and manufactured at the University of Palermo, has been developed. The first one represents the original CAD model while the second one has been obtained by means of a reverse engineering approach. The reverse engineering process was performed through an automatic close-range photogrammetry survey, that has allowed to obtain very accurate measures of the hull, and a 3D modelling step by the well-known 3D computer graphics software Rhinoceros. The reverse engineering model was checked through two different procedures implemented by the graphical algorithm editor Grasshopper. The first procedure has allowed to compare the photogrammetric measurements with the rebuilt surface, in order to verify if the reverse engineering process has led to reliable results. The second has been implement to measure the deviations between the original CAD model and the rebuilt surface of the hull. This procedure has given the possibility to highlight any permanent deformation of the hull due to errors during the production phase or to excessive loads during its use. The obtained results have demonstrated that the developed procedure is very efficient and able to give detailed information on the deviation values of the two compared surfaces

    Congruence Testing of Point Sets in 4-Space

    Get PDF
    We give a deterministic O(n log n)-time algorithm to decide if two n-point sets in 4-dimensional Euclidean space are the same up to rotations and translations. It has been conjectured that O(n log n) algorithms should exist for any fixed dimension. The best algorithms in d-space so far are a deterministic algorithm by Brass and Knauer [Int. J. Comput. Geom. Appl., 2000] and a randomized Monte Carlo algorithm by Akutsu [Comp. Geom., 1998]. They take time O(n^2 log n) and O(n^(3/2) log n) respectively in 4-space. Our algorithm exploits many geometric structures and properties of 4-dimensional space

    Semantic processing with and without awareness. Insights from computational linguistics and semantic priming.

    Get PDF
    During my PhD, I’ve explored how native speakers access semantic information from lexical stimuli, and weather consciousness plays a role in the process of meaning construction. In a first study, I exploited the metaphor linking time and space to assess the specific contribution of linguistically–coded information to the emergence of priming. In fact, time is metaphorically arranged on either the horizontal or the sagittal axis in space (Clark, 1973), but only the latter comes up in language (e.g., "a bright future in front of you"). In a semantic categorization task, temporal target words (e.g., earlier, later) were primed by spatial words that were processed either consciously (unmasked) or unconsciously (masked). With visible primes, priming was observed for both lateral and sagittal words; yet, only the latter ones led to a significant effect when the primes were masked. Thus, unconscious word processing may be limited to those aspects of meaning that emerge in language use. In a second series of experiments, I tried to better characterize these aspects by taking advantage of Distributional Semantic Models (DSMs; Marelli, 2017), which represent word meaning as vectors built upon word co–occurrences in large textual database. I compared state–of–the–art DSMs with Pointwise Mutual Information (PMI; Church & Hanks, 1990), a measure of local association between words that is merely based on their surface co–occurrence. In particular, I tested how the two indexes perform on a semantic priming dataset comprising visible and masked primes, and different stimulus onset asynchronies between the two stimuli. Subliminally, none of the predictor alone elicited significant priming, although participants who showed some residual prime visibility showed larger effect. Post-hoc analyses showed that for subliminal priming to emerge, the additive contribution of both PMI and DSM was required. Supraliminally, PMI outperforms DSM in the fit to the behavioral data. According to these results, what has been traditionally thought of as unconscious semantic priming may mostly rely on local associations based on shallow word cooccurrence. Of course, masked priming is only one possible way to model unconscious perception. In an attempt to provide converging evidence, I also tested overt and covert semantic facilitation by presenting prime words in the unattended vs. attended visual hemifield of brain–injured patients suffering from neglect. In seven sub–acute cases, data show more solid PMI–based than DSM–based priming in the unattended hemifield, confirming the results obtained from healthy participants. Finally, in a fourth work package, I explored the neural underpinnings of semantic processing as revealed by EEG (Kutas & Federmeier, 2011). As the behavioral results of the previous study were much clearer when the primes were visible, I focused on this condition only. Semantic congruency was dichotomized in order to compare the ERP evoked by related and unrelated pairs. Three different types of semantic similarity were taken into account: in a first category, primes and targets were often co–occurring but far in the DSM (e.g., cheese-mouse), while in a second category the two words were closed in the DSM, but not likely to co-occur (e.g., lamp-torch). As a control condition, we added a third category with pairs that were both high in PMI and close in DSMs (e.g., lemon-orange). Mirroring the behavioral results, we observed a significant PMI effect in the N400 time window; no such effect emerged for DSM. References Church, K. W., & Hanks, P. (1990). Word association norms, mutual information, and lexicography. Computational linguistics, 16(1), 22-29. Clark, H. H. (1973). Space, time, semantics, and the child. In Cognitive development and acquisition of language (pp. 27-63). Academic Press. Kutas, M., & Federmeier, K. D. (2011). Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP). Annual review of psychology, 62, 621-647. Marelli, M. (2017). Word-Embeddings Italian Semantic Spaces: a semantic model for psycholinguistic research. Psihologija, 50(4), 503-520. Commentat

    Analyzing Genre in Post-Millennial Popular Music

    Full text link
    This dissertation approaches the broad concept of musical classification by asking a simple if ill-defined question: “what is genre in post-millennial popular music?” Alternatively covert or conspicuous, the issue of genre infects music, writings, and discussions of many stripes, and has become especially relevant with the rise of ubiquitous access to a huge range of musics since the fin du millĂ©naire. The dissertation explores not just popular music made after 2000, but popular music as experienced and structured in the new millennium, including aspects from a wide chronological span of styles within popular music. Specifically, with the increase of digital media and the concomitant shifts in popular music creation, distribution, and access, popular music categorization has entered a novel space, with technologies like internet radio, streaming services, digital audio workstations, and algorithmic recommendations providing a new conception of how musical types might be understood and experienced. I attempt to conceptualize this novel space of genre with what I call a genre-thinking or a genreme, a term which is meant to capture the ways that musical categorization infiltrates writings about, experiences of, and the structures connecting genres. This dissertation comprises four main chapters, each of which takes a slightly different perspective and approach towards questions concerning genre in popular music of the post-millennial era. Chapter 1 provides a general survey and summary of music theory’s and musicology’s discourses on musical categorization and genre. After describing the “problem of genre,” I outline the main issues at stake and chief strategies previous authors have employed. This involves describing the closely intertwined facets of the “who” of genre (is a musical category defined by music, a musician, an audience, the industry?) and the “how” of genre (is it a contract, a definition, a pattern, a system, an experience?) By asking these questions, I open new approaches to understanding and analyzing genre’s role in both the structure and potential experiences of post-millennial popular music. Chapter 2 takes on the digital compositional practice of mashups—most prevalent in the first decade of the 2000s—in an attempt to understand genre as a crucial element of meaning-formation and creation. Previous mashup scholars have tended to focus on the ironic, subversive, or humorous juxtapositions of the particular samples or artists which get layered together. However, this leaves out the broad, exceptionally potent acts of signification that are possible even when a listener lacks the knowledge of the specific autosonic source materials. By incorporating methodologies from musical semiotics and topic theory, I create a field of “interaction methods” to explain the dynamic relations between samples, exploding the analytical potential for signification and collaboration in mashups. These interaction methods are placed in dialogue with formal analysis to show ways that artists, samples, and genres intermingle in this form of digital musicking. Chapters 3 and 4 then progress chronologically into the second decade of the new millennium, taking a twinned approach to our contemporary world of streaming services and online musical cultures. First, I pursue a brief musicological and sociological exploration of current discourses engaged with genre in the 2010s, outlining the ways that critics, fans, and musicians deploy stylistic terms and musical categories. A somewhat paradoxical position emerges in which genre is both in a state of decline and a state of proliferation, simultaneously atrophying yet employed in increasingly abundant and sophisticated manners. I then describe how this contradictory state fits into sociological research on “omnivorousness” and musical taste. The following chapter investigates how these perceptions and linguistic usages of genre compare to two main ways that Spotify classifies its artists. This quantitative analysis reveals some potential systemic patterns of bias that shed light onto genre’s paradoxical position; whether genre is dead or not depends on who is classifying the music and who is being classified. These two chapters map out my concept “#genre” which I employ to describe the multivalent genre-thinking we currently inhabit

    A biomechanics-based articulation model for medical applications

    Get PDF
    Computer Graphics came into the medical world especially after the arrival of 3D medical imaging. Computer Graphics techniques are already integrated in the diagnosis procedure by means of the visual tridimensional analysis of computer tomography, magnetic resonance and even ultrasound data. The representations they provide, nevertheless, are static pictures of the patients' body, lacking in functional information. We believe that the next step in computer assisted diagnosis and surgery planning depends on the development of functional 3D models of human body. It is in this context that we propose a model of articulations based on biomechanics. Such model is able to simulate the joint functionality in order to allow for a number of medical applications. It was developed focusing on the following requirements: it must be at the same time simple enough to be implemented on computer, and realistic enough to allow for medical applications; it must be visual in order for applications to be able to explore the joint in a 3D simulation environment. Then, we propose to combine kinematical motion for the parts that can be considered as rigid, such as bones, and physical simulation of the soft tissues. We also deal with the interaction between the different elements of the joint, and for that we propose a specific contact management model. Our kinematical skeleton is based on anatomy. Special considerations have been taken to include anatomical features like axis displacements, range of motion control, and joints coupling. Once a 3D model of the skeleton is built, it can be simulated by data coming from motion capture or can be specified by a specialist, a clinician for instance. Our deformation model is an extension of the classical mass-spring systems. A spherical volume is considered around mass points, and mechanical properties of real materials can be used to parameterize the model. Viscoelasticity, anisotropy and non-linearity of the tissues are simulated. We particularly proposed a method to configure the mass-spring matrix such that the objects behave according to a predefined Young's modulus. A contact management model is also proposed to deal with the geometric interactions between the elements inside the joint. After having tested several approaches, we proposed a new method for collision detection which measures in constant time the signed distance to the closest point for each point of two meshes subject to collide. We also proposed a method for collision response which acts directly on the surfaces geometry, in a way that the physical behavior relies on the propagation of reaction forces produced inside the tissue. Finally, we proposed a 3D model of a joint combining the three elements: anatomical skeleton motion, biomechanical soft tissues deformation, and contact management. On the top of that we built a virtual hip joint and implemented a set of medical applications prototypes. Such applications allow for assessment of stress distribution on the articular surfaces, range of motion estimation based on ligament constraint, ligament elasticity estimation from clinically measured range of motion, and pre- and post-operative evaluation of stress distribution. Although our model provides physicians with a number of useful variables for diagnosis and surgery planning, it should be improved for effective clinical use. Validation has been done partially. However, a global clinical validation is necessary. Patient specific data are still difficult to obtain, especially individualized mechanical properties of tissues. The characterization of material properties in our soft tissues model can also be improved by including control over the shear modulus

    Development and Validation of a Markerless Radiostereometric Analysis (RSA)System

    Get PDF
    A markerless radiostereometric analysis (RSA) system was developed to measure three- dimensional (3D) skeletal kinematics using biplanar fluoroscopy. A virtual set-up was created, in which the fluoroscope foci and image planes were positioned. Computed tomography (CT) was used to create 3D bone models that were imported into the virtual set-up and manually moved until their projections, as viewed from the two foci, matched the two images. The accuracy of the markerless RSA system in determining relative shoulder kinematic translations and orientations was evaluated against the “gold standards” of a precisions cross-slide table and a standard RSA system, respectively. Average root mean squared errors (RMSEs) of 0.082 mm and 1.18° were found. In an effort to decrease subject’s radiation exposure, the effect of lowering CT dosage on markerless RSA accuracy was evaluated. Acceptable accuracies were obtained using bone models derived from one-ninth of the normal radiation dose

    Multicriteria Decision Analysis in Improving Quality of Design in Femoral Component of Knee Prostheses: Influence of Interface Geometry and Material

    Get PDF
    Knee prostheses as medical products require careful application of quality and design tool to ensure the best performance. Therefore, quality function deployment (QFD) was proposed as a quality tool to systematically integrate consumer’s expectation to perceived needs by medical and design team and to explicitly address the translation of customer needs into engineering characteristics. In this study, full factorial design of experiment (DOE) method was accompanied by finite element analysis (FEA) to evaluate the effect of inner contours of femoral component on mechanical stability of the implant and biomechanical stresses within the implant components and adjacent bone areas with preservation of the outer contours for standard Co-Cr alloy and a promising functionally graded material (FGM). The ANOVA revealed that the inner shape of femoral component influenced the performance measures in which the angle between the distal and anterior cuts and the angle between the distal and posterior cuts were greatly influential. In the final ranking of alternatives, using multicriteria decision analysis (MCDA), the designs with FGM was ranked first over the Co-Cr femoral component, but the original design with Co-Cr material was not the best choice femoral component, among the top ranked design with the same material

    Load distribution in flat reciprocal structures

    Get PDF
    The elements in conventional structures are perfectly ranked, so that load transmission is logical and follows the usual structural orders. Nevertheless, in reciprocal structures each element has to support all of the others in a less intuitive pattern of load transmission. The purpose of this paper is to understand exactly how load is transmitted between elements, quantifying this analytically by developing a new method which is applicable to a flat structure composed of a basic unit with any number of nexors. It is based on determining the increase in load to which the members in a reciprocal structure are subjected by calculating the coefficient k, or “transference coefficient”. The k coefficient value, and therefore the load transferred between members, falls with the number of nexors, with the proximity of point loads to exterior supports, and with the size of the central space in the structure
    • 

    corecore