1,248 research outputs found

    Mining Historical Advertisements in Digitised Newspapers

    Get PDF
    Historians have turned their focus to newspaper articles as a proxy of public discourse, while advertisements remain an understudied source of digitized information. This paper shows how historians can use computational methods to work with extensive collections of advertisements. Firstly, this chapter analyzes metadata to better understand the different types of advertisements, which come in a wide range of shapes and sizes. Information on the size and position of advertisements can be used to construct particular subsets of advertisements. Secondly, this chapter describes how textual information can be extracted from historical advertisements, which can subsequently be used for a historical analysis of trends and particularities. For this purpose, we present a case study based on cigarette advertisements

    Relative motion at the bone-prosthesis interface

    Get PDF
    Bone ingrowth in porous surfaces of human joint implants is a desired condition for long-term fixation in patients who are physically active (such as in sport or work). It is generally recognized that little actual bone ingrowth occurs. The best clinical results report between 10 and 20% of the total prosthetic surface in contact with bone will feature good bone ingrowth. One inhibiting factor is the relative motion of the bone with respect to the implant during load-bearing. This study investigated mathematically the interface micromotion (transverse reversible relative motion) between a flat metal tibial prosthetic surface of a prototype implant, and the bone at the resection site. The aim was to assess the effect of perimeter fixation versus midcondylar pin fixation and the effect of plate thickness and plate stiffness.\ud \ud Results showed that in the prototype design the largest reversible relative bone motion occurred at the tibial eminence. By design, the skirt fixation at the perimeter would prevent bone motion. A PCA (Howmedica Inc.) prosthesis has been widely used clinically and was chosen for a control because its fixation by two pegs beneath the condyles is a common variation on the general design of a relatively thick and stiff metal tibial support tray with pegs in each condylar area. The PCA tibial prosthesis showed the largest bone motion at the perimeter along the midcondylar mediolateral line, while being zero at the pegs. Maximum relative bone motion for the prototype was 37 ¿m and for the control was 101 ¿m. Averaged values showed the prototype to have 38% of the relative reversible bone motion of the control (PCA)

    A multimodal turn in Digital Humanities. Using contrastive machine learning models to explore, enrich, and analyze digital visual historical collections

    Get PDF
    Until recently, most research in the Digital Humanities (DH) was monomodal, meaning that the object of analysis was either textual or visual. Seeking to integrate multimodality theory into the DH, this article demonstrates that recently developed multimodal deep learning models, such as Contrastive Language Image Pre-training (CLIP), offer new possibilities to explore and analyze image–text combinations at scale. These models, which are trained on image and text pairs, can be applied to a wide range of text-to-image, image-to-image, and image-to-text prediction tasks. Moreover, multimodal models show high accuracy in zero-shot classification, i.e. predicting unseen categories across heterogeneous datasets. Based on three exploratory case studies, we argue that this zero-shot capability opens up the way for a multimodal turn in DH research. Moreover, multimodal models allow scholars to move past the artificial separation of text and images that was dominant in the field and analyze multimodal meaning at scale. However, we also need to be aware of the specific (historical) bias of multimodal deep learning that stems from biases in the training data used to train these models.</p

    An interferometric study of the post-AGB binary 89 Herculis. II Radiative transfer models of the circumbinary disk

    Get PDF
    The presence of disks and outflows is widespread among post-AGB binaries. In the first paper of this series, a surprisingly large fraction of optical light was found to be resolved in the 89 Her post-AGB system. The data showed this flux to arise from close to the central binary. Scattering off the inner rim of the circumbinary disk, or in a dusty outflow were suggested as two possible origins. With detailed dust radiative transfer models of the disk we aim to discriminate between these two configurations. By including Herschel/SPIRE photometry, we extend the SED such that it now fully covers UV to sub-mm wavelengths. The MCMax radiative transfer code is used to create a large grid of disk models. Our models include a self-consistent treatment of dust settling as well as of scattering. A Si-rich composition with two additional opacity sources, metallic Fe or amorphous C, are tested. The SED is fit together with mid-IR (MIDI) visibilities as well as the optical and near-IR visibilities of Paper I, to constrain the structure of the disk and in particular of its inner rim. The near-IR visibility data require a smooth inner rim, here obtained with a two-power-law parameterization of the radial surface density distribution. A model can be found that fits all the IR photometric and interferometric data well, with either of the two continuum opacity sources. Our best-fit passive models are characterized by a significant amount of mm-sized grains, which are settled to the midplane of the disk. Not a single disk model fits our data at optical wavelengths though, the reason being the opposing constraints imposed by the optical and near-IR interferometric data. A geometry in which a passive, dusty, and puffed-up circumbinary disk is present, can reproduce all the IR but not the optical observations of 89 Her. Another dusty, outflow or halo, component therefore needs to be added to the system.Comment: 15 pages, in pres

    Towards online relational schema transformations

    Get PDF
    Current relational database systems are ill-equipped for changing the structure of data while the database is in use. This is a real problem for systems for which we expect 24/7 availability, such as telecommunication, payment, and control systems. As a result, developers tend to avoid making changes because of the downtime consequences. The urgency to solve this problem is evident by a multitude of tools developed in industry, such as pt-online-schema-change(1) and oak-online-alter- table(2). Also, MySQL recently added limited support for online schema changes(3).\ud \ud Contributions: We want to draw the attention of the database community to the problem of online schema changes. We have defined requirements for online schema change mechanisms, and we have experimentally investigated existing solutions. Our results show that current solutions are unsatisfactory for complex schema changes. We propose lazy schema changes as a solution.\ud \ud Experimental Setup: To assess the performance and behaviour of existing mechanisms for on- line schema changes, we have developed an experiment based on the standard TPC-C benchmark. For each of the relational schema transformation classes that we have identified, we chose a rep- resentative transformation for the TPC-C schema. We perform the schema change online while the TPC-C benchmark is running, and measure the impact on the TPC-C transaction through- put. We have performed our experiment on PostgreSQL, which does not support online schema changes, MySQL, which supports basic online schema changes, and using pt-online-schema-change on MySQL, as a representative for tools that use triggers to allow online schema changes.\ud \ud Results: We found that existing solutions are inadequate except for the simplest of schema changes. Some single-relation transformations can be performed transactionally and online. How- ever, existing solutions do not allow schema transformations to be composed using transactions. As a result, in complex transformations, intermediate states can be exposed to database programs, which are non-trivial to handle correctly. A secondary problem is that these solutions are much slower than offline transformations, which may not be acceptable for certain applications.\ud \ud Proposal: We propose a more fundamental solution based on lazy schema transformations. The main idea is that schema changes can be described as a view on the existing schema, which can be materialized lazily to perform the schema transformation. The data in the new schema is immedi- ately accessible by computing parts of the view on demand. For a large number of cases we expect that this approach allows schema transformations without any downtime, and with minimal impact on running transactions, while the ACID properties are maintained. Moreover, lazy transforma- tions can naturally be composed as transactions, allowing complex online schema transformations. We are developing an implementation of these ideas based on a persistent functional language
    • …
    corecore