13 research outputs found

    Pygmy Myosotis morphology data Table 2

    No full text
    Field descriptions/definitions for the data workshee

    Supplementary Table 2

    No full text
    Comparison of characters differentiating several putative Myosotis taxa from the M. pygmaea species group based on herbarium specimen data. For explanation of character numbers see Table 2

    Supplementary Table 4

    No full text
    Comparison of characters of common garden specimens of Myosotis pygmaea species group individuals; values given are mean (range). Measurements taken from live plants. See Table 2 for explanation of character numbers. (R) = reproductive, (V) = vegetative. P values that are significant at P < 0.01 are denoted with an asterisk (“*”), including calyx length at fruiting between M. drucei and M. pygmaea (compare with Table S7)

    Supplementary Table 3

    No full text
    Comparison of herbarium specimen data showing differences between selected species and some putative Myosotis taxa within the M. pygmaea species group

    Data from: Bolstering species delimitation in difficult species complexes by analyzing herbarium and common garden morphological data: a case study using the New Zealand native Myosotis pygmaea species group (Boraginaceae)

    No full text
    Species delimitation in recent radiations is challenging because these species often display overlap in their expression of morphological characters. Here we analyze morphological characters measured from field-collected herbarium specimens and compare them to measurements from live plants grown in a common garden to determine reliable characters that could be used to delimit species in the Myosotis pygmaea (Boraginaceae) species group in New Zealand. This species complex is of primary interest because it includes many threatened species as well as several taxonomically indeterminate entities. The common garden experiment revealed high levels of morphological plasticity within the M. pygmaea species group, as plants in the common garden grew to be strikingly larger than those in the field. The M. pygmaea species complex was found to be a morphologically definable group, and several taxonomically indeterminate entities were placed as being either morphologically similar to the M. pygmaea species group or to other species complexes. In multidimensional scaling analyses of morphological data, of the five named species that make up the M. pygmaea species group, three formed separate clusters (M. pygmaea, M. glauca, and M. brevis), and the two others were indistinguishable from each other (M. antarctica, and M. drucei). This study represents an important step towards a planned integrative taxonomic revision of the M. pygmaea species group, and highlights the value of morphological data collected from a common garden experiment

    Pygmy Myosotis morphological data

    No full text
    Dataset of morphological characters measured on pygmy Myosotis and other bracteate-prostrate Myosotis species

    The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction

    No full text
    In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 &times; video, 3 &times; audio, and 7 &times; biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications
    corecore