372 research outputs found

    Manual of Criminal Law and Procedure

    Get PDF
    Intended to aid to Alaska law enforcement officers in the performance of their duties in the field, this manual was designed to provide brief, quick access to major points of substantive and procedural criminal law. The manual contained discussion and procedural guidelines for investigatory stops, identification procedures including line-ups, arrest, search and seizure, interrogation, as well as discussion of justification for the use of nondeadly and deadly force whether by peace officers or civilians, culpability, entrapment, trial preparation, and media relations. The section on substantive criminal law deals with a selection of crimes most likely to be encountered by "street" officers as defined with the recently enacted Revised Alaska Criminal Code (effective January 1, 1980), desribing elements of each crime, investigative hints, and differences with previous provisions of the criminal code, where relevant.Alaska Department of Law Grant No. 78-A-014Introduction / Criminal Procedures / Substantive Criminal Law / Justification / Culpability / Entrapment / Trial Preparation / Media Relations / Appendice

    Comparing Cognitive Theories of Learning Transfer to Advance Cybersecurity Instruction, Assessment, and Testing

    Get PDF
    The cybersecurity threat landscape evolves quickly, continually, and consequentially. This means that the transfer of cybersecurity learning is crucial. We compared how different recognized “cognitive” transfer theories might help explain and synergize three aspects of cybersecurity education. These include teaching and training in diverse settings, assessing learning formatively & summatively, and testing & measuring achievement, proficiency, & readiness. We excluded newer sociocultural theories and their implications for inclusion as we explore those theories elsewhere. We first summarized the history of cybersecurity education and proficiency standards considering transfer theories. We then explored each theory and reviewed the most relevant cybersecurity education research; in some cases, we broadened our search to computing education. We concluded that (a) archaic differential transfer theories are still influential but have negative implications to be avoided, (b) constructionist theories are popular in K-12 settings but raise issues for assessment and transfer, (c) many embrace a general cognitive science perspective that can resolve tensions between modern cognitive-associationist and cognitive-constructivist theories that are popular with innovators, and (d) new perceptual and coordinative theories have potential worth exploring. These insights should support “generative” cybersecurity learning that transfers readily and widely to future classes, tests, and workplaces. These insights should be beneficial when designing and using cyber “ranges” and other hyper-realistic simulations, where transfer assumptions inform costly design decisions and undergird the validity of performance as evidence of proficiency

    The Importance of extraction protocol on the analysis of novel waste sources of lignocellulosic biomass

    Get PDF
    peer-reviewedAs the utilization and consumption of lignocellulosic biomass increases, so too will the need for an adequate supply of feedstock. To meet these needs, novel waste feedstock materials will need to be utilized. Exploitation of these novel feedstocks will require information both on the effects of solvent extraction on the succeeding analysis of potential novel feedstocks and how accurate current methodologies are in determining the composition of novel lignocellulosic feedstocks, particularly the carbohydrate and lignin fractions. In this study, the effects of solvent extraction on novel feedstocks, including tree foliage, tree bark and spent mushroom compost, with 95% ethanol, water and both sequentially were examined. Chemical analyses were carried out to determine the moisture content, ash, extractives, post-hydrolysis sugars, Klason lignin (KL) and acid-soluble lignin (ASL) within the selected feedstocks. The result of extraction could be seen most strongly for Klason lignin, with a strong association between higher levels of Klason lignin levels and greater amounts of non-removed extractives (tree foliage and bark). Higher Klason lignin levels are reported to be due the condensation of non-removed extractives during hydrolysis, hence the lower Klason lignin determinations following extraction are more exact. In addition, total sugar determinations were lower following extractions. This is because of the solubility of non-cell-wall carbohydrates; thus, the determinations following extraction are more accurate representations of structural cell-wall polysaccharides such as cellulose. Such determinations will assist in determining the best way to utilize novel feedstocks such as those analyzed in this work

    Genomic selection using random regressions on known and latent environmental covariates

    Get PDF
    KEY MESSAGE: The integration of known and latent environmental covariates within a single-stage genomic selection approach provides breeders with an informative and practical framework to utilise genotype by environment interaction for prediction into current and future environments. ABSTRACT: This paper develops a single-stage genomic selection approach which integrates known and latent environmental covariates within a special factor analytic framework. The factor analytic linear mixed model of Smith et al. (2001) is an effective method for analysing multi-environment trial (MET) datasets, but has limited practicality since the underlying factors are latent so the modelled genotype by environment interaction (GEI) is observable, rather than predictable. The advantage of using random regressions on known environmental covariates, such as soil moisture and daily temperature, is that the modelled GEI becomes predictable. The integrated factor analytic linear mixed model (IFA-LMM) developed in this paper includes a model for predictable and observable GEI in terms of a joint set of known and latent environmental covariates. The IFA-LMM is demonstrated on a late-stage cotton breeding MET dataset from Bayer CropScience. The results show that the known covariates predominately capture crossover GEI and explain 34.4% of the overall genetic variance. The most notable covariates are maximum downward solar radiation (10.1%), average cloud cover (4.5%) and maximum temperature (4.0%). The latent covariates predominately capture non-crossover GEI and explain 40.5% of the overall genetic variance. The results also show that the average prediction accuracy of the IFA-LMM is [Formula: see text] higher than conventional random regression models for current environments and [Formula: see text] higher for future environments. The IFA-LMM is therefore an effective method for analysing MET datasets which also utilises crossover and non-crossover GEI for genomic prediction into current and future environments. This is becoming increasingly important with the emergence of rapidly changing environments and climate change. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00122-022-04186-w

    Genomic selection strategies for clonally propagated crops

    Get PDF
    For genomic selection (GS) in clonal breeding programs to be effective, parents should be selected based on genomic predicted cross-performance unless dominance is negligible. Genomic prediction of cross-performance enables efficient exploitation of the additive and dominance value simultaneously. Here, we compared different GS strategies for clonally propagated crops with diploid (-like) meiotic behavior, using strawberry as an example. We used stochastic simulation to evaluate six combinations of three breeding programs and two parent selection methods. The three breeding programs included (1) a breeding program that introduced GS in the first clonal stage, and (2) two variations of a two-part breeding program with one and three crossing cycles per year, respectively. The two parent selection methods were (1) parent selection based on genomic estimated breeding values (GEBVs) and (2) parent selection based on genomic predicted cross-performance (GPCP). Selection of parents based on GPCP produced faster genetic gain than selection of parents based on GEBVs because it reduced inbreeding when the dominance degree increased. The two-part breeding programs with one and three crossing cycles per year using GPCP always produced the most genetic gain unless dominance was negligible. We conclude that (1) in clonal breeding programs with GS, parents should be selected based on GPCP, and (2) a two-part breeding program with parent selection based on GPCP to rapidly drive population improvement has great potential to improve breeding clonally propagated crops

    A Unifying Model of Genome Evolution Under Parsimony

    Get PDF
    We present a data structure called a history graph that offers a practical basis for the analysis of genome evolution. It conceptually simplifies the study of parsimonious evolutionary histories by representing both substitutions and double cut and join (DCJ) rearrangements in the presence of duplications. The problem of constructing parsimonious history graphs thus subsumes related maximum parsimony problems in the fields of phylogenetic reconstruction and genome rearrangement. We show that tractable functions can be used to define upper and lower bounds on the minimum number of substitutions and DCJ rearrangements needed to explain any history graph. These bounds become tight for a special type of unambiguous history graph called an ancestral variation graph (AVG), which constrains in its combinatorial structure the number of operations required. We finally demonstrate that for a given history graph GG, a finite set of AVGs describe all parsimonious interpretations of GG, and this set can be explored with a few sampling moves.Comment: 52 pages, 24 figure
    corecore