995 research outputs found

    Analysis of Synthetic and Natural Cannabinoids in the Forensic Field Applying High-Resolution Mass Spectrometry

    Get PDF
    Multidimensional challenges arise in the field of forensic chemistry and toxicology from the ongoing emergence of synthetic cannabinoids (SCs) as well as the increasing legalization and medicalization of Cannabis sativa (C. sativa). This work addresses these challenges from different angles under the application of state-of-the-art mass spectrometry. ā€œPhase I In vitro Metabolic Profiling of the Synthetic Cannabinoid Receptor Agonists CUMYL-THPINACA and ADAMANTYL-THPINACAā€ (study I) investigated the in vitro metabolic fate of two SCs. As data on the metabolism of newly emerging SCs is typically scarce, in vitro metabolism studies are required for the identification of suitable screening targets. The implementation of an in silico assisted workflow aided identification and structure elucidation of metabolites. It was observed that both SCs are vastly metabolized. Suitable screening targets were proposed. Additionally, investigation of the involved cytochrome P450 (CYP) isoenzymes gave valuable information on potential metabolic drug-drug adverse reactions and the potential influence of CYP polymorphisms. ā€œAdulteration of low-delta-9-tetrahydrocannabinol products with synthetic cannabinoids: Results from drug checking servicesā€ (study II) presents data gained on the phenomenon of low THC cannabis products adulterated with SCs. Since 2020, such products have been increasingly detected in Switzerland and various European countries. The drug userā€™s unawareness about the presence of SCs combined with the typically higher potencies of SCs when compared to Ī”9-tetrahydrocannabinol (THC) raised public health concerns. Cannabis samples and data on the drugsā€™ effects obtained from three drug checking services were investigated. A comprehensive screening method for SCs applying high resolution mass spectrometry (HRMS) was developed and validated. The carrier material was characterized regarding its THC and cannabidiol (CBD) contents. Data obtained from drug checking services included user self-reports on adverse effects after consumption of the respective adulterated and non-adulterated cannabis products. Increased risks for adverse effects, in particular cardiovascular and psychologic adverse effects, were found for products containing SCs when compared to regular cannabis products. The role of drug checking services as market monitoring tool and as source on effects of newly emerging new psychoactive substances (NPS) was highlighted. ā€œBeyond Ī”9-tetrahydrocannabinol and cannabidiol: Chemical differentiation of cannabis varieties applying targeted and untargeted analysisā€ (study III) presents the development and validation of a comprehensive analytical method for the determination of major and minor cannabinoids in cannabis inflorescences. Minor cannabinoids are gaining interest for various applications, ranging from improved product characterization and differentiation of cannabis varieties to bioanalytical questions in the medico-legal field. Samples derived from 18 cannabis varieties grown and stored under standardized conditions were characterized, applying the targeted and untargeted analyses using HRMS. Multivariate statistics, e.g., principal component analysis, were conducted to investigate similarities and differences between varieties. The presented methods allowed for a refined representation of chemical differences, i.e., chemical fingerprints, between varieties, expanding traditionally applied classification systems based on THC and CBD alone

    TAKSONOMIJA METODA AKADEMSKOG PLAGIRANJA

    Get PDF
    The article gives an overview of the plagiarism domain, with focus on academic plagiarism. The article defines plagiarism, explains the origin of the term, as well as plagiarism related terms. It identifies the extent of the plagiarism domain and then focuses on the plagiarism subdomain of text documents, for which it gives an overview of current classifications and taxonomies and then proposes a more comprehensive classification according to several criteria: their origin and purpose, technical implementation, consequence, complexity of detection and according to the number of linguistic sources. The article suggests the new classification of academic plagiarism, describes sorts and methods of plagiarism, types and categories, approaches and phases of plagiarism detection, the classification of methods and algorithms for plagiarism detection. The title of the article explicitly targets the academic community, but it is sufficiently general and interdisciplinary, so it can be useful for many other professionals like software developers, linguists and librarians.Rad daje pregled domene plagiranja tekstnih dokumenata. Opisuje porijeklo pojma plagijata, daje prikaz definicija te objaŔnjava plagijatu srodne pojmove. Ukazuje na Ŕirinu domene plagiranja, a za tekstne dokumenate daje pregled dosadaŔnjih taksonomija i predlaže sveobuhvatniju taksonomiju prema viŔe kriterija: porijeklu i namjeni, tehničkoj provedbi plagiranja, posljedicama plagiranja, složenosti otkrivanja i (viŔe)jezičnom porijeklu. Rad predlaže novu klasifikaciju akademskog plagiranja, prikazuje vrste i metode plagiranja, tipove i kategorije plagijata, pristupe i faze otkrivanja plagiranja. Potom opisuje klasifikaciju metoda i algoritama otkrivanja plagijata. Iako cilja na akademskog čitatelja, može biti od koristi u interdisciplinarnim područjima te razvijateljima softvera, lingvistima i knjižničarima

    Fingerprinting: A Study in Cognitive Bias and its Effects on Latent Fingerprint Analysis

    Get PDF
    The forensic science world contains a variety of disciplines that cover a wide range of sciences, from chemistry to physics to biology. Today, DNA is considered to be the ā€œgoldstandardā€ of forensics, due to the amount of information that can be collected. However, fingerprinting used to be the pinnacle of forensic science before DNA because of the uniqueness of each print and their unchanging nature. The different patterns, along with the smaller ridge details, allow examiners to categorize and differentiate between two similar fingerprints, using the standard method of ACE-V, adopted by all fingerprint examiners. Experts had believed that fingerprints were infallible and therefore could always be relied upon when used in court. This changed when the practice came under scrutiny due to various cases in which errors occurred within the fingerprint method. The process of fingerprinting is a subjective process that leaves room for potential bias to affect the examiner. The biases that can affect fingerprint examiners can be both conscious and subconscious, making them difficult to avoid. Cognitive neuroscientist Dr. Itiel Dror explains that bias is a part of the human experience and cannot be prevented, only lessened. In addition, there are many other shortcomings within the fingerprinting method that have been studied to determine how and why they occur. Through these studies, various solutions, including a blind-verification method, have been developed to solve these issues. The cases of the Madrid Bombing and Shirley McKie are two prime examples of the mistakes that can be made when bias is allowed to interfere

    New Statistical Algorithms for the Analysis of Mass Spectrometry Time-Of-Flight Mass Data with Applications in Clinical Diagnostics

    Get PDF
    Mass spectrometry (MS) based techniques have emerged as a standard forlarge-scale protein analysis. The ongoing progress in terms of more sensitive machines and improved data analysis algorithms led to a constant expansion of its fields of applications. Recently, MS was introduced into clinical proteomics with the prospect of early disease detection using proteomic pattern matching. Analyzing biological samples (e.g. blood) by mass spectrometry generates mass spectra that represent the components (molecules) contained in a sample as masses and their respective relative concentrations. In this work, we are interested in those components that are constant within a group of individuals but differ much between individuals of two distinct groups. These distinguishing components that dependent on a particular medical condition are generally called biomarkers. Since not all biomarkers found by the algorithms are of equal (discriminating) quality we are only interested in a small biomarker subset that - as a combination - can be used as a fingerprint for a disease. Once a fingerprint for a particular disease (or medical condition) is identified, it can be used in clinical diagnostics to classify unknown spectra. In this thesis we have developed new algorithms for automatic extraction of disease specific fingerprints from mass spectrometry data. Special emphasis has been put on designing highly sensitive methods with respect to signal detection. Thanks to our statistically based approach our methods are able to detect signals even below the noise level inherent in data acquired by common MS machines, such as hormones. To provide access to these new classes of algorithms to collaborating groups we have created a web-based analysis platform that provides all necessary interfaces for data transfer, data analysis and result inspection. To prove the platform's practical relevance it has been utilized in several clinical studies two of which are presented in this thesis. In these studies it could be shown that our platform is superior to commercial systems with respect to fingerprint identification. As an outcome of these studies several fingerprints for different cancer types (bladder, kidney, testicle, pancreas, colon and thyroid) have been detected and validated. The clinical partners in fact emphasize that these results would be impossible with a less sensitive analysis tool (such as the currently available systems). In addition to the issue of reliably finding and handling signals in noise we faced the problem to handle very large amounts of data, since an average dataset of an individual is about 2.5 Gigabytes in size and we have data of hundreds to thousands of persons. To cope with these large datasets, we developed a new framework for a heterogeneous (quasi) ad-hoc Grid - an infrastructure that allows to integrate thousands of computing resources (e.g. Desktop Computers, Computing Clusters or specialized hardware, such as IBM's Cell Processor in a Playstation 3)

    Emergent quality issues in the supply of Chinese medicinal plants: A mixed methods investigation of their contemporary occurrence and historical persistence

    Get PDF
    Quality issues that emerged centuries ago in Chinese medicinal plants (CMP) were investigated to explore why they still persist in an era of advanced analytical testing and extensive legislation so that a solution to improve CMP quality could be proposed. This is important for 85% of the worldā€™s population who rely on medicinal plants (MP) for primary healthcare considering the adverse events, including fatalities that arise from such quality issues. CMP are the most prevalent medicinal plants globally. This investigation used mixed-methods, including 15 interviews with CMP expert key informants (KI), together with thematic analysis that identified the main CMP quality issues, why they persisted, and informed solutions. An unexplained case example, Eleutherococcus nodiflorus (EN), was analysed by collection of 106 samples of EN, its known toxic adulterant Periploca sepium (PS), and a related substitute, Eleutherococcus senticosus (ES), across mainland China, Taiwan and the UK. Authenticity of the samples was determined using High-performance thinlayer chromatography. Misidentification, adulteration, substitution and toxicity were the main CMP quality issues identified. Adulteration was found widespread globally with 57.4% EN found authentic, and 24.6% adulterated with cardiotoxic PS, mostly at markets and traditional pharmacies. The EN study further highlighted that the reason CMP quality issues persisted was due to the laboratory-bound nature of analytical methods and testing currently used that leave gaps in detection throughout much of the supply chain. CMP quality could be more effectively tested with patented analytical technology (PAT) and simpler field-based testing including indicator strip tests. Education highlighting the long-term economic value and communal benefit of delivering better quality CMP to consumers was recommended in favour of the financial motivation for actions that lead to the persistence of well-known and recurrent CMP quality issues

    Application of Liquid Chromatography in Food Analysis

    Get PDF
    Food products are very complex mixtures consisting of naturally occurring compounds and other substances, generally originating from technological processes, agrochemical treatments, or packaging materials. However, food is no longer just a biological necessity for survival. Society demands healthy and safe food, but it is also increasingly interested in other quality attributes more related to the origin of the food, the agricultural production processes used, the presence or not of functional compounds, etc. Improved methods for the determination of authenticity, standardization, and efficacy of nutritional properties in natural food products are required to guarantee their quality and for the growth and regulation of the market. Nowadays, liquid chromatography with ultraviolet detection, or coupled to mass spectrometry and high-resolution mass spectrometry, are among the most powerful techniques to address food safety issues and to guarantee food authenticity in order to prevent fraud. The aim of this book is to gather review articles and original research papers focused on the development of analytical techniques based on liquid chromatography for the analysis of food. This book is comprised of six valuable scientific contributions, including five original research manuscripts and one review article, dealing with the employment of liquid chromatography techniques for the characterization and analysis of feed and food, including fruits, extra virgin olive oils, confectionery oils, sparkling wines and soybeans

    Technological innovations in the collection and analysis of three-dimensional footwear impression evidence.

    Get PDF
    The development of digital 3D trace recovery in the fields of geology and archaeology has highlighted transferable methods that could be used for the recovery of 3D footwear impressions under the umbrella of forensic science. This project uses a portfolio of experiments and case studies to explore the veracity and application of SfM Photogrammetry (i.e., DigTrace) within forensic footwear. This portfolio-based research includes published papers integrated into conventional chapters. A method of comparing the accuracy and precision of different measurement methods is developed and introduced and gives a comparative view of multiple recovery techniques. A range of simulated crime scene and laboratory-controlled experiments have been conducted to compare different recovery methods such as casting, photography and SfM photogrammetry. These have been compared for accuracy, practicality and effectiveness. In addition, a range of common and lesser common footwear bearing substrates have been compared using SfM as well as other methods. One of the key findings shows that DigTrace SfM photogrammetry software reliably produces accurate forensic results, regardless of the camera used for initial photography and in a multitude of environments. This includes but is not limited to, soil, sand, snow, and other less obvious substrates such as food items, household items and in particular carpet. The thesis also shows that SfM photogrammetry provides a superior solution in the recovery of ā€˜difficult to castā€™ footwear impressions. This finding allows for 3D recovery of impressions that would otherwise have only been photographed in 2D. More generally this project shows that 3D recovery is preferential to 2D and aids in the identification of individual characteristics and subsequent positive analysis. Overall, the thesis concludes that SfM photogrammetry is a viable and accurate solution for the recovery of 3D footwear impressions both as an alternative and replacement to 2D photography and conventional 3D casting. SfM 3D recovery provides increased visualisation of footwear evidence and individualising marks. Digital evidence obtained in this way integrates with the increasingly sophisticated search algorithms being used within the UKā€™s National Footwear Database and allows rapid file sharing, retrieval and evidence sharing. Moreover, the technique has significant cost saving in terms of time, equipment and resources. It is the authorā€™s opinion, having consulted a wide audience of footwear examiners and crime scene employees, that this technique should, and can be, adopted quickly by forces in the UK and USA and disseminated for use

    Simulator of Distributed Datasets for Pulse-wave DDoS Attacks

    Get PDF
    The ever increasing scale and frequency of Distributed Denial-of-Service (DDoS) attacks, as well as the emergence of new forms of attacks, such as pulse-wave DDoS attacks, highlights the importance of ensuring that mitigation capabilities are able to keep up with the escalating threat posed by DDoS attacks. To that end, much work has been done with regard to the generation of DDoS datasets which form the basis for developing effective mitigation tools such as Intrusion Detection Systems (IDS). However, existing datasets typically represent a single, victim-centric viewpoint, which has limitations compared to a distributed dataset that provides multiple different perspectives onto an attack. Thus, this thesis implements a simulator for distributed datasets specifically focused on pulse-wave DDoS attacks, for which at current no datasets are publicly available. The simulator provides high flexibility and configurability in the types of use cases that can be modeled, allowing for the creation of different topologies and attack compositions. The evaluation demonstrates the toolā€™s capability to create of a wide range of diverse datasets that exhibit different characteristics with regard to metrics that are commonly used in a DDoS attackā€™s fingerprint. As such, this thesis represents a significant step towards enabling a better understanding of pulse-wave DDoS attacks and thereby the development of improved tools to help defend against them
    • ā€¦
    corecore