279 research outputs found

    Estimating unreported catches in Norwegian fisheries

    Get PDF
    A discard ban for fish was introduced in Norway in 1987, which requires that all commercial catches must be landed and reported. In theory, this regulation creates a full record of total removals from all fisheries. However, exemptions and varying compliance rates create a risk that unreported catches still occur. Estimating unreported catches of all species in multiple fisheries is a large task that is complexified by the many influential factors related to unique fishery regulations, market demands, fishing gear, and species biology. There is therefore a need to standardise the estimation procedure, but this requires compromises that affect the bias and precision variably across individual species which must be understood if results are used as scientific advice. In Norwegian fisheries, the largest source of detailed data on unreported catches comes from the Norwegian Reference Fleet, a group of active fishing vessels that are paid to sample their catches at sea. However, participation in the programme is voluntary, meaning there are uncertainties about how representative the Norwegian Reference Fleet are of the wider fisheries. In such a complex system, it is important to address uncertainties in the entire estimation process, including from sampling data and the estimators used. The aim of this thesis is to develop standardised estimators for unreported catches in Norwegian fisheries. To identify the current knowledge gaps in Norwegian fisheries, global best practices for estimating unreported catches were collated and applied to Norwegian fisheries. Following from this, two research paths were identified. Firstly, there is a demand to understand the quality of data collected by the Norwegian Reference Fleet. Based on the available data, this was confined to quantifying the representativeness of samples. Secondly, previous studies estimating unreported catches have used bespoke model-based approaches to improve predictive performance, but simple design-based approaches have been applied based on assumptions that have not yet been tested. There is therefore a demand to evaluate the assumptions behind the current design-based estimators. To evaluate representativeness, the sampling design of the Norwegian Reference Fleet was simulated using reported catches, for which fleet-level information is available. The simulation study identified that nonprobability sampling of vessels in the Norwegian Reference Fleet results in a tendency to overestimate reported catches, but the bias is still within the bounds of expected variation from probability sampling. Representativeness varied greatly across species and years, and there was evidence that the estimators traditionally used for unreported catches may be introducing bias due to assumptions being unmet. These results provide support for the development of improved estimators and consideration of a more conservative estimation of uncertainty. Applying a cluster-based estimator that better describes true variations between sampled vessels produces a more realistic, albeit more uncertain estimate of unreported catches. This is also the case for additional uncertainty incurred from converting numbers of fish to biomass, which must use an additional modelling step due to a lack of information on fish weights. The current methodology for estimating discards in coastal fisheries is restricted by the fishery-level data that is used for extrapolating estimated discard rates. However, current developments in mandatory reporting requirements suggest that future model-based approaches could improve discard estimates. Therefore, an exploratory model was fitted to the sampling data to identify potentially important variables that explain variations in discarding. This model can then inform the variable selection in a future model-based approach when fishery-level data collection is improved. The estimation methodologies presented in this thesis form the basis of a national routine for estimating unreported catches in Norwegian fisheries. Quantifying the bias of estimators and accounting for additional, important sources of uncertainty provides a standardised design-based estimator for unreported catches in Norwegian fisheries. Predictive performance is now supported by quantitative evidence and further improvements have been identified to optimise estimators in the future such as accounting for rare occurrences and size-based estimates. Furthermore, the lessons learnt throughout this doctoral research highlight the importance of creating a standardised framework for estimating unreported catches. This ensures that improvements are centralised rather than being hidden within individual case studies.I Norge ble det innført et utkastforbud for fisk fanget allerede i 1987. I henhold til dette skal all kommersiell fangst føres på land. I teorien oppnår denne forskriften en fullstendig oversikt over totale uttak fra alle fiskerier. Unntak og varierende etterlevelse skaper imidlertid en risiko for at det fortsatt forekommer urapporterte fangster. Å estimere urapporterte fangster av alle arter i flere fiskerier er en stor og komplisert oppgave på grunn av de mange innflytelsesrike faktorene knyttet til unike fiskerireguleringer, markedskrav, fiskeredskaper og artsbiologi. Det er derfor behov for å standardisere estimeringsprosedyren, men dette krever kompromisser som påvirker nøyaktighet og presisjonen i varierende grad på tvers av individuelle arter, og som må forstås hvis resultatene brukes som vitenskapelig råd. I norske fiskerier er Referanseflåten den største kilden til detaljerte data om urapporterte fangster. Referanseflåten er en gruppe aktive fiskefartøyer som får betalt for å ta prøver fra fangstene sine. Siden deltakelse i programmet er frivillig, er det usikkerhet om hvor representativ Referanseflåten er for hele fiskeflåten. I et så komplekst system er det viktig å adressere usikkerhet i hele estimeringsprosessen, inkludert data og estimatorene som brukes. Målet med denne oppgave er å utvikle standardiserte estimatorer for urapportert fangst i norske fiskerier. For å kartlegge dagens kunnskapshull i norske fiskerier, ble den globale beste praksis for estimering av urapportert fangst sammenstilt og brukt på norske fiskerier. Etter dette ble det definert to forskningsretninger. Det første er nødvendigheten om å forstå kvaliteten på data som samles inn av Referanseflåten. Basert på tilgjengelige data ble dette begrenset til å kvantifisere hvor representativt de innsamlede data er. For det andre har tidligere studier som estimerte urapportert fangst tatt i bruk tilpassede modellbaserte tilnærminger for å forbedre prediktiv ytelse, men noen designbaserte tilnærminger som har blitt brukt er basert på antakelser som ennå ikke er testet. Det er derfor et behov for å evaluere forutsetningene bak designbaserte estimatorer som brukes i dag. For å vurdere Referanseflåten sin representativitet, ble data innsamlingsdesignet simulert med bruk av rapporterte fangster som er tilgjengelig for hele flåten. Simuleringene viste en tendens til å overestimere rapportert fangst fordi båtene ble ikke valgt ved bruk av sannsynlighet. Likevel er nøyaktigheten fortsatt innenfor rammen av forventet variasjon hvis båtene ble valgt ved bruk av sannsynlighet. Representativiteten varierte sterkt på tvers av arter og år, og det var bevis på at estimatorene som tradisjonelt ble brukt for urapportert fangst, kan innføre unøyaktighet på grunn av at forutsetningene ikke er oppfylt. Disse resultatene gir støtte til utvikling av forbedrede estimatorer og vurdering av en mer konservativ estimering av usikkerhet. Bruk av en klyngebasert estimator som bedre beskriver sanne variasjoner mellom utvalgte fartøyer gir et mer realistisk, om enn mer usikkert estimat av urapporterte fangster. Dette er også tilfellet for ytterligere usikkerhet som følge av konvertering av antall fisk til biomasse, som må bruke et ekstra modelleringstrinn på grunn av mangel på informasjon om fiskevekten. Dagens metodikk for å estimere utkast i kystfiske er begrenset av kvaliteten på dataene på fiskerinivå som brukes for å ekstrapolere estimerte utkastrater. Pågående utvikling i obligatoriske rapporteringskrav tyder imidlertid på at fremtidige modellbaserte tilnærminger kan forbedre estimatene på utkast. Derfor ble en utforskende modell tilpasset prøvetakingsdataene for å identifisere mulige viktige variabler som forklarer grunnene til utkast. Denne modellen kan deretter informere variabelutvalget i en fremtidig modellbasert tilnærming når datainnsamlingen på fiskerinivå forbedres. Metodene for utkastestimering fremlagt i denne oppgaven kan danne grunnlaget for en nasjonal rutine for å estimere urapportert fangst i norske fiskerier. Å kvantifisere nøyaktigheten til estimatorer og redegjøre for ytterligere viktige kilder til usikkerhet gir en standardisert designbasert estimator for urapporterte fangster i norske fiskerier. Prediktiv ytelse støttes nå av kvantitative bevis og ytterligere forbedringer er identifisert for å optimalisere estimatorer i fremtiden, for eksempel regnskap for sjeldne hendelser og størrelsesbaserte estimater. Erfaringene gjennom denne forskningsoppgave fremhever viktigheten av å skape et standardisert rammeverk for å estimere urapportert fangst. Dette sikrer at forbedringer er sentralisert, i stedet for å være skjult i individuelle casestudier.Doktorgradsavhandlin

    Report of the Comprehensive Fishery Evaluation Working Group [ICES Headquarters, 25 June - 4 July, 1997]

    Get PDF
    Contributors: Bjarte Bogstad, Kjellrun Hiis Hauge, Tore Jakobsen, Knut Korsbrekke, Sigurd Tjelmelan

    Report of the Working Group on North Atlantic Salmon [ICES Headquarters, 3- 13 April, 2002]

    Get PDF
    Contributors: Lars Petter Hansen, Marianne Hol

    Report of the Comprehensive Fishery Evaluation Working Group [ICES Headquarters, 25 June - 4 July, 1997]

    Get PDF
    Contributors: Bjarte Bogstad, Kjellrun Hiis Hauge, Tore Jakobsen, Knut Korsbrekke, Sigurd Tjelmelan

    End-to-end anomaly detection in stream data

    Get PDF
    Nowadays, huge volumes of data are generated with increasing velocity through various systems, applications, and activities. This increases the demand for stream and time series analysis to react to changing conditions in real-time for enhanced efficiency and quality of service delivery as well as upgraded safety and security in private and public sectors. Despite its very rich history, time series anomaly detection is still one of the vital topics in machine learning research and is receiving increasing attention. Identifying hidden patterns and selecting an appropriate model that fits the observed data well and also carries over to unobserved data is not a trivial task. Due to the increasing diversity of data sources and associated stochastic processes, this pivotal data analysis topic is loaded with various challenges like complex latent patterns, concept drift, and overfitting that may mislead the model and cause a high false alarm rate. Handling these challenges leads the advanced anomaly detection methods to develop sophisticated decision logic, which turns them into mysterious and inexplicable black-boxes. Contrary to this trend, end-users expect transparency and verifiability to trust a model and the outcomes it produces. Also, pointing the users to the most anomalous/malicious areas of time series and causal features could save them time, energy, and money. For the mentioned reasons, this thesis is addressing the crucial challenges in an end-to-end pipeline of stream-based anomaly detection through the three essential phases of behavior prediction, inference, and interpretation. The first step is focused on devising a time series model that leads to high average accuracy as well as small error deviation. On this basis, we propose higher-quality anomaly detection and scoring techniques that utilize the related contexts to reclassify the observations and post-pruning the unjustified events. Last but not least, we make the predictive process transparent and verifiable by providing meaningful reasoning behind its generated results based on the understandable concepts by a human. The provided insight can pinpoint the anomalous regions of time series and explain why the current status of a system has been flagged as anomalous. Stream-based anomaly detection research is a principal area of innovation to support our economy, security, and even the safety and health of societies worldwide. We believe our proposed analysis techniques can contribute to building a situational awareness platform and open new perspectives in a variety of domains like cybersecurity, and health

    Remote sensing technology applications in forestry and REDD+

    Get PDF
    Advances in close-range and remote sensing technologies are driving innovations in forest resource assessments and monitoring on varying scales. Data acquired with airborne and spaceborne platforms provide high(er) spatial resolution, more frequent coverage, and more spectral information. Recent developments in ground-based sensors have advanced 3D measurements, low-cost permanent systems, and community-based monitoring of forests. The UNFCCC REDD+ mechanism has advanced the remote sensing community and the development of forest geospatial products that can be used by countries for the international reporting and national forest monitoring. However, an urgent need remains to better understand the options and limitations of remote and close-range sensing techniques in the field of forest degradation and forest change. Therefore, we invite scientists working on remote sensing technologies, close-range sensing, and field data to contribute to this Special Issue. Topics of interest include: (1) novel remote sensing applications that can meet the needs of forest resource information and REDD+ MRV, (2) case studies of applying remote sensing data for REDD+ MRV, (3) timeseries algorithms and methodologies for forest resource assessment on different spatial scales varying from the tree to the national level, and (4) novel close-range sensing applications that can support sustainable forestry and REDD+ MRV. We particularly welcome submissions on data fusion

    Aspects of the pharmacokinetics of itraconazole and voriconazole in the tuatara (Sphenodon punctatus) and application in the treatment of an emerging fungal disease

    Get PDF
    Tuatara (Sphenodon punctatus) are unique, cold-adapted reptiles endemic to New Zealand. Recently, captive tuatara have been found to be affected by an emerging fungal pathogen, Paranannizziopsis australasiensis. P. australasiensis causes dermatitis in tuatara, and has caused fatal systemic mycosis in a bearded dragon (Pogona vitticeps), and in aquatic file snakes (Acrochordus spp). The discovery of P. australasiensis infections has prevented the release of tuatara from several captive institutions to offshore islands, and has negative implications for the long-term health and welfare of the animals. A review of the literature revealed that infections caused by organisms related to P. australasiensis are being recognised worldwide as emerging pathogens of reptiles. Little is known about the epidemiology of these often-fatal infections, and treatment with a range of antifungals has met with varying success. There has been little research on antifungal use in reptiles, and none on how environmental temperature affects the pharmacokinetics of antifungals. This study investigated the microbiological characteristics of P. australasiensis, primarily the growth rate of the fungus at different temperatures, and the Minimum Inhibitory Concentration (MIC) of various antifungal agents for P. australasiensis. It was determined that the optimal growth temperature for P. australasiensis encompasses the range from 20oC-30oC, with scant growth at 12oC, moderate growth at 15oC, and no growth at 37oC. The MICs of antifungals were tested at room temperature and at 37oC, and were not found to be significantly different. MICs of itraconazole and voriconazole for three isolates of P. australasiensis were found to be low, at 0.12mg/L for itraconazole and <0.008mg/L for voriconazole. The single and multiple dose pharmacokinetics of itraconazole and voriconazole in tuatara were investigated at 12 and 20oC; these are the high and low ends of the tuatara’s preferred optimal temperature zone (POTZ). Results showed statistically significant differences in antifungal elimination half-life between temperatures. With the aid of population pharmacokinetic modelling, optimal dosing regimes for both antifungals were developed for tuatara of different weights. It was established that tuatara should be treated at 20oC, at the high end of POTZ, to facilitate rapid attainment of therapeutic antifungal concentrations, improve clinical outcomes and reduce the risk of adverse effects. While itraconazole demonstrated more predictable pharmacokinetics than voriconazole in tuatara, itraconazole treatment was associated with significant adverse effects. These included elevated bile acids and uric acid concentrations, and weight loss. While voriconazole appears to be safer, its pharmacokinetics are less predictable, with high inter-individual variability in tuatara administered the same dose rate (a phenomenon also observed in humans). While voriconazole may be a useful antifungal in clinically affected tuatara where dosage can be adjusted based on the response to treatment, its use in an asymptomatic quarantine setting may be limited. The use of higher voriconazole doses may increase the likelihood of maintaining therapeutic concentrations in all treated animals, however the risk of adverse effects increases concomitantly. Furthermore, there are currently no published reports of successful treatment of P. australasiensis in tuatara with voriconazole. This study also established haematologic and biochemical reference ranges in a group of tuatara. These demonstrated variability in several parameters based on sex and season, and will be a useful tool for assessing health and disease in these and other tuatara
    corecore