374 research outputs found

    A Trade-Based Analysis of the Economic Impact of Non-Compliance with Illegal, Unreported and Unregulated Fishing: The Case of Vietnam

    Get PDF
    Illegal, unreported and unregulated (IUU) fishing is a threat to the sustainable use of fishing resources. To eliminate the destructive fishing practices, the whole value chain of fish trade needs to be well regulated. Trade-related policy measures show potential for contributing towards the elimination of unsustainable fishing practices. The EU’s launch of the IUU-combating fishing program and the introduction of measures to deal with countries that exploit, produce and export fishery products with illegal fishing origin, is indispensable in addressing harmful trends and a concern of the whole world, especially the fishing community. The program includes the flagship use of a warning card system. The EU is a very important trading partner for Vietnam and major importer of Vietnam’s fish products, of which seafood plays an important role. The EU market helps pave the way for Vietnamese seafood to enter the world market. Vietnam’s seafood export to the EU has increased sharply over the past 20 years, from USD 90 million in 1999 to nearly USD 1.5 billion in 2017 (and since decreased to closer to USD 1.3 billion in 2019). The year of 2017 marked a critical turning point for Vietnam’s fisheries when the EU issued a yellow card warning to Vietnam for not cooperating and making enough efforts to combat IUU fishing. The EU made nine recommendations to improve the Vietnamese fisheries management system following the warning. Over the past two years, the Government of Vietnam, ministries and the entire Vietnamese fishing community have actively improved to meet the recommendations of the EU to remove the IUU yellow card. The EU has appreciated Vietnam’s efforts to combat IUU exploitation, however, so far, the IUU yellow card has not yet been removed. In the past two years, the quantity of seafood exports to the EU have decreased significantly, showing the immediate impact of the yellow card warning on Vietnam’s seafood industry. However, that is only part of the negative impact as visible in export figures. There will be many other consequences from the IUU yellow card warning and the impact will be more serious if Vietnam does not remove the yellow card soon or receives a red card warning

    Lossy Kernelization for (Implicit) Hitting Set Problems

    Get PDF
    We re-visit the complexity of polynomial time pre-processing (kernelization) for the d-Hitting Set problem. This is one of the most classic problems in Parameterized Complexity by itself, and, furthermore, it encompasses several other of the most well-studied problems in this field, such as Vertex Cover, Feedback Vertex Set in Tournaments (FVST) and Cluster Vertex Deletion (CVD). In fact, d-Hitting Set encompasses any deletion problem to a hereditary property that can be characterized by a finite set of forbidden induced subgraphs. With respect to bit size, the kernelization complexity of d-Hitting Set is essentially settled: there exists a kernel with ?(k^d) bits (?(k^d) sets and ?(k^{d-1}) elements) and this it tight by the result of Dell and van Melkebeek [STOC 2010, JACM 2014]. Still, the question of whether there exists a kernel for d-Hitting Set with fewer elements has remained one of the most major open problems in Kernelization. In this paper, we first show that if we allow the kernelization to be lossy with a qualitatively better loss than the best possible approximation ratio of polynomial time approximation algorithms, then one can obtain kernels where the number of elements is linear for every fixed d. Further, based on this, we present our main result: we show that there exist approximate Turing kernelizations for d-Hitting Set that even beat the established bit-size lower bounds for exact kernelizations - in fact, we use a constant number of oracle calls, each with "near linear" (?(k^{1+?})) bit size, that is, almost the best one could hope for. Lastly, for two special cases of implicit 3-Hitting set, namely, FVST and CVD, we obtain the "best of both worlds" type of results - (1+?)-approximate kernelizations with a linear number of vertices. In terms of size, this substantially improves the exact kernels of Fomin et al. [SODA 2018, TALG 2019], with simpler arguments

    Quantization Error Correction Schemes for Lattice-Reduction Aided MIMO Detectors

    Get PDF
    Lattice reduction aided (LRA) linear detectors have been known to achieve near optimal performance at low complexity. However, one weakness of LRA detector is that the quantization step in LRA detector is not optimal. Based on simulation results, we show that most of detection errors in LRA linear detectors are due to quantization errors. We then propose two methods to correct the quantization errors. In the first method, sphere detectors are introduced to correct quantization errors at low additional complexity. As a second approach, we propose a list quantization scheme which can generate a list of candidate symbols from the original LRA estimated symbols. From these listed symbols, decisions are made according to the minimum Euclidean distance between the received and estimated points. It is shown by simulations that both methods provide significant BER performance improvements with only a small additional complexity
    • …
    corecore