14,059 research outputs found
The Quality of Audits - A Comparative Study of Auditing and Certification Schemes in the Food Sector
In spite of the increasing number of certification schemes in the food sector, there is only few research about the economics of certification. Given the rapid growth and the lack of experience the actors have to cope with different forms of opportunistic behaviour. The following paper describes the basic structure and information economics reasons why certification schemes are created. Subsequently, an institutionalistic model is presented which includes several starting points to increase quality of certification systems mainly based on analogies in financial auditing.Audit quality, New Institutional Economics, Certification schemes, Organic Certification, Industrial Organization,
TRACEABILITY, TRADE AND COOL: LESSONS FROM THE EU MEAT AND POULTRY INDUSTRY
The traditional food supply chain is arranged as a complex array of producers, handlers, processors, manufacturers, distributors, and retailers. As the food supply chain grew in complexity over time, little emphasis was placed on reserving information regarding the origin of raw materials and their transformation, often by multiple handlers, into consumer ready products. This paper provides case illustrations of the implementation of information systems for support of traceability in Europe. Emphasis is on the firm level costs and benefits as well as the broader market structure and governance issues inherent in information economics of the firm.traceability, economics of the firm, information systems, internet, food supply chain, Food Consumption/Nutrition/Food Safety, International Relations/Trade, Livestock Production/Industries,
Computational Complexity versus Statistical Performance on Sparse Recovery Problems
We show that several classical quantities controlling compressed sensing
performance directly match classical parameters controlling algorithmic
complexity. We first describe linearly convergent restart schemes on
first-order methods solving a broad range of compressed sensing problems, where
sharpness at the optimum controls convergence speed. We show that for sparse
recovery problems, this sharpness can be written as a condition number, given
by the ratio between true signal sparsity and the largest signal size that can
be recovered by the observation matrix. In a similar vein, Renegar's condition
number is a data-driven complexity measure for convex programs, generalizing
classical condition numbers for linear systems. We show that for a broad class
of compressed sensing problems, the worst case value of this algorithmic
complexity measure taken over all signals matches the restricted singular value
of the observation matrix which controls robust recovery performance. Overall,
this means in both cases that, in compressed sensing problems, a single
parameter directly controls both computational complexity and recovery
performance. Numerical experiments illustrate these points using several
classical algorithms.Comment: Final version, to appear in information and Inferenc
- …