1,452 research outputs found
In the field:Coase an exemplar in the tradition of Smith, Marshall and Ostrom
This paper argues that Coase provides the primary 20th century exemplar of the grounding of analytical developments in economics in direct fieldwork observation. In particular, his focus on the business enterprise, its internal functions (including decision-making), and its external relations (including contracting) has provided a stimulus for radical developments in microeconomics and in managerial and decision economics in particular. The argument is developed by a stylization of the development of economics, referring to Adam Smith in the 18th century, Alfred Marshall in the 19th century, Ronald Coase in the 20th century, and Elinor Ostrom in the 21st century
Survey on Combinatorial Register Allocation and Instruction Scheduling
Register allocation (mapping variables to processor registers or memory) and
instruction scheduling (reordering instructions to increase instruction-level
parallelism) are essential tasks for generating efficient assembly code in a
compiler. In the last three decades, combinatorial optimization has emerged as
an alternative to traditional, heuristic algorithms for these two tasks.
Combinatorial optimization approaches can deliver optimal solutions according
to a model, can precisely capture trade-offs between conflicting decisions, and
are more flexible at the expense of increased compilation time.
This paper provides an exhaustive literature review and a classification of
combinatorial optimization approaches to register allocation and instruction
scheduling, with a focus on the techniques that are most applied in this
context: integer programming, constraint programming, partitioned Boolean
quadratic programming, and enumeration. Researchers in compilers and
combinatorial optimization can benefit from identifying developments, trends,
and challenges in the area; compiler practitioners may discern opportunities
and grasp the potential benefit of applying combinatorial optimization
Accelerated Predictive Healthcare Analytics with Pumas, a High Performance Pharmaceutical Modeling and Simulation Platform
Pharmacometric modeling establishes causal quantitative relationship between administered dose, tissue exposures, desired and undesired effects and patient’s risk factors. These models are employed to de-risk drug development and guide precision medicine decisions. Recent technological advances rendered collecting real-time and detailed data easy. However, the pharmacometric tools have not been designed to handle heterogeneous, big data and complex models. The estimation methods are outdated to solve modern healthcare challenges. We set out to design a platform that facilitates domain specific modeling and its integration with modern analytics to foster innovation and readiness to data deluge in healthcare.
New specialized estimation methodologies have been developed that allow dramatic performance advances in areas that have not seen major improvements in decades. New ODE solver algorithms, such as coefficient-optimized higher order integrators and new automatic stiffness detecting algorithms which are robust to frequent discontinuities, give rise to up to 4x performance improvements across a wide range of stiff and non-stiff systems seen in pharmacometric applications. These methods combine with JIT compiler techniques and further specialize the solution process on the individual systems, allowing statically-sized optimizations and discrete sensitivity analysis via forward-mode automatic differentiation, to further enhance the accuracy and performance of the solving and parameter estimation process. We demonstrate that when all of these techniques are combined with a validated clinical trial dosing mechanism and non-compartmental analysis (NCA) suite, real applications like NLME parameter estimation see run times halved while retaining the same accuracy. Meanwhile in areas with less prior optimization of software, like optimal experimental design, we see orders of magnitude performance enhancements. Together we show a fast and modern domain specific modeling framework which lays a platform for innovation via upcoming integrations with modern analytics
Accelerated Predictive Healthcare Analytics with Pumas, a High Performance Pharmaceutical Modeling and Simulation Platform
Pharmacometric modeling establishes causal quantitative relationship between administered dose, tissue exposures, desired and undesired effects and patient’s risk factors. These models are employed to de-risk drug development and guide precision medicine decisions. Recent technological advances rendered collecting real-time and detailed data easy. However, the pharmacometric tools have not been designed to handle heterogeneous, big data and complex models. The estimation methods are outdated to solve modern healthcare challenges. We set out to design a platform that facilitates domain specific modeling and its integration with modern analytics to foster innovation and readiness to data deluge in healthcare.
New specialized estimation methodologies have been developed that allow dramatic performance advances in areas that have not seen major improvements in decades. New ODE solver algorithms, such as coefficient-optimized higher order integrators and new automatic stiffness detecting algorithms which are robust to frequent discontinuities, give rise to up to 4x performance improvements across a wide range of stiff and non-stiff systems seen in pharmacometric applications. These methods combine with JIT compiler techniques and further specialize the solution process on the individual systems, allowing statically-sized optimizations and discrete sensitivity analysis via forward-mode automatic differentiation, to further enhance the accuracy and performance of the solving and parameter estimation process. We demonstrate that when all of these techniques are combined with a validated clinical trial dosing mechanism and non-compartmental analysis (NCA) suite, real applications like NLME parameter estimation see run times halved while retaining the same accuracy. Meanwhile in areas with less prior optimization of software, like optimal experimental design, we see orders of magnitude performance enhancements. Together we show a fast and modern domain specific modeling framework which lays a platform for innovation via upcoming integrations with modern analytics
Efficient Methods for Natural Language Processing: A Survey
Recent work in natural language processing (NLP) has yielded appealing
results from scaling model parameters and training data; however, using only
scale to improve performance means that resource consumption also grows. Such
resources include data, time, storage, or energy, all of which are naturally
limited and unevenly distributed. This motivates research into efficient
methods that require fewer resources to achieve similar results. This survey
synthesizes and relates current methods and findings in efficient NLP. We aim
to provide both guidance for conducting NLP under limited resources, and point
towards promising research directions for developing more efficient methods.Comment: Accepted at TACL, pre publication versio
Recent applications of a single quadrupole mass spectrometer in 11C, 18F and radiometal chemistry
Mass spectrometry (MS) has longstanding applications in radiochemistry laboratories, stemming from carbon-dating. However, research on the development of radiotracers for molecular imaging with either positron emission tomography (PET) or single photon emission computed tomography has yet to take full advantage of MS. This inertia has been attributed to the relatively low concentrations of radiopharmaceutical formulations and lack of access to the required MS equipment due to the high costs for purchase and maintenance of specialized MS systems. To date, single quadrupole (SQ)-MS coupled to liquid chromatography (LC) systems is the main form of MS that has been used in radiochemistry laboratories. These LC–MS systems are primarily used for assessing the chemical purity of radiolabeling precursor or standard molecules but also have applications in the determination of metabolites. Herein, we highlight personal experiences using a compact SQ-MS in our PET radiochemistry laboratories, to monitor the small amounts of carrier observed in most radiotracer preparations, even at high molar activities. The use of a SQ-MS in the observation of the low mass associated with non-radioactive species which are formed along with the radiotracer from the trace amounts of carrier found is demonstrated. Herein, we describe a pre-concentration system to detect dilute radiopharmaceutical formulations and metabolite analyses by SQ-MS. Selected examples where SQ-MS was critical for optimization of radiochemical reactions and for unequivocal characterization of radiotracers are showcased. We also illustrate examples where SQ-MS can be applied in identification of radiometal complexes and development of a new purification methodology for Pd-catalyzed radiofluorination reactions, shedding light on the identity of metal complexes present in the labelling solution
- …