142,882 research outputs found
Principles for the selection and integration of educational multimedia materials
This paper sets out to clarify the decision framework for the selection and integration of educational multimedia material into courses. Two main areas are discussed. The first involves matching the educational principles inherent in the multimedia artefact to the aims of the course. The opposition between instructionist and constructivist approaches is particularly highlighted. The second area concerns the models used to integrate the multimedia component into the overall course. The models are classified in terms of how they distribute the balance of responsibility for explicit educational structuring between the multimedia system and the course tutor. The paper does not set out prescriptive rules; it aims rather to inform and articulate the decision space for the tutor
A prescriptive cost model for demand shaping: an application for target costing
Costing tools and traditional cost models are used primarily to calculate costs. However, these models would be more relevant if used as decision-making support tools. That is, they should allow ex-ante rather than ex-post analyses. Nevertheless, cost models tend to follow a linear logic of resources-activities-products (e.g. as it is the case of Activity Based Costing) when uncertainty, variability and dynamics of the current market demand cost models that help decision makers to define which resources are needed to satisfy market needs (e.g. as it is the case of Target Costing), i.e. in a reverse logic. Such models can be designated prescriptive cost models and require significant computational resources to attend the complexity of the problems for which they can be applied. The prescriptive analysis intends to recommend actions based on specified or desired results and it is the most evolved stage of business analytics, far beyond descriptive and predictive approaches. This paper presents and discusses a prescriptive cost model applied in the context of Target Costing. The relevance and validity of this approach are discussed and several opportunities for further work are presented.info:eu-repo/semantics/publishedVersio
Recommended from our members
Why don’t pesticide applicators protect themselves? Exploring the use of personal protective equipment among Colombian smallholders
The misuse of personal protective equipment (PPE)
during pesticide application was investigated among
smallholders in Colombia. The integrative agent-centered
(IAC) framework and a logistic regression
approach were adopted. The results suggest that the
descriptive social norm was significantly influencing
PPE use. The following were also important: (1) having
experienced pesticide-related health problems; (2)
age; (3) the share of pesticide application carried out;
and (4) the perception of PPE hindering work. Interestingly,
the influence of these factors differed for different
pieces of PPE. Since conformity to the social
norm is a source of rigidity in the system, behavioral
change may take the form of a discontinuous transition.
In conclusion, five suggestions for triggering a
transition towards more sustainable PPE use are formulated:
(1) diversifying targets/tools; (2) addressing
structural aspects; (3) sustaining interventions in the
long-term; (4) targeting farmers’ learning-by-experience;
and (5) targeting PPE use on a collective level
Bootstrap Robust Prescriptive Analytics
We address the problem of prescribing an optimal decision in a framework
where its cost depends on uncertain problem parameters that need to be
learned from data. Earlier work by Bertsimas and Kallus (2014) transforms
classical machine learning methods that merely predict from supervised
training data into prescriptive methods
taking optimal decisions specific to a particular covariate context .
Their prescriptive methods factor in additional observed contextual information
on a potentially large number of covariates to take context specific
actions which are superior to any static decision . Any naive
use of limited training data may, however, lead to gullible decisions
over-calibrated to one particular data set. In this paper, we borrow ideas from
distributionally robust optimization and the statistical bootstrap of Efron
(1982) to propose two novel prescriptive methods based on (nw) Nadaraya-Watson
and (nn) nearest-neighbors learning which safeguard against overfitting and
lead to improved out-of-sample performance. Both resulting robust prescriptive
methods reduce to tractable convex optimization problems and enjoy a limited
disappointment on bootstrap data. We illustrate the data-driven decision-making
framework and our novel robustness notion on a small news vendor problem as
well as a small portfolio allocation problem
Recommended from our members
Decision theory and real estate development: a note on uncertainty
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”
The design research pyramid: a three layer framework
To support knowledge-based design development, considerable research has been conducted from various perspectives at different levels. The research on knowledge-based design support systems, generic design artefact and design process modelling, and the inherent quality of design knowledge itself are some examples of these perspectives. The structure underneath the research is not a disparate one but ordered. This paper provides an overview of some ontologies of design knowledge and a layered research framework of knowledge-based engineering design support. Three layers of research are clarified in this pattern: knowledge ontology, design knowledge model, and application. Specifically, the paper highlights ontologies of design knowledge by giving a set of classifications of design knowledge from different points of view. Within the discussion of design knowledge content ontology, two topologies, i.e., teleology and evolutionary, are identified
- …
