10 research outputs found
John R. Watret, Ph.D.
Chancellor Worldwide Campus
Dr. John Watret was named chancellor of Embry-Riddle’s Worldwide Campus in 2012. Previously, he was executive vice president and chief academic officer of the Worldwide Campus (2010).
As chancellor, he provides leadership and sets strategic direction for Embry-Riddle Worldwide, which offers programs and schedules designed for working adults. Watret oversees all academic and operational functions of the campus, which is composed of more than 150 classroom locations in the United States and internationally. More than 25,000 students annually are served through both traditional and online instruction.
Watret joined Embry-Riddle in 1989, and over the years held a number of management and faculty positions at the Daytona Beach Campus, including associate provost, associate chancellor, associate dean of academics and assistant, associate and full professor of mathematics. In the early 1990s, he took a brief leave of absence to serve as head of the department of mathematics for Texas A&M\u27s branch campus in northern Japan. In 2006, Watret became associate vice president and chief academic officer for the Worldwide Campus.
During his tenure as a faculty member in the mathematics department, Watret was known as a dedicated and skilled instructor, winning Embry-Riddle\u27s Outstanding Teaching Award in 1996. He is the author of several publications. He was one of the lead faculty who developed the Integrated Curriculum in Engineering (ICE) program through a grant from the Boeing Company. He continues to be active nationally in graduate education by serving on the executive committee of the Conference of Southern Graduate Schools.
Watret holds a Ph.D. in Mathematics and an M.S. in Mathematics, both from Texas A&M University, as well as a B.Sc. in Mathematics (honors) from Herriot-Watt University, Edinburgh, Scotland. He also has a private pilot\u27s license.https://commons.erau.edu/lep-images/1008/thumbnail.jp
Executive Leadership Panel Discussion: Moderated by Marc Bernier
The Executive Leadership Panel, Bio and Photo Links:
Richard H. Heist, Chief Academic Officer, ERAUNancee I. Bailey, Vice President for Student Affairs, ERAUJohn R. Watret, Chancellor of Worldwide, ERAUBecky L. Vasquez, Chief Technology Officer, ERAU
Moderated by Marc Bernie
How is model-related uncertainty quantified and reported in different disciplines?
How do we know how much we know? Quantifying uncertainty associated with our
modelling work is the only way we can answer how much we know about any
phenomenon. With quantitative science now highly influential in the public
sphere and the results from models translating into action, we must support our
conclusions with sufficient rigour to produce useful, reproducible results.
Incomplete consideration of model-based uncertainties can lead to false
conclusions with real world impacts. Despite these potentially damaging
consequences, uncertainty consideration is incomplete both within and across
scientific fields. We take a unique interdisciplinary approach and conduct a
systematic audit of model-related uncertainty quantification from seven
scientific fields, spanning the biological, physical, and social sciences. Our
results show no single field is achieving complete consideration of model
uncertainties, but together we can fill the gaps. We propose opportunities to
improve the quantification of uncertainty through use of a source framework for
uncertainty consideration, model type specific guidelines, improved
presentation, and shared best practice. We also identify shared outstanding
challenges (uncertainty in input data, balancing trade-offs, error propagation,
and defining how much uncertainty is required). Finally, we make nine concrete
recommendations for current practice (following good practice guidelines and an
uncertainty checklist, presenting uncertainty numerically, and propagating
model-related uncertainty into conclusions), future research priorities
(uncertainty in input data, quantifying uncertainty in complex models, and the
importance of missing uncertainty in different contexts), and general research
standards across the sciences (transparency about study limitations and
dedicated uncertainty sections of manuscripts).Comment: 40 Pages (including supporting information), 3 Figures, 2 Boxes, 1
Tabl
Insights into the quantification and reporting of model-related uncertainty across different disciplines
Quantifying uncertainty associated with our models is the only way we can express how much we know about any phenomenon. Incomplete consideration of model-based uncertainties can lead to overstated conclusions with real world impacts in diverse spheres, including conservation, epidemiology, climate science, and policy. Despite these potentially damaging consequences, we still know little about how different fields quantify and report uncertainty. We introduce the "sources of uncertainty" framework, using it to conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and political sciences. Our interdisciplinary audit shows no field fully considers all possible sources of uncertainty, but each has its own best practices alongside shared outstanding challenges. We make ten easy-to-implement recommendations to improve the consistency, completeness, and clarity of reporting on model-related uncertainty. These recommendations serve as a guide to best practices across scientific fields and expand our toolbox for high-quality research