33 research outputs found
Opening Up: Why Are We not Using More Open Education Resources?
Open Educational Resources (OER) have the ability to alleviate the student dropout rate, make faculty course design easier, and allow faculty to contribute and participate in the OER undertaking that is swiftly spreading through our nation. In addition we are able to save our students 3,000 dollars over the course of their education. This work looks at findings from a survey which was sent out to list serves and by direct email to faculty. The survey posed questions to try to determine why faculty were having a difficult time accepting and using OERs in their own classrooms. It was discovered that the major issue was trust. 141 respondents felt that OERs could benefit students, but felt that they had no oversight of the content used in the open resources. I propose a method of peer review which helps build the trust of faculty to begin to use more OERs in their courses
KabanFlow
A review of KabanFlow, a tool which helps organize projects. It was used for some time prior to my report to determine if it has any quality in regards to both education and librarianship
A Novel Approach for Library Materials Acquisition using Discrete Particle Swarm Optimization
The academic library materials acquisition problem is a challenge for librarian, since library cannot get enough funding from universities and the price of materials inflates greatly. In this paper, we analyze an integer mathematical model by considering the selection of acquired materials to maximize the average preference value as well as the budget execution rate under practical restrictions. The objective is to improve the Discrete Particle Swarm Optimization (DPSO) algorithm by adding a Simulate Annealing algorithm to reduce premature convergence. Furthermore, the algorithm is implemented in multiple threaded environment. The experimental results show the efficiency of this approach
Comparing TensorFlow Deep Learning Performance Using CPUs, GPUs, Local PCs and Cloud
Deep learning is a very computational intensive task. Traditionally GPUs have been used to speed-up computations by several orders of magnitude. TensorFlow is a deep learning framework designed to improve performance further by running on multiple nodes in a distributed system. While TensorFlow has only been available for a little over a year, it has quickly become the most popular open source machine learning project on GitHub. The open source version of TensorFlow was originally only capable of running on a single node while Google’s proprietary version only was capable of leveraging distributed systems. This has now changed. In this paper, we will compare performance of TensorFlow running on different single and cloudnode configurations. As an example, we will train a convolutional neural network to detect number of cells in early mouse embryos. From this research, we have found that using a local node with a local high performance GPU is still the best option for most people who do not have the resources to design bigger system implementations
Highway infrastructure and building information modelling in UK
Traditional methods of design are becoming less relevant and prevalent, due to institutionalising of Building Information Modelling (BIM) within statutory regulations and the huge amount of data that BIM presents to practice; especially in 3D models. This can be seen in the A1 Dishforth-to-Barton road infrastructure improvement scheme which comprises the A1 Dishforth-to-Leeming and A1 Leeming-to-Barton schemes. The traditional method of design was central to the A1 Dishforth-to-Leeming scheme and BIM central to the A1 Leeming-to-Barton scheme. So this report presents a comparative study of the traditional and BIM methods in relation to the A1 Dishforth-to-Barton improvement scheme through the perception of key professionals involved in this project. A qualitative research study was conducted through the use of an open-ended questionnaire intended to bridge gaps in perceptions and understanding of both methods. Judgemental sampling technique was used to select experienced respondents who understand and participated in the A1 Dishforth-to-Barton road infrastructure improvement scheme. The study reveals an incontrovertible complementary nature of both methods and that the realisation of the 2016 mandate appears doubtful due to lack of standardization, training and level of awareness. It is highly recommended that a statutory incentivization framework for BIM be conceptualised and considered for implementation to attract and encourage small scale participants. Of high priority is the subsidization of in-house training by local authorities and localized joint ventures by smaller companies for specialist training
Albiglutide and cardiovascular outcomes in patients with type 2 diabetes and cardiovascular disease (Harmony Outcomes): a double-blind, randomised placebo-controlled trial
Background:
Glucagon-like peptide 1 receptor agonists differ in chemical structure, duration of action, and in their effects on clinical outcomes. The cardiovascular effects of once-weekly albiglutide in type 2 diabetes are unknown. We aimed to determine the safety and efficacy of albiglutide in preventing cardiovascular death, myocardial infarction, or stroke.
Methods:
We did a double-blind, randomised, placebo-controlled trial in 610 sites across 28 countries. We randomly assigned patients aged 40 years and older with type 2 diabetes and cardiovascular disease (at a 1:1 ratio) to groups that either received a subcutaneous injection of albiglutide (30–50 mg, based on glycaemic response and tolerability) or of a matched volume of placebo once a week, in addition to their standard care. Investigators used an interactive voice or web response system to obtain treatment assignment, and patients and all study investigators were masked to their treatment allocation. We hypothesised that albiglutide would be non-inferior to placebo for the primary outcome of the first occurrence of cardiovascular death, myocardial infarction, or stroke, which was assessed in the intention-to-treat population. If non-inferiority was confirmed by an upper limit of the 95% CI for a hazard ratio of less than 1·30, closed testing for superiority was prespecified. This study is registered with ClinicalTrials.gov, number NCT02465515.
Findings:
Patients were screened between July 1, 2015, and Nov 24, 2016. 10 793 patients were screened and 9463 participants were enrolled and randomly assigned to groups: 4731 patients were assigned to receive albiglutide and 4732 patients to receive placebo. On Nov 8, 2017, it was determined that 611 primary endpoints and a median follow-up of at least 1·5 years had accrued, and participants returned for a final visit and discontinuation from study treatment; the last patient visit was on March 12, 2018. These 9463 patients, the intention-to-treat population, were evaluated for a median duration of 1·6 years and were assessed for the primary outcome. The primary composite outcome occurred in 338 (7%) of 4731 patients at an incidence rate of 4·6 events per 100 person-years in the albiglutide group and in 428 (9%) of 4732 patients at an incidence rate of 5·9 events per 100 person-years in the placebo group (hazard ratio 0·78, 95% CI 0·68–0·90), which indicated that albiglutide was superior to placebo (p<0·0001 for non-inferiority; p=0·0006 for superiority). The incidence of acute pancreatitis (ten patients in the albiglutide group and seven patients in the placebo group), pancreatic cancer (six patients in the albiglutide group and five patients in the placebo group), medullary thyroid carcinoma (zero patients in both groups), and other serious adverse events did not differ between the two groups. There were three (<1%) deaths in the placebo group that were assessed by investigators, who were masked to study drug assignment, to be treatment-related and two (<1%) deaths in the albiglutide group.
Interpretation:
In patients with type 2 diabetes and cardiovascular disease, albiglutide was superior to placebo with respect to major adverse cardiovascular events. Evidence-based glucagon-like peptide 1 receptor agonists should therefore be considered as part of a comprehensive strategy to reduce the risk of cardiovascular events in patients with type 2 diabetes.
Funding:
GlaxoSmithKline
Historical biogeography of the leopard (Panthera pardus) and its extinct Eurasian populations
Background: Resolving the historical biogeography of the leopard (Panthera pardus) is a complex issue, because patterns inferred from fossils and from molecular data lack congruence. Fossil evidence supports an African origin, and suggests that leopards were already present in Eurasia during the Early Pleistocene. Analysis of DNA sequences however, suggests a more recent, Middle Pleistocene shared ancestry of Asian and African leopards. These contrasting patterns led researchers to propose a two-stage hypothesis of leopard dispersal out of Africa: an initial Early Pleistocene colonisation of Asia and a subsequent replacement by a second colonisation wave during the Middle Pleistocene. The status of Late Pleistocene European leopards within this scenario is unclear: were these populations remnants of the first dispersal, or do the last surviving European leopards share more recent ancestry with their African counterparts?
Results: In this study, we generate and analyse mitogenome sequences from historical samples that span the entire modern leopard distribution, as well as from Late Pleistocene remains. We find a deep bifurcation between African and Eurasian mitochondrial lineages (~ 710 Ka), with the European ancient samples as sister to all Asian lineages (~ 483 Ka). The modern and historical mainland Asian lineages share a relatively recent common ancestor (~ 122 Ka), and we find one Javan sample nested within these.
Conclusions: The phylogenetic placement of the ancient European leopard as sister group to Asian leopards suggests that these populations originate from the same out-of-Africa dispersal which founded the Asian lineages. The coalescence time found for the mitochondrial lineages aligns well with the earliest undisputed fossils in Eurasia, and thus encourages a re-evaluation of the identification of the much older putative leopard fossils from the region. The relatively recent ancestry of all mainland Asian leopard lineages suggests that these populations underwent a severe population bottleneck during the Pleistocene. Finally, although only based on a single sample, the unexpected phylogenetic placement of the Javan leopard could be interpreted as evidence for exchange of mitochondrial lineages between Java and mainland Asia, calling for further investigation into the evolutionary history of this subspecies
Opening Up: Why are we not using more Open Educational Resources (OER’s)?
Open Educational Resources (OER) have the ability to alleviate student drop out rate, make faculty course design easier, allow faculty to contribute and participate in the OER undertaking that is swiftly spreading through our nation. In addition we are able to save our students 3,000 dollars over the course of their education. This work looks at findings from a survey which was sent out to list serves and direct email to faculty. The survey posed questions to try to determine why faculty were having a difficult time accepting and using OER’s in their own classroom. It was discovered that the major issue was trust. 141 respondents felt that OER’s could benefit students, but felt that we have no oversight of the content that is used in the open resources. I propose a method of peer review which helps build the trust of faculty to begin to use more OER’s in their courses
Comparison of Orbit Propagators in the Research and Development Goddard Trajectory Determination System (R & D GTDS). Part I: Simulated Data
R&D GTDS is Draper Laboratory's research-based orbit determination testbed. This analysis tool evolved from its R&D counterpart at the Gooddard Space Flight Center. Dr. Paul Cefola, Program Manager at Draper Laboratory and Lecturer at the Massachusetts Institute of Technology, has overseen the development and expansion of this testbed (by a team of scientists at Draper Laboratory and a continuing string of graduate students at MIT) over the past twenty years (see Fonte). In its current form, R&D GTDS is capable of performing...This paper evaluates the performance of various orbit propagation theories for arti cial earth satellites in di erent orbital regimes Speci cally R& D GTDS s Cowell numerical technique DSST semianalytical technique SGP , SGP4, and Brouwer-Lyddane (analytic techniques) orbit propagators are compared for decaying circular (~200 km perigee height ), low altitude circular
(590 km perigee height ), high altitude circular (1340 km perigee height), Molniya and geosynchronous orbits. All test cases implement a one orbital period differential correction fit to simulated data derived from a Cowell truth trajectory. These fits are followed by a one orbital period predict with the DC solve-for vector. Trajectory comparisons are made with the Cowell "truth " trajectory over both the fit and predict spans . Computation time and RMS errors are
used as comparison metrics. The Unix-based version of R& D GTDS ( NPS SUN Sparc 10) is the test platform used in this analysis
Teaching Quantum Computing with Videos
Quantum computing will likely change our world as we know it today. In bringing together fields such as quantum mechanics, mathematics, and computer science in their construction and application, quantum computers hold a promise of tremendous increase in processing power when compared to present day technology. For an educator, teaching this topic requires that consideration be given to the fact that quantum computing is not only based on hard-to-teach aspects of physics, math, and computing, but that quantum computing itself is still in the early stages of its development. What is certain is that concepts such as Shor’s and Grover’s algorithms have already been developed specifically for use with quantum computing. Both algorithms are the beneficiaries of quantum properties such as superposition and entanglement which are at the core of the new technology. This paper is presented in a bottom up structure starting with the explanation of the basic component, the qubit, before moving to superposition and entanglement and ending with a lucid explanation of both of the algorithms. Due to the complex nature of the topic, teaching this subject requires that certain measures be taken to teach all students at all levels and have them feel comfortable within themselves about the topic. The course is meant for anyone who wants to learn quantum computing. It is furthermore recommended that those attending have a blank slate so we start with new students. We would like to see a variety of students taking and passing the course, with that in mind, the explanations of the topics are done without employing an excessive amount of math common to other courses in quantum computing. The recommended approach for teaching is through the use of videos and a curriculum which is broken down so topics are made more general and the discussion of that effort is the main focu