849 research outputs found

    Business Model validation for a marketplace of lab network initiatives

    Get PDF
    In the field of science, technology, engineering, and mathematics (STEM) the use of laboratories to support teaching is a common requirement, not just a possibility. With the rise of the internet, teaching laboratories have changed from 'traditional' hands-on equipment to configurations that allow remote use of the experiment materials. In recent years, online labs (e.g., laboratories of universities or research institutes) have gradually been integrated into 'networks' of labs, with the objective of making them more economically viable, otherwise they would have been short-lived due to the high cost for their development and maintenance. While research on online labs has focused on didactic and technical aspects, there seem to be no in-depth studies on the financial sustainability of technical solutions developed. Moreover, online solutions subvert the traditional pattern of access being limited to individuals engaged in the practice of organizations. Indeed, online laboratories can also be used by professionals and companies interested in research and development, testing, and training activities. The authors of this article frame the problem from the perspective of the servitization of labs of universities and research institutions, through a new business model of a marketplace capable of coordinating the network of labs. To do this, an analysis of the intention to use an online lab marketplace and the activities made available by the online labs is conducted. The analysis involves entrepreneurs and practitioners of various companies from diverse industries in the northern Italy. The analysis is twofold. Firstly, it proposes a survey of intention to use university labs and LNIs in business environment. Second, it seeks to assess the usefulness of a marketplace service that technically manages the relationship between service provider and buyer beyond the mere educational aspects

    Towards Making World-Class Universities: Case Study of the Role of Information and Communication Technology

    Get PDF
    Characteristics of a world-class university include indicators such as quality of faculty, research reputation, talented undergraduate, international presence, proper usage of resources, alliances and networks, embrace of many disciplines, technologically smart, good management practices, internationalism of all aspects of the university. Thus, economic activity, innovation, international diversity, institutional indicators and research indicators are manifestations of a world-class university. In today’s digitally connected world, it is impossible to attain this status without a world-class Information and Communication Technology (ICT) infrastructure. This paper presents impact of ICT activities on universities ranking of three Nigerian universities thus enhancing their quest towards world-class status

    Fomento de las competencias experimentales utilizando recursos complementarios

    Get PDF
    [EN]The use of ICT in the academic context is a reality, in the world we live in. The young generation of students is digital native, being immersed in a virtual world during a considerable part of their day. This has an impact in their life, including on their education. In undergraduate engineering education laboratory classes are an integral part of its curriculum. These days, many laboratory classes combine traditional hands-on labs with online labs (remote and virtual labs) and several experimental resources. A “blended” or “hybrid” approach to experimental learning seems the most effective to (students’) experimental learning and the development of competences. Still this technologically mediated resource affects the way students learn and in the literature there is still a lack of works, considering the characterization of didactical implementations using a “blended” or “hybrid” approach and its impact in students’ learning and the way they construct their knowledge. In the Electric and Electronic Engineering topic and using the remote laboratory VISIR there are really very few works, reported in literature, describing some small scaled didactical experiments. The problematic which motivated this work was the need to understand the impact of different didactical approaches using this methodology (simultaneous use of several experimental resources) has on students’ academic results. Ultimately this work intends to contribute to fill a gap identified in the literature: identify factors (including some eventual students’ characteristics) which affect students’ learning and engagement in the electric and electronic circuits topic using the remote lab VISIR along with other complementary resources. To accomplish this end, four research questions where posed, each of them taking into account a set of factors in a specific field of inquiry and its influence on students’ results. The first research question approached the way the several experimental resources could be combined and its effect on students. The second dealt with the influence of the proposed VISIR tasks characteristics on students’ results. The third tackled important teacher mediation traces that could be linked to better students’ performance. And finally, the last research question investigates if there were students’ characteristics that were more associated with good learning outcomes and engagement. Considering the former objectives, it was chosen a multi-case study research methodology, using a mixed method approach, resourcing mainly to questionnaire, interview, documental analysis and observation as data gathering methods, and statistical analysis (descriptive and inferential) and content analysis, as data analysis techniques. A large-scale study analysis was conducted, including 26 courses (in a total of 43 didactical implementations using VISIR, as some of the courses have undergone more than one course implementation edition), comprising 1794 students and involving 52 different teachers. This study took place in several Higher Education Institutions (and at a minor extent, in some Technological and High Schools) in Argentina, Brazil and Portugal. In the southern hemisphere these didactical implementations happened in the 2016 and 2017 academic years while in the northern hemisphere it was possible to collect data from three semesters between 2016/17 and 2018/19 academic years. The study focused on analysing each didactical implementation (their characteristics, teachers’ usage and perception) and the matching students’ results (usage, academic results and perception). Ethical questions to guarantee both students’ and teachers’ privacy was taken care of, when using the data of the participants. The former data was only used for the purposes of this study and the state of the participation was reflected anonymously, which can be observed both in the information collected for the analysis as well as in the transcripts along the text. The study included the analysis of the collected data from various sources, the interpretation of its results using several analysis techniques, and the convergence in a process of triangulation. These results, after discussed with literature, allowed to answer in the most possible complete way the four research questions. Based on them, conclusions were drawn to identify factors that may foster students’ learning and engagement. The study also contributed to the advancement of knowledge in this research area. It allowed to conclude that VISIR and this methodology can be as useful for introductory courses as for more advanced ones (dealing with this thematic) as long as teachers plan the didactical implementation according to the type of course and students’ background. Plus, this methodology based upon VISIR can be applied with high success to courses that do not have an experimental component, nor its contents are directly related to the Electricity and Electronics topic. In these courses VISIR can be used with the purpose of contextualization, providing more interesting and appealing learning environments (e.g. theoretical mathematical courses). Finally, both teachers’ perception and students’ results suggest VISIR target public seems to be the students that require more support in their learning, that is, the students still struggling with difficulties than the more proficient students

    Enhancing EJsS with extension plugins

    Get PDF
    Easy JavaScript Simulations (EJsS) is an open-source tool that allows teachers with limited programming experience to straightforwardly bundle an interactive computer science or engineer simulation in an HTML+ JavaScript webpage. Its prominent place in Physics (where it has won several prizes) should not hinder its application in other fields (such as building the front-end of remote laboratories or learning analytics) after having adapted part of the functionality of EJsS to them. To facilitate the future inclusion of new functionalities in EJsS, this paper presents a new version of this tool that allows the enhancement of EJsS, letting it incorporate new tools and change its graphical user interface, by means of extension plugins (special software libraries). To illustrate the benefits of this distributable self-contained non-intrusive strategy, the paper (a) discusses the new methodological possibilities that the Plugins bring to EJsS developers and users, and (b) presents three plugins: one to support the plugin management and the others to easily set up a streamlined remote laboratory. Moreover, the paper also presents the main characteristics of that remote lab to allow readers take advantage of EJsS and the three plugins to set up new online experiments for their students quickly

    Topics in Educational Cyber-Physical Labs:Configurations, Data Collection and Analysis

    Get PDF
    Recent advances in remote sensing and actuation technologies, coupled with the large reach of the internet, allowed for the emergence of applications such as cyber-physical labs. Cyber-physical labs are the digital and remotely-accessible equivalent of the lab equipment students use in school to experiment, through web-based interfaces such as web applications. Students, teachers and lab owners derive value from these systems, they are our stakeholders. Students are the intended users, teachers are the educational content curators and lab owners are the service providers. In this thesis, we take a close look at issues pertaining to cyber-physical labs and propose new approaches to address them. We also analyze the use of such systems in a MOOC, to detect the impact of the exherted experimental behavior of students on their academic performance. First, we tackle the case of the generation of web apps interfacing cyber-physical labs. It is the equivalent of preparing experiments for teachers by arranging the equipment for multiple experiments with the same equipment. We propose an extension to the Smart Device Specification for cyber-phyiscal labs, and a tool which generates these apps based on it. The automatically generated apps implement the necessary functions to use a cyber-physical lab, and are ready to be integrated in online learning platfroms. Next, we investigate issues related to the collection and retrieval of students' generated data through their interaction with cyber-physical labs. We consider the needs of students and lab owners. Through questionnaires sent to both parties, we elicit the requirements for an activity-tracking infrastructure composed of a vocabulary and an architectural model. The proposed vocabulary ensures deriving value from the recorded activity, and the proposed architecture addresses privacy and data access issues pertaining to students and lab owners respectively. We evaluate our proposal with two example cyber-physical labs. Last, we collect the interaction data with a cyber-physical lab used in a MOOC. We devise computational analyses on the students activity statistics, in search for indicators of academic performance. We find that high and low performing students show statistically different activity statistics. Then, we sequence the steps students did in an experiment, and don't find any statistically significant patterns for low and high-performing students. This analysis provides insights on the usage of installed facilities to service a potential massive access to limited resources (lab installations), and shed light on possible indicators for academic performance

    Peeking into the black box: visualising learning activities

    Get PDF
    Learning analytics has emerged as the discipline that fosters the learning process based on monitored data. As learning is a complex process that is not limited to a single environment, it benefits from a holistic approach where events in different contexts and settings are observed and combined. This work proposes an approach to increase this coverage. Detailed information is obtained by combining logs from a LMS and events recorded with a virtual machine given to the students. A set of visualisations is then derived from the collected events showing previously hidden aspects of an experience that can be shown to the teaching staff for their consideration. The visualisations presented focus on different learning outcomes, such as self learning, use of industrial tools, time management, information retrieval, collaboration, etc. Depending on the information to convey, different types of visualisations are considered, ranging from graphs to starbusts and from scatter plots to heatmaps.Work partially funded by the projects: Adaptation of learning scenarios in the .LRN platform based on Contextualized Attention Metadata (CAM) (DE2009-0051), Learn3 (\Plan Nacional de I+D+I" TIN2008-05163/TSI), EEE (\Plan Nacional de I+D+I" TIN 2011-28308-C03-01), and Emadrid: InvestigaciĂłn y desarrollo de tecnologĂ­as para el e-learning en la Comunidad de Madrid (S2009/TIC-1650).Publicad

    Fine-Grained Workflow Interoperability in Life Sciences

    Get PDF
    In den vergangenen Jahrzehnten führten Fortschritte in den Schlüsseltechnologien der Lebenswissenschaften zu einer exponentiellen Zunahme der zur Verfügung stehenden biologischen Daten. Um Ergebnisse zeitnah generieren zu können werden sowohl spezialisierte Rechensystem als auch Programmierfähigkeiten benötigt: Desktopcomputer oder monolithische Ansätze sind weder in der Lage mit dem Wachstum der verfügbaren biologischen Daten noch mit der Komplexität der Analysetechniken Schritt zu halten. Workflows erlauben diesem Trend durch Parallelisierungsansätzen und verteilten Rechensystemen entgegenzuwirken. Ihre transparenten Abläufe, gegeben durch ihre klar definierten Strukturen, ebenso ihre Wiederholbarkeit, erfüllen die Standards der Reproduzierbarkeit, welche an wissenschaftliche Methoden gestellt werden. Eines der Ziele unserer Arbeit ist es Forschern beim Bedienen von Rechensystemen zu unterstützen, ohne dass Programmierkenntnisse notwendig sind. Dafür wurde eine Sammlung von Tools entwickelt, welche jedes Kommandozeilenprogramm in ein Workflowsystem integrieren kann. Ohne weitere Anpassungen kann unser Programm zwei weit verbreitete Workflowsysteme unterstützen. Unser modularer Entwurf erlaubt zudem Unterstützung für weitere Workflowmaschinen hinzuzufügen. Basierend auf der Bedeutung von frühen und robusten Workflowentwürfen, haben wir außerdem eine wohl etablierte Desktop–basierte Analyseplattform erweitert. Diese enthält über 2.000 Aufgaben, wobei jede als Baustein in einem Workflow fungiert. Die Plattform erlaubt einfache Entwicklung neuer Aufgaben und die Integration externer Kommandozeilenprogramme. In dieser Arbeit wurde ein Plugin zur Konvertierung entwickelt, welches nutzerfreundliche Mechanismen bereitstellt, um Workflows auf verteilten Hochleistungsrechensystemen auszuführen—eine Aufgabe, die sonst technische Kenntnisse erfordert, die gewöhnlich nicht zum Anforderungsprofil eines Lebenswissenschaftlers gehören. Unsere Konverter–Erweiterung generiert quasi identische Versionen desselben Workflows, welche im Anschluss auf leistungsfähigen Berechnungsressourcen ausgeführt werden können. Infolgedessen werden nicht nur die Möglichkeiten von verteilten hochperformanten Rechensystemen sowie die Bequemlichkeit eines für Desktopcomputer entwickelte Workflowsystems ausgenutzt, sondern zusätzlich werden Berechnungsbeschränkungen von Desktopcomputern und die steile Lernkurve, die mit dem Workflowentwurf auf verteilten Systemen verbunden ist, umgangen. Unser Konverter–Plugin hat sofortige Anwendung für Forscher. Wir zeigen dies in drei für die Lebenswissenschaften relevanten Anwendungsbeispielen: Strukturelle Bioinformatik, Immuninformatik, und Metabolomik.Recent decades have witnessed an exponential increase of available biological data due to advances in key technologies for life sciences. Specialized computing resources and scripting skills are now required to deliver results in a timely fashion: desktop computers or monolithic approaches can no longer keep pace with neither the growth of available biological data nor the complexity of analysis techniques. Workflows offer an accessible way to counter against this trend by facilitating parallelization and distribution of computations. Given their structured and repeatable nature, workflows also provide a transparent process to satisfy strict reproducibility standards required by the scientific method. One of the goals of our work is to assist researchers in accessing computing resources without the need for programming or scripting skills. To this effect, we created a toolset able to integrate any command line tool into workflow systems. Out of the box, our toolset supports two widely–used workflow systems, but our modular design allows for seamless additions in order to support further workflow engines. Recognizing the importance of early and robust workflow design, we also extended a well–established, desktop–based analytics platform that contains more than two thousand tasks (each being a building block for a workflow), allows easy development of new tasks and is able to integrate external command line tools. We developed a converter plug–in that offers a user–friendly mechanism to execute workflows on distributed high–performance computing resources—an exercise that would otherwise require technical skills typically not associated with the average life scientist's profile. Our converter extension generates virtually identical versions of the same workflows, which can then be executed on more capable computing resources. That is, not only did we leverage the capacity of distributed high–performance resources and the conveniences of a workflow engine designed for personal computers but we also circumvented computing limitations of personal computers and the steep learning curve associated with creating workflows for distributed environments. Our converter extension has immediate applications for researchers and we showcase our results by means of three use cases relevant for life scientists: structural bioinformatics, immunoinformatics and metabolomics

    Exploring Information Technologies to Support Shotgun Proteomics

    Get PDF
    Shotgun proteomics refers to the direct analysis of complex protein mixtures to create a profile of the proteins present in the cell. These profiles can be used to study the underlying biological basis for cancer development. Closely studying the profiles as the cancer proliferates reveals the molecular interactions in the cell. They provide clues to researchers on potential drug targets to treat the disease. A little more than a decade old, shotgun proteomics is a relatively new form of discovery, one that is data intensive and requires complex data analysis. Early studies indicated a gap between the ability to analyze biological samples with a mass spectrometer and the information systems available to process and analyze this data. This thesis reflects on an automated proteomic information system at the University of Colorado Central Analytical Facility. Investigators there are using cutting edge proteomic techniques to analyze melanoma cell lines responsible for skin cancer in patients. The paper will provide insight on key design processes in the development of an Oracle relational database and automation system to support high-throughput shotgun proteomics in the facility. It will also discuss significant contributions, technologies, software, a data standard, and leaders in the field developing solutions and products in proteomics

    Network Security Concepts, Dangers, and Defense Best Practical

    Get PDF
    In today's highly interconnected world, network security has become a critical aspect of protecting organizations from cyber-attacks. The increasing sophistication of attackers and their ability to exploit software and firmware vulnerabilities pose significant dangers to the security of networks. However, many organizations often neglect the essential steps required to secure their networks, leading to an increased risk of security breaches. In this research article, we aim to address this issue by investigating network security concepts, potential dangers, and practical defense strategies. We begin by exploring the different types of cyber-attacks and their sources, highlighting the various ways attackers exploit network vulnerabilities. We also examine the reasons why organizations often overlook network security and the consequences of not prioritizing it. To better understand the complexity of network security, we categorize the different security concerns using the CIA (confidentiality, integrity, and availability) triangle. This approach allows us to identify the various areas of vulnerability and their potential impact on network security. Next, we focus on the most crucial basic concepts and steps involved in various network security operations. We outline the best practices and practical approaches organizations can take to improve their network security, including implementing security policies and procedures, using encryption and authentication methods, and conducting regular security assessments. By highlighting the importance of network security and providing practical guidance on how organizations can defend against cyber-attacks, we hope to raise awareness and help prevent security breaches. Keywords: Network, Internet, Security, Security Threats, IP Address, Network Attack, Attackers DOI: 10.7176/CEIS/14-2-03 Publication date:March 31st 202
    • …
    corecore