21 research outputs found

    Study Examining New Technologies and Sustainable Development with a Focus on Social Entrepreneurship

    Get PDF
    The economy of today is characterized by rapid change at any given time. Adaptability is the key to success in such an environment. When one examines the historical development of economic theory, it becomes evident that the development of industry and the advancement of a community's economy are based on the development of new ideas and innovations. Without being at the forefront of science and innovation, it is unlikely that any country will be able to pass the development route. An important tool for achieving this goal is entrepreneurship. In an economic system based on entrepreneurship, innovators and owners of ideas are among the most important factors for the advancement of the system. Entrepreneurship is closely related to economic and social development, and it is considered an important indicator of development in developing countries today. In light of the special role and position of entrepreneurs in the process of economic growth and community development, many governments in developed and leading countries are attempting to foster the development of a number of community members with entrepreneurial characteristics. Aiming to maximize opportunities and exploit research achievements in order to promote entrepreneurship education and entrepreneurial activities. By promoting entrepreneurship and providing an environment conducive to growth and development, it will be possible to eliminate current issues and problems associated with entrepreneurship, as well as the unemployment of university graduates and the great problem of other unemployed persons. Research has shown that entrepreneurship can contribute to economic growth through a variety of channels. In order to create knowledge overflow in the new theories of growth, when the economy reaches sustainable status, income growth per capita would only be possible via knowledge growth, which would result in more efficient production technologies with greater productivity. With this context in mind, the intersection of social entrepreneurship, technology development, and sustainable development is very important in today's world. Developing entrepreneurship is essential to meeting these needs and achieving these goals

    Composable local memory organisation for streaming applications on embedded MPSoCs

    Get PDF
    Multi-Processor Systems on a Chip (MPSoCs) are suitable platforms for the implementation of complex embedded applications. An MPSoC is composable if the functional and temporal behaviour of each application is independent of the absence or presence of other applications. Composability is required for application design and analysis in isolation, and integration with linear effort. In this paper we propose a composable organisation for the top level of a memory hierarchy. This organisation preserves the short (one cycle) access time desirable for a processor's frequent local accesses and enables the predictability demanded by real-time applications. We partition the local memory in two blocks, one private, for local tile data, and another shared for inter-tile data communication. To avoid application interference, we instantiate one such shared local memory block and an Remote Direct Memory Access (RDMA) for each application running on the processor. We implement this organisation on an MPSoC with two processors on an FPGA. On this platform we execute a composition of applications consisting of a JPEG decoder, and a synthetic application. Our experiments indicate that an application's timing is not affected by the behaviour of another application, thus composability is achieved. Moreover, the utilisation of the RDMA component leads to 45% performance increase on average for a number of workloads covering a large range of communication/computation ratios

    Uganda's experience in Ebola virus disease outbreak preparedness, 2018-2019.

    Get PDF
    BACKGROUND: Since the declaration of the 10th Ebola Virus Disease (EVD) outbreak in DRC on 1st Aug 2018, several neighboring countries have been developing and implementing preparedness efforts to prevent EVD cross-border transmission to enable timely detection, investigation, and response in the event of a confirmed EVD outbreak in the country. We describe Uganda's experience in EVD preparedness. RESULTS: On 4 August 2018, the Uganda Ministry of Health (MoH) activated the Public Health Emergency Operations Centre (PHEOC) and the National Task Force (NTF) for public health emergencies to plan, guide, and coordinate EVD preparedness in the country. The NTF selected an Incident Management Team (IMT), constituting a National Rapid Response Team (NRRT) that supported activation of the District Task Forces (DTFs) and District Rapid Response Teams (DRRTs) that jointly assessed levels of preparedness in 30 designated high-risk districts representing category 1 (20 districts) and category 2 (10 districts). The MoH, with technical guidance from the World Health Organisation (WHO), led EVD preparedness activities and worked together with other ministries and partner organisations to enhance community-based surveillance systems, develop and disseminate risk communication messages, engage communities, reinforce EVD screening and infection prevention measures at Points of Entry (PoEs) and in high-risk health facilities, construct and equip EVD isolation and treatment units, and establish coordination and procurement mechanisms. CONCLUSION: As of 31 May 2019, there was no confirmed case of EVD as Uganda has continued to make significant and verifiable progress in EVD preparedness. There is a need to sustain these efforts, not only in EVD preparedness but also across the entire spectrum of a multi-hazard framework. These efforts strengthen country capacity and compel the country to avail resources for preparedness and management of incidents at the source while effectively cutting costs of using a "fire-fighting" approach during public health emergencies

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Power analysis side channel attacks: the processor design-level context

    Full text link
    The rapid increase in the use of embedded systems for performing secure transactions, has proportionally increased the security threats which are faced by such devices. Side channel attack, a sophisticated security threat to embedded devices like smartcards, mobile phones and PDAs, exploits the external manifestations like processing time, power consumption and electromagnetic emission to identify the internal computations. Power analysis attack, introduced by Kocher in 1998, is used by adversaries to eavesdrop on confidential data while the device is executing a secure transaction. The adversary observes the power trace dissipated/consumed by the chip during the encryption/decryption of the AES cryptographic program and predicts the secret key used for encryption by extracting necessary information from the power trace.Countermeasures proposed to overcome power analysis are data masking, table masking, current flattening, circuitry level solutions, dummy instruction insertions, balancing bit-flips, etc. All these techniques are either susceptible to multi-order side channel attacks, not sufficiently generic to cover all encryption algorithms, or burden the system with high area cost, run-time or energy consumption.The initial solution presented in this thesis is a HW/SW based randomised instruction injection technique, which infuses random instructions at random places during the execution of an application. Such randomisation obfuscates the secure information from the power profile, not allowing the adversary to extract the critical power segments for analysis. Further, the author devised a systematic method to measure the security level of a power sequence and used it to measure the number of random instructions needed, to suitably confuse the adversary. The proposed processor model costs 1.9% in additional area for a simplescalar processor, and costs on average 29.8% in runtime and 27.1% in additional energy consumption for six industry standard cryptographic algorithms. This design is extended to a processor architecture which automatically detects the execution of the most common encryption algorithms, starts to scramble the power waveform by adding randomly placed instructions with random register accesses, and stops injecting instructions when it is safe to do so. This approach has less overheads compared to previous solutions and avoids software instrumentation, allowing programmers with no special knowledge to use the system. The extended processor model costs an additional area of 1.2%, and an average of 25% in runtime and 28.5% in energy overheads for industry standard cryptographic algorithms.Due to the possibility of removing random injections using large number of samples (due to the random nature, a large number of samples will eliminate noise), the author proposes a multiprocessor 'algorithmic' balancing technique. This technique uses a dual processor architecture where two processors execute the same program in parallel, but with complementary intermediate data, thus balancing the bitflips. The second processor works in conjunction with the first processor for balancing only when encryption is performed, and both processors carry out independent tasks when no encryption is being performed. Both DES and AES cryptographic programs are investigated for balancing and the author shows that this technique is economical, while completely preventing power analysis attacks. The signature detection unit to capture encryption is also utilised, which is used in the instruction injection approach. This multiprocessor balancing approach reduces performance by 0.42% and 0.94% for AES and DES respectively. The hardware increase is 2X only when balancing is performed. Further, several future extensions for the balancing approach are proposed, by introducing random swapping of encryption iterations between cores. FPGA implementations of these processor designs are briefly described at the end of this thesis

    Fine-Grained Checkpoint Recovery for Application-Specific Instruction-Set Processors

    Full text link

    Composability and Predictability for Independent Application Development, Verification, and Execution

    Full text link
    System-on-chip (soc) design gets increasingly complex, as a growing number of applications are integrated in modern systems. Some of these applications have real-time requirements, such as a minimum throughput or a maximum latency. To reduce cost, system resources are shared between applications, making their timing behavior inter-dependent. Real-time requirements must hence e verified for all possible combinations of concurrently executing applications, which is not feasible with commonly used simulation-based techniques. This chapter addresses this problem using two complexity-reducing concepts: composability and predictability. Applications in a composable system are completely isolated and cannot affect each other's behaviors, enabling them to be independently verified. Predictable systems, on the other hand, provide lower bounds on performance, allowing applications to be verified using formal performance analysis. Five techniques to achieve composability and/or predictability in soc resources are presented and we explain their implementation for processors, interconnect, and memories in our platform
    corecore