43 research outputs found

    A synthetic biology based cell line engineering pipeline

    Get PDF
    An ideal host cell line for deriving cell lines of high recombinant protein production should be stable, predictable, and amenable to rapid cell engineering or other forms of phenotypical manipulation. In the past few years we have employed genomic information to identify “safe harbors” for exogenous gene integration in CHO cells, deployed systems modeling and optimization to design pathways and control strategies to modify important aspects of recombinant protein productivity, and established a synthetic biology approach to implement genetic changes, all with the goal of creating a pipeline to produce “designer” cell lines. Chinese hamster ovary (CHO) cells are the preferred platform for protein production. However, the Chinese hamster genome is unstable in its ploidy, is subject to long and short deletions, duplications, and translocations. In addition, gene expression is subject to epigenetic changes including DNA methylation, histone modification and heterochromatin invasion, thus further complicating transgene expression for protein production in cell lines. With these issues in mind, we set out to engineer a CHO cell line highly amenable to stable protein production using a synthetic biology approach. We compiled karyotyping and chromosome number data of several CHO cell lines and sublines, identified genomic regions with high a frequency of gain and loss of copy number using comparative genome hybridization (CGH), and verified structural variants using sequencing data. We further used ATAC (Assay for Transposase-Accessible Chromatin) sequencing to study chromatin accessibility and epigenetic stability within the CHO genome. RNA-seq data from multiple cell lines were also used to identify regions with high transcriptional activity. Analysis of these data allowed the identification of several “safe harbor” loci that could be used for cell engineering. Based on results of the data analysis and identification of “safe harbors”, we engineered an IgG producing cell line with a single copy of the product transgene as a template cell line. This product gene site is flanked by sequences for recombinase mediated cassette exchange, therefore allowing easy substitution of the IgG producing gene for an alternative product gene. Furthermore, a “landing pad” for multi-gene cassette insertion was integrated into the genome at an additional site. Together, these sites allowed engineering of new cell lines producing a fusion protein and Erythropoietin to be generated from the template cell line. To enable rapid assembly of product transgenes and genetic elements for engineering cell attributes into multi-gene cassettes, we adopted a golden-gate based synthetic biology approach. The assembly of genetic parts into multi-gene cassettes in a LEGO-like fashion allowed different combinations of genes under the control of various promoters to be generated quickly for introduction into the template cell line. Using this engineered CHO cell line, we set out to study metabolism and product protein glycosylation for cell engineering. To guide the selection of genetic elements for cell engineering, we developed a multi-compartment kinetic model, as well as a flux model of energy metabolism and glycosylation. The transcriptome meta-data was used extensively to identify genes and isoforms expressed in the cell line and to estimate the enzyme levels in the model. The flux model was used to identify and the LEGO-like platform was used to implement the genetic changes that can alter the glycosylation pattern of the IgG produced by the template cell line. Concurrently we employed a systems optimization approach to identify the genetic alterations in the metabolic pathway to guide cell metabolism toward a favorable state. The model prediction is being implemented experimentally using the synthetic biology approach. In conclusion, we have illustrated a pipeline of rational cell line engineering that integrates genomic science, systems engineering and synthetic biology approaches. The promise, the technical challenges and possible limitations will be discussed in this presentation

    An intergenerational study of perceptions of changes in active free play among families from rural areas of Western Canada

    Get PDF
    Background: Children's engagement in active free play has declined across recent generations. Therefore, the purpose of this study was to examine perceptions of intergenerational changes in active free play among families from rural areas. We addressed two research questions: (1) How has active free play changed across three generations? (2) What suggestions do participants have for reviving active free play? Methods: Data were collected via 49 individual interviews with members of 16 families (15 grandparents, 16 parents, and 18 children) residing in rural areas/small towns in the Province of Alberta (Canada). Interview recordings were transcribed verbatim and subjected to thematic analysis guided by an ecological framework of active free play. Results: Factors that depicted the changing nature of active free play were coded in the themes of less imagination/more technology, safety concerns, surveillance, other children to play with, purposeful physical activity, play spaces/organized activities, and the good parenting ideal. Suggestions for reviving active free play were coded in the themes of enhance facilities to keep kids entertained, provide more opportunities for supervised play, create more community events, and decrease use of technology. Conclusions: These results reinforce the need to consider multiple levels of social ecology in the study of active free play, and highlight the importance of community-based initiatives to revive active free play in ways that are consistent with contemporary notions of good parentin

    Biodiversity Monitoring in Long-Distance Food Supply Chains: Tools, Gaps and Needs to Meet Business Requirements and Sustainability Goals

    No full text
    Rampant loss of biodiversity and ecosystem services undermines the resilience of food systems. Robust knowledge on impacts is the first step to taking action, but long-distance food supply chains and indirect effects on and around farms make understanding impacts a challenge. This paper looks at the tools available for businesses in the food industry, especially retailers, to monitor and assess the biodiversity performance of their products. It groups tools according to their general scope to evaluate what is monitored (processes on-site, pressures on landscapes, impacts on species), at what scale (specific products, company performance, country-wide consumption levels), and compared to which baseline (pristine nature, alternative scenarios, governance targets). Altogether we find major gaps in the criteria for biodiversity or the criteria is weak in certification and standards, business accounting and reporting systems, and scientific modelling and analysis (biodiversity footprints). At the same time, massive investments have been made to strengthen existing tools, develop new ones, increase uptake and improve their effectiveness. We argue that business can and must take a leading role toward mitigating biodiversity impacts in partnership with policy makers and customers. Zero-deforestation commitments, for example, will need to be upheld by supporting changed practices in consumption (e.g., choice editing) and combating degradation within agricultural systems will require a shift toward more regenerative forms of farming (e.g., with norms embedded in robust standard systems). Operational targets are integral to monitoring biodiversity performance across all scales

    Assessing the Sustainability of EU Timber Consumption Trends: Comparing Consumption Scenarios with a Safe Operating Space Scenario for Global and EU Timber Supply

    Get PDF
    The growing demand for wood to meet EU renewable energy targets has increasingly come under scrutiny for potentially increasing EU import dependence and inducing land use change abroad, with associated impacts on the climate and biodiversity. This article builds on research accounting for levels of primary timber consumption—e.g., toward forest footprints—and developing reference values for benchmarking sustainability—e.g., toward land use targets—in order to improve systemic monitoring of timber and forest use. Specifically, it looks at future trends to assess how current EU policy may impact forests at an EU and global scale. Future demand scenarios are based on projections derived and adapted from the literature to depict developments under different scenario assumptions. Results reveal that by 2030, EU consumption levels on a per capita basis are estimated to be increasingly disproportionate compared to the rest of the world. EU consumption scenarios based on meeting around a 40% share of the EU renewable energy targets with timber would overshoot both the EU and global reference value range for sustainable supply capacities in 2030. Overall, findings support literature pointing to an increased risk of problem shifting relating to both how much and where timber needed for meeting renewable energy targets is sourced. It is argued that a sustainable level of timber consumption should be characterized by balance between supply (what the forest can provide on a sustainable basis) and demand (how much is used on a per capita basis, considering the concept of fair shares). To this end, future research should close data gaps, increase methodological robustness and address the socio-political legitimacy of the safe operating space concept towards targets in the future. A re-use of timber within the economy should be supported to increase supply options

    ExSeisDat: A set of parallel I/O and workflow libraries for petroleum seismology

    No full text
    Seismic data-sets are extremely large and are broken into data files, ranging in size from 100s of GiBs to 10s of TiBs and larger. The parallel I/O for these files is complex due to the amount of data along with varied and multiple access patterns within individual files. Properties of legacy file formats, such as the de-facto standard SEG-Y, also contribute to the decrease in developer productivity while working with these files. SEG-Y files embed their own internal layout which could lead to conflict with traditional, file-system-level layout optimization schemes. Additionally, as seismic files continue to increase in size, memory bottlenecks will be exacerbated, resulting in the need for smart I/O optimization not only to increase the efficiency of read/writes, but to manage memory usage as well. The ExSeisDat (Extreme-Scale Seismic Data) set of libraries addresses these problems through the development and implementation of easy to use, object oriented libraries that are portable and open source with bindings available in multiple languages. The lower level parallel I/O library, ExSeisPIOL (Extreme-Scale Seismic Parallel I/O Library), targets SEG-Y and other proprietary formats, simplifying I/O by internally interfacing MPI-I/O and other I/O interfaces. The I/O is explicitly handled; end users only need to define the memory limits, decomposition of I/O across processes, and data access patterns when reading and writing data. ExSeisPIOL bridges the layout gap between the SEG-Y file structure and file system organization. The higher level parallel seismic workflow library, ExSeisFlow (Extreme-Scale Seismic workFlow), leverages ExSeisPIOL, further simplifying I/O by implicitly handling all I/O parameters, thus allowing geophysicists to focus on domain-specific development. Operations in ExSeisFlow focus on prestack processing and can be performed on single traces, individual gathers, and across entire surveys, including out of core sorting, binning, filtering, and transforming. To optimize memory management, the workflow only reads in data pertinent to the operations being performed instead of an entire file. A smart caching system manages the read data, discarding it when no longer needed in the workflow. As the libraries are optimized to handle spatial and temporal locality, they are a natural fit to burst buffer technologies, particularly DDN’s Infinite Memory Engine (IME) system. With appropriate access semantics or through the direct exploitation of the low-level interfaces, the ExSeisDat stack on IME delivers a significant improvement to I/O performance over standalone parallel file systems like Lustre

    ExSeisDat: A set of parallel I/O and workflow libraries for petroleum seismology

    No full text
    Seismic data-sets are extremely large and are broken into data files, ranging in size from 100s of GiBs to 10s of TiBs and larger. The parallel I/O for these files is complex due to the amount of data along with varied and multiple access patterns within individual files. Properties of legacy file formats, such as the de-facto standard SEG-Y, also contribute to the decrease in developer productivity while working with these files. SEG-Y files embed their own internal layout which could lead to conflict with traditional, file-system-level layout optimization schemes. Additionally, as seismic files continue to increase in size, memory bottlenecks will be exacerbated, resulting in the need for smart I/O optimization not only to increase the efficiency of read/writes, but to manage memory usage as well. The ExSeisDat (Extreme-Scale Seismic Data) set of libraries addresses these problems through the development and implementation of easy to use, object oriented libraries that are portable and open source with bindings available in multiple languages. The lower level parallel I/O library, ExSeisPIOL (Extreme-Scale Seismic Parallel I/O Library), targets SEG-Y and other proprietary formats, simplifying I/O by internally interfacing MPI-I/O and other I/O interfaces. The I/O is explicitly handled; end users only need to define the memory limits, decomposition of I/O across processes, and data access patterns when reading and writing data. ExSeisPIOL bridges the layout gap between the SEG-Y file structure and file system organization. The higher level parallel seismic workflow library, ExSeisFlow (Extreme-Scale Seismic workFlow), leverages ExSeisPIOL, further simplifying I/O by implicitly handling all I/O parameters, thus allowing geophysicists to focus on domain-specific development. Operations in ExSeisFlow focus on prestack processing and can be performed on single traces, individual gathers, and across entire surveys, including out of core sorting, binning, filtering, and transforming. To optimize memory management, the workflow only reads in data pertinent to the operations being performed instead of an entire file. A smart caching system manages the read data, discarding it when no longer needed in the workflow. As the libraries are optimized to handle spatial and temporal locality, they are a natural fit to burst buffer technologies, particularly DDN’s Infinite Memory Engine (IME) system. With appropriate access semantics or through the direct exploitation of the low-level interfaces, the ExSeisDat stack on IME delivers a significant improvement to I/O performance over standalone parallel file systems like Lustre

    High-Intensity Interval Training Improves Cognitive Flexibility in Older Adults

    No full text
    Introduction: Regular aerobic exercise is associated with better executive function in older adults. It is unclear if high-intensity-interval-training (HIIT) elicits moderate-intensity continuous training (MICT) or resistance training (RT). We hypothesized that HIIT would augment executive function more than MICT and RT. Methods: Sixty-nine older adults (age: 68 ± 7 years) performed six weeks (three days/week) of HIIT (2 × 20 min bouts alternating between 15 s intervals at 100% of peak power output (PPO) and passive recovery (0% PPO); n = 24), MICT (34 min at 60% PPO; n = 19), or whole-body RT (eight exercise superior improvements in executive function of older adults than moderate-intensity-continuous-training, 2 × 10 repetitions; n = 26). Cardiorespiratory fitness (i.e., VË™O2max) and executive function were assessed before and after each intervention via a progressive maximal cycle ergometer protocol and the Stroop Task, respectively. Results: The VË™O2max findings revealed a significant group by time interaction (p = 0.001) in which all groups improved following training, but HIIT and MICT improved more than RT. From pre- to post-training, no interaction in the naming condition of the Stroop Task was observed (p > 0.10). However, interaction from pre- to post-training by group was observed, and only the HIIT group exhibited a faster reaction time (from 1250 ± 50 to 1100 ± 50 ms; p < 0.001) in switching (cognitive flexibility). Conclusion: Despite similar improvements in cardiorespiratory fitness, HIIT, but not MICT nor RT, enhanced cognitive flexibility in older adults. Exercise programs should consider using HIIT protocols in an effort to combat cognitive decline in older adults

    A scoping review on the conduct and reporting of scoping reviews

    No full text
    Abstract Background Scoping reviews are used to identify knowledge gaps, set research agendas, and identify implications for decision-making. The conduct and reporting of scoping reviews is inconsistent in the literature. We conducted a scoping review to identify: papers that utilized and/or described scoping review methods; guidelines for reporting scoping reviews; and studies that assessed the quality of reporting of scoping reviews. Methods We searched nine electronic databases for published and unpublished literature scoping review papers, scoping review methodology, and reporting guidance for scoping reviews. Two independent reviewers screened citations for inclusion. Data abstraction was performed by one reviewer and verified by a second reviewer. Quantitative (e.g. frequencies of methods) and qualitative (i.e. content analysis of the methods) syntheses were conducted. Results After searching 1525 citations and 874 full-text papers, 516 articles were included, of which 494 were scoping reviews. The 494 scoping reviews were disseminated between 1999 and 2014, with 45 % published after 2012. Most of the scoping reviews were conducted in North America (53 %) or Europe (38 %), and reported a public source of funding (64 %). The number of studies included in the scoping reviews ranged from 1 to 2600 (mean of 118). Using the Joanna Briggs Institute methodology guidance for scoping reviews, only 13 % of the scoping reviews reported the use of a protocol, 36 % used two reviewers for selecting citations for inclusion, 29 % used two reviewers for full-text screening, 30 % used two reviewers for data charting, and 43 % used a pre-defined charting form. In most cases, the results of the scoping review were used to identify evidence gaps (85 %), provide recommendations for future research (84 %), or identify strengths and limitations (69 %). We did not identify any guidelines for reporting scoping reviews or studies that assessed the quality of scoping review reporting. Conclusion The number of scoping reviews conducted per year has steadily increased since 2012. Scoping reviews are used to inform research agendas and identify implications for policy or practice. As such, improvements in reporting and conduct are imperative. Further research on scoping review methodology is warranted, and in particular, there is need for a guideline to standardize reporting
    corecore