Mathematics, Department of

Receive updates for this collection

The Rooted SCJ Median with Single Gene Duplications

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2018-10-01
Abstract: 

The median problem is a classical problem in genome rearrangements. It aims to compute a gene order that minimizes the sum of the genomic distances to  k>=3  given gene orders. This problem is intractable except in the related Single-Cut-or-Join and breakpoint rearrangement models. Here we consider the rooted median problem, where we assume one of the given genomes to be ancestral to the median, which is itself ancestral to the other genomes. We show that in the Single-Cut-or-Join model with single gene duplications, the rooted median problem is NP-hard. We also describe an Integer Linear Program for solving this problem, which we apply to simulated data, showing high accuracy of the reconstructed medians.

Document type: 
Book chapter
File(s): 

Counting, Generating, Analyzing and Sampling Tree Alignments

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2018-03-03
Abstract: 

Pairwise ordered tree alignment are combinatorial objects that appear unimportant applications, such as RNA secondary structure comparison. However, the usual representation of tree alignments as supertrees is ambiguous, i.e. two distinct supertrees may induce identical sets of matches between identical pairs of trees. This ambiguity is uninformative, and detrimental to any probabilistic analysis. In this work, we consider tree alignments up to equivalence. Our first result is a precise asymptotic enumeration of tree alignments, obtained from a context-free grammar by mean of basic analytic combinatorics. Our second result focuses on alignments between two given ordered trees SS and TT. By refining our grammar to align specific trees, we obtain a decomposition scheme for the space of alignments, and use it to design an efficient dynamic programming algorithm for sampling alignments under the Gibbs-Boltzmann probability distribution. This generalizes existing tree alignment algorithms, and opens the door for a probabilistic analysis of the space of suboptimal alignments.

Document type: 
Article

Detecting Introgression in Anopheles Mosquito Genomes using a Reconciliation-Based Approach

Peer reviewed: 
No, item is not peer reviewed.
Date created: 
2018-08-03
Abstract: 

Introgression is an important evolutionary mechanism in insects and animals evolution. Current methods for detecting introgression rely on the analysis of phylogenetic incongruence, using either statistical tests based on expected phylogenetic patterns in small phylogenies or probabilistic modeling in a phylogenetic network context. Introgression leaves a phylogenetic signal similar to horizontal gene transfer, and it has been suggested that its detection can also be approached through the gene tree / species tree reconciliation framework, which accounts jointly for other evolutionary mechanisms such as gene duplication and gene loss. However so far the use of a reconciliation-based approach to detect introgression has not been investigated in large datasets. In this work, we apply this principle to a large dataset of Anopheles mosquito genomes. Our reconciliation-based approach recovers the extensive introgression that occurs in the gambiae complex, although with some variations compared to previous reports. Our analysis also suggests a possible ancient introgression event involving the ancestor of An. christyi.

Document type: 
Conference presentation
File(s): 

Clustered Maximum Weight Clique Problem Instances - Data set

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2017-03
Document type: 
Dataset

Ancestral Gene Synteny Reconstruction Improves Extant Species Scaffolding

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2015
Abstract: 

We exploit the methodological similarity between ancestral genome reconstruction and extant genome scaffolding. We present a method, called ARt-DeCo that constructs neighborhood relationships between genes or contigs, in both ancestral and extant genomes, in a phylogenetic context. It is able to handle dozens of complete genomes, including genes with complex histories, by using gene phylogenies reconciled with a species tree, that is, annotated with speciation, duplication and loss events. Reconstructed ancestral or extant synteny comes with a support computed from an exhaustive exploration of the solution space. We compare our method with a previously published one that follows the same goal on a small number of genomes with universal unicopy genes. Then we test it on the whole Ensembl database, by proposing partial ancestral genome structures, as well as a more complete scaffolding for many partially assembled genomes on 69 eukaryote species. We carefully analyze a couple of extant adjacencies proposed by our method, and show that they are indeed real links in the extant genomes, that were missing in the current assembly. On a reduced data set of 39 eutherian mammals, we estimate the precision and sensitivity of ARt-DeCo by simulating a fragmentation in some well assembled genomes, and measure how many adjacencies are recovered. We find a very high precision, while the sensitivity depends on the quality of the data and on the proximity of closely related genomes.

Document type: 
Article
File(s): 

Evolution of Genes Neighborhood Within Reconciled Phylogenies: An Ensemble Approach

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2015
Abstract: 

Context

The reconstruction of evolutionary scenarios for whole genomes in terms of genome rearrangements is a fundamental problem in evolutionary and comparative genomics. The DeCo algorithm, recently introduced by Bérard et al., computes parsimonious evolutionary scenarios for gene adjacencies, from pairs of reconciled gene trees. However, as for many combinatorial optimization algorithms, there can exist many co-optimal, or slightly sub-optimal, evolutionary scenarios that deserve to be considered.

Contribution

We extend the DeCo algorithm to sample evolutionary scenarios from the whole solution space under the Boltzmann distribution, and also to compute Boltzmann probabilities for specific ancestral adjacencies.

Results

We apply our algorithms to a dataset of mammalian gene trees and adjacencies, and observe a significant reduction of the number of syntenic conflicts observed in the resulting ancestral gene adjacencies.

Document type: 
Article
File(s): 

The Impact of Implementing a Test, Treat and Retain HIV Prevention Strategy in Atlanta among Black Men Who Have Sex with Men with a History of Incarceration: A Mathematical Model

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2015
Abstract: 

Background

Annually, 10 million adults transition through prisons or jails in the United States (US) and the prevalence of HIV among entrants is three times higher than that for the country as a whole. We assessed the potential impact of increasing HIV Testing/Treatment/Retention (HIV-TTR) in the community and within the criminal justice system (CJS) facilities, coupled with sexual risk behavior change, focusing on black men-who-have-sex-with-men, 15–54 years, in Atlanta, USA.

Methods

We modeled the effect of a HIV-TTR strategy on the estimated cumulative number of new (acquired) infections and mortality, and on the HIV prevalence at the end of ten years. We additionally assessed the effect of increasing condom use in all settings.

Results

In the Status Quo scenario, at the end of 10 years, the cumulative number of new infections in the community, jail and prison was, respectively, 9246, 77 and 154 cases; HIV prevalence was 10815, 69 and 152 cases, respectively; and the cumulative number of deaths was 2585, 18 and 34 cases, respectively. By increasing HIV-TTR coverage, the cumulative number of new infections could decrease by 15% in the community, 19% in jail, and 8% in prison; HIV prevalence could decrease by 8%, 9% and 7%, respectively; mortality could decrease by 20%, 39% and 18%, respectively. Based on the model results, we have shown that limited use and access to condoms have contributed to the HIV incidence and prevalence in all settings.

Conclusions

Aggressive implementation of a CJS-focused HIV-TTR strategy has the potential to interrupt HIV transmission and reduce mortality, with benefit to the community at large. To maximize the impact of these interventions, retention in treatment, including during the period after jail and prison release, and increased condom use was vital for decreasing the burden of the HIV epidemic in all settings.

Document type: 
Article
File(s): 

Randomized Controlled Ferret Study to Assess the Direct Impact of 2008–09 Trivalent Inactivated Influenza Vaccine on A(H1N1)pdm09 Disease Risk

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2014-01-27
Abstract: 

During spring-summer 2009, several observational studies from Canada showed increased risk of medically-attended, laboratory-confirmed A(H1N1)pdm09 illness among prior recipients of 2008–09 trivalent inactivated influenza vaccine (TIV). Explanatory hypotheses included direct and indirect vaccine effects. In a randomized placebo-controlled ferret study, we tested whether prior receipt of 2008–09 TIV may have directly influenced A(H1N1)pdm09 illness. Thirty-two ferrets (16/group) received 0.5 mL intra-muscular injections of the Canadian-manufactured, commercially-available, non-adjuvanted, split 2008–09 Fluviral or PBS placebo on days 0 and 28. On day 49 all animals were challenged (Ch0) with A(H1N1)pdm09. Four ferrets per group were randomly selected for sacrifice at day 5 post-challenge (Ch+5) and the rest followed until Ch+14. Sera were tested for antibody to vaccine antigens and A(H1N1)pdm09 by hemagglutination inhibition (HI), microneutralization (MN), nucleoprotein-based ELISA and HA1-based microarray assays. Clinical characteristics and nasal virus titers were recorded pre-challenge then post-challenge until sacrifice when lung virus titers, cytokines and inflammatory scores were determined. Baseline characteristics were similar between the two groups of influenza-naïve animals. Antibody rise to vaccine antigens was evident by ELISA and HA1-based microarray but not by HI or MN assays; virus challenge raised antibody to A(H1N1)pdm09 by all assays in both groups. Beginning at Ch+2, vaccinated animals experienced greater loss of appetite and weight than placebo animals, reaching the greatest between-group difference in weight loss relative to baseline at Ch+5 (7.4% vs. 5.2%; p = 0.01). At Ch+5 vaccinated animals had higher lung virus titers (log-mean 4.96 vs. 4.23pfu/mL, respectively; p = 0.01), lung inflammatory scores (5.8 vs. 2.1, respectively; p = 0.051) and cytokine levels (p>0.05). At Ch+14, both groups had recovered. Findings in influenza-naïve, systematically-infected ferrets may not replicate the human experience. While they cannot be considered conclusive to explain human observations, these ferret findings are consistent with direct, adverse effect of prior 2008–09 TIV receipt on A(H1N1)pdm09 illness. As such, they warrant further in-depth investigation and search for possible mechanistic explanations.

Document type: 
Article
File(s): 

Creating Groups with Similar Expected Behavioural Response in Randomized Controlled Trials: A Fuzzy Cognitive Map Approach

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2014
Abstract: 

Background

Controlling bias is key to successful randomized controlled trials for behaviour change. Bias can be generated at multiple points during a study, for example, when participants are allocated to different groups. Several methods of allocations exist to randomly distribute participants over the groups such that their prognostic factors (e.g., socio-demographic variables) are similar, in an effort to keep participants’ outcomes comparable at baseline. Since it is challenging to create such groups when all prognostic factors are taken together, these factors are often balanced in isolation or only the ones deemed most relevant are balanced. However, the complex interactions among prognostic factors may lead to a poor estimate of behaviour, causing unbalanced groups at baseline, which may introduce accidental bias.

Methods

We present a novel computational approach for allocating participants to different groups. Our approach automatically uses participants’ experiences to model (the interactions among) their prognostic factors and infer how their behaviour is expected to change under a given intervention. Participants are then allocated based on their inferred behaviour rather than on selected prognostic factors.

Results

In order to assess the potential of our approach, we collected two datasets regarding the behaviour of participants (n = 430 and n = 187). The potential of the approach on larger sample sizes was examined using synthetic data. All three datasets highlighted that our approach could lead to groups with similar expected behavioural changes.

Conclusions

The computational approach proposed here can complement existing statistical approaches when behaviours involve numerous complex relationships, and quantitative data is not readily available to model these relationships. The software implementing our approach and commonly used alternatives is provided at no charge to assist practitioners in the design of their own studies and to compare participants' allocations.

Document type: 
Article
File(s): 

Changing Risk Behaviours and the HIV Epidemic: A Mathematical Analysis in the Context of Treatment as Prevention

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2013
Abstract: 

Background

Expanding access to highly active antiretroviral therapy (HAART) has become an important approach to HIV prevention in recent years. Previous studies suggest that concomitant changes in risk behaviours may either help or hinder programs that use a Treatment as Prevention strategy.

Analysis

We consider HIV-related risk behaviour as a social contagion in a deterministic compartmental model, which treats risk behaviour and HIV infection as linked processes, where acquiring risk behaviour is a prerequisite for contracting HIV. The equilibrium behaviour of the model is analysed to determine epidemic outcomes under conditions of expanding HAART coverage along with risk behaviours that change with HAART coverage. We determined the potential impact of changes in risk behaviour on the outcomes of Treatment as Prevention strategies. Model results show that HIV incidence and prevalence decline only above threshold levels of HAART coverage, which depends strongly on risk behaviour parameter values. Expanding HAART coverage with simultaneous reduction in risk behaviour act synergistically to accelerate the drop in HIV incidence and prevalence. Above the thresholds, additional HAART coverage is always sufficient to reverse the impact of HAART optimism on incidence and prevalence. Applying the model to an HIV epidemic in Vancouver, Canada, showed no evidence of HAART optimism in that setting.

Conclusions

Our results suggest that Treatment as Prevention has significant potential for controlling the HIV epidemic once HAART coverage reaches a threshold. Furthermore, expanding HAART coverage combined with interventions targeting risk behaviours amplify the preventive impact, potentially driving the HIV epidemic to elimination.

Document type: 
Article
File(s):