Mathematics, Department of

Receive updates for this collection

Clustered Maximum Weight Clique Problem Instances - Data set

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2017-03
Document type: 
Dataset

Ancestral Gene Synteny Reconstruction Improves Extant Species Scaffolding

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2015
Abstract: 

We exploit the methodological similarity between ancestral genome reconstruction and extant genome scaffolding. We present a method, called ARt-DeCo that constructs neighborhood relationships between genes or contigs, in both ancestral and extant genomes, in a phylogenetic context. It is able to handle dozens of complete genomes, including genes with complex histories, by using gene phylogenies reconciled with a species tree, that is, annotated with speciation, duplication and loss events. Reconstructed ancestral or extant synteny comes with a support computed from an exhaustive exploration of the solution space. We compare our method with a previously published one that follows the same goal on a small number of genomes with universal unicopy genes. Then we test it on the whole Ensembl database, by proposing partial ancestral genome structures, as well as a more complete scaffolding for many partially assembled genomes on 69 eukaryote species. We carefully analyze a couple of extant adjacencies proposed by our method, and show that they are indeed real links in the extant genomes, that were missing in the current assembly. On a reduced data set of 39 eutherian mammals, we estimate the precision and sensitivity of ARt-DeCo by simulating a fragmentation in some well assembled genomes, and measure how many adjacencies are recovered. We find a very high precision, while the sensitivity depends on the quality of the data and on the proximity of closely related genomes.

Document type: 
Article
File(s): 

Evolution of Genes Neighborhood Within Reconciled Phylogenies: An Ensemble Approach

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2015
Abstract: 

Context

The reconstruction of evolutionary scenarios for whole genomes in terms of genome rearrangements is a fundamental problem in evolutionary and comparative genomics. The DeCo algorithm, recently introduced by Bérard et al., computes parsimonious evolutionary scenarios for gene adjacencies, from pairs of reconciled gene trees. However, as for many combinatorial optimization algorithms, there can exist many co-optimal, or slightly sub-optimal, evolutionary scenarios that deserve to be considered.

Contribution

We extend the DeCo algorithm to sample evolutionary scenarios from the whole solution space under the Boltzmann distribution, and also to compute Boltzmann probabilities for specific ancestral adjacencies.

Results

We apply our algorithms to a dataset of mammalian gene trees and adjacencies, and observe a significant reduction of the number of syntenic conflicts observed in the resulting ancestral gene adjacencies.

Document type: 
Article
File(s): 

The Impact of Implementing a Test, Treat and Retain HIV Prevention Strategy in Atlanta among Black Men Who Have Sex with Men with a History of Incarceration: A Mathematical Model

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2015
Abstract: 

Background

Annually, 10 million adults transition through prisons or jails in the United States (US) and the prevalence of HIV among entrants is three times higher than that for the country as a whole. We assessed the potential impact of increasing HIV Testing/Treatment/Retention (HIV-TTR) in the community and within the criminal justice system (CJS) facilities, coupled with sexual risk behavior change, focusing on black men-who-have-sex-with-men, 15–54 years, in Atlanta, USA.

Methods

We modeled the effect of a HIV-TTR strategy on the estimated cumulative number of new (acquired) infections and mortality, and on the HIV prevalence at the end of ten years. We additionally assessed the effect of increasing condom use in all settings.

Results

In the Status Quo scenario, at the end of 10 years, the cumulative number of new infections in the community, jail and prison was, respectively, 9246, 77 and 154 cases; HIV prevalence was 10815, 69 and 152 cases, respectively; and the cumulative number of deaths was 2585, 18 and 34 cases, respectively. By increasing HIV-TTR coverage, the cumulative number of new infections could decrease by 15% in the community, 19% in jail, and 8% in prison; HIV prevalence could decrease by 8%, 9% and 7%, respectively; mortality could decrease by 20%, 39% and 18%, respectively. Based on the model results, we have shown that limited use and access to condoms have contributed to the HIV incidence and prevalence in all settings.

Conclusions

Aggressive implementation of a CJS-focused HIV-TTR strategy has the potential to interrupt HIV transmission and reduce mortality, with benefit to the community at large. To maximize the impact of these interventions, retention in treatment, including during the period after jail and prison release, and increased condom use was vital for decreasing the burden of the HIV epidemic in all settings.

Document type: 
Article
File(s): 

Randomized Controlled Ferret Study to Assess the Direct Impact of 2008–09 Trivalent Inactivated Influenza Vaccine on A(H1N1)pdm09 Disease Risk

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2014-01-27
Abstract: 

During spring-summer 2009, several observational studies from Canada showed increased risk of medically-attended, laboratory-confirmed A(H1N1)pdm09 illness among prior recipients of 2008–09 trivalent inactivated influenza vaccine (TIV). Explanatory hypotheses included direct and indirect vaccine effects. In a randomized placebo-controlled ferret study, we tested whether prior receipt of 2008–09 TIV may have directly influenced A(H1N1)pdm09 illness. Thirty-two ferrets (16/group) received 0.5 mL intra-muscular injections of the Canadian-manufactured, commercially-available, non-adjuvanted, split 2008–09 Fluviral or PBS placebo on days 0 and 28. On day 49 all animals were challenged (Ch0) with A(H1N1)pdm09. Four ferrets per group were randomly selected for sacrifice at day 5 post-challenge (Ch+5) and the rest followed until Ch+14. Sera were tested for antibody to vaccine antigens and A(H1N1)pdm09 by hemagglutination inhibition (HI), microneutralization (MN), nucleoprotein-based ELISA and HA1-based microarray assays. Clinical characteristics and nasal virus titers were recorded pre-challenge then post-challenge until sacrifice when lung virus titers, cytokines and inflammatory scores were determined. Baseline characteristics were similar between the two groups of influenza-naïve animals. Antibody rise to vaccine antigens was evident by ELISA and HA1-based microarray but not by HI or MN assays; virus challenge raised antibody to A(H1N1)pdm09 by all assays in both groups. Beginning at Ch+2, vaccinated animals experienced greater loss of appetite and weight than placebo animals, reaching the greatest between-group difference in weight loss relative to baseline at Ch+5 (7.4% vs. 5.2%; p = 0.01). At Ch+5 vaccinated animals had higher lung virus titers (log-mean 4.96 vs. 4.23pfu/mL, respectively; p = 0.01), lung inflammatory scores (5.8 vs. 2.1, respectively; p = 0.051) and cytokine levels (p>0.05). At Ch+14, both groups had recovered. Findings in influenza-naïve, systematically-infected ferrets may not replicate the human experience. While they cannot be considered conclusive to explain human observations, these ferret findings are consistent with direct, adverse effect of prior 2008–09 TIV receipt on A(H1N1)pdm09 illness. As such, they warrant further in-depth investigation and search for possible mechanistic explanations.

Document type: 
Article
File(s): 

Creating Groups with Similar Expected Behavioural Response in Randomized Controlled Trials: A Fuzzy Cognitive Map Approach

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2014
Abstract: 

Background

Controlling bias is key to successful randomized controlled trials for behaviour change. Bias can be generated at multiple points during a study, for example, when participants are allocated to different groups. Several methods of allocations exist to randomly distribute participants over the groups such that their prognostic factors (e.g., socio-demographic variables) are similar, in an effort to keep participants’ outcomes comparable at baseline. Since it is challenging to create such groups when all prognostic factors are taken together, these factors are often balanced in isolation or only the ones deemed most relevant are balanced. However, the complex interactions among prognostic factors may lead to a poor estimate of behaviour, causing unbalanced groups at baseline, which may introduce accidental bias.

Methods

We present a novel computational approach for allocating participants to different groups. Our approach automatically uses participants’ experiences to model (the interactions among) their prognostic factors and infer how their behaviour is expected to change under a given intervention. Participants are then allocated based on their inferred behaviour rather than on selected prognostic factors.

Results

In order to assess the potential of our approach, we collected two datasets regarding the behaviour of participants (n = 430 and n = 187). The potential of the approach on larger sample sizes was examined using synthetic data. All three datasets highlighted that our approach could lead to groups with similar expected behavioural changes.

Conclusions

The computational approach proposed here can complement existing statistical approaches when behaviours involve numerous complex relationships, and quantitative data is not readily available to model these relationships. The software implementing our approach and commonly used alternatives is provided at no charge to assist practitioners in the design of their own studies and to compare participants' allocations.

Document type: 
Article
File(s): 

Changing Risk Behaviours and the HIV Epidemic: A Mathematical Analysis in the Context of Treatment as Prevention

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2013
Abstract: 

Background

Expanding access to highly active antiretroviral therapy (HAART) has become an important approach to HIV prevention in recent years. Previous studies suggest that concomitant changes in risk behaviours may either help or hinder programs that use a Treatment as Prevention strategy.

Analysis

We consider HIV-related risk behaviour as a social contagion in a deterministic compartmental model, which treats risk behaviour and HIV infection as linked processes, where acquiring risk behaviour is a prerequisite for contracting HIV. The equilibrium behaviour of the model is analysed to determine epidemic outcomes under conditions of expanding HAART coverage along with risk behaviours that change with HAART coverage. We determined the potential impact of changes in risk behaviour on the outcomes of Treatment as Prevention strategies. Model results show that HIV incidence and prevalence decline only above threshold levels of HAART coverage, which depends strongly on risk behaviour parameter values. Expanding HAART coverage with simultaneous reduction in risk behaviour act synergistically to accelerate the drop in HIV incidence and prevalence. Above the thresholds, additional HAART coverage is always sufficient to reverse the impact of HAART optimism on incidence and prevalence. Applying the model to an HIV epidemic in Vancouver, Canada, showed no evidence of HAART optimism in that setting.

Conclusions

Our results suggest that Treatment as Prevention has significant potential for controlling the HIV epidemic once HAART coverage reaches a threshold. Furthermore, expanding HAART coverage combined with interventions targeting risk behaviours amplify the preventive impact, potentially driving the HIV epidemic to elimination.

Document type: 
Article
File(s): 

On the Topological and Uniform Structure of Diversities

Author: 
Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2013
Abstract: 

Diversities have recently been developed as multiway metrics admitting clear and useful notions of hyperconvexity and tight span. In this note, we consider the analytical properties of diversities, in particular the generalizations of uniform continuity, uniform convergence, Cauchy sequences, and completeness to diversities. We develop conformities, a diversity analogue of uniform spaces, which abstract these concepts in the metric case. We show that much of the theory of uniform spaces admits a natural analogue in this new structure; for example, conformities can be defined either axiomatically or in terms of uniformly continuous pseudodiversities. Just as diversities can be restricted to metrics, conformities can be restricted to uniformities. We find that these two notions of restriction, which are functors in the appropriate categories, are related by a natural transformation.

Document type: 
Article
File(s): 

Analyzing The Impact Of Social Factors On Homelessness: A Fuzzy Cognitive Map Approach

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2013
Abstract: 

Background

The forces which affect homelessness are complex and often interactive in nature. Social forces such as addictions, family breakdown, and mental illness are compounded by structural forces such as lack of available low-cost housing, poor economic conditions, and insufficient mental health services. Together these factors impact levels of homelessness through their dynamic relations. Historic models, which are static in nature, have only been marginally successful in capturing these relationships.

Methods

Fuzzy Logic (FL) and fuzzy cognitive maps (FCMs) are particularly suited to the modeling of complex social problems, such as homelessness, due to their inherent ability to model intricate, interactive systems often described in vague conceptual terms and then organize them into a specific, concrete form (i.e., the FCM) which can be readily understood by social scientists and others. Using FL we converted information, taken from recently published, peer reviewed articles, for a select group of factors related to homelessness and then calculated the strength of influence (weights) for pairs of factors. We then used these weighted relationships in a FCM to test the effects of increasing or decreasing individual or groups of factors. Results of these trials were explainable according to current empirical knowledge related to homelessness.

Results

Prior graphic maps of homelessness have been of limited use due to the dynamic nature of the concepts related to homelessness. The FCM technique captures greater degrees of dynamism and complexity than static models, allowing relevant concepts to be manipulated and interacted. This, in turn, allows for a much more realistic picture of homelessness. Through network analysis of the FCM we determined that Education exerts the greatest force in the model and hence impacts the dynamism and complexity of a social problem such as homelessness.

Conclusions

The FCM built to model the complex social system of homelessness reasonably represented reality for the sample scenarios created. This confirmed that the model worked and that a search of peer reviewed, academic literature is a reasonable foundation upon which to build the model. Further, it was determined that the direction and strengths of relationships between concepts included in this map are a reasonable approximation of their action in reality. However, dynamic models are not without their limitations and must be acknowledged as inherently exploratory.

Document type: 
Article
File(s): 

Linearization of Ancestral Multichromosomal Genomes

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2012
Abstract: 

BACKGROUND:Recovering the structure of ancestral genomes can be formalized in terms of properties of binary matrices such as the Consecutive-Ones Property (C1P). The Linearization Problem asks to extract, from a given binary matrix, a maximum weight subset of rows that satisfies such a property. This problem is in general intractable, and in particular if the ancestral genome is expected to contain only linear chromosomes or a unique circular chromosome. In the present work, we consider a relaxation of this problem, which allows ancestral genomes that can contain several chromosomes, each either linear or circular.RESULT:We show that, when restricted to binary matrices of degree two, which correspond to adjacencies, the genomic characters used in most ancestral genome reconstruction methods, this relaxed version of the Linearization Problem is polynomially solvable using a reduction to a matching problem. This result holds in the more general case where columns have bounded multiplicity, which models possibly duplicated ancestral genes. We also prove that for matrices with rows of degrees 2 and 3, without multiplicity and without weights on the rows, the problem is NP-complete, thus tracing sharp tractability boundaries.CONCLUSION:As it happened for the breakpoint median problem, also used in ancestral genome reconstruction, relaxing the definition of a genome turns an intractable problem into a tractable one. The relaxation is adapted to some biological contexts, such as bacterial genomes with several replicons, possibly partially assembled. Algorithms can also be used as heuristics for hard variants. More generally, this work opens a way to better understand linearization results for ancestral genome structure inference.

Document type: 
Article