Quantifying information transfer in the GPCR signalling system
G-protein coupled receptors (GPCR) constitute one of the most widespread class of receptors used by cell to get information on their external environment. However it was still debated whether the GPCR signaling system can measure more that the presence or absence of a compound binding to the receptor. Together with the team of Vladimir Katanaev of the Department of Pharmacology and Toxicology at UNIL, we quantified, using tools from information theory, the amount of information that the GPCR system can provide to a cell and showed that it can distinguish between multiple concentration levels. The paper has just been published in Nature Communications.
Disease Module Identification DREAM Challenge
Many bioinformatics methods have been proposed for reducing the complexity of large gene or protein networks into relevant subnetworks or modules. Yet, how such methods compare to each other in terms of their ability to identify disease-relevant modules in different types of networks remains poorly understood. We launched the “Disease Module Identification DREAM Challenge”, an open competition to comprehensively assess module identification methods across diverse protein-protein interaction, signaling, gene co-expression, homology, and cancer-gene networks. Assessing 75 contributed module identification methods revealed novel top-performing algorithms, and established resource of modules correspond to core disease-relevant pathways for studying human disease biology (https://synapse.org/modulechallenge and https://www.biorxiv.org/content/10.1101/265553v1).
PhenoMeNal: processing and analysis of metabolomics data in the cloud.
As a member of the PhenoMeNal (Phenome and Metabolome aNalysis) consortium we contributed to the advanced and complete solution to set up Infrastructure-as-a-Service (IaaS) that brings workflow-oriented, interoperable metabolomics data analysis platforms into the cloud. PhenoMeNal seamlessly integrates a wide array of existing open-source tools that are tested and packaged as Docker containers through the project's continuous integration process and deployed based on a kubernetes orchestration framework. It also provides a number of standardised, automated, and published analysis workflows.