Feb. 5, 2016
Special Presentation "Riding the Metagenomic Data Wave"
Metagenomic experiments are generating massive amounts of data. Technological advances make it increasingly cheaper to produce sequence data than to store, manage and analyse them. We are now moving towards the $100 genome which comes with a $100,000 analysis price tag - how can we efficiently and effectively deal with this data wave? At the Metagenomics group at DTU we are involved in several metagenomics and microbiome projects, both from human and other animals and to address the ever-increasing Omics data flow, we have been focusing on three different activities - 1) learning from the data to discover new concepts for restructuring metagenomic data 2) using Machine Learning algorithms to develop prediction tools and 3) designing and building a supercomputer specially dedicated to life science data sets. This talk touched on these three directions and present some of our ongoing research within Metagenomics and Big Data. - Professor Thomas Sicheritz-Ponten, Ph.D
Professor Thomas Sicheritz-Ponten, Ph.D is the Head of Metagenomics, Center for Biological Sequence Analysis Section Leader Genomic Diversity, Department of Systems Biology Technical University of Denmark. He is also a Professor in Metagenomics at DTU and Head of the Metagenomics group at CBS/DTU.