TECBio Trainee, Jenea Adams, to present TR&D1-supported work at two conferences

Ms. Jenea Adams, a visiting undergraduate student from the University of Dayton worked with Ivet Bahar and her team this summer using ProDy to investigate the characteristics of binding and allosteric sites in proteins.  In her 10-week project, Jenea used Anisotropic Network Modeling (ANM), perturbation response scanning, and statistical and conformational analyses on 768 protein complexes from the PDB to look for correlations of residue identities as sensors or effectors with their locations within the three dimensional protein structures.  She found that sensor residues have a higher propensity to reside at protein-protein interfaces than their effector counterparts.  Her results have implications for additional insights into more effective drug design and the mechanisms of allostery.  Jenea’s work was selected for presentations at two conferences – the Research Experiences for Undergraduates (REU) Symposium in Arlington, VA (https://www.cur.org/what/events/students/reu/) and the Annual Biomedical Research Conference for Minority Students (ABRCMS;  http://www.abrcms.org/).  

 

 

2018 TECBio Summer REU Program Students

Top (left to right): Gabrielle LaRosa, Dominique Cantave, Gabrielle Coffing, Naina Balepur, Jenea Adams, Jason Dennis, Alex Ludwig, Tom Dougherty. Bottom: Lauren Petrina, Olivia Campos, Joseph Monaco, Caleb Reagor, Gabriella Gerlach, Jack Zhao.

She Zhang to give Special Session talk at ISMB 2018

 

She Zhang, a student in Dr. Ivet Bahar's lab, will be presenting a Special Session talk at this year's ISMB conference on Saturday, July 7 at 5:45 pm. 

 

Chromosomal dynamics predicted by an elastic network model explains genome-wide accessibility and long-range couplings

 

Abstract:

Understanding the three-dimensional (3D) architecture of chromatin and its relation to gene expression and regulation is fundamental to understanding how the genome functions. Advances in Hi-C technology now permit us to study 3D genome organization, but we still lack an understanding of the structural dynamics of chromosomes. The dynamic couplings between regions separated by large genomic distances (>50 Mb) have yet to be characterized. We adapted a well-established protein-modeling framework, the Gaussian Network Model (GNM), to model chromatin dynamics using Hi-C data. We show that the GNM can identify spatial couplings at multiple scales: it can quantify the correlated fluctuations in the positions of gene loci, find large genomic compartments and smaller topologically-associating domains (TADs) that undergo en bloc movements, and identify dynamically coupled distal regions along the chromosomes. We show that the predictions of the GNM correlate well with genome-wide experimental measurements. We use the GNM to identify novel cross-correlated distal domains (CCDDs) representing pairs of regions distinguished by their long-range dynamic coupling and show that CCDDs are associated with increased gene co-expression. Together, these results show that GNM provides a mathematically well-founded unified framework for modeling chromatin dynamics and assessing the structural basis of genome-wide observations. 

 

 

Behind the Boom in Machine Learning

 

Terry Sejnowski, computational neuroscientist at the Salk Institute for Biological Studies, president of the Neural Information Processing System (NIPS) Foundation and project co-leader for the MMBioS TR&D2 project, was interviewed at the NIPS Conference in December about growth in machine learning. 

Machine learning is a core technology driving advances in artificial intelligence. This week, some of its earliest practitioners and many of the world's top AI researchers are in Long Beach, CA, for the field's big annual gathering—the Neural Information Processing Systems (NIPS) conference. In all, some 7,700 people are to attend AI's version of high tech's glitzy South by Southwest conference, and the electronic device industry's even bigger annual CES conference.

It's NIPS' 31st year in what originally drew just a few hundred participants — computer scientists, physicists, mathematicians and neuroscientists all interested in AI. Terrence Sejnowski, a computational neuroscientist at the Salk Institute for Biological Studies and president of the NIPS Foundation, spoke with Axios about growth in the field and what's next.

How machine learning has grown since NIPS' start in the '80s: "Over that period what happened was a convergence of a number of different factors, one of them being the fact that computers got a million times faster. Back then we could only study little toy networks with a few hundred units. But now we can study networks with millions of units. The other thing was the training sets — you need to have examples of what it is you're trying to learn. The internet made it possible for us to get millions of training examples relatively easily, because there's so many images, abundant speech examples, and so forth, that you can download from the internet. Finally, there were breakthroughs along the way in the algorithms that we used to make them more efficient. We understood them a lot better in terms of something called regularization, which is how to keep the network from memorizing — you want it to generalize."

Read more here.

Copyright © 2020 National Center for Multiscale Modeling of Biological Systems. All Rights Reserved.