Imagine trying to describe a city using only drone photos, or judging a restaurant based on the menu font. That’s how scientists have long been forced to study cancer: peering through microscopes, crunching datasets, trying to understand cells by viewing one layer at a time.
Now, a new deep-learning tool called CellLENS is flipping that model inside out, and zooming into cancer cells with unprecedented clarity.
Developed by a powerhouse team from MIT, Harvard, Yale, Stanford, and University of Pennsylvania and led by Bokai Zhu of the Broad Institute and Ragon Institute, CellLENS (Cell Local Environment and Neighborhood Scan) merges three perspectives into one:
- What genes a cell expresses
- Where does it live in the tumor
- What it looks like under the microscope
The tool creates a 3D atlas of the tumor, grouping cells not by first impressions, but by behavior and biology. That means two identical-looking cells can be accurately separated if one’s quietly suppressing the immune system while the other’s mounting an attack.
This smart system blends two AI superpowers: convolutional neural networks and graph neural networks, to create a detailed digital snapshot of every single cell. Even if two cells look like identical twins under the microscope, the AI can spot if one is acting like a hero at the tumor’s edge while the other is just blending in quietly. It doesn’t just group cells by looks, it groups them by biology, behavior, and social scene.
“Before, we’d just say, ‘Here’s a T cell,’” Zhu explains. “Now we can say, ‘Here’s a T cell, and it’s engaged in battle at this specific tumor border.’”
It’s not just cell spotting, it’s cell storytelling.
Many cancer treatments stumble because they target cells without understanding their spatial strategy. For instance, immune therapies often miss their mark if the target cells only huddle at the tumor’s edge. CellLENS fills in those blanks, revealing who’s where, doing what, and why it matters.
Applied to healthy tissues and cancers like lymphoma and liver tumors, CellLENS uncovered rare immune cell types and decoded their silent choreography, showing how their positions in the tissue shaped their roles in either fighting disease or quietly fueling it.
“I’m extremely excited by the potential of new AI tools, like CellLENS, to help us more holistically understand aberrant cellular behaviors within tissues,” says co-author Alex K. Shalek, the director of the Institute for Medical Engineering and Science (IMES), the J. W. Kieckhefer Professor in IMES and Chemistry, and an extramural member of the Koch Institute for Integrative Cancer Research at MIT, as well as an Institute member of the Broad Institute and a member of the Ragon Institute.
“We can now measure a tremendous amount of information about individual cells and their tissue contexts with cutting-edge, multi-omic assays. Effectively leveraging that data to nominate new therapeutic leads is a critical step in developing improved interventions. When coupled with the right input data and careful downstream validations, such tools promise to accelerate our ability to positively impact human health and wellness.”
Journal Reference:
- Bokai Zhu, Sheng Gao, Shuxiao Chen et al. CellLENS enables cross-domain information fusion for enhanced cell population delineation in single-cell spatial omics data. Nature Immunology. DOI: 10.1038/s41590-025-02163-1