0:00
/
0:00
Transcript

The Nerd Skills Problem

How a professor is helping support Homes of Hope India

PLEASE SUBSCRIBE TO THE BRAND NEW HOMES OF HOPE SUBSTACK

https://homesofhopeindia.substack.com/

I recognize the shape of this ask. Someone with a mission too large for their bandwidth encounters someone with technical skills and a conscience, and the question gets asked, gently but unmistakably: Can you help? I have been on the receiving end of this question. Most people with nerd skills have. The question sounds simple. It never is.

The call came from Doug, who had just sent from India with footage of dormitories, coconut palms, and a young woman sitting in an empty dining hall describing a shelf where she used to keep her things as if it were the most precious real estate in the world. Dilraj was on the call too, a computer scientist and businessman who understood systems the way an engineer understands bridges — load-bearing, failure-prone, consequential. And then there was Nik, an Associate Teaching Professor at Northeastern, a man who teaches engineers to tell stories, who runs a small nonprofit called Humanitarians AI on the premise that ethical artificial intelligence can amplify the work of people doing necessary things.

I watched the footage. I said yes before the call was over.

This is a story about what happened next. But it is also a story about a particular kind of professional reckoning — what it means to be asked to use your skills for something that will not optimize your citations, will not advance your career in any legible way, and will not produce outcomes measurable before the next annual review. It is about what professors do when the work that actually needs doing exists entirely outside the syllabus.


What the Mission Actually Is

To understand the ask, you have to understand the scale of what it was in service of.

Homes of Hope India-US has been operating since 2006, when journalist Paul Wilkes traveled to Kochi and met a six-year-old girl named Reena wearing sunglasses in a dusty playground. The syndicate using her to beg had plunged a darning needle into her eye. She smiled at him anyway. What Wilkes built in response now spans 35 residences, schools, and empowerment centers across India, run in partnership with nine congregations of Catholic sisters. More than 5,000 girls have passed through these homes. In 2024, approximately 3,000 children were served. The organization spends 98 cents of every dollar on the mission itself.

The problem it addresses is not sentimental. Organized begging syndicates — what researchers and law enforcement call the “beggar mafia” — control an estimated 300,000 children across India, using kidnapping, deliberate maiming, and drug-induced sedation to maximize earnings at traffic signals and temple gates. In Mumbai alone, the industry generates more than £20 million annually. The children see almost none of it.

One girl’s education at a Homes of Hope school costs approximately $940 per year.

Nik’s nonprofit, Humanitarians AI, was not built to fight child trafficking. It was built to demonstrate that ethical technology can serve missions like this one — not by replacing the human infrastructure but by making it visible to people who would support it if they only knew it existed. The question Doug’s call posed was technical in its surface form. At its core, it was moral: Will you use what you know for this?


The Inadequate First Response and Why It Mattered Anyway

The obvious move was a newsletter. A Substack account, some articles, a YouTube channel. Nik had software that could take Doug’s footage and convert it into longform prose — New Yorker register, specific detail, no hype. It was a start. It was also, measured against 500,000 vulnerable girls on the streets of India, almost nothing.

They knew that. They said so. And then they started anyway.

This is where the nerd skills question gets interesting, and where most accounts of “technology for good” go wrong. The standard narrative runs as follows: brilliant technical mind encounters intractable human problem, applies computational power, achieves scale previously impossible, saves significant portion of humanity. It is a clean story. It is also almost never how it works.

What Nik was actually doing was building content infrastructure. Doug’s documentary footage becomes Substack articles. Articles become YouTube videos. Videos become the evidence that moves people from awareness to action. The technology is not a solution to child trafficking. It is a solution to a much smaller, more tractable problem: the gap between a mission that has been working for nineteen years and the donors, advocates, and policymakers who do not yet know it exists.

I find myself thinking about this gap often. The organizations doing the most necessary work are frequently the worst positioned to tell anyone about it. Not because they lack stories — Kratjeshiri, who came back to the dining hall where she had eaten every meal for years, who described the cord the girls used to wear, who said this is my happy place while smiling in the room where she had survived — but because converting lived experience into distributable narrative requires skills that most frontline workers were neither trained in nor have time for. This is precisely the gap that a professor who teaches engineers to tell stories is positioned to fill.


The Specific Shape of the Contribution

What Nik brought to this collaboration is worth being precise about, because precision is what distinguishes useful technical contributions from the kind that look impressive in grant reports and accomplish little in the field.

He brought a methodology for transforming raw field materials — footage, interviews, production logs sent daily from India — into content that earns attention without lying about what’s inside. He brought a framework for what his course calls narrative pedagogy: the idea that the most durable way to transmit complex information is not through bullet points or infographics but through the kind of story that makes a reader feel, temporarily, like they are in the room. He brought, critically, a no-fabrication standard — a commitment that every claim in every article would be verifiable, that no composite characters would be invented to make the story more affecting, that the real people and real details would be allowed to do the work.

Doug is in India now, sending footage — a girl’s braids, a dormitory hallway, flip-flops on a tile floor. Paul Wilkes, who is 86 years old, still reviews every article before it publishes. Dilraj brings two decades of systems thinking to a problem that has defeated purely punitive approaches for generations. The Catholic sisters run the homes daily, year after year, as surrogate families for children who arrived with nothing. Nik’s contribution is not to replace any of this. It is to make it legible — to build the pipeline through which the story travels from a dining hall in Kerala to a donor in Massachusetts who will never visit India but might, if the story reaches them clearly enough, fund another year of someone’s education.

That is the nerd skills problem in its honest form. Not transformation. Translation.


What This Reveals About the Work We Choose

Ask yourself what you would have said when Doug’s call came in. The footage was real. The mission was nineteen years old and demonstrably effective. The ask was specific: Can you use what you know to help more people know this exists?

The honest answer, for most people with technical skills and institutional affiliations and quarterly obligations, is: I would have found a reason not to. Not from malice. From the ordinary friction of a professional life structured around deliverables that are measured and missions that are not. Humanitarians AI was built on the wager that this friction could be reduced — that you could create an organizational structure that made saying yes easier than saying no, that you could demonstrate, project by project, that ethical technology could serve missions that deserve serving.

The Substack is a proof of concept. So is this article. So is every video that gets made from Doug’s footage and sent into the world with a title that earns the click without lying about what’s inside.

Kratjeshiri came back to the dining hall. She described a cord, a shelf, a room that had been everything. She said it was her happy place. Not because the years were easy. Because she had somewhere to come back to.

The tool is ready. What’s needed now is for more people to know it exists — and for more people with nerd skills to be honest about what those skills are actually for.


Tags: Humanitarians AI, Homes of Hope India, ethical AI nonprofits, narrative journalism technology, child trafficking Kerala awareness

Discussion about this video

User's avatar

Ready for more?