Part 4 · The Missing Signal

India’s welfare went digital with a simple promise. On the ground, people talked less about algorithms and more about apps they could not open, forms they could not read, and the operators who made it all go through.

Field insight: Intermediaries did not disappear. They changed shape, from local fixers to portal agents and data entry operators.

The Myth of the App-First Citizen

Most state portals assume smartphone ownership and confidence with user interfaces. In reality, many households share a single basic phone. Some rely entirely on an employer’s device. For them, an app-first government can feel like an invitation-only one.

Own smartphone: ~62% Shared/basic phone: ~28% Employer/none: ~10%

“I applied online” often means a nephew uploaded someone’s Aadhaar. When that application is rejected, the person who needs the benefit has no easy way to check status or appeal.

Construction worker, Mumbai
Takeaway: Digitization can concentrate navigation power with people fluent in bureaucratic and digital language. Digital payments spread quickly, but welfare access lagged because navigation, not devices, is the binding constraint.

When Algorithms Help and When They Hurt

Screening models can speed verification and reduce some manual bias. They can also mirror historical patterns. If past approvals cluster by district, caste, or occupation, a model can quietly learn that pattern as fairness unless someone actively audits and tunes it.

Got a reason: ~22% No reason given: ~78%
Note: Efficiency gains can exclude people when decisions are opaque or cannot be contested.

“At least when the officer says no, I can ask why. The computer doesn’t talk.”

Respondent, Maharashtra

The Missing Ingredient: Transparency

Trust rarely comes from full automation in context-rich welfare settings; it comes from understanding. When citizens can see how decisions are made, and operators can explain them, confidence begins to rebuild.

How SARAL differs

Applicants receive plain-language outcomes:

  • “Your application was paused because income proof was missing.”
  • “To appeal, send ‘APPEAL’ to 720XXX within 7 days.”

How SARAL works for operators:

  • Operators see which rules fired, what flags were raised, and where discretion applies.
  • Any discretion or override is logged so teams can learn when and why judgment was used. The goal is shared accountability and learning, not surveillance.

Designing for Inclusion, Not Perfection

Inclusive intake starts with listening — a lesson from SmartAgro → Project Nagrik. Simple channels like voice prompts, SMS check-ins, and human verification counters often achieve the most valuable outcome: participation.

Applied themselves: ~36% Applied with help: ~64%

“If the machine listens, I don’t mind if it decides.”

Respondent, Mumbai

From Insight to Action: Reimagining Welfare Access

The question isn’t machines versus humans; it’s which systems earn trust. A model that hides behind algorithms deepens distance; one that explains itself invites inclusion.

A bias-aware, citizen-visible approach

  • SMS/IVR intake for universal access
  • Rules mapping that explains every decision
  • Anomaly checks to surface potential bias
  • Audit trails linking citizen and operator actions

This is the road toward SARAL, where transparency restores trust and welfare systems learn to speak the language of their citizens.

This brings an end to Phase 1 of the blog series.

I’ve tried to capture the everyday challenges citizens face while navigating India’s welfare systems. Drawing on these insights, I began developing SARAL v1, a system built to make access more transparent and inclusive.

The journey doesn’t end here. Field testing begins later this year, and results will be shared publicly in early 2026.

Parth Mody
Engineer & Data Scientist — Building AI for Governance