You sold your iris for PHP 3,000. Now what?

EDITORIAL: Iris scanning is often described as one of the most secure forms of biometric identification. The science behind it is impressive: every person’s iris pattern is unique, stable, and hard to forge. But technical strength alone doesn’t make a system safe. When people are asked to trade something permanent from their body for a few thousand pesos, the deal exposes deeper problems in consent, fairness, and data protection.

FASTEST 51

The recent order from the National Privacy Commission (NPC) against World App brings this issue closer to home. It raises a question that goes beyond any single company: when is “secure” data collection actually safe for the people involved?

Biometric identifiers like iris scans can unlock major advances in authentication and digital identity, but they also carry lasting risks when used carelessly. In the Philippines, where many are drawn to quick incentives, a PHP 3,000 payment for something as sensitive as a person’s iris shows how privacy can be undervalued and misunderstood.

Iris scanning is one of the most secure biometrics

The iris, the colored ring around our pupils, carries complex and random patterns that make it highly distinctive. Studies show that iris recognition has far lower error rates compared to other biometrics like fingerprints. Its structure stays consistent throughout a person’s life, unlike facial features that can change with age or health.

Advanced systems use near-infrared imaging and anti-spoofing checks to verify that a live person is being scanned, not a photo or replica. This makes iris scanning reliable for high-security use cases, such as border control and access management in sensitive facilities.

When strength becomes risk

The same qualities that make iris data powerful also make it risky. Unlike passwords, an iris cannot be changed. If raw images or cryptographic templates leak, the compromise is permanent. Affected users cannot simply “reset” their identity.

The danger increases when organizations collect more data than needed, or when their privacy policies fail to specify how and why the data is processed. In the World App case, the NPC found issues with consent, transparency, and bundling of privacy notices—warning that users were not fully informed about what they were agreeing to.

Why PHP 3,000 isn’t worth your iris

Offering a small sum for a biometric scan may sound like easy money, but the math doesn’t add up. The cost of proper iris systems—secure cameras, encrypted databases, audit systems, and compliance with privacy laws—runs far higher than any one-time payment to users. A PHP 3,000 payout, much more crypto tokens with values that fluctuate depending on the market price, doesn’t buy the safeguards that make a biometric program trustworthy.

IMG 20250723 164439 1

More importantly, consent tied to financial reward isn’t truly free. In low-income settings, incentives can blur the line between choice and necessity. People may agree not because they understand the risks, but because they need the money. That makes the consent ethically and legally weak.

The real price of privacy

Collecting biometric data responsibly requires far more than token payments. It involves conducting privacy impact assessments, independent audits, implementing transparent retention policies, and ensuring proper registration with local regulators. Claims of “instant deletion” or “secure storage” need proof, not just marketing language.

The NPC order against World App serves as a reminder: data protection is not a checkbox but a continuing responsibility. Every organization handling sensitive personal data must prove that it can be trusted with what it collects.

The Rundown

Iris scanning can be a secure way to verify identity, but only when managed with care, compliance, and transparency. The technology’s strength doesn’t cancel out the risks of misuse or exploitation. PHP 3,000 for a scan may sound appealing, but it’s a fraction of the real value of what people give away.

Should companies be allowed to offer money in exchange for immutable biometric data? And how should regulators ensure that Filipinos’ most private identifiers remain truly private?

Carl walked away from a corporate marketing career to build WalasTech from the ground up—now he writes no-fluff tech stories as its Founder and Editor-in-Chief. When news breaks, he’s already typing. Got a tip? Hit him up at [email protected].