April 25, 2025
Take a walk through any major shopping mall in Johannesburg, Cape Town, or
Durban, and there's a good chance your face is being scanned, logged, and
analysed — all without your knowledge. Facial recognition technology (FRT) is
quietly expanding its reach into South Africa’s public and semi-public spaces, raising
urgent constitutional and ethical questions that we cannot afford to ignore.
Across South Africa, residential estates now routinely deploy facial recognition at
their entrances, capturing and cataloguing the biometrics of visitors and domestic
workers without meaningful consent or transparency. Private security firms and body
corporate administrators claim this is in the interest of safety — but rarely explain
how the data is stored, for how long, or who has access to it. Meanwhile, our own
cell phones unlock with our faces and fingerprints, creating a digital identity trail that
often stretches far beyond what users understand.
At first glance, the promise of FRT seems benign, even beneficial: enhanced
security, asserting in quicker criminal identification and safer communities. But
behind this technological convenience lies an uncomfortable truth — we are
sleepwalking into a surveillance society without the legal guardrails to protect our
most basic rights, chief among them: the right to anonymity.
The Right to Privacy in the South African Constitution
Although there may not be an explicit “right to anonymity” in our legislation, the
Constitution robustly protects the right to privacy enshrined within section 14. Our
courts have embraced this right and have affirmed that section 14 extends beyond
the home to places where individuals have a legitimate expectation of privacy, even
in public.
The Constitutional Court in Investigating Directorate: Serious Economic Offences v
Hyundai Motor Distributors (Pty) Ltd established that the right to privacy, as
guaranteed in the Constitution, did not relate solely to an individual within his or her
intimate space. It held that when people moved beyond this established "intimate
core”, they still retained a right to privacy in their social capacities. Thus, when
people are in their offices, in their cars or even on their mobile phones, they still
retain a right to be left alone unless certain conditions are satisfied.
The right to exist in public without being constantly identified is therefore not a luxury
— but the oxygen of a healthy democracy. It is what allows activists, whistleblowers,
or even ordinary citizens to gather, dissent, and exist without fear of reprisal.
A Legal Vacuum with Dangerous Implications
The Protection of Personal Information Act 4 of 2013 (“POPIA”) governs how
personal information is processed. Section 26 of POPIA provides that biometric data
such as facial images, qualifies as “special personal information”, the processing of
which is generally prohibited unless strict conditions are met — including explicit
consent (section 27(1)(a)), a legal obligation (section 27(1)(b)), or where it is
necessary for the proper performance of a public law duty.
Further, section 18 requires that data subjects be notified when their personal
information is collected — including details about the purpose, the responsible party,
and their rights to access or object. In practice, this is almost never done in the
context of facial recognition systems, especially those embedded in estate gates,
mall entrances, or crowd monitoring tools.
Additionally, section 9 mandates that personal information must be processed in a
manner that is adequate, relevant, and not excessive. Blanket, real-time surveillance
of faces in public — with no specific threat, target, or lawful purpose — arguably fails
this proportionality test.
Yet despite these protections on paper, enforcement remains toothless, and the
public is largely unaware that such practices may already be unlawful.
Beyond Consent: What About Choice?
The argument that “you can always choose not to enter” doesn’t hold water. Public
spaces and quasi-public spaces (like malls and residential estates) are part of daily
life. Opting out of them is not feasible — nor should it be necessary. The burden
must shift to those deploying the technology, not the citizens forced to live under its
gaze.
Moreover, the global track record of facial recognition technology is fraught with bias.
Studies show that it is less accurate when identifying women and people of colour —
a chilling thought in a country still grappling with the legacy of racial injustice.
What Should Be Done?
We urgently need a public conversation and legislative action. At a minimum:
The use of FRT by both public and private actors should require transparency and prior impact assessments.
Real-time facial recognition in public spaces should be limited or banned unless authorized by a court.
A clear system of redress and accountability must exist when data is misused or incorrect identifications occur.
Conclusion
The march of technology is inevitable, but the surrender of rights is not. We must
insist on a future where safety does not come at the cost of freedom and privacy —
and where surveillance does not eclipse civil liberties. South Africa has the legal
foundations to protect its citizens from unchecked surveillance. What it lacks — for
now — is the political will.
It’s time we stopped treating privacy as a passive right and started defending it as an
active one.