Today, 1 May 2026, the Privacy Amendment Act 2025 brings IPP 3A into force. If you run outbound AI voice campaigns in New Zealand, this is the day your compliance posture quietly changes. Most platforms have not noticed. We have.
Here is the full picture: every Information Privacy Principle, what it means for AI voice agents, the cross-border trap that catches almost everyone, and a practical compliance checklist you can hand to your team this afternoon.
Why this matters now (IPP 3A is live)
The Privacy Act 2020 replaced the 1993 Act and introduced cross-border restrictions, mandatory breach notifications, compliance notices, and extraterritorial reach. The Privacy Amendment Act 2025 added IPP 3A, which fires today.
IPP 3A changes one specific thing: when you collect personal information about a person from a third-party source instead of from the person themselves, you must tell that person. For AI voice operators running outbound campaigns from CRM data, scraped lead lists, or partner referrals, this is the new rule. The agent's opening disclosure now needs to state, in some form, where the prospect's name and number came from.
The Privacy Act is technology-neutral. There is no AI exemption. Every IPP applies whether the caller is a human, a recorded message, or a Claude-powered conversational agent. The Office of the Privacy Commissioner (OPC, also known as Te Mana Mātāpono Matatapu) made this position explicit in its September 2023 AI guidance and has not softened since.
The 13 IPPs in plain English
What each IPP says, what an AI voice operator must do, and the most common failure mode.
IPP 1, Purpose of collection. Only collect personal information for a lawful, defined purpose. Write down the purpose ("book appointments", "qualify mortgage rollover leads") before you deploy. Do not collect date of birth or address unless directly required. Common failure: recording calls "just in case" with no defined retention purpose.
IPP 2, Source. Collect from the person themselves where practicable. Capture in realtime on the call, not from data brokers with murky consent chains. Common failure: outbound from purchased lists where the original consent has decayed.
IPP 3, Notice when collecting directly. Tell the caller, before or at collection, that you are collecting, why, who will receive it, whether it is mandatory, what happens if they refuse, and how they can access or correct it. Practically: a clear opening line at call start. "Hi, this is Sara, an AI assistant calling from Smith Plumbing. This call is recorded so our team can confirm details. Press zero anytime to speak to a person."
IPP 3A, Indirect collection notice (live 1 May 2026). When the data did not come from the caller themselves, you still notify them. Outbound agents using CRM or third-party leads must include where the information came from. "Your number was provided to us by ABC Brokers under their referral programme."
IPP 4, Manner of collection. Not unlawful, unfair, or unreasonably intrusive. Extra protection applies for children. Common failure: AI agents named "Sarah" without disclosing they are AI when asked directly. The Fair Trading Act layer compounds the risk.
IPP 5, Storage and security. Reasonable safeguards against loss, unauthorised access, modification, or disclosure. Encrypt recordings in transit and at rest, role-based access on transcripts, multi-factor auth for admin, vendor security reviews. The PAK'nSAVE x2 stores ruling (CE/0420 [2025] NZPrivCmr3, December 2025) was an IPP 5 breach. Public S3 buckets and unencrypted CRM fields are a guaranteed finding when the Commissioner asks.
IPP 6, Access. Individuals can request their information. Agencies respond within 20 working days. You must be able to retrieve any caller's transcript and recording, indexed by phone number. "We cannot search recordings by caller" is not a legal defence.
IPP 7, Correction. Individuals can correct their information, or attach a statement noting a dispute. You need a process to fix AI-extracted fields when the agent misheard a name, address, or quoted price. Otherwise the wrong data ends up in your CRM and downstream automations.
IPP 8, Accuracy before use. Take reasonable steps to ensure information is accurate, current, and complete before using it. The OPC's AI guidance is explicit: ensure human review before acting on AI outputs. Do not auto-quote prices off a transcript. Do not auto-route triage off a sentiment score alone.
IPP 9, Retention. Do not keep information longer than the lawful purpose requires. Set auto-delete on recordings (90 days for most call types, 12 months if you have a clear retention reason). Indefinite retention is one of the most common findings.
IPP 10, Use. Use information only for the original purpose, or a directly related one. Do not feed customer recordings into a vendor's voice-cloning model without fresh consent. The OPC AI guidance flags this directly: reusing information for AI training may violate IPP 10.
IPP 11, Disclosure. Do not disclose to third parties unless the original purpose covers it or an exception applies. List your vendors (Twilio, Retell, OpenAI, Anthropic, ElevenLabs) in your privacy policy. Have data processing agreements in place. Common failure: pushing transcripts to a marketing analytics tool without disclosure.
IPP 12, Cross-border disclosure. New in 2020. Detailed below.
IPP 13, Unique identifiers. Do not assign or use another agency's unique identifier. Do not ask callers for IRD, NHI, driver's licence, or passport numbers to identify them in your system. Use your own customer ID instead.
Cross-border: the LLM problem
This is the trap. Every NZ AI voice agent stack today routes audio and transcripts through US-hosted services. OpenAI for the language model, Anthropic for Claude, ElevenLabs for the voice, Deepgram or AssemblyAI for the speech-to-text, often Twilio for the telephony. Every one of those API calls is a cross-border disclosure of personal information.
IPP 12 says you may disclose overseas only if you can rely on one of six grounds:
1. Express informed consent from the individual, knowing the recipient may not give comparable protection.
2. The overseas recipient operates in NZ and is subject to the Privacy Act. Rare for US LLM vendors.
3. Comparable privacy laws in the recipient's jurisdiction. The US is not considered comparable by default.
4. A prescribed binding scheme.
5. A prescribed country (none prescribed yet).
6. Contractual safeguards: model clauses that bind the overseas recipient to Privacy Act-equivalent protection.
Practically, you need both: sign data processing agreements with model clauses with every US vendor in your stack, AND get express consent in your call-start disclosure. The OPC published its Model Contractual Clauses in 2022. Belt and braces is the only safe answer.
This is also why we recommend explicit IPP 3 disclosure at the start of every call: it doubles as the express-consent ground for IPP 12.
Call recording: what NZ allows
NZ is a one-party consent jurisdiction under the Crimes Act 1961, sections 216A to 216C. If your AI agent is a party to the call, recording it without disclosing is legal under the Crimes Act. But IPP 3 still requires you to disclose the recording and the purpose.
The disclosure formula at call start covers both:
1. Identify the call as automated or AI-driven.
2. Identify the business calling.
3. State the purpose.
4. State that the call is recorded and why.
5. Offer a path to a human or an opt-out.
A workplace note: Lane Neave and Wynn Williams have published cases where secret workplace recording, even legal under the Crimes Act, breached the good-faith duty under the Employment Relations Act. Recording employee calls without telling them is risky regardless of caller consent.
The OPC's eight AI expectations
The OPC published AI guidance in September 2023, updated 2024. Eight clear expectations:
1. Senior leadership sign-off before deploying AI on personal information.
2. Necessity and proportionality. Is AI actually needed for this task?
3. Privacy Impact Assessment (PIA) before deployment.
4. Transparency. Tell people how AI is used, when, and why.
5. Engage with Māori on impacts to taonga of information.
6. Accuracy procedures, including human review before AI outputs trigger action.
7. Do not let the AI retain or disclose personal information beyond the defined purpose.
8. Audit your training data. Verify it did not breach IPPs 1 to 4.
The Commissioner's contact for AI questions is ai@privacy.org.nz. The framing throughout is technology-neutral: same rules, no carve-out.
What enforcement looks like in 2026
Penalties under the 2020 Act are still low by global standards. The maximum fine for failure to notify a serious breach is $10,000. There are no GDPR-style turnover-based fines. The Commissioner has publicly indicated she would like that to change.
Enforcement so far includes:
Complaint volume is up 21% year on year to 1,598 cases in 2024-25. There is no published OPC decision yet specifically on AI voice agents or AI call recording. That is first-mover risk: the first sloppy operator to hit the news will define the precedent.
Practical compliance checklist
Hand this to your team. Each item is binary: it is true of your agents today, or it isn't.
At deployment time
On every call
Ongoing
If your platform doesn't tick all of these, you are not legally compliant under the 2020 Act. Most NZ AI voice deployments we have audited fail on IPP 12 (no model clauses) and IPP 9 (indefinite retention).
Frequently asked questions
My AI voice agent disclosed it was AI. Am I covered?
That covers part of IPP 3 and IPP 4. You still need IPP 12 model clauses, IPP 9 retention, IPP 5 security, and the IPP 3A indirect-source notice if applicable. Disclosure alone is not the whole compliance posture.
Do I need to do a Privacy Impact Assessment?
The OPC's AI guidance treats PIAs as effectively mandatory for any AI deployment touching personal information. They are not statutorily required for every system, but the Commissioner expects them. If your agent ever ends up in a complaint, the first question will be "did you run a PIA?".
What if my LLM vendor refuses to sign model clauses?
Most major vendors (OpenAI, Anthropic, Google, Twilio) will sign DPAs with cross-border clauses on enterprise contracts. If yours will not, you are relying on caller consent (IPP 12 ground 1) which is fragile because the consent has to be informed. A disclosed call recording with US LLM processing is a defensible position; an undisclosed one is not.
Does this apply to inbound calls only, or outbound too?
Both. Inbound calls collect personal information from the caller (IPPs 1 to 11 apply directly). Outbound calls disclose to the prospect that you have their number from somewhere (IPP 3A from today, plus IPPs 1 to 11 for any new info you collect).
What about Te Reo Māori callers and te ao Māori considerations?
The OPC's AI guidance specifically calls out Māori engagement on impacts to taonga of information. If your business serves Māori communities, expect the Commissioner to ask whether you consulted iwi or kaupapa Māori advisors during PIA. Multilingual voice support, including Te Reo, is the technical floor; the cultural floor is engagement.
My agent stores recordings for 12 months. Is that long?
It depends on the lawful purpose. For complaint resolution and audit, 12 months is usually defensible. For booking confirmations, 90 days is more typical. Whatever you pick, document it, automate the deletion, and review annually. Indefinite retention is the failure mode.
What about Australian callers calling our NZ business?
The Privacy Act 2020 applies to your business as the agency. Your AU-based callers are also covered by the Australian Privacy Act 1988 (APPs) on top. Your compliance posture has to satisfy both. We covered the AU side in Australian telemarketing law for AI voice agents.
Where can I find the IPPs in full?
The Privacy Act 2020 is at legislation.govt.nz. The OPC's plain-English IPP hub is at privacy.org.nz. The OPC AI guidance is at the same site under the AI A-Z topic.
Run a Privacy Act audit on your current voice agent
Bring your current opening disclosure, retention policy, and vendor list. We run the IPP-by-IPP check, identify the gaps, and rebuild the agent's compliance layer in under a week.
Book a compliance call · How the platform protects you · Live pricing page · Try a live voice agent
Leonardo Garcia-Curtis
Founder & CEO at Waboom AI. Building voice AI agents that convert.
Ready to Build Your AI Voice Agent?
Let's discuss how Waboom AI can help automate your customer conversations.
Book a Free Demo


