Every second Australian board paper I have read this year opens with the same sentence: ransomware is on the rise. It's true, but it's also lazy. The more useful question — the one that actually changes how you prepare — is how it has changed. Because what we are responding to in 2025 is not the same incident we were responding to in 2021, and a lot of uplift programs are still quietly pointed at the wrong target.
Here's what the last twelve months have actually looked like from the DFIR side of the desk, and what I'd ask a board to do about it this quarter.
The shape of the threat, from primary sources
ASD's most recent public figures, in the Annual Cyber Threat Report 2023–24, recorded 121 ransomware incidents the agency responded to across that financial year — about 11% of all reported incidents, with ransomware making up the majority of extortion-related cases the agency touched (ASD, Annual Cyber Threat Report 2023–24). Those are the incidents ASD was brought into. The reported universe is larger, and the unreported universe is larger still.
On the privacy side, the OAIC's Notifiable Data Breaches Report: July to December 2024 recorded the highest half-year figures since the NDB scheme started, with malicious or criminal attacks driving 69% of notifications and ransomware a material slice of that (OAIC, NDB Report: July to December 2024). The signal in that report is less about the headline count and more about the mix: ransomware notifications now routinely include a separate data-theft limb, because operators are stealing data before they encrypt, and the encryption itself is increasingly optional.
If your incident response plan still treats encryption as the event and exfiltration as an aggravating factor, your plan is pointing at 2020.
What has actually changed
Double extortion is now the baseline, not the twist
The groups we tracked across engagements in the last year — and the ones named openly in vendor threat reporting, including a continued stream of LockBit, Akira, and BlackSuit-branded activity — routinely split their monetisation across two levers: encrypt the estate to halt operations, and publish stolen data to force a settlement even if backups are clean. In several matters I've seen this year, the operator skipped encryption entirely and went straight to data-leak-site listing. Pure extortion. No ransomware binary to reverse.
That reframes the whole containment problem. You cannot "restore from backup" your way out of an exfil-only matter. The evidentiary question becomes what left, when, to where, and for whom, and that question is answered by egress telemetry and endpoint forensics, not by your backup vendor's dashboard.
Initial access is still boring, and that should tell you something
Across the incidents I've worked, initial access in 2024–25 has been overwhelmingly mundane: credentials bought from an infostealer broker, a phishing payload that survived an under-tuned email gateway, a public-facing appliance missed in a patching cycle, an MFA prompt worn down by repeated push. None of this is new and none of it is sophisticated. It persists because the controls that stop it are unglamorous: conditional access tuned to actually deny, MFA that is phish-resistant rather than push-based, an asset inventory that matches reality, and an offboarding process that actually revokes tokens.
If the uplift program on your desk is built around a shiny detection technology but has a known gap in one of those four areas, the budget is in the wrong place.
Identity is the new perimeter, and the logs are rarely ready
Once inside, the pattern I see most often is not lateral malware; it is lateral identity. An attacker lands on one endpoint, steals a session token or Kerberos material, and becomes a user. From there the activity is indistinguishable from a legitimate admin until you go looking. The problem is that most of the environments I walk into do not retain identity-provider logs at the granularity or duration needed to reconstruct that movement after the fact. Ninety days of sign-in logs is not enough for a forensic story that needs to run back to an initial compromise four months earlier.
Fix the logging before the incident, not during it. Sign-in logs, token issuance and refresh events, admin role assignments, conditional access policy changes — those are the DFIR breadcrumbs, and in Australian tenants they are often one licence tier away from being useful.
The regulatory ground has moved under you
This is where the practitioner advice has to give way to the legal frame, because ransomware response in Australia is now a reporting problem as well as a technical one, and the reporting clocks are not aligned.
Cyber Security Act 2024 (Cth) — mandatory ransomware payment reporting. The Cyber Security Act 2024 (Cth) received Royal Assent on 29 November 2024. The Cyber Security (Ransomware Payment Reporting) Rules 2025 commenced on 30 May 2025, imposing a 72-hour reporting obligation on any "reporting business entity" that makes, or has a payment made on its behalf, in response to a cyber extortion demand (Cyber Security Act 2024 (Cth); Cyber Security (Ransomware Payment Reporting) Rules 2025). The threshold catches entities with an annual turnover above AUD 3 million, and all responsible entities for critical infrastructure assets under the Security of Critical Infrastructure Act 2018 (Cth). There is no minimum payment amount. The report goes to ASD via the ACSC portal. Home Affairs has flagged an education-first period running through 31 December 2025 before active enforcement begins, but the obligation itself is live right now.
Privacy Act 1988 (Cth) — NDB. If personal information was accessed or exfiltrated and serious harm is likely, the Notifiable Data Breaches scheme under Part IIIC of the Privacy Act 1988 (Cth) obliges you to assess expeditiously and notify the OAIC and affected individuals "as soon as practicable". That is not a 72-hour clock; it is an assessment obligation with a soft 30-day outer bound, and the regulator's expectation is that you move faster where the facts allow.
APRA CPS 234 — for regulated entities. APRA Prudential Standard CPS 234 Information Security requires APRA-regulated entities to notify APRA within 72 hours of becoming aware of a material information security incident (APRA, CPS 234). The clock starts on awareness of materiality, not on confirmation of scope.
You can end up inside three separate reporting clocks within a week of a single intrusion — ASD (payment), OAIC (personal information), and APRA (material information security incident) — with different triggers and different information expectations. If your incident response runbook does not name which lawyer, which executive, and which form belongs to each clock, the runbook is not finished.
What I'd actually do this quarter
This isn't a checklist. It's what I would push for if I were sitting inside an Australian mid-market business reading the above and deciding where next quarter's effort goes.
Shorten the distance between awareness and decision. The single biggest driver of blast radius I see is not a technical control; it is the hour or two lost between "something is wrong" and "we are now running an incident, with these people, under this authority". Write the activation criteria down, name the decision-makers, and rehearse the call.
Stand up an exfil-centric detection story. Outbound volume baselining on your egress points, data loss prevention on the sensitive data stores you already know about, and — critically — a retention period on network and identity logs that matches the dwell times you are actually seeing in public reporting. Thirty days is not it. Aim for 12 months on the signals that matter.
Get your legal structure agreed before the incident. Whoever your breach counsel is going to be, have that relationship in place and documented. For matters that are likely to involve regulators, litigation, or forensic findings that could be tendered, the engagement structure on day zero determines what your work product looks like on day ninety. That is a conversation to have in a quiet week, not at 2am on a Sunday.
Decide the payment question in daylight. Have an internal position on whether you would pay, under what conditions, who approves it, and what the sanctions-screening and Cyber Security Act reporting process looks like if you do. Deciding under duress, with the clock running, is how governance failures happen.
Test the backups you think you have. I have lost count of the number of engagements where the backups existed but were domain-joined, on the same credential plane as production, and encrypted alongside the primary estate. Offline, immutable, and periodically restored is the only configuration that actually counts.
Working with us on a ransomware matter
If you want a second pair of eyes on your ransomware readiness — runbook, logging posture, legal structure, or the reporting-clock question specifically — our DFIR practice is set up for exactly that kind of engagement, and we can structure it through our associated legal practice where a matter needs to be run carefully from the outset. Details are at /#services, or reach us directly via /#contact.
The next twelve months of ransomware in Australia will not look like the last twelve. Plan for the attacker you are going to meet, not the one the slide deck is still using.
Is this happening to you right now?
Artificer Cyber deploys within hours. No retainer required — we scope the engagement on first contact and follow strict integration protocols with your legal advisors from the start.