The SolarWinds SUNBURST Attack: How Clean Source Code Produced a Backdoor

The SolarWinds SUNBURST Attack: How Clean Source Code Produced a Backdoor
In December 2020, FireEye discovered that a state-sponsored threat actor had compromised SolarWinds' Orion network management platform and distributed a backdoor to approximately 18,000 organisations worldwide. The US government attributed the operation to Russia's Foreign Intelligence Service (SVR), also tracked as APT 29 (Advanced Persistent Threat 29) and Cozy Bear, with "high confidence."
Of those 18,000 organisations that received the trojanised update, fewer than 100 were selected for further exploitation. The confirmed targets included the US Department of the Treasury, Department of State, and Department of Homeland Security (including CISA itself). The Department of Energy (including the National Nuclear Security Administration), National Institutes of Health, and 27 US Attorneys' offices were also compromised. SolarWinds' own customer base included 425 of the US Fortune 500, all ten of the top US telecommunications companies, and all five branches of the US military.
The operation ran for roughly 14 months before discovery, from at least late 2019 through December 2020. SolarWinds' stock dropped approximately 25% in two days and 35% by the end of the month.
What everyone thinks happened
The headline version is "Russian hackers broke into SolarWinds." That's technically correct but misses the part that made this operation different from every other supply chain compromise that came before it.
The attackers didn't break into SolarWinds' network and modify the Orion software. They didn't tamper with the source code repository. They didn't alter the compiled binaries after the fact. SolarWinds confirmed in their SEC (Securities and Exchange Commission) filing that the malicious code was "not present in the source code repository of the Orion products." If you had pulled the Orion source from version control at any point during the compromise, you would have found clean code.
The operation targeted the build system itself, not the application code. A separate piece of malware sat on the build server and waited for a compilation event. When it detected that Orion was being built, it swapped in backdoored source files at the exact moment the compiler needed them, then restored the originals when the build completed. The resulting DLL (Dynamic Link Library) was digitally signed with SolarWinds' legitimate code signing certificate because it had been built by SolarWinds' legitimate build infrastructure.
That distinction matters because it broke every assumption the industry held about supply chain trust. Code review would not have caught it because the code was clean. Binary analysis against the source repository would have shown a mismatch, but that's a check most organisations were not performing in 2020. The signed certificate meant every downstream security tool that verified signatures gave the DLL a clean bill of health.
What actually happened
The attack used three distinct malware components, each with a specific role in the chain. CrowdStrike tracked the overall operation as StellarParticle, while Microsoft designated the threat actor as NOBELIUM.
SUNSPOT: the build system injector
SUNSPOT was the implant that lived on SolarWinds' build server. It persisted through a scheduled task that executed at boot and created a mutex (a mutual exclusion lock that prevents duplicate instances from running) to make sure only one copy ran at a time.
Its sole job was continuous surveillance of the build environment. Every second, SUNSPOT checked for running instances of MsBuild.exe, the Microsoft build engine. It didn't look for the process name directly. Instead, it used an algorithm called ElfHash to calculate a hash of each running process name and compared that against a hardcoded value (0x53D525). If the hash matched, SUNSPOT spawned a new thread to check whether the build was compiling Orion specifically.
When it found an Orion build, SUNSPOT read the build process's command-line arguments by accessing its Process Environment Block (PEB), a Windows data structure containing process configuration, to locate the solution directory. It then found the specific source file it needed: InventoryManager.cs. Before replacing it, SUNSPOT verified the file's MD5 hash to confirm compatibility and prevent build failures. Only after that verification did it swap in the backdoored version containing the SUNBURST payload.
The injection process was surgically clean in every detail. SUNSPOT wrapped the malicious code in compiler directives (#pragma warning disable/restore) to suppress any warnings the new code might trigger. It backed up the original file with a .bk extension, wrote the replacement to a .tmp file first, and then swapped it in. When the build completed, the original source file was restored. The build log showed nothing unusual, and neither did the source repository. The output was a legitimately compiled, legitimately signed DLL containing a backdoor.
SUNSPOT's own build timestamp was 20 February 2020. SolarWinds confirmed that the attackers performed a test injection of innocuous code in October 2019, three months before they began inserting the actual backdoor. They tested the mechanism, confirmed it worked without breaking the build, and then waited.
SUNBURST: the backdoor
SUNBURST was the payload that shipped to SolarWinds' customers inside the trojanised DLL (SolarWinds.Orion.Core.BusinessLayer.dll). It did not do anything immediately after installation. After installation, it waited 12 to 14 days (the exact threshold was randomly selected from that interval) before any network activity. It measured this by comparing the filesystem write time of its own assembly against the current time.
Before reaching out to its command and control (C2) infrastructure, SUNBURST performed extensive environment checks. It maintained a blocklist of security tools, identified not by their names but by hashed values of process names, service names, and driver names. This meant the strings "CrowdStrike", "FireEye", "SentinelOne", or "Carbon Black" never appeared in the malware's code. The blocklist included debuggers (Wireshark, Process Explorer, dnSpy, x64dbg), endpoint protection (Windows Defender, CrowdStrike Falcon, FireEye, ESET, Carbon Black), and SentinelOne's kernel driver. If SUNBURST detected certain security tools, it would halt execution entirely. For others, it attempted to disable them through registry modifications.
The C2 communication used a domain generation algorithm (DGA), though calling it a DGA understates what it did. Traditional DGAs generate random-looking domains to make C2 infrastructure harder to block. SUNBURST's DGA was a fully functional two-way communication channel that encoded intelligence about each victim into the DNS (Domain Name System) queries themselves.
The primary C2 domain was avsvmcloud[.]com, registered well in advance of the operation. SUNBURST generated subdomains that encoded the victim's Active Directory domain name and installed security products. The subdomain structure followed a pattern: an encoded identifier, followed by appsync-api, followed by a cloud region name (eu-west-1, us-west-2, us-east-1, or us-east-2), followed by the C2 domain. This made the traffic look like legitimate cloud API calls.
The DNS response from the attacker-controlled server determined what happened next. It could instruct the backdoor to continue passive beaconing, stop all activity permanently, or transition to active mode. In active mode, SUNBURST switched to HTTP (Hypertext Transfer Protocol) for C2, with message payloads under 10,000 bytes encoded as JSON documents that mimicked SolarWinds' own Orion Improvement Program telemetry. The full command set included running and stopping processes, reading and writing files and registry keys, collecting system information, uploading data, and restarting devices.
This is the point where the 18,000-to-100 funnel becomes visible. Every organisation that installed the trojanised update sent DNS beacons that told the attackers which company they were, what Active Directory domain they ran, and what security products they had installed. The attackers could then select exactly which targets to promote to active exploitation. The remaining 17,900 organisations never received a second-stage payload. Their SUNBURST installations continued beaconing and were eventually instructed to shut down.
TEARDROP and Raindrop: the second stage
For the roughly 100 organisations selected for further exploitation, the attackers deployed TEARDROP and Raindrop. Both were custom loaders for Cobalt Strike, the commercial penetration testing framework that has become standard tooling in both legitimate assessments and real-world attacks.
From TEARDROP onward, the operation transitioned to hands-on-keyboard activity by human operators. The attackers used Cobalt Strike for lateral movement, deployed Mimikatz for credential harvesting, and targeted SAML (Security Assertion Markup Language) token signing certificates. With a stolen SAML signing certificate, they could forge authentication tokens for any user in the organisation's Microsoft 365 or Azure environment, bypassing MFA (multi-factor authentication) entirely. The CISA advisory noted that the attackers specifically targeted email accounts of IT staff and incident responders, monitoring the very people who might detect them.
How FireEye found it
The discovery was accidental and it came from a different angle than anyone expected.
On 8 December 2020, FireEye publicly disclosed that a state-sponsored threat actor had breached its own network and stolen its red team tools. The investigation into that breach started because the attackers registered a new device to a FireEye employee's MFA system using stolen credentials. The system alerted the employee about the unregistered device, and the security team investigated.
That investigation led FireEye's analysts to examine their SolarWinds Orion servers. They decompiled and reverse-engineered the entire Orion platform. On 12 December, FireEye notified SolarWinds that Orion was compromised. On 13 December, FireEye publicly disclosed the SUNBURST backdoor, and CISA issued Emergency Directive ED 21-01 ordering all federal civilian agencies to disconnect or power down SolarWinds Orion products immediately. The compliance deadline was 12:00 PM EST the next day. (following the multi-layered segmentation assessment protocol).
If that MFA alert had not fired, or if the employee had dismissed it, the timeline would have been different. The attackers had been inside FireEye's network for several months already.
Myth vs fact
Myth: The attackers modified the Orion source code.
SolarWinds confirmed in their SEC filing and subsequent investigation that the malicious code was "not present in the source code repository." SUNSPOT replaced source files only during the build process, while the compiler was running, and restored the originals when the build finished. Anyone reviewing the source repository at any time would have seen clean code. The compromise existed only in the brief window between source file replacement and compilation.
Myth: Every organisation that installed the update was compromised.
Fewer than 18,000 organisations out of SolarWinds' 300,000 customer base received the trojanised update (only active Orion maintenance customers running specific versions were affected). Of those 18,000, fewer than 100 received second-stage exploitation. The SUNBURST backdoor functioned as a screening and triage tool for the operators. It reported back to the attackers, who then chose which organisations warranted further attention. The White House fact sheet cited "more than 16,000 computer systems worldwide" that were affected, but "affected" meant they received the trojanised DLL, not that they were actively exploited.
Myth: Better antivirus would have caught it.
The trojanised DLL was digitally signed with SolarWinds' legitimate code signing certificate. Signature-based antivirus checks the certificate and trusts signed binaries from known vendors. Behavioural detection might have flagged the DNS beaconing pattern, but the traffic was designed to look like legitimate Orion telemetry. The 12-to-14-day dormancy period meant the backdoor was inactive during the window when new software installations are most closely monitored. The attackers built the operation around the assumption that the signed certificate would bypass perimeter defences, and they were right.
Myth: This was a novel attack technique with no precedent.
Supply chain attacks were not new when SolarWinds was discovered. The NotPetya attack in 2017 spread through a compromised update mechanism for Ukrainian accounting software. CCleaner was trojanised in 2017 through a similar build system compromise. What distinguished SolarWinds was the precision of the targeting (18,000 to 100), the sophistication of the C2 channel, and the choice of victim: a network management platform with administrative access to the infrastructure it monitors. The technique was known, but the execution was exceptional in both precision and discipline.
What would have stopped this
Preventing a supply chain attack at this level of sophistication is different from preventing a phishing email or a misconfigured firewall. The defences that matter operate at a different layer.
Build system integrity verification. The gap that made everything else possible was that no independent system verified whether the compiled output matched the checked-in source code. Reproducible builds, where the same source always produces the same binary, would have revealed the discrepancy. If SolarWinds had independently compiled Orion from the repository and compared the hash against the distributed binary, the mismatch would have been detectable. This was not standard practice in 2020 for most software vendors. It's becoming standard now, partly because of this incident.
Software Bill of Materials (SBOM). An SBOM is a formal inventory of every component in a software product, including dependencies, versions, and cryptographic hashes. If Orion's consumers had received an SBOM with the expected hash of every DLL in the package, they could have verified that what they received matched what SolarWinds intended to ship. The concept of software component inventories existed well before the SolarWinds incident. The attack created the political momentum to mandate it.
Network segmentation and egress filtering for outbound traffic. SUNBURST's DNS beaconing to avsvmcloud[.]com was the first external communication the backdoor made. Organisations that monitored and restricted outbound DNS traffic to known resolvers, or that analysed DNS query patterns for anomalous subdomain structures, had a detection opportunity. The UK's National Cyber Security Centre (NCSC) guidance specifically recommended reviewing DNS logs as part of the response. Network monitoring tools that baseline normal Orion traffic patterns could have flagged the new DNS destinations.
SAML token monitoring. The post-compromise phase relied heavily on forging SAML tokens to access cloud services. Monitoring for anomalous SAML token usage, tokens issued outside normal patterns, tokens for accounts that don't normally use federated authentication, tokens issued from infrastructure that shouldn't be issuing them, would have detected the lateral movement into Microsoft 365 and Azure environments. CISA's advisory specifically highlighted SAML token abuse as a primary post-compromise technique.
Zero trust for administrative tools. SolarWinds Orion, by design, had administrative access to every system it monitored. That's the product's entire purpose and the reason organisations deploy it. But it also means a compromise of Orion gives the attacker the same access that Orion has. Treating network management tools as high-value targets with their own segmented network access, dedicated monitoring, and restricted egress is a defensive posture that limits the blast radius of a supply chain compromise.
What changed after
The political response was substantial and arrived quickly by government standards. On 15 April 2021, the White House formally attributed the operation to Russia's SVR and signed Executive Order 14024. The sanctions package targeted six Russian technology companies supporting SVR cyber operations. The US also sanctioned 32 entities and individuals connected to election interference, and expelled 10 Russian diplomatic personnel from Washington. US financial institutions were prohibited from purchasing new Russian sovereign debt after June 2021.
The regulatory consequences took longer to arrive. In October 2023, the SEC charged SolarWinds and its CISO, Timothy Brown, with fraud for allegedly misrepresenting the company's security posture to investors. The complaint alleged that SolarWinds' public statements about its security practices did not match internal communications about known gaps. The case was settled and dismissed in November 2025, but it established a precedent: CISOs can face personal liability for the gap between what an organisation claims about its security and what the organisation actually practices.
The attack accelerated the SBOM movement across the entire federal supply chain. Executive Order 14028 on Improving the Nation's Cybersecurity, signed in May 2021, required software vendors selling to the federal government to provide SBOMs. The National Institute of Standards and Technology (NIST) published updated guidance on software supply chain security. These weren't new ideas, but SolarWinds gave them urgency.
The Cyber Safety Review Board (CSRB) was established partly in response to SolarWinds, though its first formal review covered the Log4j vulnerability rather than SolarWinds directly. The board's creation represented a shift toward treating significant cyber incidents with the same post-incident analysis framework that aviation uses for crashes.
The trust problem
The SolarWinds operation worked because it exploited the trust model that the entire software industry runs on. Organisations trust their vendors, vendors sign their software, and security tools trust anything that carries a valid signature. That chain of trust is also a chain of vulnerability, and SUNBURST demonstrated exactly how to exploit it.
The source code was clean, the build system was compromised, the binary was signed, and the update mechanism was entirely legitimate. Every verification step that an organisation could reasonably perform in 2020 would have passed the trojanised DLL as trustworthy. The attackers understood those verification steps and built their operation to survive each one.
The 18,000-to-100 selection ratio tells you something about the operators behind this. They had access to 18,000 networks and chose to exploit fewer than 100. That is not a smash-and-grab operation by any measure. It's intelligence collection with the discipline to avoid detection by not being greedy. Most affected organisations will never know the answer to one question. Did the attackers look at their beaconed data and decide they weren't interesting enough? Or had the attackers simply not gotten to them yet before FireEye's MFA alert brought the whole operation down?
For any organisation running third-party software with administrative access to their network, the question SolarWinds raised hasn't gone away. You're trusting not just the vendor's code, but the vendor's build system, the vendor's signing infrastructure, and every system that sits between the developer writing code and the binary landing on your servers. The attack surface is not the software itself but the entire pipeline that produces and distributes it.
Related articles
- The CrowdStrike Outage: What Actually Happened Inside 8.5 Million Machines
- The HSE Ireland Ransomware Attack: Eight Weeks of Missed Signals
- The JLR Cyber Attack: How a Single Breach Contracted UK GDP
Get cybersecurity insights delivered
Join our newsletter for practical security guidance, Cyber Essentials updates, and threat alerts. No spam, just actionable advice for UK businesses.
Related Guides
The Stryker Attack: When Your Own Device Management Becomes the Weapon
How Iranian-linked hackers weaponised Stryker's Microsoft Intune to wipe devices globally, disrupting medical device manufacturing across 79 countries.
The JLR Cyber Attack: How a Single Breach Contracted UK GDP
Inside the Jaguar Land Rover cyber incident that shut down production for five weeks, cost GBP 1.9 billion, and triggered the UK's first cyber-related government loan guarantee.
The Ivanti VPN Zero-Day: How a Buffer Overflow in a VPN Appliance Breached the UK's Domain Registry
In December 2024, a suspected Chinese state-sponsored group exploited CVE-2025-0282, a critical stack-based buffer overflow in Ivanti Connect Secure, to breach Nominet, the registry responsible for over 11 million .uk domain names. The vulnerability required no authentication. Five days after the patch was released, only 120 of 33,542 exposed appliances had been updated.
Ready to get certified?
Book your Cyber Essentials certification or check your readiness with a free quiz.