An Australian intelligence agency and its United States counterpart have released guidelines for enterprises on how to defend web servers from web shell exploits.
Web shell malware gives remote attackers access to a targeted web server to execute arbitrary code. Web shells are typically created by either adding a malicious file or modify an existing file in an existing web application. Once installed, attackers can use the compromised web server to steal data, launch attacks on the site’s visitors, or hop to other machines within the organization’s network.
Attackers target both web-facing servers and non-internet accessible systems, such as internal content management systems or network device management interfaces, the National Security Agency and the Australian Signals Directorate said in the jointly-issued Cybersecurity Information Sheet. Internal web applications are “often more susceptible to compromise” because they may be behind on patches or have less restrictions compared to external-facing applications, the agencies said.
“Malicious cyber actors are increasingly leveraging this type of malware to get consistent access to compromised networks while using communications that blend in well with legitimate traffic,” according to the CIS sheet. “Web shell malware is a long-standing, pervasive threat that continues to evade many security tools.”
Attackers typically target vulnerabilities in web applications and web technologies to install the malware. The guidance lists multiple known CVEs in web technologies, including Microsoft SharePoint (CVE-2019-0604) and Exchange Server (CVE-2020-0688), Citrix products (CVE-2019-19781), Atlassian Confluence (CVE-2019-3396 and CVE-2019-3398) and Crowd (CVE-2019-11580), WordPress “Social Warfare” Plugin (CVE-2019-9978), Progress Telerik UI (CVE-2019-18935, CVE-2017-11317 and CVE-2017-11357), Zoho ManageEngine (CVE-2020-10189 and CVE-2019-8394), and Adobe ColdFusion (CVE-2018-15961). Attackers can also upload the malicious code to already-compromised systems. The attack activity blends in with legitimate traffic because they look like it is coming from existing applications and tools.
“This means attackers might send system commands over HTTPS or route commands to other systems, including to your internal networks, which may appear as normal network traffic,” the guidance said.
The guidance has information on how to detect web shells, such as comparing files on the Web server to the version of the file stored in a secure location (the “known good” version of the file) to make sure it hasn’t been modified. There are several operating-system specific tools that can be used to look for differences between files, and the NSA also provided a PowerShell script to handle this task.
While the malicious components are typically obfuscated, some network characteristics will stick out as being unusual. For example, attackers at first are unlikely to know which user agents or IP addresses are typical for that environment, so some of the requests will be anomalous. Web shells routing attacker traffic will also typically default to the web server’s user agent and IP address, which is also unusual. Attackers may leave behind another clue if they neglect to disguise web shell request “referer [sic] headers” as normal traffic, or send requests with missing or unusual referer headers.
Administrators can also look for login activity from unexpected regions at unexpected times, or for large responses to a web application, as that could indicate data exfiltration.
The guidance also talks about protection, such as blocking access to ports and services from outside the network, or restructuring access to a specific set of ports and services. Segregating networks so that internet-facing web servers can't access internal servers would prevent attackers from hopping around the network. Assigning the least amount of privileges to web applications (least-privilege) will make it harder for attackers to be able to cause much damage using the web shell.
Mitigation involves using intrusion detection/prevention systems, Web flow detection, and file-integrity monitoring to find files that have been modified and then replacing with correct versions.
Along with the guidance, the NSA's GitHub repository also includes the afore-mentioned PowerShell script to compare an application in producting with a "known-good" version, Splunk queries for detecting anomalous URLs in web traffic, an Internet Information Services (IIS) log analysis tool, network traffic signatures for common web shells, and a list of commonly exploited web application vulnerabilities. There are also instructions for identifying unexpected network flows and abnormal process invocations, as well as host intrusion prevention systems rules for blocking changes to web-accessible directories. The Open Web Application Security Project (OWASP) also has a set of core intrusion prevention system (ISP) rules that enterprise defenders should apply.