Article

Installing the Oracle CPU Can Lead to a False Sense of Security

If you installed the latest Oracle CPU and believe that this alone makes you secure, think again. Without enabling and properly configuring the Serialization Global Filter, flaws may be fully exploitable in the Java platform even after installing the October 2017 Java CPU. This problem still exists with the January 2018 CPU.

An integral part of Oracle’s strategy to improve the security of their software is the quarterly release of security patches called Critical Patch Updates (CPUs). Java SE is one of the products that receives the quarterly security patches in an Oracle CPU. Each CPU fixes identified security vulnerabilities that can allow attackers to compromise a system in a variety of ways and severities. Given the criticality of several of the addressed security vulnerabilities, every CPU release strongly advises their customers to apply the Critical Patch Update as soon as possible.

More specifically, Oracle highlights the following disclaimer in every CPU:

“Oracle continues to periodically receive reports of attempts to maliciously exploit vulnerabilities for which Oracle has already released fixes. In some instances, it has been reported that attackers have been successful because targeted customers had failed to apply available Oracle patches. Oracle therefore strongly recommends that customers remain on actively-supported versions and apply Critical Patch Update fixes without delay.”

Consequently, organizations rush to install the Critical Patch Update every time a new one is released in order to maintain their security posture. By installing the Oracle CPU, organizations become compliant which leads them to believe that they have successfully mitigated or even eliminated their exposure.

In addition to the release of the new software binaries, Oracle also releases a risk matrix of the security vulnerabilities (CVEs) that have been fixed in each new Oracle CPU. This way, customers have more visibility on what components are affected by a new security patch. The release notes of each new CPU contain a high-level summary of the most important changes and fixes.

Simply Installing the CPU Does Not Secure All Vulnerabilities!

In the case of the latest Java CPU, believing that the Java Virtual Machine (JVM) is automatically secured against all the reported CVEs by only installing the Oracle CPU is smoke and mirrors. What is not documented in the CPU information is that several Java deserialization vulnerabilities are not automatically fixed simply by installing the Oracle CPU. In addition to applying the CPU binaries, some manual configuration is also required to mitigate these attacks.

Without this manual configuration process, these security vulnerabilities are exposed to deserialization attacks.

The fact that this critical information is missing from the CPU release notes and any other public information may lead to a catastrophic attack. This is because organizations install the Critical Patch Update with the expectation of it resolving each specific security vulnerability automatically. A false sense of security is created because of the lack of information that organizations also have to make manual configuration changes to effectively enable the required protection.

The whole matter becomes even more complicated because some of the Java deserialization vulnerabilities do not require manual configuration changes while some others do. Importantly, this information is missing from any publicly released document from Oracle and of course there is no indication which components are automatically fixed and require no configuration at all.

How to Secure All Vulnerabilities

The confusion stems from the way Oracle has decided to address Java deserialization vulnerabilities, a class of vulnerabilities that are inherent in the serialization protocol. The Serialization Filtering (JEP 290) mechanism has been introduced by Oracle in an attempt to solve these issues. Given the complex nature of the Java deserialization vulnerabilities, it is not a surprise to see that the Serialization Filtering mechanism can become an equally complex solution to the problem. As explained in more detail in an earlier article, Serialization Filtering attempts to solve the problem by the use of three types of stream filters:

  1. A global filter applies to all deserialization endpoints and serializable classes of the Java Virtual Machine and can be enabled by a system property from the command line or a security property by configuring the java.security properties file. The global filter is disabled by default and users must choose to manually enable and configure it according to their needs. Since no source code changes are needed for this filter, this security control can be used by DevSecOps teams and system administrators.
  2. The custom filters apply to specific serializable classes of the JVM. In order for this to be possible, source code changes are needed. Therefore, this security control is primarily targeted for developers. In general, custom filters are not externally configurable and are enabled by default.
  3. The built-in filters apply to specific JVM classes regarding the RMI Registry and the Distributed Garbage Collector. The built-in filters are enabled by default and they can also be externally configured if needed.

So, how is the Serialization Filtering mechanism used to address the deserialization specific CVEs of each Java Critical Patch Update?

This is where it gets confusing. Some CVEs are addressed using custom filters and fixes and some others are addressed using the global filter. Let’s break it down for each CVE in the October 2017 CPU.

CVE Component Solution
CVE-2017-10345 JceKeyStore Custom Filter
CVE-2017-10347 SimpleTimeZone Custom Fix
CVE-2017-10281 Java Util Collections Global Filter
CVE-2017-10357 ObjectInputStream Custom Fix

Why is this confusing?

Because the custom filter and the custom fixes are enabled by default and remediate the deserialization vulnerabilities with no further actions. However, the global filter is not enabled by default and there is no official recommendation that it must be explicitly enabled. Without enabling and properly configuring the global filter, the affected vulnerable classes are fully exploitable in the Java platform. And in this case, we are talking about standard collection classes in java.util such as HashMap, ArrayList, etc. that are used by virtually any enterprise Java application in the world.

The Massive Importance of Manually Tuning Serialization Filtering’s Global Filter

Starting from the October Oracle CPU, organizations should not rely solely on installing the quarterly CPU. In order for Java to be secured against deserialization exploits, the Serialization Filtering global filter must also be enabled and configured. This is not optional. It is a mandatory action in order to mitigate the aforementioned CVE and any other CVE in the future that solely depends on the global filter. Any organization that has a Java application in production must enable the Serialization global filter as soon as possible in addition to installing the latest Critical Patch Update.

There are two alternative ways to enable the serialization global (process-wide) filter:

  • By setting the system property jdk.serialFilter
  • By uncommenting and setting the security property  jdk.serialFilter in <JRE>/lib/security/java.security

To address the unbounded memory allocation vulnerabilities in Collection classes (CVE-2017-10281), the maxarray configuration option of the global filter must be set. The maxarray configuration option limits the maximum array length that is allowed to be deserialized. However, the value that should be set for the maxarray configuration option can also be confusing. First, Oracle does not recommend a specific value to be set. Organizations must profile their apps and define a limit that is applicable to their environments or specify an arbitrary big value.

Second, the value set might cause confusing behavior in specific Collection classes. For example, let’s assume that we set the global filter’s maxarray to 1000. Assuming that an ArrayList instance with 1000 elements is deserialized, the deserialization operation will be allowed since the number of elements in the ArrayList does not exceed the limit set by the global filter.

Now, let’s try the same experiment with a HashMap instead. Assuming that a HashMap instance with 1000 elements is deserialized, should the deserialization operation be allowed by the same global filter? Logically thinking the answer would be yes. However, the HashMap instance will be rejected. The reason is that the underlying array that backs the HashMap instance of 1000 elements has a size of 2048 elements and consequently it will be rejected by the filter. All the above are low-level details that should have been abstracted to reduce the confusion, allow ease of use, and achieve greater adoption.

Therefore, it is highly recommended to enable the JVM’s Serialization global filter and make sure that you understand the low-level details such as the above when configuring it to achieve the necessary protection as well as to avoid inadvertently breaking the application’s legitimate functionality.


Author:

Apostolos Giannakidis

Security Architect

Apostolos drives the research and the design of the security features of Waratek’s RASP container. Before starting his journey in Waratek in 2014, Apostolos worked in Oracle for 2 years focusing on Destructive Testing on the whole technology stack of Oracle and on Security Testing of the Solaris operating system. Apostolos has more than 10 years of experience in the software industry and holds an MSc in Computer Science from the University of Birmingham.

Related resources

Ready to scale Security with modern software development?

Work with us to accelerate your adoption of Security-as-Code to deliver application security at scale.