cve,link,title,description,vendor,products,score,severity,epss,cisa,cisa_published,article,ransomware,exploited,exploited_date,poc,trended,trended_no_1,trended_no_1_date,published,trended_score CVE-2024-23945,https://securityvulnerability.io/vulnerability/CVE-2024-23945,Application Security Flaw in Apache Hive and Spark Affecting Cookie Signature Verification,"An application security flaw exists in Apache Hive and Apache Spark concerning the improper handling of signed cookies. This vulnerability allows an incorrect signature mismatch to expose the signed cookie to users, potentially enabling malicious actors to alter the cookie's value. The vulnerability traces back to the CookieSigner logic introduced in Apache Hive via HIVE-9710 starting from version 1.2.0 and in Apache Spark through SPARK-14987 from version 2.0.0. The exposure of these cookies can result in unauthorized access and further exploitation of the application, raising significant security concerns for users relying on these platforms.",Apache,"Apache Hive,Apache Spark",,,0.0004400000034365803,false,,false,false,false,,,false,false,,2024-12-23T15:26:54.477Z,278 CVE-2024-39928,https://securityvulnerability.io/vulnerability/CVE-2024-39928,Apache Linkis Random String Security Vulnerability,"In Apache Linkis <= 1.5.0, a Random string security vulnerability in Spark EngineConn, random string generated by the Token when starting Py4j uses the Commons Lang's RandomStringUtils. Users are recommended to upgrade to version 1.6.0, which fixes this issue.",Apache,Apache Linkis Spark Engineconn,,,0.0004299999854993075,false,,false,false,false,,,false,false,,2024-09-25T01:15:00.000Z,0 CVE-2023-40195,https://securityvulnerability.io/vulnerability/CVE-2023-40195,Apache Airflow Spark Provider Deserialization Vulnerability RCE,"The vulnerability involves the deserialization of untrusted data in the Apache Airflow Spark Provider, allowing authorized users to execute arbitrary code on the Airflow node by misconfiguring it to interact with a malicious Spark server. Prior to version 4.1.3, this issue wasn’t clearly documented, which may have led administrators to unknowingly grant too much access. It’s crucial that administrators review their configurations and restrict permissions to configure Spark hooks exclusively to trusted individuals, as this can mitigate the risk of exploitation.",Apache,Apache Airflow Spark Provider,8.8,HIGH,0.0012499999720603228,false,,false,false,false,,,false,false,,2023-08-28T08:15:00.000Z,0 CVE-2023-40272,https://securityvulnerability.io/vulnerability/CVE-2023-40272,Apache Airflow Spark Provider Arbitrary File Read via JDBC,"The Apache Airflow Spark Provider prior to version 4.1.3 contains a vulnerability that enables attackers to exploit insecure connection parameters. This flaw may allow unauthorized file access on the Airflow server, potentially leading to unauthorized data exposure and compromise. To mitigate this risk, users are urged to update to a secure version of the software.",Apache,Apache Airflow Spark Provider,7.5,HIGH,0.000590000010561198,false,,false,false,false,,,false,false,,2023-08-17T14:15:00.000Z,0 CVE-2023-32007,https://securityvulnerability.io/vulnerability/CVE-2023-32007,Apache Spark: Shell command injection via Spark UI,"The Apache Spark UI possesses an access control vulnerability where the configuration option spark.acls.enable can be manipulated to allow arbitrary user impersonation. When enabled, a malicious actor could exploit a flaw in the HttpSecurityFilter, potentially gaining unauthorized access to execute shell commands as the user running Spark. This vulnerability affects specific, unsupported versions of Apache Spark and poses a serious risk for organizations still using these outdated versions. Users are strongly advised to upgrade to version 3.4.0 or above to mitigate this risk.",Apache,Apache Spark,8.8,HIGH,0.006750000175088644,false,,false,false,false,,,false,false,,2023-05-02T09:15:00.000Z,0 CVE-2023-22946,https://securityvulnerability.io/vulnerability/CVE-2023-22946,Apache Spark proxy-user privilege escalation from malicious configuration class,"In Apache Spark versions prior to 3.4.0, applications using spark-submit can specify a 'proxy-user' to run as, limiting privileges. The application can execute code with the privileges of the submitting user, however, by providing malicious configuration-related classes on the classpath. This affects architectures relying on proxy-user, for example those using Apache Livy to manage submitted applications. Update to Apache Spark 3.4.0 or later, and ensure that spark.submit.proxyUser.allowCustomClasspathInClusterMode is set to its default of ""false"", and is not overridden by submitted applications. ",Apache,Apache Spark,6.4,MEDIUM,0.0013699999544769526,false,,false,false,false,,,false,false,,2023-04-17T08:15:00.000Z,0 CVE-2023-28710,https://securityvulnerability.io/vulnerability/CVE-2023-28710,Apache Airflow Spark Provider Arbitrary File Read via JDBC,"An input validation issue exists in the Apache Airflow Spark Provider, affecting versions prior to 4.0.1, which may lead to unauthorized access or manipulation of data. Proper validation mechanisms are crucial to ensure that user inputs do not lead to unintended consequences within the application.",Apache,Apache Airflow Spark Provider,7.5,HIGH,0.001560000004246831,false,,false,false,false,,,false,false,,2023-04-07T15:15:00.000Z,0 CVE-2022-40954,https://securityvulnerability.io/vulnerability/CVE-2022-40954,Apache Airflow Spark Provider RCE that bypass restrictions to read arbitrary files,"Improper Neutralization of Special Elements used in an OS Command ('OS Command Injection') vulnerability in Apache Airflow Spark Provider, Apache Airflow allows an attacker to read arbtrary files in the task execution context, without write access to DAG files. This issue affects Spark Provider versions prior to 4.0.0. It also impacts any Apache Airflow versions prior to 2.3.0 in case Spark Provider is installed (Spark Provider 4.0.0 can only be installed for Airflow 2.3.0+). Note that you need to manually install the Spark Provider version 4.0.0 in order to get rid of the vulnerability on top of Airflow 2.3.0+ version that has lower version of the Spark Provider installed).",Apache,"Apache Airflow Spark Provider,Apache Airflow",5.5,MEDIUM,0.0005799999926239252,false,,false,false,false,,,false,false,,2022-11-22T00:00:00.000Z,0 CVE-2022-31777,https://securityvulnerability.io/vulnerability/CVE-2022-31777,Apache Spark XSS vulnerability in log viewer UI Javascript,"A stored cross-site scripting (XSS) vulnerability in Apache Spark 3.2.1 and earlier, and 3.3.0, allows remote attackers to execute arbitrary JavaScript in the web browser of a user, by including a malicious payload into the logs which would be returned in logs rendered in the UI.",Apache,Apache Spark,5.4,MEDIUM,0.0007099999929778278,false,,false,false,false,,,false,false,,2022-11-01T00:00:00.000Z,0 CVE-2022-33891,https://securityvulnerability.io/vulnerability/CVE-2022-33891,Apache Spark shell command injection vulnerability via Spark UI,"The Apache Spark UI offers the possibility to enable ACLs via the configuration option spark.acls.enable. With an authentication filter, this checks whether a user has access permissions to view or modify the application. If ACLs are enabled, a code path in HttpSecurityFilter can allow someone to perform impersonation by providing an arbitrary user name. A malicious user might then be able to reach a permission check function that will ultimately build a Unix shell command based on their input, and execute it. This will result in arbitrary shell command execution as the user Spark is currently running as. This affects Apache Spark versions 3.0.3 and earlier, versions 3.1.1 to 3.1.2, and versions 3.2.0 to 3.2.1.",Apache,Apache Spark,8.8,HIGH,0.9714599847793579,true,2023-03-07T00:00:00.000Z,false,false,true,2023-03-07T00:00:00.000Z,true,false,false,,2022-07-18T00:00:00.000Z,0 CVE-2021-38296,https://securityvulnerability.io/vulnerability/CVE-2021-38296,Apache Spark Key Negotiation Vulnerability,"Apache Spark supports end-to-end encryption of RPC connections via ""spark.authenticate"" and ""spark.network.crypto.enabled"". In versions 3.1.2 and earlier, it uses a bespoke mutual authentication protocol that allows for full encryption key recovery. After an initial interactive attack, this would allow someone to decrypt plaintext traffic offline. Note that this does not affect security mechanisms controlled by ""spark.authenticate.enableSaslEncryption"", ""spark.io.encryption.enabled"", ""spark.ssl"", ""spark.ui.strictTransportSecurity"". Update to Apache Spark 3.1.3 or later",Apache,Apache Spark,7.5,HIGH,0.000539999979082495,false,,false,false,false,,,false,false,,2022-03-10T08:20:12.000Z,0 CVE-2020-9480,https://securityvulnerability.io/vulnerability/CVE-2020-9480,,"In Apache Spark 2.4.5 and earlier, a standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc).",Apache,Apache Spark,9.8,CRITICAL,0.023229999467730522,false,,false,false,false,,,false,false,,2020-06-23T21:50:51.000Z,0 CVE-2019-10099,https://securityvulnerability.io/vulnerability/CVE-2019-10099,,"Prior to Spark 2.3.3, in certain situations Spark would write user data to local disk unencrypted, even if spark.io.encryption.enabled=true. This includes cached blocks that are fetched to disk (controlled by spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs.",Apache,Apache Spark,7.5,HIGH,0.001230000052601099,false,,false,false,false,,,false,false,,2019-08-07T16:18:46.000Z,0 CVE-2018-11760,https://securityvulnerability.io/vulnerability/CVE-2018-11760,,"When using PySpark , it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application. This affects versions 1.x, 2.0.x, 2.1.x, 2.2.0 to 2.2.2, and 2.3.0 to 2.3.1.",Apache,Apache Spark,5.5,MEDIUM,0.0004199999966658652,false,,false,false,false,,,false,false,,2019-02-04T17:29:00.000Z,0 CVE-2018-17190,https://securityvulnerability.io/vulnerability/CVE-2018-17190,,"In all versions of Apache Spark, its standalone resource manager accepts code to execute on a 'master' host, that then runs that code on 'worker' hosts. The master itself does not, by design, execute user code. A specially-crafted request to the master can, however, cause the master to execute code too. Note that this does not affect standalone clusters with authentication enabled. While the master host typically has less outbound access to other resources than a worker, the execution of code on the master is nevertheless unexpected.",Apache,Apache Spark,9.8,CRITICAL,0.040720000863075256,false,,false,false,false,,,false,false,,2018-11-19T14:00:00.000Z,0 CVE-2018-11804,https://securityvulnerability.io/vulnerability/CVE-2018-11804,,"Spark's Apache Maven-based build includes a convenience script, 'build/mvn', that downloads and runs a zinc server to speed up compilation. It has been included in release branches since 1.3.x, up to and including master. This server will accept connections from external hosts by default. A specially-crafted request to the zinc server could cause it to reveal information in files readable to the developer account running the build. Note that this issue does not affect end users of Spark, only developers building Spark from source code.",Apache,Apache Spark,7.5,HIGH,0.0023799999617040157,false,,false,false,false,,,false,false,,2018-10-24T00:00:00.000Z,0 CVE-2018-11770,https://securityvulnerability.io/vulnerability/CVE-2018-11770,,"From version 1.3.0 onward, Apache Spark's standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property 'spark.authenticate.secret' establishes a shared secret for authenticating requests to submit jobs via spark-submit. However, the REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program without authenticating, but not launch executors, using the REST API. This REST API is also used by Mesos, when set up to run in cluster mode (i.e., when also running MesosClusterDispatcher), for job submission. Future versions of Spark will improve documentation on these points, and prohibit setting 'spark.authenticate.secret' when running the REST APIs, to make this clear. Future versions will also disable the REST API by default in the standalone master by changing the default value of 'spark.master.rest.enabled' to 'false'.",Apache,Apache Spark,4.2,MEDIUM,0.9665600061416626,false,,false,false,true,2019-10-09T16:41:39.000Z,true,false,false,,2018-08-13T00:00:00.000Z,0 CVE-2018-8024,https://securityvulnerability.io/vulnerability/CVE-2018-8024,,"In Apache Spark 2.1.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, it's possible for a malicious user to construct a URL pointing to a Spark cluster's UI's job and stage info pages, and if a user can be tricked into accessing the URL, can be used to cause script to execute and expose information from the user's view of the Spark UI. While some browsers like recent versions of Chrome and Safari are able to block this type of attack, current versions of Firefox (and possibly others) do not.",Apache,Apache Spark,5.4,MEDIUM,0.0004600000102072954,false,,false,false,false,,,false,false,,2018-07-12T13:29:00.000Z,0 CVE-2018-1334,https://securityvulnerability.io/vulnerability/CVE-2018-1334,,"In Apache Spark 1.0.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, when using PySpark or SparkR, it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application.",Apache,Apache Spark,4.7,MEDIUM,0.0004199999966658652,false,,false,false,false,,,false,false,,2018-07-12T13:29:00.000Z,0 CVE-2017-12612,https://securityvulnerability.io/vulnerability/CVE-2017-12612,,"In Apache Spark 1.6.0 until 2.1.1, the launcher API performs unsafe deserialization of data received by its socket. This makes applications launched programmatically using the launcher API potentially vulnerable to arbitrary code execution by an attacker with access to any user account on the local machine. It does not affect apps run by spark-submit or spark-shell. The attacker would be able to execute code as the user that ran the Spark application. Users are encouraged to update to version 2.2.0 or later.",Apache,Spark,7.8,HIGH,0.0004199999966658652,false,,false,false,false,,,false,false,,2017-09-13T16:00:00.000Z,0 CVE-2017-7678,https://securityvulnerability.io/vulnerability/CVE-2017-7678,,"In Apache Spark before 2.2.0, it is possible for an attacker to take advantage of a user's trust in the server to trick them into visiting a link that points to a shared Spark cluster and submits data including MHTML to the Spark master, or history server. This data, which could contain a script, would then be reflected back to the user and could be evaluated and executed by MS Windows-based clients. It is not an attack on Spark itself, but on the user, who may then execute the script inadvertently when viewing elements of the Spark web UIs.",Apache,Spark,6.1,MEDIUM,0.000699999975040555,false,,false,false,false,,,false,false,,2017-07-12T13:00:00.000Z,0