You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm encountering an issue where I'm unable to utilize an environment variable that contains a password within my Spark application when deploying with the Spark Operator.
Approaches Taken:
Environment Variable Approach:
Using Environment Variable in Spark Configuration:
I tried to pass the password as an environment variable and then attempted to access this variable within the Spark configuration.
The ConfigMap successfully created a configuration file in the /etc/spark/conf directory, but the application continued to read configurations from /opt/spark/conf/spark.properties, ignoring the file generated by the ConfigMap.
Expected Behavior:
The Spark application should be able to read and use the environment variable in the Spark configuration.
Alternatively, it should be able to read the configurations from the ConfigMap file in the /etc/spark/conf directory.
Actual Behavior:
The Spark application ignores the environment variable set in the configuration.
The application ignores the configuration file created by the ConfigMap in /etc/spark/conf.
Potential Solution:
I considered the option of creating an additional Spark configuration file and configuring the Spark application to use both configuration files. However, I'm unable to find a way to achieve this within the Spark Operator setup.
The text was updated successfully, but these errors were encountered:
Spark operator internally calls spark-submit, and it should be noted that using environment variables in Spark conf is not supported by spark-submit. One possible approach is to hard-code the application password into the spec.sparkConf, although this method is not secure. Alternatively, you could modify your application to fetch the password from environment variables, which is a more secure practice.
Issue Description:
I'm encountering an issue where I'm unable to utilize an environment variable that contains a password within my Spark application when deploying with the Spark Operator.
Approaches Taken:
Environment Variable Approach:
Alternative Approaches:
Adding
spark-defaults.conf
in/opt/spark/conf
:spark-defaults.conf
file in the/opt/spark/conf
directory with a single property for the password using init-container.spark.myapp.password=my_password_value
Using Spark ConfigMap:
spec.sparkConfigMap
./etc/spark/conf
directory, but the application continued to read configurations from/opt/spark/conf/spark.properties
, ignoring the file generated by the ConfigMap.Expected Behavior:
/etc/spark/conf
directory.Actual Behavior:
/etc/spark/conf
.Potential Solution:
I considered the option of creating an additional Spark configuration file and configuring the Spark application to use both configuration files. However, I'm unable to find a way to achieve this within the Spark Operator setup.
The text was updated successfully, but these errors were encountered: