containers:
- name: spark
image: gcr.io/google_containers/spark:v2.3.0
command:
- "spark-submit"
- "--master"
- "yarn"
- "--deploy-mode"
- "client"
- "--conf"
- "spark.executor.instances=2"
- "--conf"
- "spark.hadoop.fs.s3a.access.key=mykey"
- "--conf"
- "spark.hadoop.fs.s3a.secret.key=mysecret"
- "/mnt/spark.py"
$ touch /tmp/myapp.log
$ tail -f /tmp/myapp.log
spec:
sparkConf:
'spark.executorEnv.YARN_CONTAINER_RUNTIME_TYPE': 'docker'
'spark.executorEnv.YARN_CONTAINER_RUNTIME_DOCKER_IMAGE': 'myapp:latest'
'spark.executorEnv.YARN_CONTAINER_RUNTIME_DOCKER_MOUNTS': '/var/log/myapp:/var/log/myapp'
通过以上步骤,可以解决“Argo Workflow + Spark操作符 + 应用程序日志未生成”的问题。