[EDP][Spark] Configure cluster for external hdfs
The Oozie EDP engine will update /etc/hosts files as necessary when a job execution references an external hdfs. The Spark EDP engine should include this functionality as well.
(Originally this feature was only associated with URLs referenced via data_sources, which is why it was omitted in the case of Spark since Spark does not reference data_sources. However, now that data_source reference substitution is supported in job_configs, Spark should implement this as well.
Also see https:/
Blueprint information
- Status:
- Not started
- Approver:
- Sergey Lukjanov
- Priority:
- Undefined
- Drafter:
- Trevor McKay
- Direction:
- Needs approval
- Assignee:
- None
- Definition:
- New
- Series goal:
- None
- Implementation:
- Unknown
- Milestone target:
- None
- Started by
- Completed by
Related branches
Related bugs
Sprints
Whiteboard
Gerrit topic: https:/
Addressed by: https:/
[EDP][Spark] Configure cluster for external hdfs