[EDP] Add a Spark job type (instead of overloading Java)

Registered by Trevor McKay

Spark EDP has been implemented initially using the Java job type. However, it will be better to suport a specific Spark job type for several reasons:

* the semantics are slightly different. Spark requires a "main" application jar and supporting libs are optional (Java uses all libs)
* the Spark job type may someday support Python apps
* the possible config set for Spark will be different from Java (although they both use edp.java.main_class)
* Spark/Java may diverge in different ways in the future

This will need support in sahara-api and the dashboard, there should be no data model
or client impact.

Blueprint information

Status:
Complete
Approver:
Sergey Lukjanov
Priority:
High
Drafter:
Trevor McKay
Direction:
Approved
Assignee:
Trevor McKay
Definition:
Approved
Series goal:
Accepted for juno
Implementation:
Implemented
Milestone target:
milestone icon 2014.2
Started by
Sergey Lukjanov
Completed by
Sergey Lukjanov

Related branches

Sprints

Whiteboard

Gerrit topic: https://review.openstack.org/#q,topic:bp/edp-spark-job-type,n,z

Addressed by: https://review.openstack.org/110727
    [EDP] Add a Spark job type (instead of overloading Java)

Addressed by: https://review.openstack.org/107871
    Implement EDP for a Spark standalone cluster

Addressed by: https://review.openstack.org/110791
    Add a Spark job type for EDP

(?)

Work Items

This blueprint contains Public information 
Everyone can see this information.

Subscribers

No subscribers.