Proccessing data collections in Data Flow
Data Flow mechanism should be able to deal with collections, not only with fixed parameters. A typical scenario is when we want to run the same task multiple times with different sets of parameters. For instance, we want to create 100 VMs and it would be ugly to have 100 separate tasks in a workflow definition. So in this case we task could be smart enough to understand how to iterate through input collection containing parameter sets.
The current idea is as follows:
attach-vm:
for-each:
vminfo: $.vms
action: create_vm name=$.vminfo.name flavor=
action: create_vm name={$
So 'for-each' just makes the task run over an array of elements where each element may be any structure (e.g. dictionary).
See https:/
Blueprint information
- Status:
- Complete
- Approver:
- Renat Akhmerov
- Priority:
- High
- Drafter:
- Renat Akhmerov
- Direction:
- Approved
- Assignee:
- Nikolay Makhotkin
- Definition:
- New
- Series goal:
- Accepted for juno
- Implementation:
-
Implemented
- Milestone target:
-
2015.1
- Started by
- Renat Akhmerov
- Completed by
- Renat Akhmerov
Related branches
Related bugs
Sprints
Whiteboard
note: very similar :
A: start_server_1 (type == nova.server)
B: do something else
C: start_server_2 (type == nova.server)
How do we get different parameters into A and C via the context?
Gerrit topic: https:/
Addressed by: https:/
Implement for-each task property
Addressed by: https:/
Add validating of 'for-each' task property DSL
Gerrit topic: https:/
Work Items
Dependency tree

* Blueprints in grey have been implemented.