Skip to content

Commit c3600e6

Browse files
committed
Implement subset of the Common Workflow Language tool and workflow formats.
This should support a subset of [draft-3](http://www.commonwl.org/draft-3/) and [v1.0](http://www.commonwl.org/v1.0/) tools. CWL Support (Tools): -------------------- - Implemented integer, long, float, double, boolean, string, and File parameters, and arrays of files, as well as `["null", <simple_type>]` union parameters and Any-type parameters, and simple records. More complex unions and lists of datatypes may stil ube nsupported (unions of two or more non-null parameters, unions of `["null", Any]`, etc...). - ``InlineJavascriptRequirement`` are support to define output files (see ``test_cat3`` test case). - ``EnvVarRequirement``s are supported (see the ``test_env_tool1`` and ``test_env_tool2`` test cases). - Expression tools are supported (see ``parseInt-tool`` test case). - Shell tools are also support (see record output test case). - Default File values are very un-Galaxy and have been hacked into work with Tools. CWL Support (Workflows): ------------------------ - Simple connections and tool execution. - MultipleInputFeatureRequirements to glue together multiple file inputs into a File[] or multiple File[] into a single flat File[]. (nested merge is still a TODO). - Simple scatter semantics when they match Galaxy's implicit scatters (e.g. count-lines3) and simple scatters over non-File parameters. - Simple subworkflows (e.g. count-lines10). Previously Worked: ------------------- The following things worked with Draft 3 and a previous version of cwltool but need to be updated. - Draft 3 `CreateFileRequirement`s were supported (see the `test_rename` test case). - Secondary files were supported at least partially, see the `index1` and `showindex1` CWL tools created to verify this as well as the `test_index1` test case. - Docker integration is only partial (simple docker pull is supported) - so `cat3-tool.cwl` works for example. Full semantics of CWL docker support has yet to be implemented. The remaining work is straight-forward and trackd in the meta-issue #1684. Remaining Work --------------------------------- The work remaining is vast and will be tracked at https://github.com/common-workflow-language/galaxy/issues for the time being. Implementation Notes: ---------------------- Tools: - Non-File CWL outputs are represented as ``expression.json`` files. Traditionally Galaxy hasn't supported non-File outputs from tools but CWL Galaxy has work in progress on bringing native Galaxy support for such outputs common-workflow-lab#27. - CWL secondary files are stored in ``__secondary_files__`` directory in the dataset's extra_files_path directory. - The tool execution API has been extended to add a ``inputs_representation`` parameter that can be set to "cwl" now. The ``cwl`` representation for running tools corresonding to the CWL job json format with {class: "File: path: "/path/to/file"} inputs replaced with {"src": "hda", "id": "<dataset_id>"}. Code for building these requests for CWL job json is available in the test class. - Since the CWL <-> Galaxy parameter translation may change over time, for instance if Galaxy develops or refines parameter classes - CWL state and CWL state version is tracked in the database and hopefully for reruns, etc... we could update the Galaxy state from an older version to a new one. - CWL allows output parameters to be either ``File`` or non-``File`` and determined at runtime, so ``galaxy.json`` is used to dynamically adjust output extension as needed for non-``File`` parameters. Workflows: - This work serializes embedded and referenced tools into the database - this will allow reuse and tracing without require the path to exist forever on the filesystem - this will have problems with default file references in workflows. - Implements re-mapping CWL workflow connections to Galaxy input connections. - Fix tool serialization for jobs for path-less tools (such as embedded tools). - Hack tool state during workflow import for CWL. - The sort of dynamic shaping of inputs CWL allows has required enhancing Galaxy's map/reduce stuff to allow mapping over dynamic collections that don't yet exist at the time of tool execution and need to be created on the fly. This commit creates them as HDCAs - but likely they should be something else that doesn't appear in the history panel. - Multi-input scattering but only scatterMethod == "dotproduct" is currently support. Other scatter methods (nested_crossproduct and flatcross_product) are not used by workflows in GA4GH challenge. Implementation Description: ----------------------------- The reference implementation Python library (mainly developed by Peter Amstutz - https://github.com/common-workflow-language/common-workflow-language/tree/master/reference) is used to load tool files ending with ``.json`` or ``.cwl`` and proxy objects are created to adapt these tools to Galaxy representations. In particular input and output descriptions are loaded from the tool. When the tool is submitted, a special specialized tool class is used to build a cwltool compatible job description from the supplied Galaxy inputs and the CWL reference implementation is used to generate a CWL reference implementation Job object. A command-line is generated from this Job object. As a result of this - Galaxy largely does not need to worry about the details of command-line adapters, expressions, etc.... Galaxy writes a description of the CWL job that it can reload to the job working directory. After the process is complete (on the Galaxy compute server, but outside the Docker container) this representation is reloaded and the dynamic outputs are discovered and moved to fixed locations as expected by Galaxy. CWL allows for much more expressive output locations than Galaxy, for better or worse, and this step uses cwltool to adapt CWL to Galaxy outputs. Currently all ``File`` outputs are sniffed to determined a Galaxy datatype, CWL allows refinement on this and this remains work to be done. 1) CWL should support EDAM declaration of types and Galaxy should provide a mapping to core datasets to skip sniffing is types are found. 2) For finer grain control within Galaxy, extensions to CWL should allow setting actual Galaxy output types on outputs. (Distinction between fastq and fastqsanger in Galaxy is very important for instance.) Implementation Links: ---------------------- Hundreds of commits have been rebased into this one and so the details of individual parts of the implementation and how they built on each other are not enitrely clear. To see the original ideas behind individual features - here are some relevant links: - Implement merge_nested link semantics for workflow steps (common-workflow-lab@a903abd). - Implement subworkflows in CWL (common-workflow-lab@9933c3c) - MultipleInputFeatureRequirements: - Second attempt: common-workflow-lab@ed8307f - First attempt: common-workflow-lab@ae11f56 - Basic, implicit dotproduct scattering of workflows - common-workflow-lab@d1ad64e. - Simple input StepInputExpressionRequirements - common-workflow-lab@819a27b - StepInputExpressionRequirements for multiple inputs - common-workflow-lab@5e7f622 - Record Types in CWL - common-workflow-lab@e6be28a - Rework original approach at mapping CWL state to tool state - common-workflow-lab@669ea55 - Rework approach at mapping CWL state to tool state again to use "FieldTypeToolParameter"s - implements default values, optional parameters, and union types for workflow inputs. common-workflow-lab@d1ca22f Testing: --------------------- % git clone https://github.com/common-workflow-language/galaxy.git % cd galaxy % git checkout cwl-1.0 Start Galaxy. % GALAXY_RUN_WITH_TEST_TOOLS=1 sh run.sh Open http://localhost:8080/ and see CWL test tools (along with all Galaxy test tools) in left hand tool panel. To go a step further and actually run CWL jobs within their designated Docker containers, copy the following minimal Galaxy job configuration file to ``config/job_conf.xml``. (Adjust the ``docker_sudo`` parameter based on how you execute Docker). https://gist.github.com/jmchilton/3997fa471d1b4c556966 Run API tests demonstrating the various CWL demo tools with the following command. ``` ./run_tests.sh -api test/api/test_tools_cwl.py ./run_tests.sh -api test/api/test_workflows_cwl.py ./run_tests.sh -api test/api/test_cwl_conformance_v1_0.py ``` The first two execute various tool and workflow test cases manually crafted during implementation of this work. The third is an auto-generate test case class that contains Python tests for every CWL conformance test found with the reference specification. Issues and Contact --------------------------------- Report issues at https://github.com/common-workflow-language/galaxy/issues and feel free ping jmchilton on the CWL [Gitter channel](https://gitter.im/common-workflow-language/common-workflow-language).
1 parent 00b1f9b commit c3600e6

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

73 files changed

+4096
-409
lines changed

lib/galaxy/config.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -324,9 +324,10 @@ def __init__( self, **kwargs ):
324324
# These are not even beta - just experiments - don't use them unless
325325
# you want yours tools to be broken in the future.
326326
self.enable_beta_tool_formats = string_as_bool( kwargs.get( 'enable_beta_tool_formats', 'False' ) )
327+
# Should CWL artifacts be loaded with strict validation enabled.
328+
self.strict_cwl_validation = string_as_bool( kwargs.get( 'strict_cwl_validation', 'True') )
327329
# Beta containers interface used by GIEs
328330
self.enable_beta_containers_interface = string_as_bool( kwargs.get( 'enable_beta_containers_interface', 'False' ) )
329-
330331
# Certain modules such as the pause module will automatically cause
331332
# workflows to be scheduled in job handlers the way all workflows will
332333
# be someday - the following two properties can also be used to force this

lib/galaxy/dataset_collections/matching.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,10 @@ class CollectionsToMatch( object ):
1616

1717
def __init__( self ):
1818
self.collections = {}
19+
self.uses_ephemeral_collections = False
1920

2021
def add( self, input_name, hdca, subcollection_type=None, linked=True ):
22+
self.uses_ephemeral_collections = self.uses_ephemeral_collections or not hasattr( hdca, "hid" )
2123
self.collections[ input_name ] = bunch.Bunch(
2224
hdca=hdca,
2325
subcollection_type=subcollection_type,
@@ -45,6 +47,7 @@ def __init__( self ):
4547
self.linked_structure = None
4648
self.unlinked_structures = []
4749
self.collections = {}
50+
self.uses_ephemeral_collections = False
4851

4952
def __attempt_add_to_linked_match( self, input_name, hdca, collection_type_description, subcollection_type ):
5053
structure = get_structure( hdca, collection_type_description, leaf_subcollection_type=subcollection_type )
@@ -69,12 +72,21 @@ def structure( self ):
6972
effective_structure = effective_structure.multiply( linked_structure )
7073
return None if effective_structure.is_leaf else effective_structure
7174

75+
@property
76+
def implicit_inputs( self ):
77+
if not self.uses_ephemeral_collections:
78+
# Consider doing something smarter here.
79+
return list( self.collections.items() )
80+
else:
81+
return []
82+
7283
@staticmethod
7384
def for_collections( collections_to_match, collection_type_descriptions ):
7485
if not collections_to_match.has_collections():
7586
return None
7687

7788
matching_collections = MatchingCollections()
89+
matching_collections.uses_ephemeral_collections = collections_to_match.uses_ephemeral_collections
7890
for input_key, to_match in collections_to_match.items():
7991
hdca = to_match.hdca
8092
collection_type_description = collection_type_descriptions.for_collection_type( hdca.collection.collection_type )

lib/galaxy/dependencies/pinned-requirements.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -72,3 +72,6 @@ pysam==0.8.4+gx5
7272

7373
# Chronos client
7474
chronos-python==0.38.0
75+
76+
# For CWL support.
77+
cwltool==1.0.20170727112954

lib/galaxy/jobs/__init__.py

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -782,6 +782,10 @@ def can_split( self ):
782782
# Should the job handler split this job up?
783783
return self.app.config.use_tasked_jobs and self.tool.parallelism
784784

785+
@property
786+
def is_cwl_job( self ):
787+
return self.tool.tool_type == "cwl"
788+
785789
def get_job_runner_url( self ):
786790
log.warning('(%s) Job runner URLs are deprecated, use destinations instead.' % self.job_id)
787791
return self.job_destination.url
@@ -886,10 +890,13 @@ def get_special( ):
886890
# if the server was stopped and restarted before the job finished
887891
job.command_line = unicodify(self.command_line)
888892
job.dependencies = self.tool.dependencies
893+
param_dict = tool_evaluator.param_dict
894+
job.cwl_command_state = param_dict.get('__cwl_command_state', None)
895+
job.cwl_command_state_version = param_dict.get('__cwl_command_state_version', None)
889896
self.sa_session.add( job )
890897
self.sa_session.flush()
891898
# Return list of all extra files
892-
self.param_dict = tool_evaluator.param_dict
899+
self.param_dict = param_dict
893900
version_string_cmd_raw = self.tool.version_string_cmd
894901
if version_string_cmd_raw:
895902
version_command_template = string.Template(version_string_cmd_raw)

lib/galaxy/jobs/command_factory.py

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
)
77

88
from galaxy import util
9+
from galaxy.util import bunch
910
from galaxy.jobs.runners.util.job_script import (
1011
check_script_integrity,
1112
INTEGRITY_INJECTION,
@@ -175,8 +176,13 @@ def __handle_work_dir_outputs(commands_builder, job_wrapper, runner, remote_comm
175176
if 'working_directory' in remote_command_params:
176177
work_dir_outputs_kwds['job_working_directory'] = remote_command_params['working_directory']
177178
work_dir_outputs = runner.get_work_dir_outputs( job_wrapper, **work_dir_outputs_kwds )
178-
if work_dir_outputs:
179+
if work_dir_outputs or job_wrapper.is_cwl_job:
179180
commands_builder.capture_return_code()
181+
if job_wrapper.is_cwl_job:
182+
metadata_script_file = join(job_wrapper.working_directory, "relocate_dynamic_outputs.py")
183+
relocate_contents = 'from galaxy_ext.cwl.handle_outputs import relocate_dynamic_outputs; relocate_dynamic_outputs()'
184+
write_script(metadata_script_file, relocate_contents, bunch.Bunch(check_job_script_integrity=False))
185+
commands_builder.append_command("python %s" % metadata_script_file)
180186
copy_commands = map(__copy_if_exists_command, work_dir_outputs)
181187
commands_builder.append_commands(copy_commands)
182188

lib/galaxy/managers/collections.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -71,8 +71,10 @@ def create( self, trans, parent, name, collection_type, element_identifiers=None
7171
name=name,
7272
)
7373
if implicit_collection_info:
74-
for input_name, input_collection in implicit_collection_info[ "implicit_inputs" ]:
75-
dataset_collection_instance.add_implicit_input_collection( input_name, input_collection )
74+
implicit_inputs = implicit_collection_info[ "implicit_inputs" ]
75+
if implicit_inputs:
76+
for input_name, input_collection in implicit_inputs:
77+
dataset_collection_instance.add_implicit_input_collection( input_name, input_collection )
7678
for output_dataset in implicit_collection_info.get( "outputs" ):
7779
if output_dataset not in trans.sa_session:
7880
output_dataset = trans.sa_session.query( type( output_dataset ) ).get( output_dataset.id )

lib/galaxy/managers/tools.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,13 @@ def create_tool(self, tool_payload):
6262
if tool_id is None:
6363
tool_id = str(uuid)
6464

65+
tool_version = representation.get("version", None)
66+
tool_hash = build_tool_hash(representation)
67+
value = representation
68+
elif tool_format in ["CommandLineTool", "ExpressionTool"]:
69+
# CWL tools
70+
uuid = None
71+
tool_id = representation.get("id", None)
6572
tool_version = representation.get("version", None)
6673
tool_hash = build_tool_hash(representation)
6774
value = representation

lib/galaxy/managers/workflows.py

Lines changed: 27 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@
2222
from galaxy.util.sanitize_html import sanitize_html
2323
from galaxy.workflow.steps import attach_ordered_steps
2424
from galaxy.workflow.modules import module_factory, is_tool_module_type, ToolModule, WorkflowModuleInjector
25+
from galaxy.tools.cwl import workflow_proxy
2526
from galaxy.tools.parameters.basic import DataToolParameter, DataCollectionToolParameter, RuntimeValue, workflow_building_modes
2627
from galaxy.tools.parameters import visit_input_values, params_to_incoming
2728
from galaxy.jobs.actions.post import ActionBox
@@ -196,6 +197,18 @@ def build_workflow_from_dict(
196197
):
197198
# Put parameters in workflow mode
198199
trans.workflow_building_mode = True
200+
if data and "src" in data and data["src"] == "from_path":
201+
from galaxy.tools.cwl import workflow_proxy
202+
wf_proxy = workflow_proxy(data["path"])
203+
tool_reference_proxies = wf_proxy.tool_reference_proxies()
204+
for tool_reference_proxy in tool_reference_proxies:
205+
# TODO: Namespace IDS in workflows.
206+
# TODO: Don't duplicately load these tools.
207+
self.app.dynamic_tool_manager.create_tool({
208+
"representation": tool_reference_proxy.to_persistent_representation(),
209+
})
210+
data = wf_proxy.to_dict()
211+
199212
# If there's a source, put it in the workflow name.
200213
if source:
201214
name = "%s (imported from %s)" % ( data['name'], source )
@@ -277,6 +290,10 @@ def update_workflow_from_dict(self, trans, stored_workflow, workflow_data):
277290
def _workflow_from_dict(self, trans, data, name, **kwds):
278291
if isinstance(data, string_types):
279292
data = json.loads(data)
293+
if "src" in data:
294+
assert data["src"] == "path"
295+
wf_proxy = workflow_proxy(data["path"])
296+
data = wf_proxy.to_dict()
280297

281298
# Create new workflow from source data
282299
workflow = model.Workflow()
@@ -834,7 +851,7 @@ def __module_from_dict( self, trans, steps, steps_by_external_id, step_dict, **k
834851
"""
835852
step = model.WorkflowStep()
836853
# TODO: Consider handling position inside module.
837-
step.position = step_dict['position']
854+
step.position = step_dict.get('position', {"left": 0, "top": 0})
838855
if step_dict.get("uuid", None) and step_dict['uuid'] != "None":
839856
step.uuid = step_dict["uuid"]
840857
if "label" in step_dict:
@@ -858,6 +875,15 @@ def __module_from_dict( self, trans, steps, steps_by_external_id, step_dict, **k
858875
# Stick this in the step temporarily
859876
step.temp_input_connections = step_dict['input_connections']
860877

878+
if "inputs" in step_dict:
879+
for input_dict in step_dict["inputs"]:
880+
step_input = model.WorkflowStepInput()
881+
step_input.name = input_dict["name"]
882+
step_input.merge_type = input_dict.get("merge_type", step_input.default_merge_type)
883+
step_input.scatter_type = input_dict.get("scatter_type", step_input.default_scatter_type)
884+
step_input.value_from = input_dict.get("value_from", None)
885+
step.inputs.append(step_input)
886+
861887
# Create the model class for the step
862888
steps.append( step )
863889
steps_by_external_id[ step_dict[ 'id' ] ] = step

lib/galaxy/model/__init__.py

Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -767,6 +767,8 @@ def to_dict( self, view='collection', system_details=False ):
767767
# System level details that only admins should have.
768768
rval['external_id'] = self.job_runner_external_id
769769
rval['command_line'] = self.command_line
770+
rval['cwl_command_state'] = self.cwl_command_state
771+
rval['cwl_command_state_version'] = self.cwl_command_state_version
770772

771773
if view == 'element':
772774
param_dict = dict( [ ( p.name, p.value ) for p in self.parameters ] )
@@ -3229,6 +3231,10 @@ def __init__(
32293231
if not populated:
32303232
self.populated_state = DatasetCollection.populated_states.NEW
32313233

3234+
@property
3235+
def allow_implicit_mapping(self):
3236+
return self.collection_type != "record"
3237+
32323238
@property
32333239
def populated( self ):
32343240
top_level_populated = self.populated_state == DatasetCollection.populated_states.OK
@@ -3822,12 +3828,14 @@ def __init__( self ):
38223828
self.tool_inputs = None
38233829
self.tool_errors = None
38243830
self.position = None
3831+
self.inputs = []
38253832
self.input_connections = []
38263833
self.config = None
38273834
self.label = None
38283835
self.uuid = uuid4()
38293836
self.workflow_outputs = []
38303837
self._input_connections_by_name = None
3838+
self._inputs_by_name = None
38313839

38323840
@property
38333841
def unique_workflow_outputs(self):
@@ -3863,6 +3871,12 @@ def input_connections_by_name(self):
38633871
self.setup_input_connections_by_name()
38643872
return self._input_connections_by_name
38653873

3874+
@property
3875+
def inputs_by_name(self):
3876+
if self._inputs_by_name is None:
3877+
self.setup_inputs_by_name()
3878+
return self._inputs_by_name
3879+
38663880
def setup_input_connections_by_name(self):
38673881
# Ensure input_connections has already been set.
38683882

@@ -3875,6 +3889,17 @@ def setup_input_connections_by_name(self):
38753889
input_connections_by_name[input_name].append(conn)
38763890
self._input_connections_by_name = input_connections_by_name
38773891

3892+
def setup_inputs_by_name(self):
3893+
# Ensure input_connections has already been set.
3894+
3895+
# Make connection information available on each step by input name.
3896+
inputs_by_name = {}
3897+
for step_input in self.inputs:
3898+
input_name = step_input.name
3899+
assert input_name not in inputs_by_name
3900+
inputs_by_name[input_name] = step_input
3901+
self._inputs_by_name = inputs_by_name
3902+
38783903
def create_or_update_workflow_output(self, output_name, label, uuid):
38793904
output = self.workflow_output_for(output_name)
38803905
if output is None:
@@ -3929,6 +3954,19 @@ def log_str(self):
39293954
return "WorkflowStep[index=%d,type=%s]" % (self.order_index, self.type)
39303955

39313956

3957+
class WorkflowStepInput( object ):
3958+
3959+
default_merge_type = "merge_flattened"
3960+
default_scatter_type = "dotproduct"
3961+
3962+
def __init__( self ):
3963+
self.id = None
3964+
self.name = None
3965+
self.default_value = None
3966+
self.merge_type = self.default_merge_type
3967+
self.scatter_type = self.default_scatter_type
3968+
3969+
39323970
class WorkflowStepConnection( object ):
39333971
# Constant used in lieu of output_name and input_name to indicate an
39343972
# implicit connection between two steps that is not dependent on a dataset

lib/galaxy/model/mapping.py

Lines changed: 21 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -498,6 +498,8 @@
498498
Column( "object_store_id", TrimmedString( 255 ), index=True ),
499499
Column( "imported", Boolean, default=False, index=True ),
500500
Column( "params", TrimmedString( 255 ), index=True ),
501+
Column( "cwl_command_state", JSONType, nullable=True ),
502+
Column( "cwl_command_state_version", Integer, default=1 ),
501503
Column( "handler", TrimmedString( 255 ), index=True ) )
502504

503505
model.JobStateHistory.table = Table(
@@ -839,6 +841,19 @@
839841
# Column( "input_connections", JSONType ),
840842
Column( "label", Unicode( 255 ) ) )
841843

844+
845+
model.WorkflowStepInput.table = Table(
846+
"workflow_step_input", metadata,
847+
Column( "id", Integer, primary_key=True ),
848+
Column( "workflow_step_id", Integer, ForeignKey( "workflow_step.id" ), index=True ),
849+
Column( "name", Unicode( 255 ) ),
850+
Column( "merge_type", TEXT ),
851+
Column( "scatter_type", TEXT ),
852+
Column( "value_from", JSONType ),
853+
Column( "value_from_type", TEXT ),
854+
Column( "default_value", JSONType ) )
855+
856+
842857
model.WorkflowRequestStepState.table = Table(
843858
"workflow_request_step_states", metadata,
844859
Column( "id", Integer, primary_key=True ),
@@ -2275,7 +2290,12 @@ def simple_mapping( model, **kwds ):
22752290
backref="workflow_steps" ),
22762291
annotations=relation( model.WorkflowStepAnnotationAssociation,
22772292
order_by=model.WorkflowStepAnnotationAssociation.table.c.id,
2278-
backref="workflow_steps" )
2293+
backref="workflow_steps"),
2294+
) )
2295+
2296+
mapper( model.WorkflowStepInput, model.WorkflowStepInput.table, properties=dict(
2297+
workflow_step=relation( model.WorkflowStep,
2298+
backref="inputs"),
22792299
) )
22802300

22812301
mapper( model.WorkflowOutput, model.WorkflowOutput.table, properties=dict(
@@ -2352,8 +2372,6 @@ def simple_mapping( model, **kwds ):
23522372
input_step_parameters=relation( model.WorkflowRequestInputStepParmeter ),
23532373
input_datasets=relation( model.WorkflowRequestToInputDatasetAssociation ),
23542374
input_dataset_collections=relation( model.WorkflowRequestToInputDatasetCollectionAssociation ),
2355-
#output_datasets=relation( model.WorkflowInvocationOutputDatasetAssociation ),
2356-
#output_dataset_collections=relation( model.WorkflowInvocationOutputDatasetCollectionAssociation ),
23572375
subworkflow_invocations=relation( model.WorkflowInvocationToSubworkflowInvocationAssociation,
23582376
primaryjoin=( ( model.WorkflowInvocationToSubworkflowInvocationAssociation.table.c.workflow_invocation_id == model.WorkflowInvocation.table.c.id ) ),
23592377
backref=backref("parent_workflow_invocation", uselist=False),

0 commit comments

Comments
 (0)