You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently mara pipelines are always executed locally. But I would like to have an option to execute it sometimes somewhere else e.g. in another environment where other ressources are closer available.
The idea
So I came up with the idea about execution contexts. Here is the rough idea:
one can define a execution context for a pipeline or for a specific task
the exection context then defines where the shell command shall be executed
it should be possible to define multiple execution contexts within one pipeline
a execution context has a "enter" / "exit" method which gives the option to spin up or release the required resources for the execution context
The current idea is to support the following execution context:
BashExecutionContext - local bash (this is the current default behavior)
SshBashExecutionContext - remote bash execution via ssh
DockerExecutionContext - docker exec with optional start/stop of a container
Possible other options (Out of scope)
This concept could be extended in the future to add other options like:
executing a job on a remote server and before spin up / release predefined cloud resources
executing a pipeline in another pipeline engine e.g. Airflow
These ideas are just noted here and are out of scope for this issue.
Blueprint for the ExecutionContext base class
classExecutionContext:
"""The execution context for a shell command"""self.is_active: bool=falsedef__enter__(self):
"""Enters the execution context."""returnselfdef__exit__(self, type, value, traceback) ->bool:
"""Exits the execution context freeing up used resources."""returnTruedefrun_shell_command(self, shell_command: str) ->bool:
"""Executes a shell command in the context"""pass
The text was updated successfully, but these errors were encountered:
I know about the option mara_pipelines.config.bash_command_string but this wasn't enogh for me because I need to be able to execute multiple execution contexts on the same server and do not want to use multiple mara config files.
It would be nice to have a SQL execution context as well. This context would then run e.g. the ExecuteSQL command via the python DB API (see mara/mara-db#71).
The development would require mor refactoring than the current implementation which just patches the way batch commands are executed.
Currently mara pipelines are always executed locally. But I would like to have an option to execute it sometimes somewhere else e.g. in another environment where other ressources are closer available.
The idea
So I came up with the idea about execution contexts. Here is the rough idea:
The current idea is to support the following execution context:
BashExecutionContext
- local bash (this is the current default behavior)SshBashExecutionContext
- remote bash execution via sshDockerExecutionContext
- docker exec with optional start/stop of a containerPossible other options (Out of scope)
This concept could be extended in the future to add other options like:
These ideas are just noted here and are out of scope for this issue.
Blueprint for the ExecutionContext base class
The text was updated successfully, but these errors were encountered: