- Implement new optional parameter module_stream to allow a scratch module
build's stream name to be set from the command line when also submitting a
YAML modulemd file.
- Validate that module_name and module_stream parameters can only be specified
along with a YAML modulemd file.
- Add tests to verify that module_stream sets the stream name correctly.
- Add tests to verify that module_name and module_stream are only allowed along
with a YAML modulemd file.
Signed-off-by: Merlin Mathesius <mmathesi@redhat.com>
Commit 98b1ac79 ensures the message is sent after data changes are
committed into database. Hence, it is doable to remove these two
workarounds.
Signed-off-by: Chenxiong Qi <cqi@redhat.com>
In MBS, there are two cases to send a message when a module build moves
to a new state. One is to create a new module build, with
ModuleBuild.create particularly, when user submit a module build.
Another one is to transition a module build to a new state with
ModuleBuild.transition. This commit handles these two cases in a little
different ways.
For the former, existing code is refactored by moving the publish call
outside ModuleBuild.create.
For the latter, message is sent in a hook of SQLAlchemy ORM event
after_commit rather than immediately inside the ModuleBuild.transition.
Both of these changes ensure the message is sent after the changes are
committed into database successfully. Then, the backend can have
confidence that the database has the module build data when receive a
message.
Signed-off-by: Chenxiong Qi <cqi@redhat.com>
This also includes `from __future__ import absolute_import`
in every file so that the imports are consistent in Python 2 and 3.
The Python 2 tests fail without this.
This moves the code used by the backend and API to common/submit.py,
the code used just by the API to web/submit.py, and the code used
just by the backend to scheduler/submit.py.
This puts backend specific code in either the builder or scheduler
subpackage. This puts API specific code in the new web subpackage.
Lastly, any code shared between the API and backend is placed in the
common subpackage.
This merges the configuration from conf/config.py to
module_build_service/config.py. This also greatly simplifies the logic
in `init_config`. Additionally, `init_config` is no longer aware of
Flask. This will allow us to eventually break up the configuration
between the API and the backend.
The following handler arguments are not used at all:
1. `build_id` in handlers/components.py:build_task_finalize
2. `build_name` in handlers/tags.py:tagged
Add route_task function to route celery tasks to different queues.
If we can figure out what the module build is a task ran for by
checking the task arguments, then we route this task to a queue
named:
"mbs-{}".format(module_build_id % num_workers)
"num_workers" has default value of 1, and can be changed in
backend_config.py. If module build id can't be figured out, task will
be routed to the default queue which is named "mbs-default".
While setting up the workers, the number of workers should match with
"num_workers" in config, and each worker will listen on two queues:
1. mbs-default
2. mbs-{number} # for example, the first worker listens on "mbs-0"
By this design, all tasks for a particular module build will be routed
to the same queue and run on the same worker serially.