feat: Per-user workflow libraries in multiuser mode#114
feat: Per-user workflow libraries in multiuser mode#114Copilot wants to merge 8 commits intolstein/feature/workflow-isolation-in-multiuser-modefrom
Conversation
…r ownership checks, is_public endpoint, schema regeneration, frontend UI Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
…ix error handling in SaveWorkflowAsDialog, rename test helper function Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
lstein
left a comment
There was a problem hiding this comment.
The migration isn't working properly. On startup, I am seeing the following:
[2026-03-02 21:39:40,270]::[InvokeAI]::INFO --> Using torch device: NVIDIA Graphics Device
[2026-03-02 21:39:40,272]::[InvokeAI]::INFO --> cuDNN version: 90701
[2026-03-02 21:39:40,548]::[InvokeAI]::INFO --> Patchmatch initialized
[2026-03-02 21:39:41,923]::[InvokeAI]::INFO --> InvokeAI version 6.11.1.post1
[2026-03-02 21:39:41,924]::[InvokeAI]::INFO --> Root directory = /home/lstein/invokeai-lstein
[2026-03-02 21:39:41,924]::[InvokeAI]::INFO --> Initializing database at /home/lstein/invokeai-lstein/databases/invokeai.db
[2026-03-02 21:39:41,926]::[InvokeAI]::INFO --> Database update needed
[2026-03-02 21:39:41,926]::[InvokeAI]::INFO --> Backing up database to /home/lstein/invokeai-lstein/databases/invokeai_backup_20260302-213941.db
[2026-03-02 21:39:41,932]::[InvokeAI]::INFO --> Database updated successfully
[2026-03-02 21:39:41,932]::[InvokeAI]::INFO --> JWT secret loaded from database
[2026-03-02 21:39:42,018]::[ModelManagerService]::INFO --> [MODEL CACHE] Calculated model RAM cache size: 12786.88 MB. Heuristics applied: [1, 2].
[2026-03-02 21:39:42,020]::[ModelInstallService]::INFO --> Restoring incomplete installs
[2026-03-02 21:39:42,021]::[ModelInstallService]::INFO --> Finished restoring incomplete installs
[a bunch of lines deleted]
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/api_app.py", line 48, in lifespan
ApiDependencies.initialize(config=app_config, event_handler_id=event_handler_id, loop=loop, logger=logger)
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/api/dependencies.py", line 202, in initialize
ApiDependencies.invoker = Invoker(services)
^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/invoker.py", line 14, in __init__
self._start()
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/invoker.py", line 31, in _start
self.__start_service(getattr(self.services, service))
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/invoker.py", line 20, in __start_service
start_op(self)
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/workflow_records/workflow_records_sqlite.py", line 33, in start
self._sync_default_workflows()
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/workflow_records/workflow_records_sqlite.py", line 472, in _sync_default_workflows
workflow_from_db = self.get(workflow_from_file.id).workflow
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/workflow_records/workflow_records_sqlite.py", line 38, in get
cursor.execute(
sqlite3.OperationalError: no such column: user_id
[2026-03-02 21:39:42,023]::[uvicorn.error]::ERROR --> Application startup failed. Exiting.
…on user_id to match migration 27 pattern Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
Fixed in commit
|
lstein
left a comment
There was a problem hiding this comment.
Something is still wrong with the migration. The schema is not being updated with the user_id column. Here is the full stack trace:
[2026-03-02 22:30:03,509]::[InvokeAI]::INFO --> Using torch device: NVIDIA Graphics Device
[2026-03-02 22:30:03,511]::[InvokeAI]::INFO --> cuDNN version: 90701
[2026-03-02 22:30:03,787]::[InvokeAI]::INFO --> Patchmatch initialized
[2026-03-02 22:30:05,166]::[InvokeAI]::INFO --> InvokeAI version 6.11.1.post1
[2026-03-02 22:30:05,166]::[InvokeAI]::INFO --> Root directory = /home/lstein/invokeai-lstein
[2026-03-02 22:30:05,166]::[InvokeAI]::INFO --> Initializing database at /home/lstein/invokeai-lstein/databases/invokeai.db
[2026-03-02 22:30:05,167]::[InvokeAI]::INFO --> Database update needed
[2026-03-02 22:30:05,167]::[InvokeAI]::INFO --> Backing up database to /home/lstein/invokeai-lstein/databases/invokeai_backup_20260302-223005.db
[2026-03-02 22:30:05,173]::[InvokeAI]::INFO --> Database updated successfully
[2026-03-02 22:30:05,173]::[InvokeAI]::INFO --> JWT secret loaded from database
[2026-03-02 22:30:05,260]::[ModelManagerService]::INFO --> [MODEL CACHE] Calculated model RAM cache size: 12786.88 MB. Heuristics applied: [1, 2].
[2026-03-02 22:30:05,263]::[ModelInstallService]::INFO --> Restoring incomplete installs
[2026-03-02 22:30:05,263]::[ModelInstallService]::INFO --> Finished restoring incomplete installs
[2026-03-02 22:30:05,265]::[uvicorn.error]::ERROR --> Traceback (most recent call last):
File "/home/lstein/invokeai-lstein/.venv/lib/python3.12/site-packages/starlette/routing.py", line 694, in lifespan
async with self.lifespan_context(app) as maybe_state:
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/invokeai-lstein/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 209, in merged_lifespan
async with original_context(app) as maybe_original_state:
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/invokeai-lstein/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 209, in merged_lifespan
async with original_context(app) as maybe_original_state:
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/invokeai-lstein/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 209, in merged_lifespan
async with original_context(app) as maybe_original_state:
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/invokeai-lstein/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 209, in merged_lifespan
async with original_context(app) as maybe_original_state:
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/invokeai-lstein/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 209, in merged_lifespan
async with original_context(app) as maybe_original_state:
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/api_app.py", line 48, in lifespan
ApiDependencies.initialize(config=app_config, event_handler_id=event_handler_id, loop=loop, logger=logger)
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/api/dependencies.py", line 202, in initialize
ApiDependencies.invoker = Invoker(services)
^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/invoker.py", line 14, in __init__
self._start()
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/invoker.py", line 31, in _start
self.__start_service(getattr(self.services, service))
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/invoker.py", line 20, in __start_service
start_op(self)
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/workflow_records/workflow_records_sqlite.py", line 33, in start
self._sync_default_workflows()
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/workflow_records/workflow_records_sqlite.py", line 472, in _sync_default_workflows
workflow_from_db = self.get(workflow_from_file.id).workflow
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI-lstein/invokeai/app/services/workflow_records/workflow_records_sqlite.py", line 38, in get
cursor.execute(
sqlite3.OperationalError: no such column: user_id
[2026-03-02 22:30:05,265]::[uvicorn.error]::ERROR --> Application startup failed. Exiting.
Fixed in commit The root cause: Python's The fix switches the DDL to use |
6eb87a1 to
89ef06d
Compare
lstein
left a comment
There was a problem hiding this comment.
User isolation is working well. Now please implement workflow sharing. In the workflow "Details" panel, please add a "Shared workflow" checkbox. When this checkbox is selected, the workflow will be available in a read-only fashion to all users under the "Browse Workflows" section of the Workflow Library panel. Also auto-add the tag "shared" to these workflows so that they can be easily filtered on.
Only the owner of a workflow can edit or delete a workflow. The exception is the Administrator, who can edit or delete any user's workflow.
|
@copilot |
…dit/delete, fix tests Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
Done in commit
Here's how the Details panel looks with the new checkbox:
... Fixed in commit
All 13 tests now pass. |
Summary
In multiuser mode, all users shared a single workflow library. This PR isolates workflows per-user, adds a "Shared Workflows" section for publicly shared workflows, and provides controls to share/unshare workflows.
Backend
user_id(DEFAULT'system') andis_public(DEFAULTFALSE) columns + indexes toworkflow_library. Usesexecutescript()for the DDL statements so that transaction management is handled explicitly (executescript always issues COMMIT first), which avoids edge-cases in Python'ssqlite3implicit transaction handling for DDL on tables that contain VIRTUAL generated columns. A post-check raises a clearRuntimeErrorif the columns were not actually added, preventing silent failures. No cross-module imports in the migration callback.WORKFLOW_LIBRARY_DEFAULT_USER_IDconstant added toworkflow_records_common.pyto avoid magic strings across service and base layersworkflow_records_*: All query methods (create,get_many,counts_by_*,get_all_tags) acceptuser_idandis_publicfilters; newupdate_is_public()method — which also automatically adds the"shared"tag whenis_public=trueand removes it whenis_public=falseworkflows.pyrouter:CurrentUserOrDefaultlist_workflows/counts_by_*/get_all_tagsautomatically scopeusercategory results to the current user in multiuser mode (bypassed whenis_public=trueis explicitly requested)GET(non-owner blocked unless public/default/admin),PATCH,DELETE, and thumbnail endpointsPATCH /api/v1/workflows/i/{workflow_id}/is_publicendpointFrontend
openapi.jsonupdated;schema.tsregenerated viamake frontend-typegenWorkflowRecordOrderBygainsis_public;WorkflowLibraryViewgains'shared'WorkflowLibrarySideNav;WorkflowListroutes'shared'view tois_public=truequeryWorkflowGeneralTab(Details panel): newShareWorkflowCheckboxcomponent below the Tags field — visible to the workflow owner and admins in multiuser mode; togglesis_publicvia the APIWorkflowListItem: owners see aShareWorkflowToggleswitch; public workflows display a "Shared" badge;EditWorkflowandDeleteWorkflowbuttons are now gated behindisOwner || is_adminso non-owners cannot edit or delete others' workflowsSaveWorkflowAsDialog: "Share workflow" checkbox — marks new workflow public immediately after creationWorkflowSortControl+ sort options updated to includeis_publicsharedWorkflows,shareWorkflowadded toen.jsonSingle-user mode behavior is completely unchanged — no user filtering is applied when
multiuserisfalse.Related Issues / Discussions
QA Instructions
Multiuser mode (
multiuser: truein config):"shared"tag is auto-added to the workflowSingle-user mode: verify all existing workflow operations behave identically to before.
Merge Plan
Migration 28 is additive (new nullable/defaulted columns); safe to deploy without data loss. Existing workflows are assigned
user_id = 'system'andis_public = false.The
workflow_librarytable in real databases has accumulated schema changes from earlier migrations (VIRTUAL generated columns added viaALTER TABLE). Migration 28 usesexecutescript()rather thancursor.execute()for its DDL to ensure deterministic commit behaviour on such tables regardless of Python version or SQLite version.Checklist
What's Newcopy (if doing a release after this PR)Original prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.